Statistics Has Its Moment

Back when I was a student at RPI, I took numerous courses as part of the requirements for my degrees in Management Engineering.  Most of them offered a good foundation in aspects of business which proved very useful in the years to follow.  But only one of them had a massive and lasting effect on my worldview: Statistics.

Statistics?  Why Statistics?  Probably part of the reason was that it took me a while to get the hang of it.  My professor, Dr. George Manners, had a wonderful Georgian accent which was quite different than those of us from New England.  Then, he gave us a test about mid-semester and virtually everybody did poorly.  Then he did a surprising thing.  He got up in front of the class and said that obviously he hadn’t done a good job in teaching the material. So he would cover the material again and give a re-test. Wow! That made me take notice.  So I paid chose attention and discovered there was magic in the art of choosing a sample size, paying attention to aspects such as normalizing the data and running statistical tests.  I did pretty well on the re-test and I also struck up a friendship with the professor which extended beyond the point when I’d graduated.  Later, he taught me Marketing, another course which has had a long standing impact.

Getting back to statistics, I never looked at data the same way after taking this course and a successor course in the design of experiments.  I learned a valuable lesson that many politicians and other public figures have not learned, which is that a sample size of one, or even a few more data points than that, is basically statistically meaningless.  We also learned about the margin of error.  If a political poll is conducted and one candidate leads another by 3 percentage points, but the margin of error is six percentage points, it’s basically a statistical tie, not a conclusion one can bank on.  So when the press, or politicians tout a particular result, I pay close attention to the statistics behind the conclusion and decide whether the suggested results pass muster.  Frankly, this is a skill which can be valuable to anybody who wants to be able to follow any news with statistical information or other uses of data and make sense of it.

Now, we are all living in the Covid-19 era. And for many of us in the public, the people we can trust the most with their statements are the ones who understand data and are able to present conclusions based on the statistical data, rather than on gut feel.  This is closely related to looking at news stories or public statements, and determining whether the information is evidence based, using techniques such as the scientific method and peer-based reviews, or anecdotal, and therefore not to be trusted.  Intuition can be very powerful, but not nearly as useful unless it is also combined with a data driven outlook for these types of challenges.

I teach an Introduction to Business course at Northeastern University and we have the students conduct surveys as part of their business projects.  Using survey and statistical tools such as those offered by Qualtrics, they take surveys and test conclusions on ideas for their business models.  The surveys carry much more weight for me if the survey size is at least 30 and ideally much more, and if the respondents have a demographic profile which fits the proposed target market for their products or services.  This is hardly rigorous statistical work, but it is much more valid, and convincing for a reader, than just stating opinions and not having any research to back it up.

So statistics and big data are having a moment. It’s a good reminder that certain skills, such as statistics, investment planning and effective written and oral communications, can offer value that goes well beyond the classroom and better equips all of us to be effective and informed citizens and consumers.  That’s why I encouraged both of my sons to take college level statistics courses and add this particular skill to their life toolkit.  So the next time you hear a news story or listen to a public figure, think about whether the opinions stated are backed up by data. It’s a basic skill from which we can all benefit.

Reflections on Robot Proof

I recently finished reading the book Robot Proof: Higher Education in the Age of Artificial Intelligence, written by Dr. Joseph Aoun, the President of Northeastern University.  My interest had been sparked by a panel on which Dr. Aoun participated concerning the future of work and how it will be affected by developments in Artificial Intelligence (AI).

I’ve written previously about applications of Artificial Intelligence (AI) in a post called “Contact Centers Get Smarter,” but Aoun’s book encouraged me to take a deeper dive into the topic. The intent of this post is to share my thoughts on what I’ve learned and consider some implications.  For context, I am a member of the part-time faculty at Northeastern after working many years within the high tech industry in disciplines which included Product Management, Information Technology, Engineering Management, Operations and Consulting.  I also have two children who been studying at universities in recent years, one in Liberal Arts and one in Engineering. So, I have a personal stake in the matters which Dr. Aoun discusses in his book.

Robot Proof  contends that as AI matures, it will have a dramatic effect on the careers of human workers. Many employment roles which have a variety of repetitive steps will begin to be either replaced or supplemented by algorithms. Thus, a key point in Robot Proof is that human workers should hone cognitive capacities where human skills and adaptability will offer an advantage. Examples of such cognitive capacities include critical thinking, systems thinking, entrepreneurship and cultural agility.  Another overarching theme is the importance of lifelong learning for people in the workforce at all levels.  As technologies change rapidly, workers will need to develop enhanced skills and new areas of domain expertise. Aoun also feels that the skills taught in leading liberal arts programs, such as communications, critical thinking and creativity, can also be valuable as part of one’s “robot proof” toolkit.

Robot Proof offers a lot of depth and thought on the matters of what kind of skills and literacies will be useful over the course of one’s career. For example, I’ve found that systems thinking and strong written communications skills have been very useful within my own career in enabling me to shift between different roles depending upon my needs and those of employers or clients.  Being able to make  connections across diverse disciplines and then apply them is an example of a skill that Aoun calls far transfer. Aoun also talks about the value of being  able to shift perspective and change one’s mindset to expand the range of potential solutions. So this book offers encouragement for people who have diverse interests and like to bring a generalist’s perspective into their work or creative endeavors.

Dr. Aoun also contends the university system in the United States needs to evolve from its current focus on teaching undergraduates and engaging in research, and take on the challenge of partnering with industry, government and other institutions to offer programs of lifelong learning. This would be a substantial transformation, but I believe these ideas should be in the mix in the ongoing discussion on how to make higher education both effective and more affordable.

To summarize, I think Dr. Aoun’s book offers a set of useful approaches for anybody who wonders about the future of work in an era where robots and other forms of artificial intelligence will play a greater role. The ideas here are highly relevant for current students and members of today’s workforce who want to stay competitive in their fields.  The book also poses ample challenges for the education and business communities. The value of building a “Robot Proof” toolkit of skills resonates for me as a teacher, parent and consultant. Welcome to the workplace of the near future.

What do you think?  Is your current set of skills and career path sufficiently robot proof?  To continue the conversation , please feel free to comment here or contact me on LinkedIn.

 

Shifting Gears — From Product Management to Teaching

The year 2018 was a time of career changes for me. My last corporate assignment, as a Product Line Manager for the Converged Communications Division of Dialogic ended in early January.  The division was divested to another company, Sangoma, and all of the roughly eight product lines I managed were part of the deal, but my time was up.

I’d had a productive three year run in my return to the company, so I decided this presented an opportunity for me to re-assess how I wanted to spend my time. I considered various possibilities, but one of the most attractive directions was to explore the potential to get involved in university teaching. Teaching or training has often been part of my work in product management and consulting, but I liked the idea of being on a college campus and bringing my business experience into the classroom. Several of my business colleagues had previously made the move from business into academia, so I began  reaching out to them as part of my networking process.

It turns out that having a master’s degree is one of the requirements to be considered for some adjunct teaching roles, so my Master of Engineering in Management Engineering from Rensselaer (RPI) covered that prerequisite. The Boston area has a rich selection of diverse universities and Northeastern University in particular has a strong business program. But it quickly became clear through my networking meetings that if I wanted to seriously pursue a teaching role, I needed to target a particular semester and make sure I’d be available to teach for several months as a time.  My early contacts were promising, so I decided to go for it.

I continued job hunting, but also looked hard for potential consulting assignments that would give me more time flexibility in the event a teaching assignment opened up. I was open to a lot of options and got a tentative offer from Northeastern to teach an Introduction to Business course around March. The details firmed up a couple of months later.

In parallel, I started doing some consulting, so I was busy with that work and had a teaching assignment which would start just after Labor Day. In preparation, I continued my networking, but now with a specific focus. I had several informal discussions with colleagues who had moved into academia and they were virtually all willing to share takeaways from their experiences in the classroom. This was all helpful and I felt good about the upcoming change.  One big change from my other teaching work was the extent to which the current teaching tools are online and highly integrated.  I took a course in Blackboard, which is the teaching platform used at Northeastern, but had to come up to speed very fast in its practical use for this particular class. My years of IT experience helped, but there were many fine details which weren’t always obvious.

Intro to Business at Northeastern is somewhat unusual in various respects. I became part of a faculty teaching team which teaches the class to several hundred incoming freshmen at the D’Amore McKim School of Business.  A standard syllabus had been prepared, but it was up to each individual teacher to deliver the material in an effective manner. We met as a team before the semester began and typically met every two weeks after that. Our class sizes were relatively small — my fall semester sections had fewer than 15 students.

Classes began on the Wednesday after Labor Day.  From this point forward, I taught the students three days a week in the classroom, but quickly found I needed to spend much more time outside of the classroom to prepare. As teachers, our goal was to provide students with a foundation in the basic elements of business — addressing topics such as entrepreneurship, marketing and accounting / finance — while simultaneously helping them form teams which would apply the concepts in a variety of assignments. The early workload was substantial, but the students quickly had chances to develop new skills in areas such as conducting research surveys and learning about finance using Bloomberg terminals. As teachers, a key challenge was to engage with the students on various aspects of business through readings of a textbook, numerous articles and other tools such as videos, and encourage the students to use critical thinking in applying the material. The teaching was less about lecture than alternative means; as a professor, I facilitated in-class discussions and encouraged the teams to work together to  reach conclusions.

Here are a few takeaways from my first semester on campus:

  1. Teaching a class for the first time is a lot of work. I’d seen this before in my earlier teaching experiences, but it was particularly true for this class. Most of the effort was outside the classroom and included preparation, review and grading of assignments and meeting with students. It felt like I learned a lot over the course of the semester and this will help me to be more efficient and effective in teaching future classes.
  2. I found this particular class drew upon a wide range of my business experiences and skills.  For example, the review of supply chain and operations tied well into my original degree studies in Management Engineering at RPI and experiences from the three years when I managed purchasing and materials management for Burroughs and Fujitsu.
  3. Managing the class relied heavily upon online technology, notably on the learning automation tool known as Blackboard. Blackboard is a bit quirky, so the integration between 3rd party software and Blackboard is brittle and held a few surprises. My many years of IT experience helped here.
  4. In business, the focus is on meeting customer needs. When teaching university classes, meeting the needs of students is the central focus. One key goal was to bring my business experiences to the classroom, but the nuances of teaching a broad set of concepts required a great deal of focus.  Since this class included several team-based projects, I spent a lot of time coaching the teams on how to successfully complete their assigned presentations and projects. The improvements the students made in areas such as making presentations and preparing business plans demonstrated how they were able to take the concepts of the class and apply them to entrepreneurial projects.
  5. My business focus has been heavily in the business-to-business (B2B) arena, but most of the examples used in this class were business-to-consumer (B2C), since the major projects were tied to the consumer retailing giant TJX (the owner of stores which include Marshalls, TJ Maxx and Homegoods).  I enjoyed applying my marketing background in this somewhat different business context.

In summary, the transition from conducting business to teaching business has proven to be a career change which has many challenging elements.  I enjoy the work, especially when the students progress in learning many new skills over the course of a semester.  I also liked the university environment; Northeastern treats part-time faculty and staff as professionals and I’ve enjoyed my interactions with other professors and the support staff team. So this career change has been a positive one for me and might also work for other business professionals who’d like to apply their career skills in a university environment.

Have you considered a career change which leverages your experiences in a different way?  If you’ve had similar experiences in re-imagining your career or are contemplating such a move, I’d love to hear from you on LinkedIn.

Contact Centers Get Smarter

Contact centers have evolved consistently over the past two decades and always seem to be utilizing a mix of both old school voice technologies and newer solution elements.  My exposure to contact centers has been as a customer, product manager for related connectivity products and as a contributor to an important SIP-based IETF standard for standardizing collection of user information. I also get to hear war stories on a regular basis from people I know who work in call centers.

One of the newest trends is to add Artificial Intelligence (AI) into the mix.  Google has recently announced their Cloud Contact Center AI solution and it’s described in some detail within a blog post on the Google web site.

Google themselves aren’t in the Contact Center solution market (yet!) and this solution is designed to complement solutions from other providers. There is a rich history of solutions provided by companies such as Avaya, Genesys, Cisco and many others that were originally all premise based, but contact center solutions increasingly have been moving to the Cloud in recent years.  A review of the blog post noted above shows an interesting mix of how AI is injected into the fray.  Contact centers are a people intensive business, as agents take incoming calls and customers get queued up until an agent is available.  Google’s Diagflow development tool enables contact center providers to create automated virtual agents who can take incoming calls and use a combination of Interactive Voice Response (IVR) tools and access to databases to start interacting with incoming callers.  There are limits at this point, but the tools such as Virtual Agent (shown as being in a Beta status) can start analyzing the caller’s needs, answer some questions and determine if a handoff to a live agent is needed.

Another new tool is called Agent Assist.  Assuming that some of the incoming calls eventually do need to connect to a human agent, this tool (shown as being at Alpha level) can augment the agent’s progress through the conversation by providing tips such as relevant articles or other shortcuts.

The big picture here is fascinating. There’s been a long term debate about whether AI should replace human roles or augment human capabilities.  Walter Isaacson’s book, The Innovators, has some interesting discussion about this from AI experts on both sides of this argument.  Google AI pursues both directions.  At the business level, contact centers employ a lot of humans and need to assist many customers via tools which can include voice, chat, speech recognition and much more.  Customers want answers or perhaps want to make a purchase. So whether AI is used to deal directly with the customer needs or help an agent to get to answers more quickly, it’s a win for the customers. For the companies who deploy contact centers, AI  offers another approach to get more productivity out of the investment they have already made in contact center solutions and in agent resources that utilize these solutions. Human call agents do add value in this equation, particularly when the issues are complex or emotions come into play, so don’t expect these virtual assistants to eliminate those roles, but over time, the trend is likely to be toward making more use of AI at various stages of the customer interaction.

If you or your company is active in the contact center eco-system, feel free to weigh in with your comments.  If your company would like advice on how trends like AI will affect strategies for providing contact center solutions, you can reach me on LinkedIn or at our web site.

 

 

 

 

 

 

 

Leveraging Industry Standards for Success – A Case Study

Telecom is an example of an industry that has created national and international standards for communications in ways that benefit companies large and small. As a consultant, I’ve frequently advised companies about specific standards and how they can be aligned with business strategies. Let’s consider an example.

During the last 15 years, facsimile communications has been dramatically affected by technological forces.  The circuit-switched network is being replaced by IP networks throughout the world, as I noted in a previous post.  As a result, all fax communications company have had to develop a strategy for the transition to IP. The vendor community anticipated this in the late Nineties and key new standards for sending fax messages over IP were developed by the Internet Engineering Task Force (IETF) and the International Telecommunications Union (ITU). The IETF focused on integrating fax with Internet email and the ITU split its efforts between supporting the IETF Internet Fax standards by reference (T.37) and devising a new standard for real-time fax communications (T.38).

Standards adoption often takes time and such was the case for IP fax. There were some early adopters of the email based approach (for example, Panafax and Cisco), but despite backing by both the ITU and IETF, the market didn’t take off. One big reason was the emergence of voice communications over IP (VoIP), primarily based upon the IETF’s Session Initiation Protocol (SIP), which gained increasing momentum during the first decade of the 21st century.

Several of us in the ITU and IETF took a small but critical step which allowed T.38 IP fax to ride this wave. In the year 2000, we completed an annex to T.38 which specified how it could be used with SIP.  As a result, when implementors wanted to add fax support to their SIP-based Voice over IP solutions, the steps required to enable a Voice over IP session to spawn a T.38 fax session had already been specified in a T.38 annex. During this same period, Voice over IP gateways were emerging as the preferred approach to connect the existing circuit-based network to the emerging IP network based on SIP. Cisco and other gateway manufacturers such as Audiocodes and Cantata (later renamed Dialogic) cut over to T.38 as their favored solution to support fax over IP.  The fax board manufacturers such as Brooktrout (later Dialogic) followed suit and T.38 became the most widely adopted solution for Fax over IP.  The use of T.38 for IP fax was also supported by the Third Generation Partnership Project (3GPP) for 4th generation mobile networks and by the SIP Connect initiative for SIP Trunking driven by the SIP Forum.

When I was advising my fax industry clients in the late Nineties, I suggested they keep a close eye on the trends in both fax over IP and Voice over IP in deciding upon their product directions. At this time, the IETF standards for Internet Fax via email got early momentum, but in the standards community, we kept working on both the email and real-time IP fax solutions. As noted above, the step of ensuring that T.38 could eventually be used with SIP in a standards-based solution became very important as Voice over IP became a much bigger industry trend than Fax over IP.  As a result, fax solutions that would work over the emerging voice over IP networks became successful and are still being sold by many communications vendors today. The story didn’t stop there. There are other important trends that have emerged in recent years such as the needs for enhanced security and the transition from physical products to software-based solutions in the Cloud that communications vendors need to bake into their strategies going forward.

If you have been in business scenario where leveraging industry standards helped your company’s products gain success, please feel free to weigh in with your comments. If you’d like to explore strategies on how to evolve your company’s solutions and leverage current or potential industry standards, you can reach me on LinkedIn or on our web site.

What can the Internet of Things Market Learn from Telecom?

The Internet of Things is widely perceived as a hot market and has the usual hockey stick projections of massive growth laid out by market researchers such as Gartner and IHS Markit.  In this post, I’d like to consider one broad slice of the IoT market, the Industrial Internet of Things (aka IIoT), which applies IoT technology to address business problems. Let’s also consider if the IIoT industry could benefit from lessons learned by an adjacent market, Telecom.

Last year, Dialogic, the company I where I worked in Product Management, started looking at the Internet of Things (IoT) as a potential market where some of our telecom expertise could come into play.  I wrote about my experience in exploring a product concept for an IoT gateway in a post earlier this year.

The IIoT market has had good success so far by tackling individual problems within vertical markets and spinning up solutions.  There have also been attempts to create a de facto standard architecture for IoT, such as the platform architecture specification developed by Intel.  One of the challenges for IIoT is the proliferation of different vertical markets. In my last role at Dialogic, I talked with several companies that support  monitoring applications in areas as diverse as home health care, security and vehicular emergency services. These companies are prime candidates to use IIoT technologies, but their current implementations often run over the circuit-switched network, make extensive use of proprietary technologies and sometimes use dial-up modem connections. So a common challenge for many of these companies is the need to move forward and evolve solutions that will be well suited to the emerging technological environment of IP-based networks and take advantage of newer software approaches such as virtualization.

So what could these IIoT companies learn from the experience of telecom solution providers?  Part of the solution may be to look at the standards-based toolkit that has emerged as Telecom has swapped out it’s own network. Both mobile and fixed telecom networks have moved over to IP and solutions are usually built using the Session Initiation Protocol (SIP).  Mobile networks are currently a hybrid of IP and circuit-switched technologies, but the data portion of the network that would apply for many IoT solutions has been all IP since 3G networks were implemented and has been greatly enhanced with the fourth generation Long Term Evolution (LTE) networks. SIP has been widely used for voice and other applications, but other technologies such as WebRTC have emerged which provide standards-based approaches to build applications which support both multimedia (such as voice or video) and data.  The common element of the next gen IP networks, SIP and WebRTC mentioned above is that they were built using standards approved by international bodies such as the Internet Engineering Task Force (IETF) and the Third Generation Partnership Program (3GPP).

One of the key challenges for companies building IIoT applications and infrastructure will be to have solutions that can scale from very small to very large implementations and to use approaches which don’t need to be revamped to address markets in different countries. The leaders of IIoT have been able to solve customer problems from the bottom up, but the challenges of scale and addressing multiple geographic markets will benefit greatly from the use of standards.  These may consist of existing standards, such as those which support mobile connectivity, and building out new standards which are well targeted to solving IoT problems and enable the development of eco-systems that will promote best of breed solutions.  In summary, the use of existing and new standards is a way that Industrial Internet of Things providers can leverage lessons learned in the telecom to massively expand the available market for their solutions.

If you or your company are participants in driving change in the Industrial Internet of Things market, feel free to weigh in with comments. If you’d like to explore strategies on how to evolve your application solutions or other IoT products and services to leverage standards and better address the rapidly changing industrial environment, you can reach me on LinkedIn or on our web site.

 

More Business Disruption: Telecom’s Move to IP

In the late Nineties, the Telecom business was dominated by big companies who had built their phone network over many years using switching technology. But a massive storm was on the horizon as the same IP technology which helped revolutionize commerce on the world wide web started to be applied to phone-based voice communications. Early attempts at Voice over IP were primarily targeted to the long distance market. International long distance calling was expensive, so a number of startups began to bypass the traditional long distance network with a much lower cost IP network. The quality wasn’t great, but the price per call over international routes dropped dramatically and IP infrastructure and solutions gathered momentum.

The early leader in IP protocols for voice was the H.323 protocol developed within the traditional standards group for phone networks, the International Telecommunications Union (ITU). But competitive protocol models were also on the rise. The Internet Engineering Task Force (IETF) developed a new IP communications protocol, the Session Identification Protocol (SIP) and both the IETF and ITU worked on a softswitch protocol called Megaco (later standardized by the ITU as H.248).

Around 2001, two important organizations endorsed SIP and the train which would ultimately displace much of the traditional switched phone network was set in motion. Microsoft had been an early user of H.323 and had added it to their instant messaging client support and included multi-point data sharing using T series protocols from the ITU. But Microsoft decided their future communications would be SIP-based and quickly phased out use of H.323. Then, the Third Generation Partnership Project (3GPP), a standards group which had specified the very popular second generation wireless protocol GSM, said that they would be using SIP to build their next generation network and shift both data and voice services over to IP.

But first, the core SIP protocol needed to be finished. IETF participants likely spent millions of manhours and devised an updated version of SIP which got standardized in June, 2002 as RFC 3261, along with 4 other RFCs for related methods and operations. But this was just the beginning. In the time since, the IETF has produced at least 100 SIP-related documents which are either standards track or informational to guide SIP developers.

On the business side, it took quite a while, but the current public phone networks have largely cut over to IP, although there are still elements of the switched network in place.  In the world of mobile communications, the fourth generation network specified by 3GPP was the first to use SIP in its core. The related Long Term Evolution (LTE) network has been deployed throughout the world, although the voice portion of the network (Voice over LTE) has lagged behind. The move to LTE and SIP has required a massive investment in new capital equipment and software by mobile service providers and most of that deployment dates from about 2012. On the business side the industry has experienced lots of turmoil during the period between 2001 and 2012.  One of the biggest equipment vendors, Nortel, declared Chapter 11 and chunks were sold off to other companies before the company went out of business. Many of the remaining vendors have gone through multiple mergers and acquisitions, greatly reducing both the number of telecom related companies and the number of employees.

The other major SIP endorser from 2001, Microsoft, has shifted its IP voice communications strategy numerous times, but one of it’s flagship offerings,  Skype for Business, is predominately based on SIP.  Microsoft’s use of SIP is primarily within enterprises, though they have also been a strong advocate of SIP Trunking, which enables enterprises to connect to the service provider IP phone network. In the meantime, Microsoft has many competitors in the enterprise voice and communications space, but SIP remains a dominant technology. Vestiges of circuit-based phone systems remain, but all of the major players have long since switched their current product and service offers to be IP-based.

IP and SIP are doing well, but voice is now a much smaller portion of the communications business and service providers make much of their money from data services. The era of premise-based equipment is also winding down, as the shift to IP has enabled companies to move both service provider and enterprise applications to the massive conglomeration of servers known as The Cloud. I’ll be writing more in future posts about lessons learned from the Telecom move to IP and on how the move to the Cloud is also causing major business disruptions.

If you or your company participated in the Telecom move to IP, feel free to weigh in with comments. If you’d like to explore strategies on how to evolve your application solutions or other products and services to in the face of rapid business and technical change, you can reach me on LinkedIn or on our web site.

 

Business Disruption in Document Communications – What Happened?

In the late 1990s, the Internet and the World Wide Web created massive technical disruption for the worlds of document communications and messaging. Now, nearly twenty years later, business communications looks much different than it did going into the Millennium and once major businesses such as the marketing of enterprise fax machines are deep into their long tail phase. In my last post, I noted several trends in both fax and email as the related standards communities pushed to transform these technologies for the new IP world. Let’s look at what happened.

One major driver of the success of fax in the Nineties was the classic network effect as postulated by Ethernet inventor Robert Metcalfe. In essence, Metcalfe had stated that a network became much more compelling as the number of connected devices increased.  In the Nineties, the fax machine vendors and computer fax companies were often on opposing sides in technical battles, but all of these companies benefited from Metcalfe’s network effect as it applied to the overall fax network. But as we crossed into the 21st century, fax machines designed to run on the circuit-switched phone network (aka the Public Switched Telephone Network or PSTN) had much less utility in an increasingly IP network connected world. As a result, physical fax machines began to disappear from larger enterprise offices and in smaller offices, they were often replaced by less expensive multi-function peripherals (MFPs), which were basically printers that also included fax and scanning features. This meant that the number of Group 3 fax devices in total at first plateaued and then began a decline. In essence, Metcalfe’s network effect played out in reverse. The fax machines and MFPs of the Nineties did not evolve to use the new IP fax standards, so as document communications moved to IP, these physical fax or MFP devices still only sent faxes over the PSTN and were less connected as IP communications became more prevalent.

If we consider the trends in computer-based fax, they played out differently. Companies like Brooktrout sold fax boards to independent software developers and the boards were incorporated in local area network solutions. These solutions also typically included tight integration with email.  By 2004, Fax over IP enabling technology started to be commercialized, using the ITU-T T.38 IP fax standards. T.38 had some technical issues, but it could use the same call control protocols — SIP, H.323 and H.248 — that were being adopted by the new Voice over IP networks, so T.38 became a popular choice for conveying fax over these VoIP networks. By contrast, the T.37 approach of Internet Fax over Email did not get much adoption, most likely because it didn’t mesh very well with Voice over IP.  The computer-based fax solutions that ran on Local Area Networks continued to have healthy growth in the first decade of the 2000s in large part due to the continued validity of fax as a legal document, perceived security compared to use of email over the Internet, a slow rampup in the use of digital signatures on other electronic documents and regulations such as the Health Insurance Portability and Accountability Act of 1996 (HIPAA) which meshed well with receiving fax documents in electronic form (rather than on a paper tray).

During the same period, email use continued to grow, but rising issues such as lack of security and massive amounts of spam made the use of email outside of corporate subject to a number of hassles. As noted above, electronic signatures started to become available as a legal alternative to fax signatures, but didn’t gain widespread use until the past few years. As a result, enterprises tended to standardize on a particular commercial email package and communicate whenever possible over secured private IP networks and by making use of security tools such as Virtual Private Networks (VPNs).

Now, in 2018, the messaging world is highly fragmented. Large enterprises have tended to choose unified communications eco-systems from large players like Microsoft, Cisco and Avaya, but even these solutions are rapidly evolving as the momentum is shifting toward pushing enterprise communications into the Cloud.  Hence, Microsoft is shifting its emphasis from Lync to Skype for Business and now onto Teams and other vendors such as Cisco are doing much the same.  Upstarts such as Slack have started by offering cloud-based team communications and have forced reactions from the traditional Unified Communications players.  As messaging has evolved, voice is now becoming less important and fax is now more of a niche play.  One thing I don’t see too much of is the use of business communications that can effectively cross the boundaries between organizations. In theory, Cloud-based communications could get us there, but the vision of the late Nineties of being able to communicate documents and other types of media effectively across the entire Internet has been hobbled by security, privacy and spam issues. We’ll have to see if the Cloud and better cross-network security mechanisms could form the foundation for approaches that will be superior to today’s highly balkanized communications landscape.

If you or your company have participated in the massive changes to the communications eco-system since the 1990s, feel free to weigh in with comments. If you’d like to explore strategies on how to evolve your application solutions or other communications products and services to better address the rapidly changing business environment, you can reach me on LinkedIn or on our web site.

A Tale of Business Disruption in Document Communications

In the middle of the 1990s, the Internet and its associated IP protocols were like a huge wave that was off the shore of the business world, but poised to come in and cause massive disruption. At that time, I ran a consulting business for telecom clients (Human Communications) and was active on several fronts to be proactive on the topic.  In the TR-29 fax standards committee, we started work on how fax communications could take place over the Internet. A small group began work on an initiative called Group 5 Messaging, whose goal was to take the best ideas of fax, email and telex and spin up the next generation of business communications. In late 1996, the Internet Engineering Task Force (IETF) held an informal Birds of a Feather (BOF) on Internet Fax.  In meetings of Study Group 8 of the International Telecommunications Union (ITU), discussions began on how to extend fax protocols to work over the Internet or on private IP networks.

On the business side, fax was very hot and even very small businesses such as pizza parlors had purchased fax machines. Corporations had been adopting fax over Local Area Networks, and companies like Rightfax, Omtool, Optus and Biscom had  very healthy businesses selling into this space. Brooktrout Technology had introduced multi-channel fax boards and drivers for Windows NT, and had built up market momentum that enabled the company to go public. But all of this fax technology was based on sending faxes over circuit-switched networks. What would be the impact of the Internet and its technology on fax and business communications?

By 1999, the business communications landscape had changed dramatically. On the standards front, the IETF had created several standards for providing a fax services via email and the ITU had referenced these standards in the T.37 standard. The ITU had also independently created a new T.38 standard which essentially extended the T.30 Group 3 fax protocol into the IP packet world. The Group 5 initiative had lost momentum, as the fax and other communications players lined up to support the new IP-based standards from the IETF and ITU which appeared to solve the problem of how to send faxes over IP.  Related standards work continued and I was active in making sure that the new T.38 fax protocol was supported under both the current H.323 call control and under the new SIP and Megaco (later H.248) protocols.

On the business side, fax was still doing well, but now had new competition. The advent of the World Wide Web had totally wiped out the Fax on Demand business that had done well in the early Nineties. Various pundits were saying that email was the future of business communications and that new portable document formats like the PDF from Adobe would be used in place of fax.  Curiously, the email experts who participated in the IETF Internet Fax work weren’t so sure. Fax had business quality of service elements which were hard to duplicate in email — notably instant confirmation of delivery at the end of a session, negotiations between the endpoints on what document formats were acceptable and the legal status of fax, where fax messages over the circuit network were accepted as legal documents for business purposes.  The IETF work group tried to upgrade email protocols to address the technical elements, but the work was hard and the path to adoption slow.

I also shifted my career and suspended my consulting business to join Brooktrout Technology and help them participate in the new Voice over IP business. But just before I left my business, I advised my fax clients and newsletter subscribers to get diversified and not put all of their eggs in the fax communications basket.  I saw both challenges and opportunities ahead. There had been a large number of new startups that had attempted to ride IP fax to success in the late Nineties, but most of them crashed and burned within a couple of years. E-Fax had introduced “free” IP fax mailboxes and that approach was quickly emulated by competitors, but the business model for “free” wasn’t obvious.  I’d helped form a new industry association called the Internet Fax and Business Communications Association in early 1999, but we had difficulty getting fax and other communications industry vendors to sign on. The times were turbulent and the way forward was less than obvious.

In my next post, I’ll talk about how the trends toward IP Fax and its communications competitors played out and which related business communications issues still need to be addressed.

If your organization has participated in the evolution of fax or other business communications during this evolution from the circuit-switched phone network to IP, please feel free to comment. If you’d like to explore strategies on how to evolve your application solutions or other communications products and services in this rapidly changing business environment, you can reach me on LinkedIn or on our web site.

Virtual Software: Changing Business Models

One of the best texts I’ve ever read about business models was written by Cory Doctorow, a famous writer and entrepreneur. His novel Makers was not only a great story, but virtually a doctoral thesis on how business models can change and have a radical impact on everything they touch.

A couple years ago, I helped launch a new virtualized software product line for Dialogic. The PowerVille™ Load Balancer was different in many ways from other products I’d managed. The software was totally agnostic to the underlying hardware, courtesy of a Java code base which was highly portable to multiple topologies. As a result, it fit nicely into a variety of virtual environments and also was poised to make the leap into emerging Cloud architectures, in line with trends like the emerging Virtualized Network Function (VNF) and approaches like the use of HEAT templates for configuration.

A few months into the launch, my manager and I talked about how to take this product to the next level and realized that we needed different business models for this kind of product. The traditional load balancer provided by industry leaders such as F5 was built on top of proprietary hardware platforms, and the business model followed suit. Pricing was typically based on a purchase, where all of the hardware (and software) was purchased upfront, accompanied by a service agreement which was renewed year by year.  This approach is often called the perpetual model.

But with the Cloud taking over, customers were looking for different answers. Cloud Services such as Amazon Web Services (AWS) and lots of industry software had moved to subscription or usage based business models. For example, if you buy a subscription to to a software product like Adobe Acrobat, you get the right to use the product so long as you keep paying the monthly subscription fees. Amazon went further. You can buy rights to AWS services and only pay for the usage of the Cloud infrastructure you have purchased. In the world of virtual services, this permits customers to scale up for high usage events—think of the capacity need to support online voting via text for a television program like American Idol—and then scale back down as needed, perhaps even to zero.

We considered these kinds of changes for the Dialogic load balancer, but other virtual software products at the company ended up taking the lead in becoming available under subscription or usage based models. The implications were huge. Salesreps loved the perpetual model, since they’d get a big chunk of commissions every time they sold a big box.  In a subscription or usage based model, the revenue—and the commissions—move to a “pay as you go” model. Hence, no big upfront commissions payout and you need to keep the customer happy to get that recurring revenue stream. By contrast, finance executives now had a revenue stream which was less bumpy, since there was somewhat less incentive for Sales to go out and close those end of quarter deals. Customers also like the flexibility of subscription models. Typically, they may pay more over the long haul vs. the perpetual model, but they also have the option to change to a new product or service mid-stream. In summary, the move to virtual software and related technical innovations such as Software as a Service (SaaS), Infrastructure as a Service (IaaS) or by extension Anything as a Service is likely to drag in new business models.  These new business models change the finances both on the customer and vendor side and not everybody will be pleased with the results, but momentum for these trends continues to grow.

If your organization has participated in the evolution from perpetual to subscription or usage based business models, please  weigh in with your comments. If you’d like to explore strategies on how to evolve your application solutions or other communications products in this rapidly changing business environment, you can reach me on LinkedIn or on our web site.