The Future of IoT

The Future of IoT
Leveraging the Shift to a Data Centric World
by Don DeLoach, Emil Berthelsen and Wael Elrifai

With an introduction by Charles Paumelle, CMO, Microshare.io

A new book from three Big Data thinkers, Don DeLoach, former President and CEO of Infobright, Emil Berthlesen, Research Director at Gartner and Wael Elrifai, a speaker and IoT entrepreneur, captures the essence of the IoT revolution and communicates it in the most clear and concise language I’ve yet come across. We are proud at Microshare.io and the Microshare Economy blog to be able to provide an excerpt for our readers.

The coming change to the way we view data and business generally is best illustrated by a personal example: I have been using my personal FitBit device for a while to track my activity. The information is transmitted from the FitBit device, to my personal app on my phone, and I have allowed a number of friends to see my data too – so we can encourage each other. Recently, I have also chosen to give my insurance company access to my Fitbit data in return for perks generated by my positive behavior (regular exercise). Slowly but surely, I am increasingly making decisions about this new information (data) I am generating, who I choose to share it with and why but I want to keep control of that data.

Today, there are millions of companies starting to generate and use new data in a similar way to me. There are lots and lots of product suppliers (like Fitbit but for the enterprise market) building individual solutions, for very niche markets, to help companies collect important data, for their own use (for example, connected leak detection systems sold to insurance companies or remote waste level sensors for facilities management providers).

The shift these authors describe is one where we all think beyond our personal or business use and look at how the data we capture might be useful to other people and companies, therefore generating additional value from that same data (the authors use the term “utility value”).

For example: a municipality may gather data on the use of its parking lots, in order to monitor current requirements in the community, make decisions about pricing, future development etc. Today, that municipality can very easily share that data with the public in real time, in order to increase the chances that they will find a parking spot. The result is a happier ‘customer’ and the cost is relatively low.

Some of the biggest issues in relation to sharing all that data are compliance, regulation and data protection. The authors tackle this subject head-on – and provide an elegant concept they’ve called ‘first receiver’.  The idea is that the first receiver of the information decides what to do with it. They can either share it with a named source, or publish it, and allow other people to ‘subscribe’ to it (as in the case of the parking space example above). Critically, the information collected must be handled responsibly. For example, sharing data about the number of parking spots available might be fine, whereas sharing video footage of cars going in and out of specific parking lots might not. Anyone trying to build an IoT strategy or the infrastructure to support it would benefit from reading this book.

Introduction: From Smart Connected Products to System of Systems, Shifting Focus to the Enterprise

At this point, much has been written about the Internet of Things. Many who weren’t paying attention before have become enthralled, while others may be growing tired of hearing yet another “oh so smart product” story. Smart cars, smart grids, smart homes, smart watches, and more are filling our airwaves, our stores, and our lives. So why another book on IoT? For the same reason people still write books about IT architectures, or real estate, or raising kids. We evolve. And while considerations about real estate and raising kids may change slowly over time, the path forward tends to be more aggressive when technology is involved. Erik Brynjolfsson and Andrew McAfee’s recent book The Second Machine Age, did a wonderful job in highlighting how the basic advances in physics associated with technology and the combinatorial element of new technological ideas create an almost hyper-acceleration of technological advance. So, the world in 1980 may have looked entirely different than the world in 1960, the world in the nineties was much more different still than the eighties, and by 2000 it was changed on an even more radical scale. And yet, the progress since 2000 is a quantum leap from all of history up until then, and in five years, will be another quantum leap forward still.

There have been certain critical steps along the way. The semiconductor and the microchip, to be certain, were two of them. The computer and then the personal computer were two more. Cellular technology for certain. And then there was the Internet, which, in and of itself, was spawned from an initial military and academic effort (as many are) called ARPANET. But when Sir Tim Berners-Lee developed the “map,” if you will, called the World Wide Web, and Marc Andreessen at the NCSA at University of Illinois developed Mosaic as a graphical, intuitive interface to the web, the Internet had its underpinnings to move from the realm of what was possible to what was practical, and they paved the way for the web becoming mainstream. Technology becoming mainstream defines the beginning of the wave.

To reduce this discussion to the limited examples here does not do justice to the evolution of technology, but it conveys the idea that certain seminal ideas take root in a way that effect our society at large. What’s more, these waves actually occur in phases of maturity. The Internet, which started as not much more than an insecure vehicle for looking at marketing brochures, has evolved into an elaborate system where thousands, if not millions of applications are used by billions of people each day. Use cases ranging from ERP to sales force automation to banking to photo sharing, once restricted to either physical process or applications that would never be considered for external hosting or deployment over the Internet, are now the rule and not the exception, but this didn’t happen overnight. It matured in phases.

Now we have the Internet of Things. In some ways, it too has a long history. The IoT, as we know it, began to see life around 2008 to 2009. Kevin Ashton from MIT is largely credited with coining the term and has been a key driver since then. By 2011 it picked up steam. Some called it “the Semantic Web” (namely, the W3C organization founded by none other than Sir Tim Berners-Lee himself), some called it “M2M (Machine to Machine) Communications,” some called it the “Internet of Things.” There were other names as well. And if you were looking, you would have seen more and more buzz about it in the market. There was the ”Peggy Smedley Show” that interviewed person after person, company after company doing things in M2M, and increasingly IoT. There were books like Daniel Obodovski’s The Invisible Intelligence. And there were companies beginning to place big bets that this would become the “new new thing.”

One of the most prominent was IBM and their ”Smarter Planet” initiative. From 2012 to 2014, we saw the likes of Cisco, GE, Siemens, Ericsson, PTC, and numerous others going all in on IoT. And while IBM was early to the game, Cisco, PTC, and GE were undoubtedly all in. Cisco framed this with their ”Internet of Everything” (IoE) initiative, and authored an oft-quoted paper projecting 50 billion connected devices by 2020. GE went all in as well, ranging from their general advertising around the new connected world, to establishing a software excellence center in San Ramone, California, to the introduction of GE Predix, the “operating system for machines” to talk to each other, and the corresponding ad campaign centered on attracting new talent for the effort. Then there was PTC. The company that had evolved from its origins in computer-aided design (CAD) and Product Lifecycle Management (PLM) and acquisitions of Computervision, which had previously absorbed Prime Computer, its lineage was all about product design and development. If you were building products, PTC probably knew you, and you them. PTC, especially their CEO Jim Heppleman, seemed to embrace the Internet of Things and in 2013 purchased ThingWorx, an IoT development platform. Many would soon suggest they overpaid, but the fact remained they were seemingly committed to the space. They reinforced that with the acquisitions of Axeda (Machine Cloud), ColdLight (predictive analytics) and others, basically doubling down on their IoT focus.

In late 2014, Jim Heppleman, along with famous Harvard technology professor Dr. Michael Porter, co-authored what would become a seminal article about the Internet of Things in the Harvard Business Review. In it, they described what they saw as the likely evolution of the Internet of Things. The progression seemed to make sense to us, as it did to many. Simply stated, there were five specific stages of IoT’s evolution. The first was the ”product” stage. Simple enough. An air conditioner is an air conditioner. We get it. The second stage was the ”smart product” stage. So now we have a programmable air conditioner. As technology evolved, it became possible to put computational power, even if it was minimal, into products like an air conditioner. In doing so, the air conditioner might be capable of adapting to certain conditions, or changing settings based on time of day. This was quite an advance.

Then came stage three, which is ”smart connected products.” Now the air conditioner is accessible by the Internet. This has big indications. You can control your air conditioner from your phone. The company providing the air conditioner can look at its operation at a component level, and compare that to millions of other units to do predictive maintenance, so when it ”sees” certain markers in the data, it can dispatch a service agent. This is a critical stage. In many ways, this is where much of the market focus has been. This is where, in 2017, a great deal of the focus and understanding remains, but it won’t be for long.

The subsequent two stages, as outlined by Heppleman and Porter, are more meaningful than most people seem to understand, but is indicative of the broader focus in the market. It is where the broader market will begin to shift focus from the product provider to the enterprise. The fourth stage is ”product systems.” This is where the smart thermostat talks to the connected HVAC and the smart window blinds and heated floors. In some ways, this is common sense. It’s reasonable that all functionally related elements should be ”connected” to ”talk to each other,” or more specifically, interoperate. One of the earliest indicators of the inevitability of this phase was Google Nest Division integrating Big Ass Fans, after which, the Nest Thermostat communicated with the smart connected Big Ass Fans. The battle over the smart home is one of the early proving grounds of this stage.

The practical (or impractical) reality of smart connected products in the home suggested there was a need for them to work together, so key industry players began to jockey for dominance. This pertained to the communications standards, as well as the ultimate command and control platforms ranging from Apple HomeKit to Amazon Echo to Google Home, Samsung SmartThings, and others. The Allseen Alliance (primarily driven by Qualcomm) got involved to broker standards for consumer IoT as well. And while the focus today in most elements of IoT is still largely on smart connected products, the progression to product systems is clearly happening.

Larger players, like GE and Hitachi, bringing forward solutions like GE/Predix and Hitachi Lumada, further demonstrate this. In some ways, these are the industrial/commercial equivalents to HomeKit or SmartThings. This might also suggest that if you have a factory or hospital or distribution center running Predix or Lumada, you might benefit from having, respectively, GE or Hitachi smart products connected into these platforms. And while these will allow other products to coexist, there is an implied level of tighter integration when you connect a specific company’s products into its platform. This is not to suggest that Predix or Lumada are equivalent. There are many differences, but they both are designed to contemplate a larger ecosystem.

This isn’t new. We have seen Apple leverage iTunes for years, where iPods, iPhones, and more all very seamlessly coexist. They will work with Windows, but is that really what you want? Apple makes billions of dollars by leveraging the accessibility of their products to meet a variety of their customer needs. They make it easy. That is a big play for GE, Hitachi, and others, but before we explore this further, let’s look at the fifth and final stage.

“System of systems” is the fifth and final stage. Think of this as your home appliances talking with your home security talking with your home entertainment talking with your car and your wearable devices. The reasons for this will be beyond obvious soon enough, but consider what you can begin to do if the data—the information—from these devices can be shared. Imagine appliances working with energy systems in order to coordinate maximum efficiency. This includes recharging electric cars at night. The HVAC isn’t only looking at temperature, but air quality, and based on input from your wearables, may adjust such things as temperature, humidity, and airflow to suit your preferences. The entertainment system may interact with the lighting system and security system to adapt to certain conditions, ranging from when you are way from home to when intruders are detected on your property.

Beyond somewhat obvious examples of basic system-to-system interaction, enriched signatures gain value when further enriched by either other non-IoT operational systems (like your personal schedule that informs the security system, the energy system, the entertainment system, and your cars, as to when they need to adapt and in what manner) and even by external data, IoT or otherwise, like weather data, traffic data, etc.

All of this information contributes to the generation of a richer signature, which can create greater insight and actionable intelligence to enhance productivity and quality of life, and, on a commercial level, can enhance competitiveness, profitability, and quality of work life. To say people and organizations should care about this is an understatement. Nobody would carry a flip phone, along with a separate text pager, camera, dictation device, mp3 player, video player, compass, and myriad of other devices; they carry a smart phone. You want to fully leverage the resources at your disposal in the most effective way. IoT will become the ultimate integration. To not leverage all the available data would be to artificially curtail the use of that data and restrict what would otherwise be incremental and meaningful insight.

Perhaps the most profound examples will be in the delivery of healthcare and the operation of smart cities. There will be people who live better and longer because of IoT, whether they realize it or not. Some people get excited about IoT because it’s a strong business opportunity. Others see it changing the world. Effectively leveraging data from IoT will deliver the greatest value, and here is where it gets really interesting. If the value created is a function of your ability to leverage the data—then who controls it? Who owns the data? Who can use the data? Why? How does this happen today, and why? And how will it happen in the future, and again, why?

Let’s consider the Heppleman/Porter thesis on the progression of IoT. Product companies provide smart connected products. The majority of the effort to create and deliver IoT technology has been aimed at the product providers, so the myriad of IoT platforms out there are being sold to product providers as a platform for them to deliver smart connected products. This is where the market is. This is where the money is. So, this makes sense.

In most of these products, the sensor devices in the products are designed to talk to the ”machine cloud” of the product provider, where the appropriate controls are executed and historical analysis in conducted, so the product company can provide predictive maintenance and better servicing. In other words, the sensor devices create sensor data, and that data is consumed by the application associated with the products operated by and owned by the product company. It is a closed-loop, message-response system. In many cases it is compelling, to be sure, but it is hardly open and leveraged beyond a minimal point. But again, the stage of smart connected products suggests the money to be made will be a function of meeting these, albeit basic, demands.

When we move to product systems, the interoperability and more importantly, the underlying association of the data from IoT system A to IoT system B may or may not be deliverable via one product company. It is certainly in their best interest to do this, because, in these circumstances, they make more money and retain more control. This is why large product companies are spending so much to provide interoperability between their related products. As the market continues to demand that products from different companies interoperate and share data, the orchestration and delivery of data will likely become more difficult. When you contemplate “system of systems,” the logical focus goes from the product company to the enterprise or organization that has invested in numerous IoT subsystems.

Shifting your focus from the product provider to the enterprise changes the game. An aha moment comes when you realize an organization’s ability to leverage underlying IoT data is a function of its ability to own and control that very data. When the market makes its way from stage three to stage five, the focus will most certainly shift from the product provider to the enterprise.

In doing so, the whole concept of ownership and control of the data should be re-cast. This isn’t to say the product provider shouldn’t see the same data they were getting before. They should. It just means they shouldn’t get that data at the expense of the enterprise that owns and uses the IoT subsystems. And while this may seem to be a big problem, in fact, it’s technologically quite simple. It has to do with architecture and governance. The issues are commercial and economical. And with the right architecture and related accommodations, everyone can win. There can be a fairly simple component in the equation called the “first receiver” that can make the effective propagation and leverage of IOT data a reality.

The first receiver would basically sit inside or behind one or more edge devices at a given entity (factory, hospital, retail store, etc.) and provide the ability to persist the data coming off a variety of IoT subsystems associated with that entity. The data would be stored into a master data model for that specific entity which could be then cleansed and enriched, and where the underlying data could then be propagated, either to local, consuming applications or constituents (like the point-of-sale or inventory system at a local store), and/or to remote constituents or applications as well. For example, the retail store may propagate one set of information to the regional headquarters, but a different set to the national headquarters.

The first receiver would likely also propagate (under contract) all the HVAC data to the HVAC vendor, just like it did before the first receiver, but only the HVAC data, nothing else, and likewise for the other product providers. In fact, it might also send a different subset of data to supply chain partners, and another set of data to OSHA, the FDA, or other regulatory monitoring bodies. The first receiver would also provide accommodation to the product providers to maintain their firmware, much like we see today with printer drivers. They would also likely enhance security as well. In short, the first receiver concept is designed to allow the maximum leverage of the utility value of IoT data. The first receiver ensures the right data gets to the right constituencies at the right time.

This won’t come easy at first. Product providers have too much to gain by ”owning” everything. They all want to be Apple, but in today’s market, market drivers will dictate a more open, democratized set of capabilities. And this is the focus of our book. We believe the key value—the maximum value—will ultimately come from leveraging the utility value of IoT data. For this to happen, one needs to travel from the world of ”smart connected products” to the world of ”system of systems,” and understand the basic underpinnings of what will deliver, or inhibit, that progression.

Don Deloach

Don Deloach

Emil Berthelsen

Emil Berthelsen

Wael Elrifai

Wael Elrifai

This excerpt from “The Future of IoT: Leveraging the Shift to a Data Centric World” was reprinted with permission from author Don DeLoach. Watch this space for a podcast interview with the author coming soon. The full book is available for purchase the book on Amazon.