Every business process, every customer interaction, every manufacturing step, and every manufactured artifact, digital or physical, throws off data. That is not new. What is, is the volume of data which has the potential to be acquired and analyzed to yield actionable insights. From enterprise network access logs to tiny MEMS sensors to 4K video streamers in drones, data is being sensed in ways and in places like never before. The amount of data we record as a species is growing exponentially, but the vast majority of those bytes are not fulfilling their potential. Whether explicitly lost in an edge device log trim, or implicitly stripped of value when dumped without context in a data lake, businesses are throwing away the chance to gain something from that data, forfeiting the chance to gain insight from it. From every edge to any cloud, HPE is already helping customers combat this issue, providing the technology and insight necessary to waste not want not.
How does your enterprise think about data?
How does your enterprise view data; as a cost to be contained, as a source of insight to increase revenue and profit, or as potential source of revenue in its own right? Data is one of the most valuable resources your company can have. Data gives insight which drives action. Action which drives progress, profit, and improved performance. Data analysis can show strengths and weaknesses, it can help you develop your ever-evolving strategy. With AI tools like unsupervised machine learning and anomaly detection, data analysis can provide insights in places you never thought to look.
Enterprises must understand that every process, every contact, in every place, creates data. Advancements at the edge mean everything is -or will be- something that throws off data. This endless source of data can’t be viewed as something that will take up valuable time to process: it cannot be ignored, it should not be given away, if you don’t act on it, you’ll get left behind. Whether it is you, your fiercest competitor, or a new entrant coming up from nowhere, as soon as someone in your segment leans into the full opportunity, you will either be a hyper-competitive, real-time analytic driven business or you will be desperately wondering how to compete with one. It will be a sea change comparable to when we shifted to just-in-time manufacturing. You need to know what is happening in your organization right now: real time is the new just-in-time. So why then are companies still wasting data? Why are some businesses discarding data, or even giving it away to someone else? Data is a raw commodity and as such it needs to be refined or processed. Think of it like a food source- for it to be useful, it must be processed prior to consumption. Also like a food source, this must be done in a timely manner because it can also be perishable. The question is, how do we extract the most value at the best cost? Considering both numerator and denominator in the return on investment of gaining insight from data; how much does it cost to refine the data and how much is the refined data worth? In essence, is the data processing economically viable?
How Much Data is Currently Being Created and Where?
Every other year, we create as much new data as has ever been in the history of mankind, that’s the exponential data growth curve; every two years it has doubled. This is not something which is going to level out soon: data sensors are not slowing down in terms of cost reduction or proliferation. The more we move to digital and virtual manufacturing and business processes, the more we can instrument our economies. What is changing is that the data is growing disproportionately at the edge. In as few as five years, the vast majority of enterprise information may never see anything we’d call a data center, the term itself will become an oxymoron.
As an example of the scale of potential growth, consider that the billion plus users of Facebook contribute 4 petabytes of data a day to the platform, data that is of extremely high value both because it is human generated and because it can be correlated with over a decade of accumulated insight. We are still wrestling with the potential impact of this type of potential. Now consider the sensor fusion platform of a connected vehicle: LIDAR, ultrasonic sensing, front and rear HD cameras, GPS, power train sensors and user actions. At hundreds of Megabytes per second, it takes only 1000 vehicles to match the data generating capability of a billion people. Today, almost all of that data is only used for safely moving the passengers from A to B and then it’s discarded. What if that data was not wasted, but could be correlated and analyzed in place? The sea of information which is currently being thrown aside could give insights into vehicle efficiency, environmental conservation, traffic control, city planning; connected vehicles could be the fluid intelligence of tomorrow’s smart cities. What business process and technical innovations are required to realize the potential of every single byte?
Data Analysis Road Blocks
So what is stopping everyone from driving towards big data based insights? What keeps us from admitting all that data to analysis for societal and enterprise benefit? Physics and Law. The reality of the post-Dennard scaling, Moore’s Law twilight world drives us to seek the economies of hyperscale cloud data centers. Moving information, for example from the edge to the cloud, comes at both a capital cost for infrastructure, bandwidth, backbones, and networks and also the operational costs of energy. Even if we ignore those costs, then we’re left with the speed of light. Latencies may seem insignificant measured at 5 nanoseconds per meter in photonic fibers, but they become material at the data center level for real time enterprise analytics, let alone metropolitan or continental crossings. Then there’s the law governing data protection, a complex issue exacerbated by the patchwork of laws and behaviors around the world and across borders and cultures. How is it possible to transport vast amounts of data from its source, then deliver analysis based on it to sites around the world? The physical requirements of such transfers coupled with the governance of such procedures are an obstacle to the data utilization which could be of such benefit to businesses and also society.
Every Byte, Everywhere
Before you throw away or give away another byte, let’s have that conversation. Your data could be the most important asset, but you might still be thinking of it as a liability because conventional computing is not capable of extracting its value. When we announced our intentions to define a new computer architecture publicly back in June 2014, our prototypes were a rough sketch on my whiteboard in Palo Alto. Along-side those sketches was that exponential data growth curve, a curve which demands new approaches, new physics, new models of security, trust, and control but in return will unlock the enduring value of data. Since then we've realized those sketches as working prototypes and engaged in hundreds of conversations across the globe about the potential, now its time to shift to implementation. Intelligent, distributed systems which span every edge and any cloud are more complex than centralized systems, but they are more sustainable, more available, more secure, and more equitable, which in turn makes them not just vastly more competitive but arguably more just. Compute for a world thirsty for wisdom.

No comments:
Post a Comment