When Digital Transformation Polarizes the Organization
Part One of Four: The Fallacies and fallout from industry “buzz”.
By Kurt Jonckheer
Chief Executive Officer
Klarrio
Big data. IoT. AI. ML. Analytics. Cloud, and what sometimes seems like a never-ending string of other trendy tech terms, are often cited as catalysts for digital transformation. Ask 100 people, and you will get 100 different definitions and interpretations of what digital transformation entails.
I, for one, have no intention of imposing another. But I do want to separate the truth from the buzz that is constantly fueled by hype that all-too-often comes from the research and advisory firms of this world. Legitimate technology breakthroughs—like IoT, AI, ML and others—had their origin way before the industry hype became the norm, and it’s frustrating—sometimes even tragic—to see how much damage is caused by baseless and self-serving claims that disregard the most basic concepts many of these buzz words are founded on.
Unfortunately, meaningless hype ripples through every level of society, industry and workplace where hordes of executives, information specialists, data executives, architects, engineers and businesspeople manipulate each other in an effort to be heard, followed, and listened to.
In well-established enterprises with decades of legacy IT, the reliance on industry buzzwords, followed by a blind adoption of the latest technology crazes, creates multiple divisions and problems within the organization. Core objectives completely get lost, and existing complexities, lack of cost structures, forecasting, realistic delivery cycles, and the strategic integration of systems and solutions only get worse.
The resulting lack of operational knowledge, combined with infighting and turf wars, can literally bifurcate what is meant to be a unified and highly advanced system going forward into a number of fractured processes that not only fail to achieve their goals, but also cause devastating financial damage in the organization.
The Roots of Digital Transformation
A good way to grasp the origin of digital evolution is to read Claude Shannon’s article, A Mathematical Theory of Communication. Shannon, dubbed the ‘father of modern digital communications and information theory,” published his theory in 1948, not only introducing the concept of information theory itself, but also outlining the fundamental process whereby a digitally sent message becomes one that is received. Digital communication has certainly come a long way since then. As manual and analog devices reached their saturation point, the digital capabilities of their offspring took center stage and continued to multiply. Later, the popular interpretation of ‘Digital Transformation’ was born, fueled by a growing need for digitization throughout the entire organization, a continually evolving data society, an always “connected” world, and today’s instant-gratification culture.
As a result, the never-ending growth and reliance on data, along with accelerating processing speeds, has created a fundamental shift, away from traditional client-server-based architectures and toward event-driven streaming architectures with distributed and parallel paradigms at their core.
Despite the hype and self-interests that continue to poison the digital-transformation environment, the process itself—as well as the need for event-driven architectures—is revolutionizing the workplace.
Empowering the Data-Driven Enterprise
With the rise of Machine to Machine (M2M) communication and the Internet of Things, we are now entering into the “Industrial Revolution of Data” (Hellerstein, 2008), where most data is no longer generated by people but by devices. Web pages, social networks, media content, and more are increasingly created by devices, and at a much faster rate. This proliferation of data is also referred to as “Big Data”.
According to The Economist, the amount of digital information now grows tenfold every five years. Increasingly, it is less about the quantity or size of the data and more about what to do with it.
Commonly available data analysis tools are unable to keep up with the increase in size, diversity and rate of data change. Consequently, the ability to analyze and manage skyrocketing amounts of data is a distinct competitive advantage.
This has led to the continued growth of real-time data processing and streaming data architectures. So much, in fact, that multiple reports contend real-time streaming has become the data-processing paradigm for the modern enterprise.
Streaming data sets the bar high for the most interesting future use cases—artificial Intelligence and event-driven applications most notably—giving rise to the number of, frameworks, and tools for building and running event-stream processing at scale.
Such rapid growth in emerging streaming-data applications requires a true understanding of how data can be used to make strategic business decisions going forward. (See Figure 1.) The real-world demands of digital transformation leave no room for decisions based on buzz rather than facts.
In addition, the rise of data streaming requires familiarity with streaming-data architectures. From the early signs of analytics and the transformation of batch ETL to streaming pipelines and streaming analytics—to running complex event-processing and event-driven logic for mission-critical applications—the adoption of data streaming is the new norm.
Unlike legacy client-server systems, this requires:
- Choosing and integrating the right tools and stream processing framework
- Ensuring specific technology frameworks can scale according to your needs without compromising functionality
- Checking the framework’s interoperability and integration with the existing technology stack
- Ensuring the chosen stream processor provides the appropriate latency and throughput characteristics to fit the application scenarios.
- Also ensuring the development teams have the knowledge, resources and skills they need to build, launch and profit from an entirely new IT paradigm.
No matter how you choose to define digital transformation, it’s a legitimate game-changer that requires a deep understanding of event-driven streaming data applications and architectures—not meaningless buzzwords that ultimately do all of us a disservice.