This microbook is a summary/original review based on the book: Digital Transformation: Survive and Thrive in an Era of Mass Extinction
Available for: Read online, read in our mobile apps for iPhone/Android and send in PDF/EPUB/MOBI to Amazon Kindle.
Also available in audiobook
With a career spanning four decades in technology, legendary Silicon Valley entrepreneur Thomas Siebel has been at the forefront of several major innovation cycles, and has seen many billion-dollar corporations go out of business after failing to adapt to new technologies. In “Digital Transformation,” he claims that we are currently in the middle of one such era of mass corporate extinction and diversification, which is powered by four emerging technologies: cloud computing, big data, artificial intelligence, and the internet of things. Get ready to find out more about all of them!
In 1859, Charles Darwin proposed that natural selection was the main force driving speciation and evolution. In his explanation, organisms morph gradually from one species into another via accumulation and enhancement of their fittest traits. Our planet’s fossil record, however, doesn’t support this claim, showing – quite contrariwise – an absence of continuity between different life forms. Darwin believed the gaps to be the result of an incomplete fossil record. In 1972, paleontologists Niles Eldredge and Stephen Jay Gould published a landmark paper in which they successfully reinterpreted the punctuated fossil record as a signal, rather than a void.
In Gould and Eldredge’s theory of punctuated equilibria, evolution and speciation didn’t happen gradually but rather in a series of bursts of evolutionary change, separated from each other by periods of evolutionary stability. During these periods, species stayed in equilibrium for hundreds of thousands of years, changing very little in the grand scheme of things. But then, some environmental disruption – such as a meteor hitting the Earth and causing an Ice Age – would instigate mass extinction of the dominant species and create rapid diversification of the most adaptive ones, practically reinventing life on Earth.
In Siebel’s opinion, Gould and Eldredge’s theory of punctuated equilibria explains the modern economy much better than Darwin’s evolutionary model does. The gradual and flat rate of change implied by the well-known Moore’s law – which states that the number of transistors on an integrated circuit doubles every two years at half the cost – doesn’t take into consideration the so-called “Schumpeter’s gale of creative destruction,” which is the real driving force of progress. Put simply, companies – just like species – don’t evolve and diversify gradually, but in bursts, triggered by revolutionary inventions. Unfortunately, the price of being unable to adapt to these transformational technological shifts is never monetary, but existential.
Just like environmental cataclysmic events are responsible for the cyclic nature of species (inception, diversification, extinction, repeat), revolutionary technological improvements are responsible for the rapid rise of startups and the swift downfall of industrial giants. “The corporate graveyard is littered with once-great companies that failed to change,” comments Siebel.
Take, for example, video rental chain Blockbuster. At its peak in 2004, the company employed more than 80,000 people in more than 9,000 stores, earned $5.9 billion in revenue, and boasted a $5 billion market capitalization. It was around this time that Blockbuster refused to buy a small online streaming service called Netflix for $50 million. Just a few years later, broadband and fiber-optic networks revolutionized the way the internet worked and, by 2010, Blockbuster had to file for insolvency. In the span of just six years, the company went from dominating the video business to bankruptcy. Meanwhile, Netflix evolved to become the largest entertainment company by market capitalization in the world.
Blockbuster, of course, isn’t the exception, but the rule. Just think of what happened to Borders after Amazon arrived on the scene or to Yahoo! after Google incorporated. It’s needless to add any more names to the list: stats show that since 2000, 52% of the Fortune 500 companies have either been acquired, merged or gone out of business. It seems that the bursts of technological change in the corporate world are just as harsh as nature’s environmental triggers: you either adapt to them or perish.
Currently, claims Siebel, we are in the middle of one such revolutionary change. He calls it the digital transformation, and claims that it is driven by four powerful disruptive technologies: cloud computing, big data, artificial intelligence (AI), and the Internet of things (IoT). “Companies that harness these technologies and transform into vibrant, dynamic digital enterprises will thrive,” he warns. “Those that do not will become irrelevant and cease to exist.” So, let’s delve a bit deeper into each.
In the simplest terms possible, cloud computing entails the storing and accessing of data and programs over the internet, as opposed to your hard drive. In the time of Blockbuster, we could only watch movies we had bought or rented. Now, cloud-based streaming media services like Netflix and Amazon Prime allow us to watch thousands of movies they have bought or rented for only a fraction of the price. Why wouldn’t they? Thanks to the “elastic cloud” – called that way because of its “ability to rapidly and dynamically expand and contract to satisfy compute and storage resource needs” – storage costs have dropped by a whopping 98% over the past decade!
According to Seibel, five core features of cloud computing make it essential to digital transformation:
In addition to using the cloud for storing, organizing and accessing data – something referred to as infrastructure-as-a-service (or IaaS) – businesses can harness the power of cloud computing through two other service models. The first one, called software-as-a-service (or SaaS), offers companies the option to use software applications hosted on cloud infrastructure. The second service model, called platform-as-a-service (or PaaS), allows companies to build, test, and deploy applications on the cloud, with the ready-to-use development platform managing everything from the underlying infrastructure to scaling and security. Regardless of the service, cloud computing makes everything easier, cheaper and faster, so the question isn’t if your company should move to the cloud, but when. The answer? Today.
The basic unit of information in computing is a bit. Think of a bit as a light bulb that can be either on or off – that’s the only information it can relay. But group eight bits in a string, and you get a byte. Eight light bulbs can relay much more information than a single one, because each of them can be either on or off, which gives us 256 different combinations of light and dark. In computing, one byte can store one character. Meaning, one thousand bytes, or a kilobyte, can store no more than a single paragraph. Just 50 years ago, that was already considered a lot. Just for illustration, the 1972 Intel 8000 processor was an 8-bit processor, and the world’s first floppy disk, released the same year, had a storage capacity of about 80 kilobytes – about two pages of text.
But for a while now, we’ve stopped dealing with kilobytes. We’ve stopped dealing with megabytes and gigabytes as well, which contain one million and one billion bytes, respectively. Your computer’s hard drive probably has a storage capacity that can be measured easily in terabytes. To understand how much is that, just take into consideration the fact that “all the information contained in the U.S. Library of Congress is on the order of 15 terabytes.” But Amazon, Google, Microsoft and Facebook don’t store terabytes of data. It is presumed they collectively house an exabyte of data which contains one quintillion bytes! Now, that’s big data.
Thanks to cheap computing and even cheaper cloud storage, these internet giants have been able to collect mindbogglingly massive amounts of data in the space of a single decade. The best part isn’t the amount, but what it allows them to do: analyzing a complete data set – as opposed to analyzing a sample – provides an organization with better prediction models. That’s why incumbent organizations such as the aforementioned Big Four have a serious competitive advantage over startups. However, unless startups start using similar technologies, they will never be able to bridge the gap. Moreover, big data “lays the foundation for the broad adoption and application of AI,” which is where both humanity and our summary are heading next.
The term “artificial intelligence” – or AI, for short – dates back to 1955, when a young Dartmouth math professor named John McCarthy coined it as a way to describe an emerging field. The following year, McCarthy organized a six-week workshop at his alma mater which is today considered the founding event of AI as a field of research. Over the following decade and a half, in the words of Siebel, “the AI buzz set the world aflame” and “dramatic predictions flooded popular culture.” However, by the mid-1970s, mainly as a result of many funding agencies losing interest in supporting AI research, the field became (ironically) a thing of the past.
With one or two short breaks, the long winter of AI lasted until the very beginning of the third millennium when three major forces propelled AI to the forefront of international competition: 1) the rapid improvement of computational power, 2) the advent of big data and, 3) the advancement of machine learning. Put simply, thanks to broadband internet and cloud computing, companies such as Google and Amazon acquired access to data from millions of users. However, they didn’t know how to process and interpret this data, even though they understood that it was in their best business interest to do so.
The attempts of the Big Four to deal with big data led them to the development of mathematical techniques that allowed them to “convert complex nonlinear problems to linear formulations with numerical solutions.” Then, in the mid-2000s, due to improvements in hardware and processing speed, neural networks became more practical than ever before, and deep learning started gaining traction. In the span of two decades, the field of AI had evolved from rule-based, expert systems to mathematical predictive models, to AI agents capable of mastering chess or Jeopardy by themselves through self-play and deep learning. No wonder tech giants spend billions of dollars a year on AI and related technologies. Once again, only those organizations that invest money and energy to adapt to this new business climate can hope not to perish.
As defined by Siebel, the internet of things (IoT) refers to “the ubiquitous sensoring of value chains so all devices in the chains become remotely machine addressable in real or near-real time.” Put in simpler terms, the IoT is a network of interrelated physical objects (usually called smart objects) embedded with sensors or other technologies and able to transfer data over the internet without the prerequisite of a human-to-human or human-to-computer interaction. Up until recently, it was the stuff of science-fiction movies; nowadays, physical objects can actually “talk” to each other.
It all started with wearable computers such as Fitbit and Garmin (both introduced in 2008) which were the first smart objects for which consumers started showing interest. For example, LG introduced an internet-connected “smart refrigerator” in 2000, but very few were interested in buying it at the time. However, that all changed with the introduction of the Fitbit smartwatches, the Nest remote thermostats and the Philips Hue smart lightbulbs. Nowadays, consumer IoT objects such as Amazon Echo and Google Home are ubiquitous.
More importantly, the interest of large corporations in IoT technology is spearing. Factories and transportation companies are rapidly deploying sensors to monitor operations, and tech giants are constantly trying to find more ways to employ IoT to their benefit. The reason? IoT generates valuable data and allows them to increase their value propositions to customers. For example, a sensor indicating the current position of all cars owned by a cab company allows the company to promptly send the nearest cab to your location.
“I expect,” Siebel concludes, “that in the next few years virtually everything will have become a computer – from eyeglasses to pill bottles, heart monitors, refrigerators, fuel pumps, and automobiles. The internet of things, along with AI, creates a powerful system that was barely imaginable at the beginning of the 21st century, enabling us to solve problems previously unsolvable.”
To quote Former U.S. Secretary of Defense Robert M. Gate, “Digital Transformation” “should alarm every CEO and government leader about the simultaneous arrival of an existential technological threat – and an historic opportunity. A must-read for every leader in business and government.”
Whether in nature or in business, evolution isn’t gradual: it comes in bursts. Companies that aren’t alert and choose to rest on their laurels instead will go from boom to bust in the blink of an eye.
Thomas M. Siebel is an American business executive, serial entrepreneur, and bestselling author. He is the founder and chief CEO of C3.ai, an AI software provider, and the chairman of First Virtual Group, a diversified ho... (Read more)
Now you can! Start a free trial and gain access to the knowledge of the biggest non-fiction bestsellers.