No better time than now to start learning! Start managing yout time effectively. SUBSCRIPTION AT 30% OFF
Many are unaware that the digital revolution started well before the first computer was created. The journey began in 1843 with the collaboration of Charles Babbage and Ada Lovelace. With each new decade, different innovations emerged and were developing until culminating in the creation of the computer and the internet. In Innovators, the author introduces us to the great inventors who contributed to the birth of modern technology. We also discovered that men were not solely responsible for the invention of computers. Many women participated directly and even were the first to program computers.
Innovators tell the story of the people who created computers and the internet. It is a guide on innovation and presents the history of the digital revolution. This is the story of how the minds of great creators worked and what made them so creative.
The Countess Ada Lovelace was the daughter of the poet Lord Byron and his wife, Anne. Her life was a constant mixture of mathematics and art, a combination that allowed her to have ideas and think of possibilities far ahead of her time. She began collaborating with mathematical scientist Charles Babbage after being impressed by his 'Machine of Difference,' a machine he built to solve polynomial equations.
Ada believed in the new idea of Babbage, the Analytical Machine. He wanted to design a "computer" that could perform various tasks - not just one, like his previous machine. Ada was fascinated by the idea. One of the features of the design that most pleased her was that she ran on punch cards, like the mechanical weaving machines she loved.
Babbage imagined that his Analytical Engine would be able to perform different operations and even do it alone. Taking an even bigger step, Ada developed the theory that the machine would be able to process not only numbers, but also poetry, musical notes, and artistic standards.
Babbage's machine was described in French with a set of notes written by Luigi Menebrea and Ada translated them into English. This translation, to which she added her comments almost doubling the size of the original annotation, was published in a scientific journal in 1843.
Ada's version of Menabrea's annotations became much more influential than the original version and provided for the functionality of future computers. The article had four concepts that would become very famous: the machine would have many purposes; would work on a sequence of operations; would be able to process anything translated into symbols and would never be able to think on her own.
She also wrote what has been widely recognized as the first example of a computer program in the form of a diagram and a table explaining how to use punch cards.
The Babbage machine was never built, but it was a functional idea that served as the basis for the digital revolution that followed. The knowledge about machines, combined with Ada's artistic and mathematical imagination, produced an idea that proved quite prophetic.
The early twentieth century saw the development of many computer-like machines, but only a few combined the four characteristics that define the modern computer. These characteristics are: electronic components in the core; a binary language of 0s and 1s; ability to perform multiple functions; and digital and non-analog jobs. The first machine to have these four criteria was the Computer and Electronic Numerical Integrator and Computer (ENIAC).
Invented by John Mauchly and J. Presper Eckert in 1945, ENIAC served as a prototype for future computers. It could perform 5,000 sums and subtractions per second and possessed a variety of capabilities. Although no ENIAC predecessor has met the four important criteria of modern computers, some have come close. The Atanasoff-Berry Computer (ABC) was so similar that it caused Mauchly and Eckert to face problems and fight for the patent of their invention.
ABC, invented by John Atanasoff in 1942, couldn't be programmed to perform general functions - it could perform only one function, which was to solve linear equations. Although not very well known, during the process of developing his machine, Mauchly heard about ABC and became very interested. He discussed the ABC prototype with Atanasoff and was even able to examine it. After this exchange of ideas, he acquired important concepts that he later used to create ENIAC.
Because of the collaborative nature of these innovations, legal patent disputes were a constant problem. Honeywell, a technology company, challenged the ENIAC patent in 1964 on the grounds that the fundamental concepts were not original. Atanasoff was a key witness in the lawsuit. In the end, the judge found that Mauchly and Eckert based their ideas on Atanasoff's ideas and their patent was invalidated.
The computer, like most other innovations, was not the creation of one or two individuals, but the result of many people working separately and also working together with a common goal. Mauchly and Eckert benefited from innovative ideas from countless inventors who, in one way or another, helped develop their machine. When it comes to technological innovation, people simply build on the work of others, until someone strikes.
Computers can only be multifunction machines if they are programmable. Programming a computer involves creating a sequence of instructions for the machine and then storing it in its electronic memory. The development of this capacity paved the way for the fulfillment of Countess Ada Lovelace's dream of creating a machine that can perform any logical operation, and also, she wrote the first example of a computer program when they didn't even exist.
Nowadays we think of programming as a male-dominated industry, but at the outset, the programming process was considered tedious and secondary, and for that reason, it was a task given to women. The first computer programmer, after Ada Lovelace and her punch cards, was Grace Hopper. Hopper was a mathematics teacher who enlisted in the US Navy during World War II and was later sent to work on the Mark I computer at Harvard University.
Hopper's job was to develop detailed manual programming for the Mark I. With a lot of dedication, she perfected subroutines for the most frequent operations and conducted programming in a collaborative way to create extremely accurate instructions. With the help of her team, she continued to refine the manual until every process was perfect. Thanks to Hopper in 1945, the Mark I was the easiest computer to program in the world.
Programming computers at that time was incredibly complex, and each new program could take days to complete. The process involved moving cables between channels, resetting switches, and making constant adjustments. Around the same time that Grace Hopper was working at Mark I, a women-only team took full responsibility for scheduling ENIAC.
Since the creation of the computer, thanks to the work of Grace Hopper and ENIAC programmers, the world of technology has been inhabited by both men and women.
Old computers were huge compared to modern personal computers. The ENIAC occupied an area of approximately 93 square meters. This made them very expensive and too large for public use, and for many years they were limited to universities, government agencies, and large corporations. But the invention of a small semiconductor known as a transistor paved the way for smaller, cheaper computers, making it possible to run complex programs on much smaller devices. This innovation changed the history of computers with the help of three Bell Lab employees.
Bell Labs, founded by Alexander Graham Bell for AT & T, were already famous for their collaborative research culture and development team. In the beginning, Bell Labs struggled to promote talent with effective group work because their creators understood the fundamental importance of collaboration to create successful innovations. Consequently, they encouraged their employees to share and develop ideas together, which produced excellent results. This collaborative environment created the perfect conditions for John Bardeen, Walter Brattain, and William Shockley to revolutionize the world of computers forever.
Until then, computers were built using vacuum tubes, which were devices capable of amplifying electronic signals. However, the size and number of these tubes made computers so large that they occupied entire rooms and consumed enormous amounts of energy. This made computers expensive and unpractical for general use. The purpose of these three inventors at Bell Labs was to eliminate the need for vacuum tubes by replacing semiconductors, thereby reducing computers and making them much more efficient.
In 1947, the team fulfilled its purpose by inventing the transistor, which used a semiconductor to amplify an electrical signal and turn it on or off. This device has paved the way for today's personal computers. Bardeen, Brattain, and Shockley won the Nobel Prize in 1956 for their innovation, but they were only able to achieve this important development through collaboration and sharing of knowledge.
In 1970, the personal computer became a mass product. Until that time, the computers, for the most part, were used for research. But thanks to the efforts of technology enthusiasts and entrepreneurs, computers have reached the general public. These "hippies and hackers" were in the area south of San Francisco Bay and began to develop experiences and understand computers from the inside out. This was the region that became known as Silicon Valley, a key area for technological innovation and development.
These techies felt that personal computers would empower people and change the way they learned, connected, and communicated. Among them was Ed Roberts, who joined forces with his friend Forrest Mims to form Micro Instrumentation and Telemetry Systems (MITS). In 1974, MITS designed the first kit for a basic personal computer, the Altair 8800. This computer needed to be assembled by the buyer and had no input device like the mouse or keyboard. It was very rudimentary by today's standards, but any technology geek with the patience to learn could use it.
Technology enthusiasts now had a way of interacting directly with computers, which were no longer exclusively controlled by the military or by companies. In the first month, the Altair 8800 sold hundreds of units, and some were shared in technology clubs.
One such social technology group was called the Homebrew Computer Club, where tech nerds could collaborate on such topics as computer technology, philosophy, and culture. It was there that many enthusiasts were inspired by their first experience in the Altair 8800, including two young future programmers, Steve Wozniak and Steve Jobs. This basic computer taught Wozniak about the microprocessor, which he used to design the new computer that would come pre-assembled and would include a screen and a keyboard. It was called the Apple II.
As the components of computers grew smaller, the possibilities for what they could do were growing exponentially. The counterculture of hippies and hackers was based on the fundamental principle of freedom of sharing of ideas and technologies, hoping that inspiring and empowering others would bring technology to the future. Building upon the work of others, these technology nerds have created personal computers and brought the technological revolution into our homes.
The Altair 8800 also inspired two more modern computer giants, Paul Allen and Bill Gates, to develop programs for personal computers. Together they formed a company, later known as Microsoft. Their program language, written on perforated paper tape, known as the Beginner's All-purpose Symbolic Instruction Code (BASIC), has become the basis for all programs for many decades. It was freely copied and spread like wildfire. At first, Gates was frightened by the overuse of his program, but, in the end, it turned out to be a positive thing.
With all copies, their program spread quickly, and BASIC soon established itself as the standard for personal computer programs. It has held that position for over thirty years, and enthusiasts have used it to create their programs. One of the biggest reasons for the program's success was that Allen and Gates designed it with the idea that it would work on just about any computer, regardless of brand. Over time, other companies have also started writing computer programs, and this competition has led to constant innovation in the development of programs, always improving the user experience.
Although Microsoft has promoted the separation of the program from the machine, Apple has integrated the two to create a package. The Apple Computer had its program, which was also continuously developed to make the user experience better. It was the first to have a graphical user interface (GUI), which used icons instead of typed commands. This gave Apple an edge over Microsoft and other competitors until the GUI became a standard for all programs.
Because software companies are in constant competition, they were forced to always make improvements in their products. This collaboration resulted in constant contributions of new ideas to the software industry, which were later incorporated into the new products of their competitors. With hard work and constant innovations leading to technological advancements, the use of personal computers has become easy and enjoyable.
Personal computers were an exciting innovation, but it was the creation of the internet that made computers popular. Originally, the internet was a collaborative effort among the military, universities, and corporations. MIT professor Vannevar Bush urged the government to fund research on these three entities over the internet, and as a result, computer geniuses and enthusiasts from diverse sectors were able to work together and develop something truly groundbreaking.
The National Science Foundation was created to bring together specialists from each of the three groups. One such expert was J.C.R. Licklider, who came up with the idea of creating decentralized networks that would facilitate the free flow of information and allow people and machines to interact in real time.
From the idea of Licklider, the ARPANET network was created and facilitated social interaction and communication. The modern email was born when Ray Tomlinson imagined the method using the @ symbol to direct messages to certain folders. Later, virtual communities began to appear, including The Well by Steward Brand, The Source by Willian Von Meister and America Online (AOL) by Steve Case. In a short time, the internet became so popular that Senator Al Gore supported the National Information and Infrastructure Act of 1993, opening the internet for public use.
The internet has forever changed the personal computer, connecting them all over the World Wide Web. With the growth of the network, it has become increasingly easy to connect to the internet and access endless amounts of information and communicate with people from around the world. And this explosion of the internet changed the world forever, but that would not be possible without the collaborative sharing of ideas by a vast network of people and disciplines.
From Ada Lovelace and Charles Babbage to Steve Jobs and Bill Gates, the technological revolution couldn't have happened without the work of thousands of talented people. And each breakthrough was built on previous work.
When people share ideas, creativity flows and then begins to work. However, as technology advances demonstrate, innovation is best cultivated when the team works together and, throughout history, most successful inventors and entrepreneurs have understood this. Often, the talents of these people combined with the work of their peers and even their competitors, made them achieve the best results. Top players must always innovate and develop constantly to stay in the game, and this has contributed to achieving fast and endless improvements for end users!
By sharing and exploring great ideas as a team, great minds discover exciting possibilities and find new ways to use them. The internet has made it easier, more than ever, to collaborate and who knows where to lead?
How about knowing ten myths about computers that insist on resisting over time?
Walter Isaacson is a professor of history at Tulane University. He served as president and CEO for the Aspen Institute, a CNN chairman, and editor of Time magazine. Isaacson is also the author of multiple biographies including “Leonardo da Vinci,”... (Read more)
Now you can! Start a free trial and gain access to the knowledge of the biggest non-fiction bestsellers.