From calculators to the cloud. How did we get here? This week, we are going to explore how we have taken bits of metal, combined them with zeros and ones, and created machines that power our world. We’ll have a look at just what this “Cloud” is, how it works, and how shared mainframe usage in the 1950s laid the foundation for today, a world where unfathomable amounts of data is stored, shared, and accessed virtually by millions of users every single second.
“If I have seen a little further it is by standing on the shoulders of Giants.” Sir Isaac Newton.
Like all culture-forming advancements, computing started as a forward-thinking idea. What if? How might we?
These questions asked by the early visionaries of computing created a foundation to a today that was unimaginable just a few generations ago. In this section, we pay tribute to some of the giants of computing.
Ada Loveless:
Computer Programmer, Mathematician
1815-1852
It would be hard to ignore the contributions to the world of computing if it weren’t for Ada Lovelace. At the age of 17, Lovelace met inventor and mathematician Charles Babbage and watched him demonstrate a model portion of his difference engine, an enormous mathematical calculating machine that has led to his being dubbed the “father of the computer.” Lovelace wrote of how the machine could be programmed with a code to calculate complex numbers, which some consider to be the first algorithm to be carried out by a machine and thus the first computer program.
Lovelace foresaw the multi-purpose functionality of the modern computer. Although Babbage believed the use of his machines was confined to numerical calculations, she mused that any piece of content—including music, text, pictures and sounds—could be translated to digital form and manipulated by machine.
Lovelace’s ideas about computing were so far ahead of their time that it took nearly a century for technology to catch up. During the 1970s, the U.S. Department of Defense developed a high-order computer programming language to supersede the hundreds of different ones then in use by the military. When U.S. Navy Commander Jack Cooper suggested naming the new language “Ada” in honor of Lovelace in 1979, the proposal was unanimously approved. Ada is still used around the world today in the operation of real-time systems in the aviation, health care, transportation, financial, infrastructure and space industries.
Alan Turing
Computer Scientist, Logician, Mathematician
1912-1954
Turing is considered the creator of modern computing. In March 1946 he produced a detailed design for what was called the Automatic Computing Engine (ACE.) This was a digital computer in today’s sense, storing programs in its memory.
In 1936, he developed the idea for the Universal Turing Machine, the basis for the first computer. And he developed a test for artificial intelligence in 1950, which is still used today.
Turing researched at the University of Manchester, where electronic engineers had already demonstrated a very small stored-program computer. Here he focused on the use of computers. His main theme had been in investigating the power of a computer to rival human thought. In 1950, he published a philosophical paper including the idea of an ‘imitation game’ for comparing human and machine outputs, now called the Turing Test. This paper remains his best known work and was a key contribution to the field of Artificial Intelligence.
Interestingly, he was responsible for breaking the Nazi Enigma code during World War II. His work gave the Allies the edge they needed to win the war in Europe, and led to the creation of the computer.
The machines that began the age of the computer were massive, expensive, single function, and by today’s standards, capable only of basic computation. But whether they were analog or digital, decimal or binary, mechanical or electrical, these groundbreaking machines changed the course of history and laid the foundation for the remarkable digital technology environment that we know today.
NOTABLE INVENTIONS PRE-DATING THE MODERN PERSONAL COMPUTER
1645 FIRST CALCULATOR | Blaise Pascal
Pascal was a French mathematician and philosopher who developed a calculator called the “Pascaline.” This device used a series of toothed wheels, which were turned by hand and which could handle numbers up to 999,999.999. Pascal’s device was also called the “numerical wheel calculator” and was one of the world’s first mechanical adding machines.
1694 STEPPED RECKONER | Gottfried Leibniz
The Stepped Reckoner was a digital mechanical calculator invented by German mathematician Gottfried Leibniz. It was based on his Leibniz Wheel and completed in 1694. It was the first calculator that could perform all four arithmetic operations: addition, subtraction, multiplication and division. By combining principles of arithmetic and logic Leibniz imagined the computer as something more than a calculator, as a logical or thinking machine. He was also one of the first to discover that computing processes can be done much easier using the binary number system.
1801 AUTOMATED LOOM | Joseph-Marie Jacquard
The Jacquard Loom was invented in 1801 by a Frenchman named Joseph-Marie Jacquard. For the first time it weaved patterned silk automatically using punch cards to automate the elaborate patterns and in the process transformed the textile industry. It also influenced the development of future programmable machines, such as computers including the Analytical Engine of Charles Babbage.
1834 ANALYTICAL ENGINE | Charles Babbage
The Analytical Engine was much more than a calculator and marked the progression to fully-fledged general-purpose computation. Essentially, it was a decimal digital general-purpose machine that was programmable and had many fundamental features found in the modern digital computer. It used punched cards, an idea borrowed from the Jacquard loom, had a ‘Store’ where numbers and intermediate results could be held, and a separate ‘Mill’ where the arithmetic processing was performed. It was one hundred years ahead of it’s time.
1890 U.S. CENSUS TABULATOR | Herman Hollerith
Herman Hollerith devised a system to process and tabulate data for the U.S. 1890 census. It consisted of electrically-operated components that captured and processed census data by “reading” holes on paper punch cards. The primary components of the system were: Pantograph, Card Reader, Tabulator Dials, and a Sorting Table. Modified versions of his technology would continue to be used at the Census Bureau until replaced by computers in the 1950s.
1931 DIFFERENTIAL ANALYZER | Vannevar Bush
The Differential Analyzer was the world’s first analog electrical-mechanical computer. Vannevar Bush designed it to model power networks, but quickly saw its value as a general-purpose analog computer. It filled a room with a complicated array of gears and shafts driven by electric motors. It solved problems in physics, seismology, and ballistics and inspired similar devices in the US, Britain, Europe, the Soviet Union, and Australia.
1943 COLOSSUS | Max Newman and Tommy Flowers
Colossus was the world’s first digital computer that was fully electronic, programmable, and operational. It used vacuum tubes and was created at Bletchley Park by a team that included Tommy Flowers, Max Newman, and Alan Turing . It was not a general-purpose computer, however. It’s mission was to decipher the encrypted messages between Hitler and his generals during World War II and is widely acknowledged to have shortened the war by many months, saving tens of thousands of lives.
1944 HARVARD MARK I | Howard Aiken
Mark I was designed in 1937 by Harvard graduate student Howard H. Aiken to solve advanced mathematical physics problems encountered in his research. It expanded the concepts of Charles Babbage and was built in collaboration with IBM, the company founded by Herman Hollerith. It was easily programmable, digital (though decimal not binary), and electromechanical rather than electronic. It could quickly change tasks but because of it’s switches was extremely slow, executing about 3 commands per second.
1945 ENIAC | Presper Eckert and John Mauchly
ENIAC was designed by Presper Eckert and John Mauchly and was the first machine to incorporate the full set of traits of a modern computer. It was programmable (though not easily), digital (though decimal not binary), all-electronic, and very fast. Capable of executing 5,000 commands per second, ENIAC could solve in less than an hour an equation that would take Harvard’s Mark I close to eighty hours.
Please watch this two minute film about the ENIAC machine.
In 1948 Alan Turing wrote that “We do not need to have an infinity of different machines doing different jobs. A single one will suffice”. But in order to accomplish that, there needed to be a way to program individual machines to perform a variety of tasks. And so the era of programming languages began – and it happened to be that it was an era for women.
GRACE HOPPER
A remarkable programming pioneer, Grace Hopper was a naval officer who worked with Howard Aiken on the Harvard Mark I and with Presper Eckert and John Mauchly on ENIAC. One of her great strengths was her ability to translate scientific problems into mathematical equations and then articulate that into ordinary English. Because of this ability she was assigned the job of writing what was to become the world’s first computer programming manual.
JEAN JENNINGS (BARTIK)
Born on a farm near Alanthus Grove, Missouri, Jean Jennings took a job at the University of Pennsylvania as a “computer”, as people who performed routine math tasks were referred to in those days, and began calculating artillery trajectory tables for the Army. Shortly thereafter she was selected to work on a new computing machine at Penn that was soon to be called ENIAC. She became one of it’s primary programmers and was one of pioneers that developed the use of sub-routines.
BETTY SNYDER (HOLBERTON)
After graduating from the University of Pennsylvania Betty Snyder, along with 5 other women including Jean Jennings, was chosen as one of the original ENIAC programmers. A few years later she helped design the UNIVAC computer and, along with Grace Hopper, went on to develop the COBOL and Fortran programming languages. In 1997 she received the Augusta Ada Lovelace Award, the highest award given by the Association of Women in Computing.
KAY MCNULTY (MAUCHLY)
Kay McNulty was one of the original ENIAC programmers. Her area of concentration was operating the Differential Analyzer, a huge analog machine of which there were only a few in the world. She and her colleague Fran Bilas led the team of women who used this machine to calculate the ballistics equations. After the war, she continued with the ENIAC to program equations for some of the world’s foremost mathematicians. Kay married Dr. John Mauchly who, together with J. Presper Eckert, invented the ENIAC and UNIVAC computers, and together they worked on program designs and techniques for many years.
Watch these two videos showcasing early programmers, Grace Hopper and Jean Jennings.
To oversimplify, cloud computing is the ability for many users to virtually access the power, storage and applications housed in physical computing infrastructure that is not a part of their wired network. Server farms, like the one just east of us in Bonner, provide massive capacities for IT functions at reduced prices to the consumer because the economy of scale creates an advantage to utilizing services provided by a company that has made the capital hardware invested.
Terms you should know:
Infrastructure as a Service (IaaS): This is virtual access to computing resources. Examples of IaaS include Cisco Metacloud and Amazon Web Services.
Platform as a Service (PaaS): This type of cloud computing offers virtual access to platforms such as operating systems, database services, and web servers. Examples of PaaS include Apache Stratos and Google App Engine.
Software as a Service (SaaS): Is a type of cloud computing through which providers make software accessible to customers over the internet. Examples of SaaS include Adobe Creative Cloud and Microsoft Azure.
CLOUD HISTORY
The origins of modern day cloud computing date back to the 1950s, when establishments like schools and corporations would invest in one mainframe server (often so large it could fill an entire room), that multiple users could access through point-to-point dumb terminals. The single mainframe was a function of economics, as mainframes were too expensive to purchase for each individual user. This data storage and computing power shared among many users became a framework for future innovation.
The next big breakthrough for cloud computer happened in the 1970s when virtual machines (VMs) were developed. VMs are basically software computers that can perform like hardware. This advancement allowed a single mainframe to run multiple operations in one environment (in the 50’s, the mainframe performed only a single function).
In the 1990s, US Government research (remember DARPA) and the development of the Internet and World Wide Web allowed ISPs to deliver cloud computing services to individuals, small businesses, and students through VPNs (virtual private networks). In the early 2000’s, NASA co-developed an open-source cloud computing resource called OpenNebula. Fun fact, NASA has an entire site dedicated to open source code. By this time, cloud technologies were being developed by all of the big kids; Google, Amazon, Microsoft, IBM, and many more would all debut their cloud computing platforms and services within a few years of OpenNebula.
Today, cloud computing is part of our daily lives. Most people’s personal phones are backed up in the cloud. Personal storage of photos, videos, any information you have on SaaS platforms like Google Docs – the examples of cloud computing in our day-to-day activities are abundant.
ASSIGNMENT: This five-episode documentary, The People’s Cloud, by Matt Parker walks us through the advent of the Internet and World Wide Web through today’s cloud uses. Please watch all five episodes (9-12 minutes each) as this week’s assignment.