The Very First Computer (~500BC)

 

The Abacus

An Abacus

The very first “computer” is known as the “Abacus”. First used in China in ~500B.C, the
abacus served to assist people with basic math equations, this was the first time a
machine was used to assist people with math equations, this concept would go on to
inspire the creation of the computer, thus making it the very first “computer”.

An abacus (also known as a counting tray) is one of the oldest devices used to assist
people with basic number crunching and arithmetic. An abacus consisted of a
bamboo or wooden frame with beads sliding on wires (as pictured on the right).
Originally beans or stones were used instead of beads but the general structure of
the abacus has remained mostly the same throughout the years. A video tutorial on how
to use an abacus can be viewed by clicking here.

 

Brief History of the Abacus and its Legacy

The name “Abacus” is derived from the greek word “ABAX” which means “Calculating Table”. The exact origins of the abacus are unknown but oldest use of the Abacus dates back to ~500BC in China. Nowadays Abacuses are used as a method of teaching kids how to add, subtract, multiply or divide, but between the year 500BC to the nineteenth century, abacuses were used everywhere. They were used by merchants to calculate profits, by builders to count required materials, as a teaching tool, as a way for the government to find statistics, and much much more. As technology got more advanced the abacus was replaced but its legacy is still known to this day as it did inspire the creation of future machines that would go on to become the computer that we know today.

 

The First Mechanical Computer (1822)

 

The Difference Engine

Working Model of the Difference Engine

In 1822, english engineer Charles Babbage invents the first machine capable of
computing tables of numbers, it was known as the world’s first computer and
was basically a really large, slow and basic calculator. This machine was
known as “The Difference Engine” and was capable of calculating polynomials,
decimals and was able to add, subtract, multiply and divide. The Difference
Engine was a major advancement in technology, the concept of having a machine
perform menial tasks such as basic math was a very new concept and would go
on to inspire the creation of countless inventions with a similar purpose.
The difference engine was 8ft (2.4m) tall, weighed 15 tons (13,600kg) and
was powered by steam. A more indepth explanation of how the Difference
Engine works can be found on the right and a model of the Difference Engine can
now be found at the Museum of Science in London.

The Difference Engine’s Successors

Despite having a working prototype, the british government halted the funding for the project and was later labelled as a failed project. Plans for “The Difference Machine No.2” were made but couldn’tbe completed due to lack of funding. The Difference Engine was succeeded by “The Analytical Engine” in 1837, also designed by Charles Babbage, the Analytical Engine did everything the Difference Engine did but was faster, cheaper, used less power and was considerably smaller. The Analytical Engine was also capable of storing numbers into it’s limited memory and experimented with the thought of using punch cards to deliver instructions to the machine.

Charles Babbage

Charles Babbage

Born on the 26th of December 1791 in London, Charles Babbage grew up with a strong interest in
mathematics and proceeded to graduate from the University of Cambridge in the year 1810. After
graduation, Babbage was hired by the Royal Institute. He would go on to work on to give lectures on
calculus and other mathematics during his time in the Royal Institute and do some work for the british
government. During the 1820s and 1830s, Babbage worked on the Difference Engine and Analytical
Engine.Once he finished with these projects he would return to the University of Cambridge and
become a mathematics professor while occaisionally offering his expertise to the british
government and Royal Institute.

 

Punch Card Computing (1890)

 

The U.S Census Crisis

After the U.S.A had gained it’s independence from the U.K in 1776, they had made their own constitution. One very important claws in the constitution stated that the U.S would perform a census every 10 years.In 1890, the United States of America was having a major census problem. The amount of immigrants, people, businesses, farms, etc. were simply growing too quickly for the american government to count using the traditional method of counting, at the rate it was going, the 1890 census wasn’t going to be complete until the 1900s and thus there was a call for a new method of counting and tracking and counting statistics.

 

The Tabulating Machine

Tabulating Machine and Storage Unit

In 1887 (the same year the 1880 census was completed), american
engineer Herman Hollerith had the idea to use a computer to input
and keep track of the data during the census. This new machine
was named “The Tabulating Machine” and worked on a punch card
system. Punch cards would be inserted into a “reader” and
depending on the holes punched, the machine would track which
holes were punched and increase the counter for that hole by one,
each “hole counter” would represent a different statistic (ex.
the amount of males in a family). Click here for a video that
goes into much greater detail as to how the Tabulating Machine
worked (it’s a lot easier to see how it works than to read
about how it works). The tabulating machine had accelerated the
process of counting the census by 8 years and saved the U.S
government 129 million dollars (adjusted for inflation). Despite the seemingly insignificant contribution the Tabulating Machine, the Tabulating Machine actually made a much bigger impact than expected as it showed the world that computers can be used for more than just calculations, that there’s a lot of creative freedom that can be taken with computers.

 

Herman Hollerith

Herman Hollerith

Herman Hollerith was born to german immigrants in 1860 in Buffalo NY. He had
performed very poorly in school but was still able to enroll at the City College
of New York in 1971 and also enrolling in the Columbia University of Mines
where he would become the assistant to professor W.P Trowbridge. He would
follow Trowbridge to the 1880 census where he worked as a statician. It was
during his work that he realised that a more efficient way of counting the
census was required and got to work on creating the Tabulating Machine. After
the rousing success of the Tabulating Machine in 1892, Hollerith would go on
to sell the Tabulating Machine to every major country and would use the money
to create a company that would eventually become IBM.

 

The Beginning of Computer Programming (1936)

 

The Turing Machine

In 1936, British engineer Alan Turing invented the first computer that could be programmed which he named “The Turing Machine”. This machine was unique as it was the first computer that could be programmed to do different things as opposed to being built to have only one function. The Turing Machine was also the first computer that was capable of following an algorithm. An algorithm is similar to a flow chart, it is a question that the computer must answer and then depending on that answer, the machine would receive new instructions and then receive a new question, or receive the same question, then the computer answers the question, and the cycle continues. The Turing was the first machine that could be programmed to follow many different algorithms. How the Turing Machine worked was that it was reader, the reader would be fed a piece or rewrite-able tape that would be as long as it needed to be, the tape would be split into sections with each section containing a 1, a 0 or a blank space, the reader would see the first section and act based on the instruction the algorithm gives it (ex. if section 1 contains a 0, erase the 0 and replace it with a 1, then move to section 19) and then the machine would follow the rest of the algorithm, the idea was that you’d put a question in coded form on a piece of tape, and when the tape exits the machine, it would be rewritten to become the answer. Click here for a video going into greater detail about the Turing Machine.

 

The Purpose and Legacy of the Turing Machine

The Turing Machine itself is useless, it serves no real purpose, but the concept of a computer being given direct instructions,then a question, being able to answer that question, and then being able to perform a task based on that answer is one of the largest jumps in the advancements in computing history and has become the base of modern computing to this day.No computer created today can do anything more than what the Turing machine can do, most modern day computers serve the same purpose that the Turing machine does, they receive instructions, and performs actions based on those instructions, no more, no less.

 

Alan Turing

Alan Turing

Born on June 23 1912, London England, Alan Turing had grown up with a talent for mathematics and
had studied at a top private school in London. He later graduated from the University of
Cambridge and proceeded to work as a mathematician and logician. It wouldn’t be until 1936 when
Alan Turing would create a machine that could solve the ever iconic Entscheidungsproblem after the
success of this problem, he would go on to work as a one of britain’s code breakers during WW2
and the cold war. He had also started the groundwork for the creation of artificial intellegence
or A.I for short. It would be until 1952 when he would be charged with “gross-indecency”- that
is to say, homosexuality (which was crime in Britain during the time). He had chosen to undergo
12 months of hormone therapy as opposed to imprisonment and hard labour. He then commited
suicide in 1954 because of the effects the hormone therapy had on his body.

 

The First General Purpose Computer (1945)

 

ENIAC

The ENIAC Super Computer

In 1945, a team of american engineers lead by John Mauchly and J. Presper
Eckert created world’s first fully electronic general purpose computer
which they named the Electronic Numerical Integrator And Computer, or
ENIAC for short. ENIAC was the fastest computer of it’s time, completely
programmable and was capable of solving math equations thought to only
be solvable by humans at lightning fast speeds. For example, when
calculating the trajectory of a missile going thousands of miles away
from the US, it would take a single human roughly 20 hours to finish
calculating a single trajectory, and there is still space for human
error, meanwhile the ENIAC is able to calculate that same trajectory in
30 seconds with no room for error. The Eniac was the size of a 20m by
30m room, costed $95 million dollars (adjusted for inflation) often
broke down, weighed 27,000kg, consumed 160 kilowatts of power every time it was used and was located in the city of Philadelphia where you can still visit the ENIAC at the Philadelphia University of Engineering.

ENIAC’s Successors and Legacy

John Mauchly (left) and Dr Presper Eckert Jr with ENIAC

Even though the ENIAC was built to only be used by the government and military, the concept of a general purpose computer instead of having many computer with different purposes became a very popular concept and a successor to the ENIAC would be built to be sold commercially. In 1949, John Mauchly and J. Presper Eckert tried their hands at creating a version of the ENIAC to be easier to use by businesses and companies, and thus the Binary Automatic Computer (BINAC) was born, although it was never finished and remained economically unviable. Using what they learned from their attempt at the BINAC, Mauchly and Eckert created the Universal Automatic Computer (UNIVAC). The UNIVAC was originally created to be sold to the United States Census Bureau, where it was known for it’s surprisingly accurate prediction during the 1952 presidential election (full details here).The UNIVAC would go on to be built into different models and sold to many big name computer companies, but wasn’t very popular for household use. It wouldn’t be until 1977 when the first household computers would be released to the public. These computers originally created by IBM, were small enough to fit on a table and required little to no prior technological knowledge to use.

 

The First Computing Language (1953)

 

COBOL

In 1953, american engineer Grace Hopper invented the world’s first human like computer language which was named the Common Business Oriented Language (COBOL). Before COBOL, computer languages consisted almost entirely of numbers and was very limited as to what it was able to do. The creation of a human-like programming language meant that it was not only much easier for people to code, but also much easier for people to read code and be more creative as to what the computer could do to get the task done. The creation of COBOL was also a major contributor to the creation of main frames, also known as “Big Iron”. Main Frames are similar to super computers, but have a much different purpose. While super computers focus on mass number crunching, main frames serve to transfer data as quickly as possible from one location to another. The ability to transfer data with no downtime is used a lot in the modern day, the most common use of main frames is in the use of bank transactions. Click here for a video that goes into greater detail about mainframes

COBOL would go on to be the inspiration for many more programming languages like javascript, HTML, C, Python and many many more.

 

Grace Hopper

Grace Hopper

Born on December 9, 1906, New York City, New York, Grace Hopper was known for many things, like being one of the first woman to receive a master’s degree in mathematics from the University of Yale (1930). When the US joined world war 2 on December 7th 1941, Grace joined the US navy in which she received the title of “Lieutenant”. When the war drew to a close, Grace returned to her mathematical roots and became a computer programmer, but still remained on the navy where she would remain a reserve officer until her retirement in 1966. While working as a programmer, Grace would assist in the programming of the UNIVAC in 1949 and, in 1952, Grace would begin leading her team of programmers to create COBOL. In 1966, Grace would return to the navy to work in the military’s communication department until her final retirement in 1986 at age 79.

 

The First GUI (1973)

 

GUI

One of the First GUI Models

In 1973, a company known as Xerox invented something called a Graphical User Interface or GUI for short. The GUI was a method of interaction between the user of a computer and the computer itself which involved the interaction between the user and various graphical icons and visual indicators, to allow such interaction, many more “input devices” were created to be used with the GUI. Input devices are computer peripherals that allow the user to input information to the computer, before the GUI, the only 2 input devices wer commonly used, those were the keyboard, and punch card readers. Many input devices were tried and tested but the most effective input device was the computer mouse which is still used to this day. Other input devices that became popular during this time include a joystick, a pen that you would put directly onto the screen, other variations of the computer mouse, trackballs, and smaller keyboards with arrow keys. Before the GUI, the only way of giving a computer information was by using a keyboard, or by literally placing the information into the computer via a punch card. Originally the GUI was going to only be compatible with IBM computers, but other large tech companies like Apple, Microsoft, and Compaq would get their hands on the GUI and use it for their own home computers.

 

Miscellaneous

Some other important jumps in computer history include (but aren’t limited to):

 

The invention of the Operating System (1969)

The invention of the operating system allowed much more efficient communication between the user, hardware and software application. Despite being invented by Nokia Bell Labs in 1969, the advancement and creation of the operating system is generally credited to Microsoft and Apple. A video explaining operating systems in greater detail can be found on the here.

The Invention of the Floppy Disk (1971)

The invention of the floppy disk in 1971 by IBM engineer Alan Shugart marked the first time information was able to be transferred from one computer to another. Before the floppy disk, if you wanted to transfer information from one computer to another, you would have to literally copy down all the information you want to transfer from one computer onto a piece of paper, then manually input it into the computer you wanted to transfer the information to, needless to say this was a very long and tedious process. Floppy disks would later be replaced by CDs and USB keys as it was made obsolete by the advancement of technology.

The invention of the Smart Phone (1992)

Invented by IBM in 1992, the smartphone fused the functionality between cell phones and computers to create a new more futuristic type of computer. The very first cell phone was able to make calls, text on it’s touch screen, was able to create and save small word documents, and even read and send emails.

 

Blockchain

Blockchain

When performing a transaction with another person over the internet, there’s one person who’s sending a payment, one person who’s sending a product or service in exchange for that payment and there’s a middle man who, for a small fee, will ensure that the correct amount of money will be sent to the correct person and that the correct product is sent to the correct person. In 2008, Japanese man Satoshi Nakomoto invented a method of facilitating this process through a system known as Blockchain. A Blockchain is a continuously growing list of records, called blocks which are linked and secured using cryptography. Each block usually contains a timestamp, transaction data, and a cryptographic hash of the previous block. Blockchain has 4 main purposes:

1. Facilitate communication between buyer and seller before, during and after their transaction.
2. Make it easier to send and recieve money during a transaction.
3. Reduce the fee required to complete said transaction.
4. Remove the middle man from the transaction, meaning that the buyer can easily and safely send the correct amount of money to      the seller and the seller can safely send the product and receive the money without the need for someone else to regulate all of it.

Bitcoin and Other Cryptocurrencies

Cryptocurrency is a currency that contains heavily encrypted information about the financial transaction, which is then exchanged through Blockchain. Bitcoin is the first and most well-known form of cryptocurrency and is the primary currency used in transactions that utilize Blockchain. Since it’s creation in 2009, over 4,000 bitcoin alternatives have been created. These alternatives are commonly referred to as “Altcoin”.

Sources:

Written and researched by Alexander K.

The First Computer:

https://www.livescience.com/20718-computer-history.html
https://www.computerhope.com/jargon/a/abacus.htm
http://brainomagic.com/why-brain-o-magic-old/abacus-and-its-history/

The First Mechanical Computer:

http://www.computerhistory.org/babbage/engines/
http://www.bbc.co.uk/history/historic_figures/babbage_charles.shtml

 

Punch Card Computing:

https://www.youtube.com/watch?v=9HXjLW7v-II
https://www.census.gov/history/www/census_then_now/notable_alumni/herman_hollerith.html
https://www.census.gov/history/www/innovations/technology/the_hollerith_tabulator.html
http://www-03.ibm.com/ibm/history/ibm100/us/en/icons/tabulator/

The Beginning of Computer Programming:

https://www.explainthatstuff.com/historyofcomputers.html
https://www.youtube.com/watch?v=dNRDvLACg5Q
http://www.turing.org.uk/publications/dnb.html
https://www.britannica.com/biography/Alan-Turing

The First General Purpose Computer:

https://www.youtube.com/watch?v=k4oGI_dNaPc
https://www.edn.com/electronics-blogs/edn-moments/4398199/BINAC-gets-under-way–October-9–1947
http://ethw.org/UNIVAC_and_the_1952_Presidential_Election

The First Computing Language:

https://devops.com/the-beauty-of-the-cobol-programming-language-v2/
https://www.biography.com/people/grace-hopper-21406809

The First GUI:

https://searchwindevelopment.techtarget.com/definition/GUI&lt
http://www.catb.org/esr/writings/taouu/html/ch02s05.html

Miscellaneous:

http://www.businessinsider.com/worlds-first-smartphone-simon-launched-before-iphone-2015-6
https://www.britannica.com/technology/floppy-disk
https://www.webopedia.com/TERM/O/operating_system.html
http://www.mobileindustryreview.com/2016/10/the-history-of-the-smartphone.html

Blockchain:

https://hbr.org/2017/02/a-brief-history-of-blockchain
https://www.investopedia.com/terms/b/blockchain.asp