History of Digital Technology

People

Augusta Ada King, Countess of Lovelace

1815

1st Woman Programmer, created instructionalreoutines to feed computer. She understood Charles Babbage's machine using steam to perform math with no mistakes. Merchant

Charles Babbage

1822 - 1849

Charles Babbage was born in England in 1791. He studied mathematics at Cambridge and received an honorary degree in 1814. Babbage had an idea for a “difference machine” in 1822. This machine would be able to do basic polynomial calculations. The government loaned him over £17,000 to build this machine, but the it was never finished.

Babbage also had an idea for an “analytical machine” which would be able to do more complicated calculations. He created designs for this machine in 1835 and continued to work on its plans through 1842. He collaborated with the Countess of Lovelace on the programming of this machine. The machine was not built.
After creating the designs for the analytical machine, Babbage wrote new plans for a difference machine between 1846 and 1849. This new plan was smaller and used fewer parts. A working difference machine using these second set of plans was built in 1991 and is now at the London Science Museum. It is able to calculate numbers up to 31 digits long.
Doles

Jean E. Sammet

1928

Jean was a mathematician and computer scientist who developed the FORmula MAnipulation Compiler (FORMAC) programming language during her 27 years at IBM. FORMAC was the first widely used computer language for symbolic manipulation of mathematical formulas. She was also a member of the subcommittee which created COmmon Business-Oriented Language (COBOL).
Shaffer

Alan Turing

1937 - 1950

Turing was born in Britain and came to Princeton in 1936 to study mathematics and work towards his doctorate. He focused on problem solving using binary numbers and Boolean logic. Turing used this work with binary numbers and Boolean logic to program computers to carry out many different jobs based on instructions (or software) provided to the computer. These instructions were written in binary code and translated into holes punched into paper tape. The computer used this information to carry out the required job.
Turing’s wrote the paper “Computing Machinery and Intelligence” in 1950 which suggested that a computers could mimic human intelligence in the future.
Doles

George Stibitz

Nov 1937

While working for Bell Labs, Stibitz invented the "Model K" named for the "kitchen table" this is where he invented it. He was first to use binary circuits to perform arithmetic operation.
Merchant

Hedy Lamarr

1942

Hedy Lamarr was an actress from an early age when her parents sent her away to boarding school. Hedy married an arms manufacturer who was possessive and controlling which lead to the failure of their marriage. By keeping Hedy from acting though her husband may have given her a very big gift...time to develop another love, science. Hedy is responsible for patenting a Secret Communication System which made it impossible for enemies to intercept radio signals of torpedoes having been launched by quickly switching frequency. The Navy didn't take advantage of this technology until after the patent had expired which resulted in Hedy not receiving any fame or fortune for her invention. The Electronic Frontier Foundation uncovered the patent information and presented a 70 year old Hedy Lamarr with the Pioneer Award. She died three years later. WiFi and most wireless connections are possible because of this discovery and it is now referred to as Spread Spectrum Communication Technology.

O'Clair

Grace Hopper

07/1944

Grace Hopper was a mathematics genius and computer pioneer. She is especially well known for pioneering “user friendly” computer software to make computers more accessible. Her best known contribution to computing was the invention of the “compiler”, an intermediate program that translated English language instructions into the language of the target computer.
J. Jacobson

J.C.R. Licklider

1950 - 1965

J.C.R. Licklider's contribution to the development of the Internet consists of ideas not inventions. He foresaw the need for networked computers with easy user interfaces. His ideas foretold of graphical computing, point-and -click interfaces, digital libraries, e-commerce, online banking, and software that would exist on a network and migrate to wherever it was needed.
Scarborough

Jack Kilby

1958

While working as a new engineer at Texas Instruments, Jack Kilby built the first integrated circuit in 1958. These new integrated circuits were more reliable and less costly to produce. They also allowed many more transistors to be contained in a small area. Kilby created his original integrated circuit using germanium. Robert Noyce improved on Kilby’s design by making an integrated circuit using silicon. Doles

Moore's Law

1965

In 1965, Moore predicted that the number of transistors on each circuit would double every year. In 1975, he refined his view to say that this "doubling" would occur every two years after seeing a “limit to the contribution of circuit device cleverness of another factor of four in component density”. According to Wikipedia, Moore’s Law is “the observation that over the history of computing hardware, the number of transistors on integrated circuits doubles approximately every two years”.

Yes, Moore’s Law still holds true today, as researchers are still finding ways to make chip components smaller and more powerful. Many people still feel that Moore’s Law will only be in existence for approximately ten more years, due to engineering challenges to get chips to measure as small as 5 nanometers or .005 microns. To illustrate just how small this is, the width of one human hair measures approximately 1 micron!
J. Jacobson

Gordon E. Moore

1965

Gordon E. Moore observed that the number of components in integrated circuits had doubled every year from the invention of the integrated circuit in 1958 until 1965. He predicted that this trend would continue "for at least ten years."
This trend has, indeed, continued for more than 50 years. There are researchers on both sides arguing on whether the trend will continue.
Martin

Clarence Ellis

1969

In 1969, Clarence Ellis became the first African American to receive a Ph.D. in Computer Science. Ellis was introduced to computers in 1958 when, at the age of 15, he got a part-time job as a security guard for an insurance company, guarding the company’s new and expensive computer. Although he was not allowed to operate the computer, he did read all of the operating manuals, which empowered him to play the hero one day when the computer technicians ran out of punch cards to complete an important project, Ellis was able to show them how to reuse old punch cards. He later worked on the Illiac 4, one of the world’s first supercomputers.

Shropshire

Elizabeth “Jake” Feinler

1972 - 1989

Elizabeth “Jake” Feinler pioneered and managed the ARPANET and the Defense Data Network (DDN) network information centers under the Department of Defense. Her group developed the first Internet “yellow-“ and “white-page” servers as well as the first query-based network host name and address servers. Basically, Elizabeth was the “go-to” person when you wanted a domain name. Her group also managed the Host Naming Registry for the Internet from 1972 – 1989 and developed the top-level domain naming scheme of .com, .edu, .gov, .mil, .org, and .net that we still use today.
J. Jacobson

Dr. Barowy starts programming

02/19/1972

Used a pdp8-e and programmed in basic. Saved programs to paper tape to be loaded later. Data was encoded as zeros and ones -- a one was represented by a hole in the tape that light could pass through.

Barowy

Nancy Hafkin

1975 - 2000

Nancy Hafkin is a pioneer and innovator in the area of networking, development information, and electronic communications. Most of her work has been in Africa, where she helped build the continent’s ICT framework. She also contributed to the Association for Progressive Communications’ (APC) which allowed over 10 African countries the ability to have email during the early 1990s, before Internet got to Africa.
J. Jacobson

H. Edward Roberts

1975

Creater of first popular personal computer - MITS Altair 8800. The MITS Altair had an Intel 8080, 2.0 MHz with 256 bytes, 64K max RAM with paper tape, cassette or floppy drive storage and ran on CP/M, BASIC Operating System.
Shaffer

Dr. Mark Dean

1980 - 2013

Dr. Mark Dean started working at IBM in 1980 and was instrumental in the invention of the Personal Computer (PC). He holds three of IBM's original nine PC patents and currently holds more than 20 total patents. The famous African-American inventor never thought the work he was doing would end up being so useful to the world, but he has helped IBM make instrumental changes in areas ranging from the research and application of systems technology circuits to operating environments. One of his most recent computer inventions occurred while leading the team that produced the 1-Gigahertz chip, which contains one million transistors and has nearly limitless potential.

Shropshire

Michael Dell

1984

Intrigued by the expanding world of computers and gadgetry, Dell purchased an early Apple computer at the age of 15 for the strict purpose of taking it apart to see how it worked. It was in college that Dell found the niche that would become his boom. The PC world was still young and Dell realized that no company had tried selling directly to customers.
Scarborough

Tim Berners-Lee

1989

Tim Berners-Lee was born in 1955. Berners-Lee was is a computer scientist who was the first to have a HTTP communicate with a server using the internet in 1989. He is also the founder of the World Wide Web Foundation and is known to be the inventor of the internet.

Mongerson

Marina Umaschi Bers, PhD

2005

My personal favorite woman who has achieved fame in computing is Marina Bers, PhD. Professor Bers is adjunct faculty at Tuft's University and works at Elliot Pearson Department of Child Development. In 2005 she received the Presidential Early Career Award for Scientists and Engineers. She landed two NSF grants at the same time which is unheard of. She has published two books, Blocks to Robots and Designing Digital Experiences for Positive Youth Development: From Playpen to Playground. For those interested in robotics, Dr. Bers has created an early childhood version of Lego Wedo.

O'Clair

Hardware

Programmed Punch Cards

1801

First hardware programming punch cards for weaving looms to make fabric by Joseph-Marie Jacquard. Cards guided patterns in the fabric.
Merchant

Relays

1835

Relays are essentially used to minimize the use of heavy gauge wiring and large, expensive high current switches. The relay was invented in 1835 by American scientist Joseph Henry in order to improve his version of the electrical telegraph, developed earlier in 1831. Konrad Zuse created the V1(Z1) which was the first computer to use relays. This machine weighed about 2204 lb. and consisted of 20,000 parts. There are many different types of relays. 4cm is the average size of a relay.

Side note- The first computer bug was literally a moth, stuck in a relay of the Harvard Mark II, that kept it from functioning.

Garret Talarczyk

First use of vacuum tube

02/1906

Flowers Vacuum Tube

1934 - 1935

Flowers was successful in developing a system of vacuum tubes so that when one failed, others could be switched on and keep working. This made running the computer more reliable. The first setup had 3000 vacuum tubes, and by the time he developed Colossus, 1,500. This was the first time the vacuum tubes worked together to power the same machine.--Martinez

Z1

1936

The Z1, the first binary computer was used to explore several groundbreaking technologies in calculator development: floating-point arithmetic, high-capacity memory and modules or relays operating on the yes/no principle. The Z1 was 650 cubic feet.

Shropshire

ENIAC

1943 - 1945

ENIAC (Electronic Numerical Integrator Analyzer and Computer) was the first computer to use vacuum tubes. It was built because of a need of quick and accurate firing tables for military artillery targets during WWII. John Mauchly at the University of Pennsylvania had the idea to use vacuum tubes to make calculations of numbers faster. Mauchly and his team were commissioned to build ENIAC. It was finished after WWII was over, but the computer was used to make calculations for other government projects during the Cold War.
ENIAC was very large and used over 17,000 vacuum tubes, 70,000 resistors, and 6,000 manual switches. ENIAC weighed about 25 tons and took up about 680 square feet. The cost to build the computer in 1945 was about $500,000. ENIAC could do about 5,000 addition calculations in one second. It was also estimated that more complex problems that might take a person up to 20 hours could be solved by ENIAC in about 15 minutes.
Doles

EDVAC

1944 - 1961

Computer engineers Mauchly and Eckert were joined by John von Neumann to improve on ENIAC by making plans for EDVAC (Electronic Discrete Variable Automatic Calculator). This computer also used vacuum tubes. Like ENIAC, the military also paid to have this computer built. EDVAC was superior to ENIAC in a few ways. EDVAC was a binary computer and it was the first computer that relied on stored memory on a disk. ENIAC had to be reprogrammed manually.
EDVAC had about 6,000 vacuum tubes and it took up about 490 square feet. This computer weighed about 17,300 pounds. The cost to build EDVAC was about $500,000.
EDVAC was used by the Ballistic Research Laboratory from 1951 – 1961, with some improvements made to it during its life.

Doles

Invention of the transistor

1947

Bell Laboratory's engineers - John Bardeen and Walter Brattain - invented the transistor in 1947 and announced the invention in 1948. The junction transistor was invented by their associate, William Shockley, a few months later, and all three jointly shared the Nobel Prize in Physics in 1956 for their invention.

Transistors were also more reliable than vacuum tubes. Changing from the vacuum tube to the transistor paved the way for making smaller, faster and more efficient computers. There has been many different designs of transistors but they are primarily made of different layers of materials deposited on a silicon subtrate, thus the name Silicon Valley for the concentration of manufacturers in Calefornia.

Miniaturization is still continuing to this day. The university of Mancester's Transistor Computer, is believed to be the first one and built in 1953 with a CPU speed of 58MHz. Some of the computers of the Second Generation were:
1. IBM 1620: Its size was smaller as compared to First Generation computers and mostly used for scientific purpose. (about the size of a refrigerator)

2.IBM 1401: Its size was small to medium and used for business applications.

3.CDC 3600: Its size was large and is used for scientific purposes.
Luckhoff

Transistor

1947

The invention of the transistor was one of the most important developments leading to the personal computer revolution. The transistor was invented in 1947 and announced in 1948 by Bell Laboratory engineers John Bardeen and Walter Brattain. Because the transistor was so much smaller and consumed significantly less power, a computer system built with transistors was also much smaller, faster, and more efficient than a computer system built with vacuum tubes. Transistors are used in today’s computers, cell phones, and tablets.
Size 45 nm is the line width for the conduction channel of a MOSFET. You can fit 1.5 billion modern transistors inside a single tube of the ENIAC.

Garret Talarczyk

Invention of the Integrated Circuit (chip)

1958

An American engineer working at Texas Instruments, Jack Kilby and his associate, Robert Noyce, are acredited for realizing the integrated circuit. Noyce built a similar circuit to Kilby's a few months later and Kilby is generally credited as the co-inventor. He received the Nobel Prize in Physics in 2000. The sizes of computers started to shrink even more drastically as with 2nd generation computers. The integrated circuit led to the invention of the microprocessor.The chips were so small that they could be built into hand-held devices. for example, an Intel 8742, is an 8-bit microcontroller that includes a CPU running at 12 MHz, 128 bytes of RAM(Random Access Memory), 2048 bytes of EPROM (read only memory), and I/O in the same chip. This lead to the invention of the micro computer, small, low-cost computers that could now be afforded by individuals as well as small businesses. After 1980 PC's became valuable household necessities.
Luckhoff

First Computers to use ICs

1963

The first computer credited with using integrated circuits is the Apollo Guidance Computer developed for the Apollo Program in the early 1960s. This computer weighed 70 lbs. with dimensions of 24x12.5x6.5 inches. The speed of the computer was 2.048 Mhz. This computer was for computation and electronic interfaces for guidance, navigation and control of the spacecraft.
Annmarie Mazzocchi

CDC 6600

1964

This was a mainframe computer used by UC Berkeley for high energy nuclear physics research. It was the fastest computer until 1969- performing one MegaFlops. It is the size of a table with it on top.

Mongerson

First School Computers

1965 - 1967

In 1965, first administrators got computers.
In 1967,first vocational programs for students to learn computer maintenance. Did most schools use them for student scheduling like my high school did?
Merchant

Moore's Law

1965

Moore's law is the observation that over the history of computing hardware, the number of transistors on integrated circuits doubles approximately every two years. He predicted that this trend would continue for at least ten years. (It did.)
The law is named after Intel co-founder Gordon E. Moore, who described the trend in his 1965 paper.
Martin

Software

VisiCalc spreadsheet

1978

The machine was purchased in the thousands by finance workers on the strength of this program. You can use VisiCalc to balance your checkbook, keep track of credit card purchases, calculate your net worth, and do your taxes. It came along in 1978.
Scarborough

WordStar

1979

WordStar was the most popular word processor during much of the 80s.
Martin

VisiCalc

1979

VisiCalc was the first spreadsheet computing program. It was created for the Apple II platform and propelled the Apple from being a toy to a useful tool in business. (It came out two years before the IBM PC).
Martin

WordStar

1980

WordStar was the most popular word processor during much of the 1980s. WordStar was the first commercially successful word processing software program produced for microcomputers and the best selling software program of the early eighties.
Scarborough

MS-DOS

June 1980

IBM ships PC running MS-DOS in 1981. MS-DOS runs effectively, and "cryptic language" makes things difficulty to understand. Merchant

Lotus 1-2-3

01/26/83

Lotus 1-2-3 was a spreadsheet created for the IBM PC, that significantly contributed to IBM's success in the corporate environment. It was IBM's first "Killer Application."
Martin

Mac OS

1984 - 2013

The Mac Operating System was a graphic user interface (GUI) initially released with the first Mac in 1984. The intent of the GUI interface was to make it more "intuitive" and easy for a user to navigate, rather than a user having to learn a complicated system with a command line. The GUI interface was a system of folders, and a main command program, called Finder. Mac OS X and above are more advanced versions.--Martinez

Windows 1.0

Nov. 20, 1985

The first windows program allowed running several programs at a time. There were drop-down menus, scroll bars, icons, and dialog boxes. This program was easier to learn and use. Merchant

Killer App

1989

A program so useful or desired that purchasing the actual computer or operating system becomes of upmost importance. For example, I bought a Nintendo 3DS because it is the only platform where I can play Luigi's Dark Mansion (when it is released later this month, that is.)--Martinez

Adobe Photoshop

1990

When a program is so instrumental it's name becomes a verb meaning "to manipulate or fabricate an image" it earns distinction as a killer app.

O'Clair

Killer Application

1990

A Killer Application or Killer App is a piece of software that is so desirable by itself that it makes the hardware needed to run it irresistible. For example one might have said "I love playing Space Invaders so much, I'm going to buy an Atari." Typically speaking the hardware drives the need/use for the software but not with a Killer App. This term was used most frequently in the 1990's in reference to video gaming. O'Clair

Microsoft Windows 95

08/24/1995

Microsoft Windows 95 was released August 24, 1995 and sells more than 1 Million copies within 4 days.

Final Fantasy VII Killer APP Playstation

01/31/1997

Killer app for the Playstation game console alongside the Metal Gear Solid series.
Aviles

GoldenEye Killer APP for N64

08/25/1997

GoldenEye shared the title with Zelda: Orcarina of Time for killer apps to the Nintendo 64 system

Aviles

Linux "Sun Solaris" Operating System

11/1998

Sun Solaris 7 operating system released. Used for workstations and servers

Mac OS X

1999

• Mac OS X – designed exclusively for Apple servers and desktops is based on a Unix kernel. This operating system provided a GUI for Mac servers starting in 1999 and desktops in 2001. By 2002, all Mac computers were shipping with Mac OS X. However, there was still an option to start up with the Classic OS 9. All Mac OS X releases are named after big cats—Mountain Lion, Snow Leopard…
Annmarie Mazzocchi

Windows XP

10/25/2001

Microsoft Windows XP is released October 25, 2001. It was for the PC

HALO Killer APP for Xbox Game Console

11/15/2001

The Halo series is considered the killer app of the Xbox and Xbox 360. The game was so popular that consumers bought the Xbox and Xbox 360 just to play it. Killer apps did not originate in the gaming community. Spreadsheets and word processors were the first killer apps made. Halo was used for entertainment as a Sci-Fi FPS (First Person Shooter) All hail Master Chief!
Aviles

Android

2008

Android a Linux based OS found on smartphones and tablets.
Annmarie Mazzocchi