1st Woman Programmer, created instructionalreoutines to feed computer. She understood Charles Babbage's machine using steam to perform math with no mistakes. Merchant
Charles Babbage was born in England in 1791. He studied mathematics at Cambridge and received an honorary degree in 1814. Babbage had an idea for a “difference machine” in 1822. This machine would be able to do basic polynomial calculations. The government loaned him over £17,000 to build this machine, but the it was never finished.
Babbage also had an idea for an “analytical machine” which would be able to do more complicated calculations. He created designs for this machine in 1835 and continued to work on its plans through 1842. He collaborated with the Countess of Lovelace on the programming of this machine. The machine was not built.
After creating the designs for the analytical machine, Babbage wrote new plans for a difference machine between 1846 and 1849. This new plan was smaller and used fewer parts. A working difference machine using these second set of plans was built in 1991 and is now at the London Science Museum. It is able to calculate numbers up to 31 digits long.
Jean was a mathematician and computer scientist who developed the FORmula MAnipulation Compiler (FORMAC) programming language during her 27 years at IBM. FORMAC was the first widely used computer language for symbolic manipulation of mathematical formulas. She was also a member of the subcommittee which created COmmon Business-Oriented Language (COBOL).
Turing was born in Britain and came to Princeton in 1936 to study mathematics and work towards his doctorate. He focused on problem solving using binary numbers and Boolean logic. Turing used this work with binary numbers and Boolean logic to program computers to carry out many different jobs based on instructions (or software) provided to the computer. These instructions were written in binary code and translated into holes punched into paper tape. The computer used this information to carry out the required job.
Turing’s wrote the paper “Computing Machinery and Intelligence” in 1950 which suggested that a computers could mimic human intelligence in the future.
While working for Bell Labs, Stibitz invented the "Model K" named for the "kitchen table" this is where he invented it. He was first to use binary circuits to perform arithmetic operation.
Hedy Lamarr was an actress from an early age when her parents sent her away to boarding school. Hedy married an arms manufacturer who was possessive and controlling which lead to the failure of their marriage. By keeping Hedy from acting though her husband may have given her a very big gift...time to develop another love, science. Hedy is responsible for patenting a Secret Communication System which made it impossible for enemies to intercept radio signals of torpedoes having been launched by quickly switching frequency. The Navy didn't take advantage of this technology until after the patent had expired which resulted in Hedy not receiving any fame or fortune for her invention. The Electronic Frontier Foundation uncovered the patent information and presented a 70 year old Hedy Lamarr with the Pioneer Award. She died three years later. WiFi and most wireless connections are possible because of this discovery and it is now referred to as Spread Spectrum Communication Technology.
Grace Hopper was a mathematics genius and computer pioneer. She is especially well known for pioneering “user friendly” computer software to make computers more accessible. Her best known contribution to computing was the invention of the “compiler”, an intermediate program that translated English language instructions into the language of the target computer.
J.C.R. Licklider's contribution to the development of the Internet consists of ideas not inventions. He foresaw the need for networked computers with easy user interfaces. His ideas foretold of graphical computing, point-and -click interfaces, digital libraries, e-commerce, online banking, and software that would exist on a network and migrate to wherever it was needed.
While working as a new engineer at Texas Instruments, Jack Kilby built the first integrated circuit in 1958. These new integrated circuits were more reliable and less costly to produce. They also allowed many more transistors to be contained in a small area. Kilby created his original integrated circuit using germanium. Robert Noyce improved on Kilby’s design by making an integrated circuit using silicon. Doles
In 1965, Moore predicted that the number of transistors on each circuit would double every year. In 1975, he refined his view to say that this "doubling" would occur every two years after seeing a “limit to the contribution of circuit device cleverness of another factor of four in component density”. According to Wikipedia, Moore’s Law is “the observation that over the history of computing hardware, the number of transistors on integrated circuits doubles approximately every two years”.
Yes, Moore’s Law still holds true today, as researchers are still finding ways to make chip components smaller and more powerful. Many people still feel that Moore’s Law will only be in existence for approximately ten more years, due to engineering challenges to get chips to measure as small as 5 nanometers or .005 microns. To illustrate just how small this is, the width of one human hair measures approximately 1 micron!
Gordon E. Moore observed that the number of components in integrated circuits had doubled every year from the invention of the integrated circuit in 1958 until 1965. He predicted that this trend would continue "for at least ten years."
This trend has, indeed, continued for more than 50 years. There are researchers on both sides arguing on whether the trend will continue.
In 1969, Clarence Ellis became the first African American to receive a Ph.D. in Computer Science. Ellis was introduced to computers in 1958 when, at the age of 15, he got a part-time job as a security guard for an insurance company, guarding the company’s new and expensive computer. Although he was not allowed to operate the computer, he did read all of the operating manuals, which empowered him to play the hero one day when the computer technicians ran out of punch cards to complete an important project, Ellis was able to show them how to reuse old punch cards. He later worked on the Illiac 4, one of the world’s first supercomputers.
Elizabeth “Jake” Feinler pioneered and managed the ARPANET and the Defense Data Network (DDN) network information centers under the Department of Defense. Her group developed the first Internet “yellow-“ and “white-page” servers as well as the first query-based network host name and address servers. Basically, Elizabeth was the “go-to” person when you wanted a domain name. Her group also managed the Host Naming Registry for the Internet from 1972 – 1989 and developed the top-level domain naming scheme of .com, .edu, .gov, .mil, .org, and .net that we still use today.
Used a pdp8-e and programmed in basic. Saved programs to paper tape to be loaded later. Data was encoded as zeros and ones -- a one was represented by a hole in the tape that light could pass through.
Nancy Hafkin is a pioneer and innovator in the area of networking, development information, and electronic communications. Most of her work has been in Africa, where she helped build the continent’s ICT framework. She also contributed to the Association for Progressive Communications’ (APC) which allowed over 10 African countries the ability to have email during the early 1990s, before Internet got to Africa.
Creater of first popular personal computer - MITS Altair 8800. The MITS Altair had an Intel 8080, 2.0 MHz with 256 bytes, 64K max RAM with paper tape, cassette or floppy drive storage and ran on CP/M, BASIC Operating System.
Dr. Mark Dean started working at IBM in 1980 and was instrumental in the invention of the Personal Computer (PC). He holds three of IBM's original nine PC patents and currently holds more than 20 total patents. The famous African-American inventor never thought the work he was doing would end up being so useful to the world, but he has helped IBM make instrumental changes in areas ranging from the research and application of systems technology circuits to operating environments. One of his most recent computer inventions occurred while leading the team that produced the 1-Gigahertz chip, which contains one million transistors and has nearly limitless potential.
Intrigued by the expanding world of computers and gadgetry, Dell purchased an early Apple computer at the age of 15 for the strict purpose of taking it apart to see how it worked. It was in college that Dell found the niche that would become his boom. The PC world was still young and Dell realized that no company had tried selling directly to customers.
Tim Berners-Lee was born in 1955. Berners-Lee was is a computer scientist who was the first to have a HTTP communicate with a server using the internet in 1989. He is also the founder of the World Wide Web Foundation and is known to be the inventor of the internet.
My personal favorite woman who has achieved fame in computing is Marina Bers, PhD. Professor Bers is adjunct faculty at Tuft's University and works at Elliot Pearson Department of Child Development. In 2005 she received the Presidential Early Career Award for Scientists and Engineers. She landed two NSF grants at the same time which is unheard of. She has published two books, Blocks to Robots and Designing Digital Experiences for Positive Youth Development: From Playpen to Playground. For those interested in robotics, Dr. Bers has created an early childhood version of Lego Wedo.
First hardware programming punch cards for weaving looms to make fabric by Joseph-Marie Jacquard. Cards guided patterns in the fabric.
Relays are essentially used to minimize the use of heavy gauge wiring and large, expensive high current switches. The relay was invented in 1835 by American scientist Joseph Henry in order to improve his version of the electrical telegraph, developed earlier in 1831. Konrad Zuse created the V1(Z1) which was the first computer to use relays. This machine weighed about 2204 lb. and consisted of 20,000 parts. There are many different types of relays. 4cm is the average size of a relay.
Side note- The first computer bug was literally a moth, stuck in a relay of the Harvard Mark II, that kept it from functioning.
Flowers was successful in developing a system of vacuum tubes so that when one failed, others could be switched on and keep working. This made running the computer more reliable. The first setup had 3000 vacuum tubes, and by the time he developed Colossus, 1,500. This was the first time the vacuum tubes worked together to power the same machine.--Martinez
The Z1, the first binary computer was used to explore several groundbreaking technologies in calculator development: floating-point arithmetic, high-capacity memory and modules or relays operating on the yes/no principle. The Z1 was 650 cubic feet.
ENIAC (Electronic Numerical Integrator Analyzer and Computer) was the first computer to use vacuum tubes. It was built because of a need of quick and accurate firing tables for military artillery targets during WWII. John Mauchly at the University of Pennsylvania had the idea to use vacuum tubes to make calculations of numbers faster. Mauchly and his team were commissioned to build ENIAC. It was finished after WWII was over, but the computer was used to make calculations for other government projects during the Cold War.
ENIAC was very large and used over 17,000 vacuum tubes, 70,000 resistors, and 6,000 manual switches. ENIAC weighed about 25 tons and took up about 680 square feet. The cost to build the computer in 1945 was about $500,000. ENIAC could do about 5,000 addition calculations in one second. It was also estimated that more complex problems that might take a person up to 20 hours could be solved by ENIAC in about 15 minutes.
Computer engineers Mauchly and Eckert were joined by John von Neumann to improve on ENIAC by making plans for EDVAC (Electronic Discrete Variable Automatic Calculator). This computer also used vacuum tubes. Like ENIAC, the military also paid to have this computer built. EDVAC was superior to ENIAC in a few ways. EDVAC was a binary computer and it was the first computer that relied on stored memory on a disk. ENIAC had to be reprogrammed manually.
EDVAC had about 6,000 vacuum tubes and it took up about 490 square feet. This computer weighed about 17,300 pounds. The cost to build EDVAC was about $500,000.
EDVAC was used by the Ballistic Research Laboratory from 1951 – 1961, with some improvements made to it during its life.
Bell Laboratory's engineers - John Bardeen and Walter Brattain - invented the transistor in 1947 and announced the invention in 1948. The junction transistor was invented by their associate, William Shockley, a few months later, and all three jointly shared the Nobel Prize in Physics in 1956 for their invention.
Transistors were also more reliable than vacuum tubes. Changing from the vacuum tube to the transistor paved the way for making smaller, faster and more efficient computers. There has been many different designs of transistors but they are primarily made of different layers of materials deposited on a silicon subtrate, thus the name Silicon Valley for the concentration of manufacturers in Calefornia.
Miniaturization is still continuing to this day. The university of Mancester's Transistor Computer, is believed to be the first one and built in 1953 with a CPU speed of 58MHz. Some of the computers of the Second Generation were:
1. IBM 1620: Its size was smaller as compared to First Generation computers and mostly used for scientific purpose. (about the size of a refrigerator)
2.IBM 1401: Its size was small to medium and used for business applications.
3.CDC 3600: Its size was large and is used for scientific purposes.
The invention of the transistor was one of the most important developments leading to the personal computer revolution. The transistor was invented in 1947 and announced in 1948 by Bell Laboratory engineers John Bardeen and Walter Brattain. Because the transistor was so much smaller and consumed significantly less power, a computer system built with transistors was also much smaller, faster, and more efficient than a computer system built with vacuum tubes. Transistors are used in today’s computers, cell phones, and tablets.
Size 45 nm is the line width for the conduction channel of a MOSFET. You can fit 1.5 billion modern transistors inside a single tube of the ENIAC.
An American engineer working at Texas Instruments, Jack Kilby and his associate, Robert Noyce, are acredited for realizing the integrated circuit. Noyce built a similar circuit to Kilby's a few months later and Kilby is generally credited as the co-inventor. He received the Nobel Prize in Physics in 2000. The sizes of computers started to shrink even more drastically as with 2nd generation computers. The integrated circuit led to the invention of the microprocessor.The chips were so small that they could be built into hand-held devices. for example, an Intel 8742, is an 8-bit microcontroller that includes a CPU running at 12 MHz, 128 bytes of RAM(Random Access Memory), 2048 bytes of EPROM (read only memory), and I/O in the same chip. This lead to the invention of the micro computer, small, low-cost computers that could now be afforded by individuals as well as small businesses. After 1980 PC's became valuable household necessities.
The first computer credited with using integrated circuits is the Apollo Guidance Computer developed for the Apollo Program in the early 1960s. This computer weighed 70 lbs. with dimensions of 24x12.5x6.5 inches. The speed of the computer was 2.048 Mhz. This computer was for computation and electronic interfaces for guidance, navigation and control of the spacecraft.
This was a mainframe computer used by UC Berkeley for high energy nuclear physics research. It was the fastest computer until 1969- performing one MegaFlops. It is the size of a table with it on top.
In 1965, first administrators got computers.
In 1967,first vocational programs for students to learn computer maintenance. Did most schools use them for student scheduling like my high school did?
Moore's law is the observation that over the history of computing hardware, the number of transistors on integrated circuits doubles approximately every two years. He predicted that this trend would continue for at least ten years. (It did.)
The law is named after Intel co-founder Gordon E. Moore, who described the trend in his 1965 paper.
The machine was purchased in the thousands by finance workers on the strength of this program. You can use VisiCalc to balance your checkbook, keep track of credit card purchases, calculate your net worth, and do your taxes. It came along in 1978.
WordStar was the most popular word processor during much of the 80s.
VisiCalc was the first spreadsheet computing program. It was created for the Apple II platform and propelled the Apple from being a toy to a useful tool in business. (It came out two years before the IBM PC).
WordStar was the most popular word processor during much of the 1980s. WordStar was the first commercially successful word processing software program produced for microcomputers and the best selling software program of the early eighties.
IBM ships PC running MS-DOS in 1981. MS-DOS runs effectively, and "cryptic language" makes things difficulty to understand. Merchant
Lotus 1-2-3 was a spreadsheet created for the IBM PC, that significantly contributed to IBM's success in the corporate environment. It was IBM's first "Killer Application."
The Mac Operating System was a graphic user interface (GUI) initially released with the first Mac in 1984. The intent of the GUI interface was to make it more "intuitive" and easy for a user to navigate, rather than a user having to learn a complicated system with a command line. The GUI interface was a system of folders, and a main command program, called Finder. Mac OS X and above are more advanced versions.--Martinez
The first windows program allowed running several programs at a time. There were drop-down menus, scroll bars, icons, and dialog boxes. This program was easier to learn and use. Merchant
A program so useful or desired that purchasing the actual computer or operating system becomes of upmost importance. For example, I bought a Nintendo 3DS because it is the only platform where I can play Luigi's Dark Mansion (when it is released later this month, that is.)--Martinez
When a program is so instrumental it's name becomes a verb meaning "to manipulate or fabricate an image" it earns distinction as a killer app.
A Killer Application or Killer App is a piece of software that is so desirable by itself that it makes the hardware needed to run it irresistible. For example one might have said "I love playing Space Invaders so much, I'm going to buy an Atari." Typically speaking the hardware drives the need/use for the software but not with a Killer App. This term was used most frequently in the 1990's in reference to video gaming. O'Clair
Microsoft Windows 95 was released August 24, 1995 and sells more than 1 Million copies within 4 days.
Killer app for the Playstation game console alongside the Metal Gear Solid series.
GoldenEye shared the title with Zelda: Orcarina of Time for killer apps to the Nintendo 64 system
Sun Solaris 7 operating system released. Used for workstations and servers
• Mac OS X – designed exclusively for Apple servers and desktops is based on a Unix kernel. This operating system provided a GUI for Mac servers starting in 1999 and desktops in 2001. By 2002, all Mac computers were shipping with Mac OS X. However, there was still an option to start up with the Classic OS 9. All Mac OS X releases are named after big cats—Mountain Lion, Snow Leopard…
Microsoft Windows XP is released October 25, 2001. It was for the PC
The Halo series is considered the killer app of the Xbox and Xbox 360. The game was so popular that consumers bought the Xbox and Xbox 360 just to play it. Killer apps did not originate in the gaming community. Spreadsheets and word processors were the first killer apps made. Halo was used for entertainment as a Sci-Fi FPS (First Person Shooter) All hail Master Chief!
Android a Linux based OS found on smartphones and tablets.