Home

History Apple Computers IBM Compatible Workstations    

 
 
History of Computers: 1946 to 1996

 
 

In 1946 ENIAC (Electronic Numerical Integrator And Computer) was unveiled in Philadelphia. Although ENIAC only represented a stepping stone towards the true computer, Eckert and Mauchly completed the construction even though they knew that the machine was not the ultimate in the state-of-the-art technology. ENIAC was programmed through the rewiring the interconnections between the various components and included the capability of parallel computation. ENIAC was later to be modified into a stored program machine, but not before other machines had claimed the claim to be the first true computer. Later that year Eckert and Mauchly, in a patent dispute with the University of Pennsylvania, left the University to establish the first computer company - Electronic Control Corporation with a plan to build the UNIVAC (Universal Automatic Computer). After many crises they built the BINAC for Northrup Aviation, and were taken over by Remington-Rand before the UNIVAC was completed. At the same time the ERA (Electronic Research Associates) was incorporated in Minneapolis and took their knowledge of computing devices to create a line of computers; later ERA was also assimilated into Remington-Rand.

In 1947 William Shockley, John Bardeen, and Walter Brattain invented the 'transfer resistance' device, later to be known as the transistor that would revolutionise the computer. In the following year, the work on a stored program computer was ongoing in at least four locations -at the University of Pennsylvania on the construction of EDSAC (Electronic Delay Storage Automatic Computer), with John von Neumann at Princeton University on the IAS (Institute for Advanced Study Machine), with Maurice Wilkes at Cambridge University and at the University of Manchester.

Max Newman, one of the leaders of the Bletchley Park, England, activity during World War II, had created the Royal Society Computing Laboratory at Manchester, was looking for a means to build a computer. On 21st June 1948 their prototype machine, the Baby was operated for the first time. The world had moved from the domain of calculators to the domain of computers. Williams, Kilburn, and Newman continued to build a full scale machine they designated the Manchester Mark I. The Ferranti Corporation took the design and began a line of computers that were one of the major parts of the British Computer Industry.

In 1949, just a year after the Manchester Baby machine became the first operating stored program machine in the world, the first large-scale, fully functional, stored program electronic digital computer was developed by Maurice Wilkes and the staff of the Mathematical Laboratory at Cambridge University. It was named EDSAC; the primary storage system was a set of tubes filled with mercury through which generated and regenerated acoustic pulses represented the bits of data. Wilkes had attended the 1946 Summer School at the University of Pennsylvania and had come home with the basic plans for a machine in his mind. In the United States of America, the National Bureau of Standards began work on two machines. The Bureau had been made responsible for managing the contract for the UNIVAC (Universal Automatic Computer) to the Census Bureau, but knew that it needed computational facilities for its own work. Not having a large budget, the Bureau decided to build its own machines.

Alan Turing joined the staff of the National Physical Laboratory at Teddington, England, in 1950 with plans to build his own computer. His design for the ACE (Automatic Computing Engine) was completed in 1947, but because the Physics section rather than the Mathematics department, where Turing resided became responsible for it, Turing left NPL to take up a position with his war-time boss, Max Newman, at the University of Manchester. The work on a prototype machine based on Turing's plans, named Pilot Ace, was designed by Harry Huskey in 1948 and completed in 1951.

In 1951 after five years of work and several different instantiations of the first computer company established by Eckert and Mauchly, the UNIVAC computer was delivered to the Census Bureau, just in time to begin work on the decennial census. Somewhat over budget, the hope for the Remington-Rand Corporation was that they could produce a sufficient number of copies to recover their losses on a 1946 fixed-price contract with the government. Eventually 46 copies were built. Maurice Wilkes had realized quickly after the completion of the work on EDSAC at Cambridge University that "a good part of the remainder of [his] life was going to be spent in finding errors in ... programs". With Stanley Gill and David Wheeler he developed the concept of subroutines in programs to create re-usable modules.

Grace Hopper, now an employee of Remington-Rand and working on the UNIVAC, took up the concept of reusable software in her 1952 paper entitled "The Education of a Computer", (Proc. ACM Conference, reprinted Annals of the History of Computing Vol. 9, No.3-4, pp. 271-281) in which she described the techniques by which a computer could be used to select (or compile) pre-written code segments to be assembled into programs in correspondence with codes written in a high level language - thus describing the concept of a compiler, and the general concept of language translation. By the end of 1952 UNIVAC had become the common name for a computer, just as Hoover and Xerox became synonyms for vacuum cleaners and paper copiers.

In 1953 IBM contributed to the war effort in Korea by providing a 'Defense Calculator' that was in fact their first true entry into the computer business. The IBM Type 701 EDPM was built as a result of the conviction of T.J. Watson, Jr. that IBM had to take a step into this field. The 700 series of machines, including the 704, 709, and eventually the 7090 and 7094, dominated the large mainframe market for the next decade, and brought IBM from computer obscurity to first place in that same time period. While many universities in the US and other countries were building their own computers, the Cambridge University EDSAC was the first to be commercialised. With the foresight of, J. Lyons & Company, Ltd., a purveyors of confectionery and operators of "corner teahouses" throughout Great Britain, took the EDSAC design and converted it for their own business applications. Called LEO (Lyons Electronic Office), other companies with the same kind of business processing needs saw its advatages and turned an in-house development project into a new computer company. LEO Computers, Ltd. eventually was purchased by English Electric Company, and together they became part of ICL (International Computers Ltd.), the major builder of British computers through the 1970s.

Since the 1930s IBM had built a series of calculators in the 600 series that contributed to the versatility of the card processing equipment that was their major product. The early IBM computers (701 and 702) were incompatible with the punched card processing equipment, but in 1954 the IBM Type 650 EDPM, a natural extension of the 600 series, used the same card processing peripherals thus making it upwardly compatible for many existing IBM customers. A decimal, drum memory machine, the 650 was the first to be mass produced though IBM never expected to lease 1000 in the first year after its announcement. Following the example set by Grace Hopper, and a successful implementation of a digital code interpreter for the IBM 701 named Speedcoding, John Backus proposed the development of a programming language that would allow uses to express their problems in commonly understood mathematical formulae - later to be named FORTRAN.

In 1955 IBM began work on their contribution to the national effort by producing a machine that was promised to be 100 times faster than the fastest machine of the day. This machine was to expand the state-of-the-art and thus was named STRETCH.

In 1956 Sperry-Rand, the successor to Remington-Rand took up the challenge to create a supercomputer on behalf of LLNL (Lawrence Livermore National Laboratory) that was to be named LARC (Livermore Automatic Research Computer). Work also began in the United Kingdom on a supercomputer project. The Atlas project was a joint venture between University of Manchester and Ferranti Ltd. with Tom Kilburn as the principal architect.

The early computers had small internal memories and slow external memories primarily relying on magnetic tape. While internal memories had been upgraded to magnetic drums and then core memory. The next logical step was the disk memory, with movable read/write heads to provide a semi-random access capability and a storage capacity similar to that of magnetic tape. The IBM 305 RAMAC was the first disk memory system in 1957.

In 1958, Jack St. Clair Kilby conceived and proved his idea of integrating a transistor with resistors and capacitors on a single semiconductor chip, which is a monolithic IC. His idea of a monolithic IC, together with the planar technology of Dr. Jean Hoerni and Robert Noyce's idea of "junction isolation" for planar interconnections, underpins the great progress of today's semiconductor IC and the microelectronics based upon it. The technology has allowed the innovation of numerous applications in computers and communications, which have changed our life styles dramatically.

While there was a movement towards supercomputers in many companies in 1959, IBM announced the availability of two machines for the small user - the IBM 1401 for the business user and the IBM 1620 for the scientist. The 1401 became the most popular business data processing machine, and for small universities and colleges, the 1620 became the first computer experience for many students.

Since 1952 Grace Murray Hopper had been developing a series of programming languages that increasingly used natural language-like phrases to express the operations of business data processing. FLOWMATIC was the last of these. Others had also taken on the challenge, including IBM that had produced a language named COMMERCIAL TRANSLATOR. From these bases an industry-wide team - Conference on Data System Languages (CODASYL) - led by Joe Wegstein of NBS (now NIST) developed a new language in a very short time and created the first standardised business computer programming language, COBOL (Common Business Oriented Language). For the next 20 years there were more programs written in COBOL than any other single language. 1960 marked the end of first generation of vacuum tube (electonic valves) driven computers gave way to the second generation using transistors.

The work on integrated circuits by Jack Kilby and Robert Noyce came to fruition in 1961 when the first commercially available integrated circuits became available from the Fairchild Corporation. From this date forward computers would incorporate ICs instead of individual transistors or other components.

In 1962 in Great Britain the Atlas computer at the University of Manchester became operational. It was the first machine to use virtual memory and paging, its instruction execution was pipelined and it contained separate fixed and floating-point arithmetic units, capable of approximately 200 kFLOPS.

By 1963 the process of standardisation of the elements of the industry was becoming prevalent and among the first was a standard for a code for information interchange (ASCII). For the first time there was a means for computers to interchange information, but it would take almost 15 years before this would become commonplace.

Starting in 1959 Douglas Engelbart launched the SRI Augmentation Research Center to pioneer the modern interactive working environment. NLS (On Line System) was built during the mid-1960s to develop and experiment with software and hardware tools that would make people more productive using computers. Among the original ideas developed and implemented in NLS were the first hypertext system, outline processor, and video conferencing. In 1964 he had developed the 'mouse', to be followed by the development of two-dimensional editing, the concept of windows, cross-file editing, uniform command syntax, remote procedure-call protocol, mixed text-graphic files, structured document files, idea processing, and many more developments. Like the work of almost any pioneer Engelbart's work was not recognized immediately, the mouse waiting until the development of the personal computer, fifteen years later, to find its niche. To many the world of computing changed radically on April 7, 1964 when IBM announced System/360, the first IBM family of compatible machines. While there was at least one other compatible family in GE, the commitment to an upwards compatible family and the merging of the scientific and business lines of machines by IBM had a profound effect on the way many businesses thought about computers.

While some companies were developing bigger and faster machines, Digital Equipment Corporation introduced the PDP-8 in 1965, the first TRUE minicomputer. The PDP-8 had a minuscule instruction set and a primitive micro-language, and excellent interface capability.

1966 was the year when a joint project between IBM and the SHARE user's group developed a new programming language with the intention of combining both scientific and business data processing as had the System/360 machines. The language was also intended to be a high level system development language.

Seven years after Fairchild Corporation had delivered the first commercial integrated circuit, the third generation of computers began in 1967 with the delivery of the first machines using that technology.

During the development of the programming language FORTRAN, Harlan Herrick had introduced the high level language equivalent of a 'jump' instruction in the form of a "GO TO" statement. In 1968 Edsger Dijkstra laid the foundation stone in the march towards creating structure in the domain of programming by writing, not a scholarly paper on the subject, but instead a letter to the editor entitled "GO TO Statement Considered Harmful".

In 1969 work on ARPAnet began. Disillusioned by the work on Multics and continuing problems with the GE 600 series machines, Bell Telephone Laboratories withdrew from Project MAC. Messrs. Ritchie and Thompson began work on their own operating system, that instead of being targeted to multiple users, would concentrate on the single user and thus in a play on the name Multics, it was named UNIX.

Networks Computer-to-computer communication expanded in 1970 when the Department of Defense established four nodes on the ARPANET: the University of California Santa Barbara and UCLA, SRI International, and the University of Utah. Viewed as a comprehensive resource-sharing network, ARPANET´s designers set out with several goals: direct use of distributed hardware services; direct retrieval from remote, one-of-a-kind databases; and the sharing of software subroutines and packages not available on the users´ primary computer due to incompatibility of hardware or languages.

The world of personal computing has its roots in 1971, with two important products - the first commercially available microprocessor and the first floppy disk. The recently founded Intel Corporation produced the Intel 4004 for Busicom company, giving birth to a family of 'processors on a chip'. The Japanese company, Busicom required a chip for a calculator. Marcian Hoff decided that it would be easier to use a 'computer on a chip' for this purpose than to custom develop a calculator chip. Alan Shugart at IBM produced the first regular use of an 8 inch floppy (magnetic storage) diskette, primarily for the DISPLAYWRITER.

1972 saw the first digital microcomputer available for personal use this was the MITS (Micro Instrumentation and Telemetry Systems) 816. Though not equipped with a display or keyboard, the 816 was of interest to the amateur enthusiast who was seeking the personal computer.

In 1973 the concept of a wide area network had been effectively developed as a part of the ARPANet project. The basis for a 'local area net' was Ethernet, created at Xerox PARC by Robert Metcalfe. In some ways Metcalfe invented Ethernet three times, the first time as part of his dissertation at Massatuchetts Institute of Technology; as part of Project MAC at Xerox PARC; and then again later at 3COM, a company he founded to exploit his invention.

The March 1974, QST magazine, contained the first formally advertised personal computer - the Scelbi ("SCientific, ELectronic, and BIological") developed by the Scelbi Computer Consulting, Milford, Connecticut. At almost the same time, Jonathan Titus produced a widely marketed personal computer kit, named the Mark-8. Intel introduced the 8080 for the purposes of controlling traffic lights, but it was to be used later as the processor for the Altair. Also in 1974 Gary Kildall introduced CP/M as the first operating system to run almost independent of the platform; John Cocke designed the first RISC machine for IBM Research; first ATM machines appear and Zilog, Inc. was founded to compete with Intel in the production of micro-processors on a chip, the Z80.

By 1975 the market for the personal computer was demanding a product that did not require an electrical engineering background to use, thus the first mass produced and marketed personal computer (available both as a kit or assembled) was welcomed. Developers Edward Roberts, William Yates and Jim Bybee spent 1973-1974 to developing the MITS Altair 8800. The price was $375 it contained 256 bytes of memory (not 256k bytes), but had no keyboard, no display and no auxiliary storage device. Later, Bill Gates and Paul Allen wrote their first product for the Altair - a BASIC compiler named after a planet on a Star Trek episode. 1975 was also the year in which IBM produced their first personal computer, the 5100. Seymour Cray, the principal architect for CDC, started the trend toward modern supercomputers and computational architectures. In 1975 a Cray machine was, and still is, the standard by which to judge super performance. There is a public domain version of the Cray operating system.

A year after the Altair was produced, Steve Jobs and Steve Wozniak produced the Apple II that was assembled and complete with its own keyboard and monitor. It was successful, priced within the reach of the enthusiast and supporting some basic software applications. The Apple II was used in schools and colleges and was the basis of many early "microprocessor" courses. 1976 was the year when Microsoft and Apple Corporations were founded.

1977 saw the opening of the First West Coast Computer Faire in San Francisco where the Apple II (costing $1298) and the Commodore Pet ($795) were on display. That same year Radio Shack introduced the TRS-80 microcomputer, given the derogatory handle of "Trash-80." The first Computerland store opened in Morristown New Jersey, under the name Computershack.

While most microprocessors had been quickly supported by a BASIC compiler, and some primitive games, Visicalc introduced by Daniel Bricklin and Bob Frankston was a major breakthrough in application software for this level of machinery. Following the 1978 release of Visicalc, the spreadsheet program and its unprecedented success.

Micropro International released Wordstar in 1979, which like Visicalc would set the standards for word processing systems.

Alan Shugart, left IBM in 1980 and founded his own company, Shugart Associates. It developed storage devices such as the Winchester hard drive, thereby revolutionizing the storage capabilities of personal computers. Personal computers were now free from their tiny internal memories and slow external storage cassette tapes or diskettes. The personal computer moved from being a microcomputer limited by its storage capabilities to compete effectively with the power of many mainframe systems, and certainly with the majority of minicomputers.

After waiting for the opposition to soften up the market, IBM entered the field in 1981 with the IBM 'PC' and supported by the 'Disk Operating System' operating system developed under an agreement that gave Microsoft all the profits in exchange for the development costs having been borne by Microsoft. Disregarding CP/M that had been the choice for earlier machines, IBM chose to go in a radically different direction on the marketing assumption that the purchasers of the PC were a different breed than those who were prepared to build their own system from a kit. IBM attracted a community of users who wanted the machine for its usefulness rather than its intrinsic engineering appeal. Osborne Computer Corporation began marketing the first self-contained, portable microcomputer in 1981, complete with a with monitor, disk drives and carrying case - the Osborne 1. Though initially successful, Osborne went bankcrupt two years later. That same year Commodore introduced the VIC-20, and quickly sold 1 million units!

By 1982 the computer had become a prime tool in the movie industry and Disney Studios completed a movie where the characters existed inside a computer - Tron - and where the special effects were computer generated.

Software development exploded with the introduction of the PC, standard applications including not only spreadsheets and word processors, but also graphics packages and communications systems. Games were also prolific. In 1983 Mitch Kapor introduced Lotus 1-2-3, and took over the spreadsheet supremacy from Visicalc.

1984, the year to be dreaded according to the novel by George Orwell, opened with the broadcast by Apple Computer of an advertisement parodying the trodden-down masses, subservient to the IBM PC, as the lead up to the announcement of the Macintosh. According to Jobs the Macintosh was the result of his having "seen the light" at Xerox PARC in the viewing of the Alto system. The mouse and the icon became the major tools for computer interaction.

In 1985 computers in various locations were being attacked by what the press came to label as 'hackers', much to the dismay of legitimate hackers at institutions of higher learning. Using personal computers, young people were travelling cyberspace and were tapping into the resources of corporate systems. A break-in to a computer at the Los Alamos National Laboratory was tracked back to a group of teenagers in Milwaukee, Wisconsin, who were to be come known as the '414' hackers, 414 being their telephone area code.

Starting from the 8086 chip used in the IBM PC, Intel Corporation continually developed new chips to support the ever increasing demand for processing power; in 1986 Intel released the 386 chip - the intermediate stage between the 1980 8086 and the 1994 Pentium. At the other end of the computer family scale the CRAY X-MP with 4 processors achieved a processing speed of 713 MFLOPS (against a peak of 840 MFLOPS) on 1000x1000 LINPACK. In thirty years the supercomputer had achieved an improvement of five orders of magnitude. If the motor car had had the same degree of improvement in the past 100 years then it would have a petrol consumption of one thimble full per hundred miles, travel at 3,000,000 miles per hour, and be cheaper to replace than pay the in town parking fee!

IBM introduced its PS/2 machines in 1987, which made the 3 1/2-inch floppy disk drive and video graphics array standard for IBM computers. The first IBMs to include Intel´s 80386 chip, the company had made more than 1 million units by the end of the year. IBM released a new operating system, OS/2, at the same time, allowing the use of a mouse with IBMs for the first time.

In 1988 Apple co-founder Steve Jobs, who left Apple to form his own company, unveiled the NeXT. The computer he created failed but was recognised as an important innovation. At a base price of $6,500, the NeXT ran too slowly to be popular. The significance of the NeXT rested in its place as the first personal computer to incorporate a drive for an optical storage disk, a built-in digital signal processor that allowed voice recognition, and object-oriented languages to simplify programming. The NeXT offered Motorola 68030 microprocessors, 8 megabytes of RAM, and a 256-megabyte read/write optical disk storage.

It was in 1989 that Tim Berners-Lee proposed the World Wide Web project to CERN (European Council for Nuclear Research). Intel’s 80486 chip with 1.2 million transistors was introduced in April. Seymour Cray founded Cray Computer Corporation and began developing the Cray 3 using gallium arsenide chips.

In 1990 Microsoft introduced Windows 3.0 in May, intensifying its legal dispute with Apple over the software’s 'look and feel' resemblance to the Macintosh operating system. Hewlett-Packard and IBM both announced RISC-based computers in. Intel’s i486 and iPSC/860, and Motorola’s 68040 became available. Berners-Lee wrote the initial prototype for the World Wide Web, which used his other creations: URLs, HTML, and HTTP.

Cray Research unveiled the Cray Y-MP C90 with 16 processors and a speed of 16 Gflops. IBM, Motorola, and Apple’s PowerPC alliance is announced on July 30 1991.

1992 was when DEC introduced the first chip to implement its 64-bit RISC Alpha architecture.

Intel’s Pentium was introduced in March 1993. Students and staff at the University of Illinois’ National Center for Supercomputing Applications create a graphical user interface for Internet navigation called NCSA Mosaic.

In April 1994, Jim Clark and Marc Andreesen founded Netscape Communications (originally Mosaic Communications). Netscape’s first browser became available in September and created a rapidly growing body of Web surfers.

Toy Story is the first full-length feature movie completely computer generated in 1995. Windows 95 is launched on August 24 with great fanfare. The Java programming language, unveiled in May, enabled platform independent application development. “Duke” is the first applet.

The Intel Pentium Pro was announced 1996.

 

Interesting point

In 1945 when Grace Murray Hopper was working in a temporary World War I building at Harvard University on the Mark II computer, she found the first computer bug dead between the contacts of a relay. She included it in the logbook of the computer and subsequently whenever the machine stopped they told Howard Aiken that they were "debugging" the computer.

 

Further information may obtained from the Institute of Electrical and Electronics Engineers Computer Society and The Computer History Museum.

Back to top