jueves, 17 de noviembre de 2022

COMPUTER TIME LINE

The computer was born not for entertainment or email, but out of the need to solve a serious crisis of numbers. By 1880, the US population had grown so large that it took more than seven years to tabulate the results of the US Census. The government sought a faster way to do the job, giving rise to computers based on punch cards that filled entire rooms. Today, we have more computing power in our smartphones than was available in these early models. The following brief history of computing is a timeline of how computers evolved from humble beginnings to today's machines that surf the Internet, play games, and stream multimedia, plus crunch numbers.



1801: In France, Joseph Marie Jacquard invents a loom that uses punched wooden cards to automatically weave fabric designs. Early computers would use similar punch cards.


1822 English mathematician Charles Babbage conceives of a steam-powered calculating machine that could calculate tables of numbers. The project, financed by the English government, is a failure. More than a century later, however, the world's first computer was built.


1890: Herman Hollerith designs a punch card system to compute the 1880 census, accomplishing the task in just three years and saving the government $5 million. He establishes a company that would eventually become IBM.


1936: Alan Turing presents the notion of a universal machine, later called the Turing machine, capable of computing anything that is computable. The central concept of the modern computer was based on his ideas.


1937: J.V. Atanasoff, a professor of physics and mathematics at Iowa State University, attempts to build the first computer without gears, cams, belts, or shafts.


1939: Hewlett-Packard is founded by David Packard and Bill Hewlett in a garage in Palo Alto, California, according to the Computer History Museum.


1941: Atanasoff and his graduate student Clifford Berry design a computer that can solve 29 equations simultaneously. This marks the first time that a computer can store information in its main memory.


1943-1944: Two University of Pennsylvania professors, John Mauchly and J. Presper Eckert, build the Electronic Numerical Integrator and Calculator (ENIAC). Considered the grandfather of digital computers, it fills a room 20 feet by 40 feet and has 18,000 vacuum tubes.


1946: Mauchly and Presper leave the University of Pennsylvania and receive funding from the Census Bureau to build the UNIVAC, the first commercial computer for business and government applications.



1947: William Shockley, John Bardeen, and Walter Brattain of Bell Laboratories invent the transistor. They figured out how to make an electrical switch out of solid materials and without the need for a vacuum.


1953: Grace Hopper develops the first computer language, eventually known as COBOL. Thomas Johnson Watson Jr., son of IBM CEO Thomas Johnson Watson Sr., conceives the IBM EDPM 701 to help the United Nations keep track of Korea during the war.

1954: The FORTRAN programming language, an acronym for FORmula TRANslation, is developed by a team of programmers at IBM led by John Backus, according to the University of Michigan.


1958: Jack Kilby and Robert Noyce reveal the integrated circuit, known as the computer chip. Kilby was awarded the Nobel Prize in Physics in 2000 for his work.


1964: Douglas Engelbart demonstrates a prototype of the modern computer, complete with a mouse and a graphical user interface (GUI). This marks the evolution of the computer from a specialized machine for scientists and mathematicians to a technology more accessible to the general public.

1969: A group of developers at Bell Labs produce UNIX, an operating system that addresses compatibility issues. Written in the C programming language, UNIX was portable across multiple platforms and became the operating system of choice among mainframes in large companies and government entities. Due to the slow nature of the system, it never gained traction among home PC users.

1970: The new Intel lineup introduces the Intel 1103, the first dynamic access memory (DRAM) chip.


1971: Alan Shugart leads a team of IBM engineers who invent the "floppy disk," allowing data to be shared between computers.



1973: Robert Metcalfe, a member of the Xerox research staff, develops Ethernet to connect various computers and other hardware.



1974-1977: A series of personal computers hit the market, including the Scelbi & Mark-8 Altair, the IBM 5100, Radio Shack's TRS-80, affectionately known as the "Trash 80", and the Commodore PET.


1975: The January issue of Popular Electronics magazine features the Altair 8080, described as the "world's first minicomputer kit to compete with commercial models." Two "computer geeks", Paul Allen and Bill Gates, offer to write software for Altair, using the new BASIC language. On April 4, after the success of this first effort, the two childhood friends form their own software company, Microsoft.


1976: Steve Jobs and Steve Wozniak start Apple Computers on April Fool's Day and release the Apple I, the first computer with a single circuit board, according to Stanford University.


The TRS-80, introduced in 1977, was one of the first machines whose documentation was intended for non-geeks.


 1977: Radio Shack's initial production run of the TRS-80 was just 3,000. It was selling like crazy. For the first time, non-geeks could write programs and make a computer do what they wanted.



1977: Jobs and Wozniak incorporate Apple and display the Apple II at the first West Coast Computer Show. It offers color graphics and incorporates an audio cassette drive for storage.


1978: Accountants rejoice with the introduction of VisiCalc, the first computerized spreadsheet program.


1979: Word processing becomes a reality as MicroPro International launches WordStar. "The ultimate change was adding margins and word wrapping," creator Rob Barnaby said in an email to Mike Petrie in 2000. "Additional changes included getting rid of the command mode and adding a print feature. I was the technical brain." I figured out how to do it, and did it, and documented it."


1981: IBM's first personal computer, codenamed "Acorn," is introduced. It uses Microsoft's MS-DOS operating system. It has an Intel chip, two floppy disks, and an optional color monitor. Sears & Roebuck and Computerland are selling the machines, marking the first time a computer has been available through third-party dealers. It also popularizes the term PC.


1983: Apple's Lisa is the first personal computer with a GUI. It also features a dropdown menu and icons. Flops but eventually evolves to the Macintosh. The Gavilan SC is the first laptop in the family twist form factor and the first to be marketed as a "laptop".


1985: Microsoft announces Windows, according to the Encyclopedia Britannica. This was the company's response to Apple's graphical user interface. Commodore introduces the Amiga 1000, which features advanced audio and video capabilities.


1985: The first dot-com domain name is registered on March 15, years before the World Wide Web marked the formal start of Internet history. The Symbolics Computer Company, a small Massachusetts computer manufacturer, registers Symbolics.com. More than two years later, only 100 dot coms had been registered.


1986: Compaq brings the Deskpro 386 to market. Its 32-bit architecture provides speed comparable to that of mainframes.


1990: Tim Berners-Lee, a researcher at CERN, the high-energy physics laboratory in Geneva, develops Hypertext Markup Language (HTML), giving birth to the World Wide Web.


1993: The Pentium microprocessor advances the use of graphics and music in PCs.


1994: PCs become gaming machines as "Command & Conquer," "Alone in the Dark 2," "Theme Park," "Magic Carpet," "Descent" and "Little Big Adventure" are among the games to go to market.


1996: Sergey Brin and Larry Page develop the Google search engine at Stanford University.



1997: Microsoft invests $150 million in Apple, which was struggling at the time, ending Apple's court case against Microsoft alleging that Microsoft had copied the "look and feel" of its operating system.


1999: The term Wi-Fi becomes part of computer parlance and users begin to connect to the Internet without wires.


2001: Apple introduces the Mac OS X operating system, which provides protected memory architecture and preventative multitasking, among other benefits. Not to be outdone, Microsoft rolls out Windows XP, which has a significantly redesigned GUI.


2003: The first 64-bit processor, AMD's Athlon 64, will be available for the consumer market.


2004: Mozilla's Firefox 1.0 challenges Microsoft's Internet Explorer, the dominant web browser. Facebook, a social networking site, launches.


2005: YouTube, a video-sharing service, is founded. Google acquires Android, a Linux-based operating system for mobile phones.


2006: Apple introduces the MacBook Pro, its first Intel-based dual-core mobile computer, as well as an Intel-based iMac. Nintendo's Wii game console hits the market.


2007: The iPhone brings many computer features to the smartphone.


2009: Microsoft releases Windows 7, which offers the ability to pin apps to the taskbar and advances in handwriting and touch recognition, among other features.


2010: Apple introduces the iPad, changing the way consumers view media and boosting the idle tablet segment


2011: Google launches the Chromebook, a laptop running the Google Chrome operating system.


2012: Facebook gains 1 billion users on October 4.


2015: Apple launches the Apple Watch. Microsoft releases Windows 10.





2016: The first reprogrammable quantum computer was created. "Until now, there hasn't been any quantum computing platform that had the ability to program new algorithms into its system. Typically, each one is designed to attack a particular algorithm," said study lead author Shantanu Debnath, a Quantum Physicist and Optical Engineer at the University of Maryland, College Park.






















No hay comentarios.:

Publicar un comentario

 The most lethal viruses  Download here the file