Wednesday, December 11, 2019

History Of The Computer Industry In America (2416 words) Essay Example For Students

History Of The Computer Industry In America (2416 words) Essay History Of The Computer Industry In AmericaMatchmaker.com: Sign up now for a free trial. Date Smarter!HistoryOf The Computer Industry In AmericaOnly once in a lifetime will a new inventioncome about to touchevery aspect of our lives. Sucha device that changes the way we work,live, and play is a special one, indeed. A machine that has done allthis and more now exists in nearly everybusiness in the U.S. and oneout of every two households (Hall, 156). This incredible invention isthe computer. The electronic computerhas been around for over ahalf-century, but its ancestors have beenaround for 2000 years. However, only in the last 40 years hasit changed the American society. From the first wooden abacus to the latesthigh-speed microprocessor,the computer has changed nearly everyaspect of peopleOs lives for thebetter. The very earliest existence of the modern day computerOsancestor is the abacus. These dateback to almost 2000 years ago. Itis simply a wooden rack holding parallelwires on which beads arestrung. When these beads are movedalong the wire according toprogramming rules that the user mustmemorize, all ordinary arithmeticoperations can be performed (Soma, 14). The next innovation incomputers took place in 1694 when BlaisePascal invented the firstOdigital calculating machineO. It could only add numbers and they hadto be entered by turning dials. It was designed to help PascalOs fatherwho was a tax collector (Soma, 32). In the early 1800Os, a mathematics professor named CharlesBabbage designed an automatic calculationmachine. It was steam poweredand could store up to 1000 50-digit numbers. Built in to his machinewere operations that included everythinga modern general-purposecomputer would need. It was programmedbyand stored data oncardswith holes punched in them, appropriatelycalled OpunchcardsO. Hisinventions were failures for the mostpart because of the lack ofprecision machining techniques used atthe time and the lack of demandfor such a device (Soma, 46). After Babbage, people began to lose interest in computers. However, between 1850 and 1900 there weregreat advances in mathematicsand physics that began to rekindle theinterest (Osborne, 45). Many ofthese new advances involved complex calculationsand formulas that werevery time consuming for human calculation. The first major use for acomputer in the U.S. was during the 1890census. Two men, HermanHollerith and James Powers, developeda new punched-card system thatcould automatically read information oncards without human intervention(Gulliver, 82). Since the populationof the U.S. was increasing sofast, the computer was an essential toolin tabulating the totals. These advantages were noted by commercial industries and soonled to the development of improved punch-cardbusiness-machine systemsby International Business Machines (IBM),Remington-Rand, Burroughs, andother corporations. By modern standardsthe punched-card machines wereslow, typically processing from 50 to250 cards per minute, with eachcard holding up to 80 digits. Atthe time, however, punched cards werean enormous step forward; they provideda means of input, output, andmemory storage on a massive scale. For more than 50 years followingtheir first use, punched-card machinesdid the bulk of the worldsbusiness computing and a good portionof the computing work in science(Chposky, 73). By the late 1930s punched-card machine techniques had become sowell established and reliable that HowardHathaway Aiken, incollaboration with engineers at IBM, undertookconstruction of a largeautomatic digital computer based on standardIBM electromechanicalparts. Aikens machine, called theHarvard Mark I, handled 23-digitnumbers and could perform all four arithmeticoperations. Also, it hadspecial built-in programs to handle logarithmsand trigonometricfunctions. The Mark I was controlledfrom prepunched paper tape. Output was by card punch and electrictypewriter. It was slow,requiring 3 to 5 seconds for a multiplication,but it was fullyautomatic and could complete long computationswithout humanintervention (Chposky, 103). The outbreak of World War II produced a desperate need forcomputing capability, especially for themilitary. New weapons systemswere produced which needed trajectorytables and other essential data. In 1942, John P. Eckert, John W. Mauchley,and their associates at theUniversity of Pennsylvania decided tobuild a high-speed electroniccomputer to do the job. This machinebecame known as ENIAC, forElectrical Numerical Integrator And Calculator. It could multiply twonumbers at the rate of 300 products persecond, by finding the value ofeach product from a multiplication tablestored in its memory. ENIAC wasthus about 1,000 times faster than theprevious generation of computers(Dolotta, 47). ENIAC used 18,000 standard vacuum tubes, occupied 1800 squarefeet of floor space, and used about 180,000watts of electricity. Itused punched-card input and output. The ENIAC was very difficult toprogram because one had to essentiallyre-wire it to perform whatevertask he wanted the computer to do. It was, however, efficient inhandling the particular programs for whichit had been designed. ENIACis generally accepted as the first successfulhigh-speed electronicdigital computer and was used in manyapplications from 1946 to 1955(Dolotta, 50). Mathematician John von Neumann was very interested in the ENIAC. In 1945 he undertook a theoretical studyof computation thatdemonstrated that a computer could havea very simple and yet be able toexecute any kind of computation effectivelyby means of properprogrammed control without the need forany changes in hardware. VonNeumann came up with incredible ideasfor methods of building andorganizing practical, fast computers. I Have a Dream EssayMany companies, some new to the computerfield, introduced in the 1970sprogrammable minicomputers supplied withsoftware packages. Thesize-reduction trend continued with theintroduction of personalcomputers, which are programmable machinessmall enough and inexpensiveenough to be purchased and used by individuals(Rogers, 153). One of the first of such machines was introduced in January1975. Popular Electronics magazineprovided plans that would allow anyelectronics wizard to build his own small,programmable computer forabout $380 (Rose, 32). The computerwas called the OAltair 8800O. Itsprogramming involved pushing buttons andflipping switches on the frontof the box. It didnOt includea monitor or keyboard, and itsapplications were very limited (Jacobs,53). Even though, many orderscame in for it and several famous ownersof computer and softwaremanufacturing companies got their startin computing through the Altair. For example, Steve Jobs and Steve Wozniak,founders of Apple Computer,built a much cheaper, yet more productiveversion of the Altair andturned their hobby into a business (Fluegelman,16). After the introduction of the Altair 8800, the personal computerindustry became a fierce battlegroundof competition. IBM had been thecomputer industry standard for well overa half-century. They heldtheir position as the standard when theyintroduced their first personalcomputer, the IBM Model 60 in 1975 (Chposky,156). However, the newlyformed Apple Computer company was releasingits own personal computer,the Apple II (The Apple I was the firstcomputer designed by Jobs andWozniak in WozniakOs garage, whichwas not produced on a wide scale). Software was needed to run the computersas well. Microsoft developed aDisk Operating System (MS-DOS) for theIBM computer while Appledeveloped its own software system (Rose,37). Because Microsoft had nowset the software standard for IBMs, everysoftware manufacturer had tomake their software compatible with MicrosoftOs. This would lead tohuge profits for Microsoft (Cringley,163). The main goal of the computer manufacturers was to make thecomputer as affordable as possible whileincreasing speed, reliability,and capacity. Nearly every computermanufacturer accomplished this andcomputers popped up everywhere. Computers were in businesses keepingtrack of inventories. Computerswere in colleges aiding students inresearch. Computers were in laboratoriesmaking complex calculations athigh speeds for scientists and physicists. The computer had made itsmark everywhere in society and built upa huge industry (Cringley, 174). The future is promising for the computer industry and itstechnology. The speed of processorsis expected to double every yearand a half in the coming years. As manufacturing techniques are furtherperfected the prices of computer systemsare expected to steadily fall. However, since the microprocessor technologywill be increasing, itOshigher costs will offset the drop in priceof older processors. In otherwords, the price of a new computer willstay about the same from year toyear, but technology will steadily increase(Zachary, 42)Since the end of World War II, the computer industry has grownfrom a standing start into one of thebiggest and most profitableindustries in the United States. It now comprises thousands ofcompanies, making everything from multi-milliondollar high-speedsupercomputers to printout paper and floppydisks. It employs millionsof people and generates tens of billionsof dollars in sales each year(Malone, 192). Surely, the computerhas impacted every aspect ofpeopleOs lives. It has affectedthe way people work and play. It hasmade everyoneOs life easier bydoing difficult work for people. Thecomputer truly is one of the most incredibleinventions in history. Works CitedChposky, James. Blue Magic. New York:Facts on File Publishing. 1988. Cringley, Robert X. Accidental Empires. Reading, MA: Addison WesleyPublishing, 1992. Dolotta, T.A. Data Processing: 1940-1985. New York: John Wiley Sons,1985. Fluegelman, Andrew. OA NewWorldO, MacWorld. San Jose, Ca: MacWorldPublishing, February, 1984 (Premire Issue). Hall, Peter. Silicon Landscapes. Boston: Allen Irwin, 1985Gulliver, David. Silicon Valey andBeyond. Berkeley, Ca: Berkeley AreaGovernment Press, 1981. Hazewindus, Nico. The U.S. MicroelectronicsIndustry. New York:Pergamon Press, 1988. Jacobs, Christopher W. OThe Altair8800O, Popular Electronics. NewYork: Popular Electronics Publishing,January 1975. Malone, Michael S. The Big Scare:The U.S. Coputer Industry. GardenCity, NY: Doubleday Co., 1985. Osborne, Adam. Hypergrowth. Berkeley, Ca: Idthekkethan PublishingCompany, 1984. Rogers, Everett M. Silicon ValeyFever. New York: Basic Books, Inc. Publishing, 1984. Rose, Frank. West of Eden. New York: Viking Publishing, 1989. Shallis, Michael. The Silicon Idol. New York: Shocken Books, 1984. Soma, John T. The History of theComputer. Toronto: Lexington Books,1976. Zachary, William. OThe Futureof ComputingO, Byte. Boston: BytePublishing, August 1994.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.