The Linguistic Culture-5 (USA-1) (1157932), страница 23
Текст из файла (страница 23)
Within a few years,California became a booming industrial state and the military center of the USA. After WorldWar II, the Stanford Research Institute (SRI) was founded to provide the industry with moreskilled specialists and increase the number of companies in Santa Clara County. More firms among them Hewlett-Packard as one of the first residents - settled their departments in this park.During the Korean War the US government placed Stanford with a great deal of the projects,which made more and more electronics companies (among them IBM and Lockheed) openedR&D departments in Santa Clara County.
Military funding for high-tech products wasresponsible for the rapid growth of Silicon Valley. Such firms as FMC, GTE, Varian Associates,Westinghouse, and finally Lockheed opened their R&D departments in the Stanford ResearchPark and started Lockheed Missiles and Space Company (LMSC) in Sunnyvale. They were tobecome the core of the early explosive growth of Silicon Valley Lockheed’s (with 24,000Employees now) move to Northern California was crucial for the developments in Santa ClaraCounty.The invention of the microprocessor in the early 1970s represented the next step towardsthe modern way of computing, providing the basis for the subsequent personal computerrevolution .The first microprocessor was designed at Intel Corporation (Integrated Electronics)representing the key to modern personal computers.
With its logic and memory chips, thecompany started providing the basic components for microcomputers. Intel, the most successfulsemiconductor company is regarded as Silicon Valley’s flagship, owing its worldwide leadingrole to a perpetually high spending on research and development (R&D).The foundation of the corporation started in 1968 by Bob Noyce together with GordonMoore and Andy Grove. Their aim was to embark on a new venture and “to regain thesatisfaction of research and development”. After Bob Noyce had developed a new photochemicalprocess, the three engineers developed the ideas of integrating many transistors on a chip ofsilicon.
Initially they focused on building the first semiconductor chips used for computermemory, which could replace the dominant memory storage technology at the time, called“magnetic core”. The young company started with 12 employees and with the first two productsgained the technological lead in the field of memory chips.Within a year, Intel developed its first product - the 3101 Schottky bipolar 64-bit staticrandom access memory (SRAM), which was soon followed by the 1101.
This chip (1101) was a256-bit SRAM developed on Intel’s new “silicon gate metal-oxide semiconductor (MOS)process”.Intel’s first really successful product was the 1103 dynamic random access memory(DRAM), which was manufactured in the MOS process. Introduced in 1970, this chip was “firstmerchant market LSI (large-scale integrated) DRAM”, and received broad acceptance because itwas superior to magnetic core memories. So, by the end of 1971, the 1103 had become theworld’s largest-selling semiconductor device and provided the capital for Intel’s early growth.Until today, semiconductor has adhered to Moore’s Law, which has been framed by thecofounder of Fairchild and Intel when the first commercial DRAMs appeared in the early 1970s.This law predicts that the price per bit drops by 30% every year. It implies that one will receive30 % more power (speed/capacity) at the same price.
Moore’s Law, which could be applied toboth memory chips and microprocessors, showed the unprecedented rapid progress inmicroelectronics.Intel’s revenues surpassed operating expenses for the first time in 1971. This year thecompany introduced a new memory chip EPROM (“erasable, programmable read onlymemory”).
Invented by Intel’s Dov Frohman, the new memory could store data permanentlybut besides could be erased simply by a beam of ultraviolet light and be used again. Theinvention of the microprocessor marked a turning point in Intel’s history. It showed the realsignificance of the EPROM, which could be used by original equipment manufacturer customersto store microprocessor programs in a “flexible and low-cost way”. The unexpected synergybetween the EPROM and the microprocessor resulted in a growing market for both chips andcontributed a great deal to Intel’s early success.The story of further technological breakthrough began in 1969, when a Japanesecalculator manufacturer Busicomp asked Intel to design a set of chips for a family ofprogrammable calculators.
Marcian Ted Hoff, a young and very bright ex-Stanford researchassociate who had joined Intel as employee number 12, was charged with this project. However,Ted Hoff did not like the Japanese design calling for 12 customs chips - each of them wasassigned a distinct task. Hoff thought that designing so many different chips would make thecalculators very expensive. His idea was to develop a four-chip set with a general-purpose logicdevice as its center, which could be programmed by instructions stored on a semiconductormemory chip. With the help of new employee Stan Mazor, Hoff perfected the design of whatwould be the 4004 arithmetic chip.
After Busicomp had accepted Hoff’s chip set, FredericoFaggin, one of the best chip design experts, began transforming the design into silicon. The 4004microprocessor, a 4-bit chip (processes 4 bits - a string of four ones or zeroes-of information at atime), contained 2300 MOS transistors, and was as powerful as the legendary first electroniccomputer ENIAC.Soon after the first 4004s had been delivered, Intel realized the market potential of thechip, and successfully renegotiated with the Japanese to regain the exclusive rights, which hadbeen sold to Busicomp. In November 1971, Intel introduced the 4004 to the public in ElectronicNews ads. They announced not just a new product, but also “a new era of integrated electronics”,a micro programmable computer on a chip.
The microprocessor is - as Gordon Moore called it “one of the most revolutionary products in the history of mankind, and ranks as one of 12milestones of American technology in a survey of the US” (“News and World Report”, 1982).The introduction of a microprocessor made possible the creation of a microcomputer.Today, Intel supplies the computing and communications industries with chips, boardsand systems building blocks that are the “ingredients” of computers, servers, and networking andcommunications products.
Industry members use these products to create advanced computingand communications systems. Intel’s mission is to be the prominent building block supplier tothe worldwide Internet economy.Communications building blocks for next-generation networks and Internet data centersare offered at various levels of integration. These products are used in communications servers,network appliances and computer telephony integration equipment.Component-level building blocks include communications silicon such as networkprocessors and other board-level components, software and embedded control chips. Theseproducts are integrated in communications hardware such as hubs, outers, switches and serversfor local and wide area networking applications. Embedded control chips are also used in laserprinters, automotive systems and other applications.Intel’s measures resulted in a remarkable technological lead against its competitors. Themost significant consequence, which was a landmark in the company’s development, was IBM’sdecision to rely on the Intel 8088 microprocessor for its PCs in 1980.IBM (short for International Business Machines) has been the world’s leadingcompany in the big mainframe computers since the 1950s.
Due to its dominance, it was oftencompared with a giant and referred to as “Big Blue”. Because of IBM’s dominance andworldwide reputation, its PCs soon became industry standard and penetrated the office market.Other established computer companies followed and introduced their own PCs - the so-called“clones”-which were compatible to IBM’s models. To maintain compatibility, all thesemanufactures were forced to rely on Intel’s microprocessors, which thus were bootstrapped toindustry standard, too.
MS-DOS was chosen as the IBM PC’s operating system and becameindustry standard, essential to every compatible IBM PC.The Apple company provides one of Silicon Valley’s most famous stories. It showsfeatures that are typical for most star-up firms in the valley, however, it is unique and its earlysuccess and its contribution to the personal computer field are unmatched.Apple’s history starts with the story of two young and exceptional people “Two Steves”who began building a computer in their garage and launched the microcomputer revolution,changing our daily life in many respects.Stephen G. Wozniak was a typical Silicon Valley child. Born in 1950, he grew up withthe electronics industry in Silicon Valley, and became intrigued by electronics from the verystart, since his father was an electronics engineer.
Wozniak, known to his friends as “Woz”, wasan electronics genius. At the age of 13, he won the highest award at a local science fair for hisaddition-subtraction machine. His electronics teacher at Homestead High School recognizedWoz’s outstanding talent and arranged a job for him at a local company, where Steve could workwith computers once a week. It was there that Wozniak saw the capabilities of a computer (itwas the DEC PDP-8 minicomputer).In 1971, Wozniak built his first computer with his high-school friend Bill Fernandez.This computer (they called it Cream Soda Computer) was developed in his friend’s garage. Billintroduced Woz to a friend of his, named Steven P.
Jobs. Jobs’ parents were - like most otherpeople in Silicon Valley-blue-collar workers. Growing up in an environment full of electronics,Steve came in contact with this fascinating technology and was caught by it. Jobs was a lonerand his character can be described as brash, very ambitious and unshakably self-confident. Withhis directness and his persistency he managed to persuade most people. He had the ability toconvey his notions and visions to other people quite well. And he was not afraid to talk tofamous people until they gave in and did what he wanted.In 1972, Steve Jobs went to Reed College in Oregon, but dropped out a year later andreturned to Silicon Valley, where he took a job with a young video game company Atari., whichat that time planned to develop a new game called “Breakout”. Jobs boasted he could design itquicker and better than anyone else. Jobs told his friend Woz about it, and the two designed thegame in record time, working four nights and days, and were paid the promised $700 for it.