Текст Лекции (изначальный) (1157952), страница 27
Текст из файла (страница 27)
Intel’s revenues surpassed operating expenses for the first time in 1971. This year the company introduced a new memory chip EPROM (“erasable, programmable read only memory”). Invented by Intel’s Dov Frohman, the new memory could store data permanently but besides could be erased simply by a beam of ultraviolet light and be used again. The invention of the microprocessor marked a turning point in Intel’s history. It showed the real significance of the EPROM, which could be used by original equipment manufacturer customers to store microprocessor programs in a “flexible and low-cost way”. The unexpected synergy between the EPROM and the microprocessor resulted in a growing market for both chips and contributed a great deal to Intel’s early success.
The story of further technological breakthrough began in 1969, when a Japanese calculator manufacturer Busicomp asked Intel to design a set of chips for a family of programmable calculators. Marcian Ted Hoff, a young and very bright ex-Stanford research associate who had joined Intel as employee number 12, was charged with this project. However, Ted Hoff did not like the Japanese design calling for 12 customs chips - each of them was assigned a distinct task. Hoff thought that designing so many different chips would make the calculators very expensive. His idea was to develop a four-chip set with a general-purpose logic device as its center, which could be programmed by instructions stored on a semiconductor memory chip. With the help of new employee Stan Mazor, Hoff perfected the design of what would be the 4004 arithmetic chip. After Busicomp had accepted Hoff’s chip set, Frederico Faggin, one of the best chip design experts, began transforming the design into silicon. The 4004 microprocessor, a 4-bit chip (processes 4 bits - a string of four ones or zeroes-of information at a time), contained 2300 MOS transistors, and was as powerful as the legendary first electronic computer ENIAC.
Soon after the first 4004s had been delivered, Intel realized the market potential of the chip, and successfully renegotiated with the Japanese to regain the exclusive rights, which had been sold to Busicomp. In November 1971, Intel introduced the 4004 to the public in Electronic News ads. They announced not just a new product, but also “a new era of integrated electronics”, a micro programmable computer on a chip. The microprocessor is - as Gordon Moore called it - “one of the most revolutionary products in the history of mankind, and ranks as one of 12 milestones of American technology in a survey of the US” (“News and World Report”, 1982). The introduction of a microprocessor made possible the creation of a microcomputer.
Today, Intel supplies the computing and communications industries with chips, boards and systems building blocks that are the “ingredients” of computers, servers, and networking and communications products. Industry members use these products to create advanced computing and communications systems. Intel’s mission is to be the prominent building block supplier to the worldwide Internet economy.
Communications building blocks for next-generation networks and Internet data centers are offered at various levels of integration. These products are used in communications servers, network appliances and computer telephony integration equipment.
Component-level building blocks include communications silicon such as network processors and other board-level components, software and embedded control chips. These products are integrated in communications hardware such as hubs, outers, switches and servers for local and wide area networking applications. Embedded control chips are also used in laser printers, automotive systems and other applications.
Intel’s measures resulted in a remarkable technological lead against its competitors. The most significant consequence, which was a landmark in the company’s development, was IBM’s decision to rely on the Intel 8088 microprocessor for its PCs in 1980.
IBM (short for International Business Machines) has been the world’s leading company in the big mainframe computers since the 1950s. Due to its dominance, it was often compared with a giant and referred to as “Big Blue”. Because of IBM’s dominance and worldwide reputation, its PCs soon became industry standard and penetrated the office market. Other established computer companies followed and introduced their own PCs - the so-called “clones”-which were compatible to IBM’s models. To maintain compatibility, all these manufactures were forced to rely on Intel’s microprocessors, which thus were bootstrapped to industry standard, too. MS-DOS was chosen as the IBM PC’s operating system and became industry standard, essential to every compatible IBM PC.
The Apple company provides one of Silicon Valley’s most famous stories. It shows features that are typical for most star-up firms in the valley, however, it is unique and its early success and its contribution to the personal computer field are unmatched.
Apple’s history starts with the story of two young and exceptional people “Two Steves” who began building a computer in their garage and launched the microcomputer revolution, changing our daily life in many respects.
Stephen G. Wozniak was a typical Silicon Valley child. Born in 1950, he grew up with the electronics industry in Silicon Valley, and became intrigued by electronics from the very start, since his father was an electronics engineer. Wozniak, known to his friends as “Woz”, was an electronics genius. At the age of 13, he won the highest award at a local science fair for his addition-subtraction machine. His electronics teacher at Homestead High School recognized Woz’s outstanding talent and arranged a job for him at a local company, where Steve could work with computers once a week. It was there that Wozniak saw the capabilities of a computer (it was the DEC PDP-8 minicomputer).
In 1971, Wozniak built his first computer with his high-school friend Bill Fernandez. This computer (they called it Cream Soda Computer) was developed in his friend’s garage. Bill introduced Woz to a friend of his, named Steven P. Jobs. Jobs’ parents were - like most other people in Silicon Valley-blue-collar workers. Growing up in an environment full of electronics, Steve came in contact with this fascinating technology and was caught by it. Jobs was a loner and his character can be described as brash, very ambitious and unshakably self-confident. With his directness and his persistency he managed to persuade most people. He had the ability to convey his notions and visions to other people quite well. And he was not afraid to talk to famous people until they gave in and did what he wanted.
In 1972, Steve Jobs went to Reed College in Oregon, but dropped out a year later and returned to Silicon Valley, where he took a job with a young video game company Atari., which at that time planned to develop a new game called “Breakout”. Jobs boasted he could design it quicker and better than anyone else. Jobs told his friend Woz about it, and the two designed the game in record time, working four nights and days, and were paid the promised $700 for it. This experience showed them that they could work together on a tough project and succeed.
When the Homebrew Computer Club came into existence, Wozniak began attending its meetings. There he met people who shared his love for computers and exchanged the technical expertise. Soon after, Chuck Peddle at MOS Tech released his new 6502 microprocessor chip for only $20, which was a sensation compared to the usual price of $400. Suddenly, Woz saw his chance and decided to write the first BASIC for it, which was the most spread programming language. After finishing with the BASIC, he made a computer for it to run on. The other hobbyists at Homebrew were impressed by Wozniak’s kit, which actually was a board with chips and interfaces for a keyboard and a video monitor.
The breakthrough for the two Steves came in July, when Paul Terrell ordered 50 Apples for his Byte Shop, however on condition the computers were fully assembled in a case and equipped with a cassette interface to enable external data storage. Working hard in Job`s parent’s garage they managed to construct the 50 Apples within those 30 days.
The Apple I was continuously refined by Wozniak, and its sales made the young company known, partly because the company’s name appeared on top of computers lists, which were published by electronics magazines in alphabetical order. By the time the first Apple was being sold, Steve Wozniak had already begun working on another computer, the Apple II. This machine had several special features, which had not appeared in any microcomputer before and would make it the milestone product that would usher in the age of personal computer.
Steve Job’s persistency persuaded Wozniak to build up a company. In 1979, Daniel Fylstra, a programmer from Boston, released VisiCalc for the Apple 11. This spreadsheet was a novelty in computer software. It relieved business calculations considerably and could be used to do financial forecasting. It was the first application that made personal computers a practical tool for people who do not know how to write their own programs. VisiCalc was very successful and contributed to the skyrocketing of the Apple 11.
The same year Mike Markkula made another important decision for Apple future growth. His idea was to create a new market in the field of education and schools. The Apple Education Foundation was established, which granted complete Apple systems equipped with learning software to schools. This market should account for a major part of the company’s sales in the subsequent years, since Apple 11 soon became the most popular machine for students.Apple remains the second-biggest personal computer manufacturer after IBM and has released innovative products such as Quick Time, easy to use multimedia software combining sound, video and animation. Its further development is Newton, a personal digital assistant (PDA), which serves as an electronic notepad and “integrates advanced hand-writing recognition, communication and data-management technologies”.
Practically at the same time a graduate student of Stanford University Andy Bechtolsheim conceived and designed the Sun workstation for the Stanford University Network communication project..In February 1982 he together with Vinod Khosia, and Scott McNealy founded Sun Microsystems ( Stanford University Network) and started initial public offerings in 1986 under the stock symbol SUNW, changed in 2007 to JAVA;
At present Sun holds the patent of the widely used Java Development language and offers certification and support to the Java development community. The company makes network computing products such as workstations, servers, storage systems, network switches, software, microprocessors, and provides associated services and support with its mission to connect everyone, everywhere via sun solutions.
The founder and leader of Microsoft Corporation. William Henry Gates was born in the family of upper middle class businessman in Seattle. He went to Lakeside Prep. School, where he was first introduced to computers. At that time, computers were still too bulky and expensive for the school to purchase their own ones, but the school made agreements with various companies that allowed its students to use the computers. Bill Gates, his friend Paul Allen and a handful of other students took up computing. They read books on computers, tried to write programs, hack the systems, alter and crash the files. Soon Bill and his friends were invited by the computer company to find bugs and explore weaknesses in the system. According to Gates, “ the boys used their time eating, drinking, and breathing computers”. When the company that was hiring the group went out of business in 1970, the boys were soon hired by Information Sciences Inc. to write a program for the payroll. Later they were also contracted by other computer firms to find bugs and fix them.
In 1973 Gates was enrolled to Harvard University as a prelaw student, though he spent most of the time programming in the campus computer center. A year later his friend Paul Allen showed Bill the picture of the first personal microcomputer on the cover of a magazine ”Popular Electronics” along with a lengthy article. They both realized that their “star time” had come – the home PC business was about to explode and needed software for the machines. Gates arranged a meeting with the Altair manufacturers and by the time of the appointment Gates and Allen had already got the program Basic Interpreter– the result of their feverish night work. They sold the program and licensed it to their first customer MITS.
After Bill Gates had dropped out of Harvard, Paul Allen also left MITS ( where he was invited to the position of the Director of Software) to devote the time completely to their new joint company “Microsoft”( 1977). The company went through some rough first years, coming out with its second programming languages FORTRAN and third COBOL In 1980 Microsoft released the Z-80 Soft Card, announced an interactive, multi-user, multi-tasking system XENIX OIS, compatible with the programs written for UNIX OS.
In 1981 Microsoft became a great corporation with Bill Gates as President and Chairman of the Board, and Paul Allen as Executive Vice President in the State of Washington, introduced its PC, multi-feature word processing program, Microsoft Word for MS-DOS1.0 and the Microsoft Mouse.
In 1984 Microsoft took the leading role in developing software for the Apple Macintosh, created a new Hardware and Peripheral Division and announced their new personal computer, the IBM PC AT. When in August 1985 Microsoft celebrated its 10PthP anniversary it already employed 1.442 workers and had expanded its growing empire to Europe.