Computer Science. The English Language Perspective - Беликова (1176925), страница 4
Текст из файла (страница 4)
This was actually aseries of 14 models that offered successively greater memorycapacity and processing speed while maintaining compatibilityso that programs developed on a smaller, cheaper model wouldalso run on the more expensive machines.By the mid-1960s, however, a new market segment had comeinto being: the minicomputer, pioneered by Digital EquipmentCorporation (DEC) Architecturally, the mini usually had ashorter data word length than the mainframe, and used indirectaddressing for flexibility in accessing memory.
Minis werepractical for uses in offices and research labs that could notafford a mainframe.In programming, the main innovation of the 1960s was thepromulgation of the first widely-used, high-level programminglanguages, COBOL (for business) and FORTRAN (for scientificand engineering calculations.) The new higher-level languagesmade it easier for professionals outside the computer field tolearn to program and made the programs themselves morereadable, and thus easier to maintain. The invention of thecompiler was yet another fruit of the stored program concept.The 1970s saw minis becoming more powerful and versatile.Meanwhile, at the high end, Seymour Cray left CDC to formCray Research, a company that would produce the world’sfastest supercomputer, the compact, freon-cooled Cray-1. In themainframe mainstream, IBM’s 370 series maintained thatcompany’s dominant market share in business computing.The most striking innovation of the decade, however, was themicrocomputer.
The microcomputer combined three basicideas: an integrated circuit so compact that it could be laid on asingle silicon chip, the design of that circuit to perform theessential addressing and arithmetic functions required for acomputer, and the use of microcode to embody the20fundamental instructions. Intel’s 4004 introduced in late 1971was originally designed to sell to a calculator company. Whenthat deal fell through, Intel started distributing themicroprocessors in developer’s kits to encourage innovators todesign computers around them.Word of the microprocessor spread through the electronichobbyist community, being given a boost by the January 1975issue of Popular Electronics that featured the Altair computer kit,available from an Albuquerque company called MITS for about$400.The Altair was hard to build and had very limited memory, butit was soon joined by companies that designed and marketedready-to-use microcomputer systems, which soon becameknown as personal computers (PCs).
By 1980, entries in the fieldincluded Apple (Apple II), Commodore (Pet), and Radio Shack(TRS-80). These computers shared certain common features: amicroprocessor, memory in the form of plug-in chips, read-onlymemory chips containing a rudimentary operating system anda version of the BASIC language, and an expansion bus towhich users could connect peripherals such as disk drives orprinters.Meanwhile, programming and the art of software developmentdid not stand still. Innovations of the 1970s included thephilosophy of structured programming.
New languages such asPascal and C supported structured programming design tovarying degrees. Programmers on college campuses also hadaccess to UNIX, a powerful operating system containing arelatively simple kernel, a shell for interaction with users, and agrowing variety of utility programs that could be connectedtogether to solve data processing problems. It was in thisenvironment that the government-funded ARPANETdeveloped protocols for communicating between computersand allowing remote operation of programs. Along with thiscame e-mail, the sharing of information in newsgroups(Usenet), and a growing web of links between networks thatwould eventually become the Internet.21In the 1980s, the personal computer came of age.
IBM brokefrom its methodical corporate culture and allowed a designteam to come up with a PC that featured an open, expandablearchitecture. Other companies such as Compaq legally createdcompatible systems (called “clones”), and “PC-compatible”machines became the industry standard. Under the leadershipof Bill Gates, Microsoft gained control of the operating systemmarket and also became the dominant competitor inapplications software (particularly office software suites).Although unable to gain market share comparable to the PCand its clones, Apple’s innovative Macintosh, introduced in1984, adapted research from the Xerox PARC laboratory in userinterface design. At a time when PC compatibles were stillusing Microsoft’s text-based MS-DOS, the Mac sported agraphical user interface featuring icons, menus, and buttons,controlled by a mouse.
Microsoft responded by developing thebroadly similar Windows operating environment, which startedout slowly but had become competitive with Apple’s by theend of the decade.The 1980s also saw great growth in networking. Universitycomputers running UNIX were increasingly linked throughwhat was becoming the Internet, while office computersincreasingly used local area networks (LANs) such as thosebased on Novell’s Netware system. Meanwhile, PCs were alsobeing equipped with modems, enabling users to dial up agrowing number of on-line services.In the programming field a new paradigm, object-orientedprogramming (OOP) was offered by languages such asSmalltalk and C++, a variant of the popular C language. Thefederal government adopted the Ada language with its abilityto precisely manage program structure and data operations.By the 1990s, the PC was a mature technology dominated byMicrosoft’s Windows operating system.
UNIX, too, hadmatured and become the system of choice for universitycomputing and the worldwide Internet, which, at the beginning22of the decade was far from friendly for the average consumeruser.This changed when Tim Berners-Lee, a researcher at Geneva’sCERN physics lab, adapted hypertext (a way to link documentstogether) with the Internet protocol to implement the WorldWide Web. By 1994, Web browsing software that could displaygraphics and play sounds was available for Windows-basedand other computers.In the office, the Intranet (a LAN based on the Internet TCP/IPprotocol) began to supplant earlier networking schemes.Belatedly recognizing the threat and potential posed by theInternet, Bill Gates plunged Microsoft into the Web servermarket, included the free Internet Explorer browser withWindows, and vowed that all Microsoft programs would workseamlessly with the Internet.Moore’s Law, the dictum that computer power roughly doublesevery 18 months, continued to hold true as PCs went from clockrates of a few tens of mHz to more than 1 gHz.Beyond 2000The new millenium began with great hopes, particularly for theWeb and multimedia “dot-coms”.
By 2005 the computingindustry in many ways was stronger than ever. On the Web,new software approaches are changing the way services andeven applications are delivered. The integration of searchengines, mapping, local content, and user participation ischanging the relationship between companies and theircustomers.Moore’s law is now expressed not through faster singleprocessors, but using processors with two, four, or moreprocessing “cores,” challenging software designers. Mobilecomputing is one of the strongest areas of growth, with devicescombining voice phone, text messaging, e-mail, and Webbrowsing.Computer hardware has evolved from big and bulky tocompact and efficient. The monitor, for example, progressed23from the large beige cathode ray tube, or CRT, monitors of the1990s to slimmer, widescreen liquid crystal display, or LCD,monitors that surpassed the sales of CRT monitors starting in2007.
Toward the end of the first decade of the 21st century,developments in computer technology supported dualmonitors and 3D LCD monitors. The first optical mouse alsobegan replacing the trackball mouse at the beginning of thecentury. Essentially, developments in hardware design haveincreased the accessibility and ease of use of computers.On the first day of 2001, Microsoft announced that Windows 95,its first operating system that was a commercial success,became a legacy item and the company would no longer sell it.This propelled the production of the later Windows operatingsystems that control almost 90 percent of the entire marketshare. Apple, however, has made its way back into the market,not as much because of its easy-to-use operating systems asbecause of its line of multimedia devices, starting with the iPodin 2000, the iPhone in 2007 and the iPad in 2010.
However, bothsoftware giants face an emerging trend toward the use offreeware, as free software such as Adobe Reader andOpenOffice becomes increasingly available.Although the start of the decade saw Internet users keyingnumbers into a dial-up modem to gain access, by the end of thedecade most households had high-speed, broadband Internet,and the rise of Wi-Fi has made wireless Internet access possiblethrough desktop computers, laptops and mobile phones. Socialnetworking became prevalent on the Internet by storm in thefirst decade.The industry continues to face formidable challenges rangingfrom mitigating environmental impact to the shifting ofmanufacturing and even software development to rapidlygrowing countries such as India and China.24The Top Ten Computer Trends for the 21st Century1. Computers will become powerful extensions of humanbeings designed to augment intelligence, learning,communications, and productivity.2.
Computers will become intuitive - they will “learn,”“recognize,” and “know” what we want, who we are,and even what we desire.3. Computer chips will be everywhere, and they willbecome invisible - embedded in everything from brainsand hearts, to clothes and toys.4. Computers will manage essential global systems, suchas transportation and food production, better thanhumans will.5. Online computer resources will enable us to downloadapplications on-demand via wireless access anywhereand anytime.6. Computers will become voice-activated, networked,video-enabled, and connected together over the Net,linked with each other and humans.7. Computers will have digital senses-speech, sight, smell,hearing-enabling them to communicate with humansand other machines.8.
Neural networks and other forms of artificialintelligence will make computers both as smart ashumans, and smarter for certain jobs.9. Human and computer evolution will converge.Synthetic intelligence will greatly enhance the nextgenerations of humans.10. As computers surpass humans in intelligence, a newdigital species and a new culture will evolve that isparallel to ours.Notes:UNIVAC - американская компания, подразделениекорпорации Remington Rand.