Nash - Scientific Computing with PCs (523165), страница 57
Текст из файла (страница 57)
We should choose not to proceed in the following situations:174Copyright © 1984, 1994 J C & M M NashNash Information Services Inc., 1975 Bel Air Drive, Ottawa, ON K2C 0X1 CanadaSCIENTIFIC COMPUTING WITH PCsCopy for:Dr. Dobb’s Journal•The total amount of data required is too large to be stored on our mass storage devices in anyreasonable way. For example, attempts to solve problems requiring all the data in a national censuswith a PC having only floppy disks would generally be regarded as foolish.
However, the Senegalcensus was conducted in the mid-1980s with a few such machines, aided by one or two with fixeddisks, in a well-designed data entry and tabulation effort.•The amount of data in the problem is considerably larger than memory available on the target PCsystem. (Note that we do not even consider working storage or program space yet, as these aredetermined by the algorithm.)•Conversion of data from its external form to computational form, e.g., from small integers in a datafile to floating-point numbers, will overload memory. Careful programming may overcome suchdifficulties.•The anticipated amount of output is such that peripherals cannot cope in a reasonable time, e.g., theprinter will take all weekend to print the results.•A "fast" PC (in comparison with the target PC) has required a considerable time to compute solutionsto problems similar to the one at hand.
We must be honest and scrupulous in estimating the timerequirements for the proposed solution.•We need reliable, high precision answers, for example in a navigation or spacecraft tracking problem,yet our programming languages and hardware cannot guarantee this precision.
Fortunately, goodtools do exist on PCs, but they may not be available to us.In all the above, the aim is to avoid impossible tasks. In the past, we have been accused of just thisoffense. However, there is a distinction to be made between testing the limits of computational systems— where they are rarely used effectively — and trying to solve problems that strain PC limits in aproduction mode. The "try-it-and-see" exercise is valuable in providing data upon which the decisiondiscussed in this section can be based. If our goal is to solve problems effectively, we do not want to beconstantly risking memory overflow or similar disasters in mid-solution, after much human and PC workhas been performed.20.2 Acquiring the Right Computing EnvironmentsMost workers performing scientific computations now have a choice of several computing environmentsfor their work.
Our recommendation:Choose the computing environment that will do the whole job with the least amount of bother forthe users — us!If computing resources that can do the job already exist, we must decide which to use. The questions inSection 20.1 are relevant in eliminating candidate machines that cannot do the job. We now wish to selectthe best PC available to us. If an already available system has software suited to our problems, this willbe the preferred candidate if it has adequate features, speed and storage resources. Whereas in Section20.1 we considered if a particular PC would be adequate, we now wish to put a scale on the answers tothe questions and decide how well it will work for us.An important consideration concerns the burdens the operation of a particular machine places on the user?For example, will it require tedious or difficult responses or commands, many disk exchanges, or havingto remember complicated sequences of events that must occur in a definite and precise order.Issues of special interfacing of equipment or devices are best left to the specialists, not the salespersons.Being the first person to try a particular interface nearly always implies expenditure of time and money.We consider some cases that deserve caution by examples.Any interface where there are time constraints on signals or movement of data can be troublesome, since20: YOUR OWN STRATEGY175minor changes in the way in which our system works may result in annoying failures.
This can beillustrated by the (non-scientific) example of our fax-modem. This inexpensive but important part of oneof our PCs functioned without problems until we connected its host PC (an aging XT-class MS-DOS PC)to our other PCs in a small local area network. We noticed our incoming fax messages were of poorquality, and modem communications seemed to suffer from "line noise". Finally, some experimentsinvolving the sending and receiving of faxes with a friend showed that timing delays imposed on diskstorage were apparently the source of the problem.
That is, the PC was not fast enough to save all the"dots" in the incoming fax message, so we lost some information. We ran this PC off-line from thenetwork. It was eventually replaced with a faster unit that can handle both the LAN and fax messagessimultaneously.It may not be possible to operate all installed devices that require interrupt allocation to work. The IBMPC is particularly notorious for having too few interrupt request lines (IRQs). Thus vendors are happyto sell equipment "guaranteed to work with your PC". However, they may neglect to mention that to useyour CD ROM drive you must disconnect your printer! We have observed a situation in a governmentoffice where two secretaries were to have their PCs connected to a single laser printer via a printer sharingbox.
The installer arrived, duly hooked up the box with appropriate cables, and departed with the words"Oh, your mouse might not work." (It did not.)If PCs must be obtained, then we want to minimize the acquisition effort. While cash can be saved bycareful shopping for PC system components from multiple suppliers, our feeling is that this activitydemands much time from the user.
In case of trouble, we can insist that a single supplier find and repairit, usually under warranty. We can also arrange a fully configured system before delivery, saving our timefor research work. While we prefer to deal with the same supplier over time, this is proving more difficultas PCs become commodity merchandise and specialty shops give way to merchandising chains. Of course,even specialty PC suppliers are rarely knowledgeable in scientific computing.Our policy is to overbuy in some respects.
In particular, we recommend that one ensures the maximumin fixed storage ("hard disk") capacity, main memory (RAM) capacity and processing speed given themoney available for the purchase. Generally we have found ourselves best served in terms ofperformance/price ratios by machines that have just been superseded as the most powerful in their class.By buying late in the product cycle, we feel we get reliable, well-tested technology at a reasonable price.Our own MS-DOS PCs were all bought in this way. We bought a XT-class (Intel 8088/NEC V20) machineas the AT class (Intel 80286) was introduced, a "fast" AT-clone as 80386 PCs were appearing, a high-end80386/80387 machine as 80486s were delivered, and a similar notebook configuration when 80486notebooks were first widely offered.We recommend against the use of multiple computing platforms.
Our own experiences of trying to mixPCs or operating systems have been uniformly unsatisfactory. We managed to get the job done, but werenot happy in so doing (see Sections 1-5, 2-6, and 11-8).20.3 Tools and MethodsAs with choosing a PC on which to compute solutions to a problem, selecting software or programminglanguage(s) is often Hobson’s choice.The machine we must use is on our desk. If we want to use equipment belonging to others, then we mustgenerally put up with the facilities provided. Many PCs are not used for technical or research work, solack the software most suitable to such tasks.If the method chosen to solve the problem is not already developed within available software, then wemust program it. This may mean:•Writing of a program "from scratch" in a traditional programming language such as FORTRAN, C,BASIC or PASCAL;176Copyright © 1984, 1994 J C & M M NashNash Information Services Inc., 1975 Bel Air Drive, Ottawa, ON K2C 0X1 CanadaSCIENTIFIC COMPUTING WITH PCsCopy for:Dr.
Dobb’s Journal•Working within a system such as MATLAB or True BASIC where a range of typical scientificcomputations is already available, so that our "program" becomes a sequence of calls to macrocommands or functions. Symbolic manipulation packages such as DERIVE, Maple or Mathematica offerconsiderable computational power along with the capability to manipulate mathematical expressions.•If our problem is statistical in nature, linking commands for a statistical package into scripts that canbe executed automatically may be a reasonable approach. We have used such ideas extensively topresent examples to students in university statistics courses, where the calculation can be made toevolve on the screen as the student watches and interacts. The tools with which we have worked —and this a very restricted list — are MINITAB, Stata and SYSTAT (or its educational versionMYSTAT).•Developing macros or worksheet templates for a spreadsheet processor such as Lotus 1-2-3,QUATTRO or Excel.
We may also be able to take advantage of special Add-In software to enhancethe capabilities of the spreadsheet package to carry out scientific computations. We have reviewedsuch Add-Ins for optimization computations (Nash J C 1991b). Our experience is such that we cautionagainst the development of any but the simplest of macros. They are, in our opinion, too difficult todevelop and revise and much too easy to prepare incorrectly.•Where problems involve extensive data manipulation we may wish to write command scripts for adatabase language such as that for dBase III/IV or PARADOX.If we decide to write a conventional program, then we need a compiler or interpreter for the languagein which we choose to write the program.