10 (Материалы к экзамену), страница 2
Описание файла
Файл "10" внутри архива находится в следующих папках: Материалы к экзамену, faq. Текстовый-файл из архива "Материалы к экзамену", который расположен в категории "". Всё это находится в предмете "вычислительные сети и системы" из 7 семестр, которые можно найти в файловом архиве МГУ им. Ломоносова. Не смотря на прямую связь этого архива с МГУ им. Ломоносова, его также можно найти и в других разделах. .
Просмотр 2 страницы текстового-файла онлайн
FORTRAN | C | C++ | etc.
------------------------
Why don't you guys grow up and use real languages?
======================================== ==========
The best way to answer this question first is to determine what languages
the questioner is asking (sometimes called 'language bigots').
What's a 'real' langauge?
This is a topic guaranteed to get yawns from the experienced folk,
you will only argue with newbies.
In two words, many of the existing application programs are:
"Dusty decks."
You remember what a 'card deck' was right? These programs are non-trivial:
thousands and sometimes millions of lines of code whose authors have sometimes
retired and not kept on retainer.
A missing key concept is "conversion." Users don't want to convert their
programs (rewrite, etc.) to use other languages.
Incentives.
See also: Statement: Supercomputers are too important to run
interactive operating systems,
text editors, etc.
Don't language Converters like f2c help?
----------------------------------------
No.
Problems fall into several categories:
1) Implementation specific features:
you have a software architecture to take advantage certain
hardware specific features (doesn't have to be vectors,
it could be I/O for instance). A delicate tradeoff
exists between using said features vs. not using them
for reasons of things like portability and long-term
program life. E.g., Control Data Q8xxxxxx based
subprogram calls while having proper legal FORTRAN syntax,
involved calls to hardware and software which didn't
exist on other systems. Some of these calls could be
replaced with non-vector code, but why? You impulse purchased
the machine for its speed to solve immediate problems.
2) Some language features don't have precisely matching/
corresponding semantics. E.g., dynamic vs. static memory use.
3). Etc.
These little "gotchas" are very annoying and frequently compound to
serious labor.
What's wrong with FORTRAN? What are it's problems for parallel computing?
---------------------------------------- ----------------------------------
The best non-language specific explanation of the parallel computing problem
was written in 1980 by Anita Jones on the Cm* Project
Paraphasing:
1) Lack of facilities to protect and insure the consistency of results.
[Determinism and consistency.]
2) Lack of adequate communication facilities.
[What's wrong with READ and WRITE?]
3) Lack of synchronization (explicit or implicit) facilities.
[Locks, barriers, and all those things.]
4) Exception handling (miscellaneous things).
Her citation of problems were: consistency, deadlock, and starvation.
FORTRAN's (from 1966 to current) problems:
Side effects (mixed blessing: re: random numbers)
GOTOs (the classic software engineering reason)
Relatively rigid poor data structures
Relatively static run time environment semantics
68. If we believe in data structures, we must believe in
independent (hence simultaneous) processing. For why else
would we collect items within a structure? Why do we
tolerate languages that give us the one without the other?
--Alan Perlis (Epigrams)
9. It is better to have 100 functions operate on one data
structure than 10 functions on 10 data structures.
--Alan Perlis (Epigrams)
A few people (Don Knuth included) would argue that the definition of an
algorithm contradicts certain aspects regarding parallelism. Fine.
We can speak parallel (replicated) data structures, but the problem of
programming languages and architectures covers more than education and math.
Programming language types (people) tend to either develop specialized
languages for parallelism or their tend to add operating system features.
The issue is assuming determinism and consistency during a computation.
If you don't mind the odd inconsistent error, then you are lucky.
Such a person must clearly write perfect code every time. The rest of
us must debug.
"Drop in" parallel speed-up is the Holy Grail of high performance computing.
The Holy Grail of programming and software engineering has been
"automatic programming." If you believe we have either, then I have a
big bridge to sell you.
Attempts to write parallel languages fall into two categories:
completely new languages: with new semantics in some case
e.g., APL, VAL, ID, SISAL, etc.
add ons to old languages: with new semantics and hacked on syntax.
The latter fall into two types:
OS like constructs like semaphores, monitors, etc.
which tend not to scale. ("Oh, yeah you want
concurrency, well, let me help you with these....")
Starting with Concurrent Pascal, Modula, etc.
Constructs for message passing or barriers thought up
by numerical analysts (actually these are two
vastly different subtypes (oversimplified)).
Starting with "meta-adjective" FORTRAN.
Compilers and architectures ARE an issue (can be different):
One issue is programmability or ease of programming:
Two camps:
parallel programming is no different than any other programming.
[Jones is an early ref.]
and
Bull shit! It's at least comparable in difficulty to
"systems" programming.
[Grit and McGraw is an early ref.]
Take a look at the use of the full-empty bit on Denelcor HEP memory
(and soon Tera). This stuff is weird if you have never encountered it.
I'm going to use this as one example feature, but keep in mind that
other features exist. You can find "trenches" war stories (mine fields for
Tera to avoid [they know it]). Why? Because the programmers are very
confident they (we) know what they (we) are doing. BUZZT!
We (I mean Murphy) screw up.
The difficulty comes (side effects) when you deal with global storage
(to varying degrees if you have ever seen TASK COMMON). You have
difficulty tracing the scope. Architecture issues.
I like to see serial codes which have dead-lock and other problems.
I think we should collect examples (including error messages) put them
on display as warnings (tell that to the govt. ha!).
The use of atomic full-empty bits might be the parity bits of the future
(noting that the early supercomputers didn't have parity).
How consistent do you like your data? Debug any lately?
Don't get fooled that message passing is any safer.
See the Latest Word on Message Passing.
You can get just as confused.
Ideally, the programmer would LOVE to have all this stuff hidden.
I wonder when that will happen?
What makes us think that as we scale up processors, that we won't make
changes in our memory systems? Probably because von Neumann memories
are so easily made.
Communication: moving data around consistently is tougher than most
people give credit, and it's not parallelism. Floating point gets
too much attention.
Solutions (heuristic): education: I think we need to make emulations for
older designed machines like the HEP available (public domain for schools).
The problem is that I don't trust some of those emulators,
because I think we really need to run them on parallel machines,
and many are proprietary and written for sequential machines.
The schools potentiall have a complex porting job.
I fear that old emulations have timing gotchas which never