1993. Interprocedural Optimization – Eliminating Unnecessary Recompilation (1157408)
Текст из файла
Interprocedural Optimization:Eliminating Unnecessary RecompilationMICHAEL BURKEIBM ResearchandLINDA TORCZONRice UniversityWhile efficient new algorithms for interprocedural data-flow analysis have made these techniques practical for use in production compilation systems,a new problem has arisen: collectingand using interprocedural information in a compiler introduces subtle dependence among theproceduresof a program. If the compiler dependson interprocedural information to optimize agiven module, a subsequentediting changeto another module in the program may changetheinterprocedural information and necessitaterecompilation. To avoid having to recompile everymodule in a program in responseto a single editing changeto one module, we have developedtechniques to more precisely determine which compilations have actually been invalidated by atest to determinechangeto the program’s source.This paper presents a general recoi-npzlattonwhich proceduresmust be compiled in responseto a series of editing changes.Three differentimplementation strategies, which demonstrate the fundamental tradeoff between the cost ofanalysis and the precision of the resulting test, are also discussed.Categories and Subject Descriptors: D.2.6 [Software Engineering]: Programming Environments; D.3.4 [Programming Languages]: Processors—compilers,optznuzationGeneral Terms: Algorithms, LanguagesAdditional Key Words and Phrases:Interprocedural analysis and optimization, data-flow analysis, recompilation analysis1.
INTRODUCTIONTraditionaloptimizingcompilers have advanced to the point where they doan excellent job of optimizingcode within a single procedure or compilationunit. Accordingly,code optimizationresearch has begun to focus on interpro-This research has been supported by the IBM Corporation, by the Advanced ResearchProjectAgency, and by the National Science Foundation through grants MCS 81-21844 and MCS83-03638.Authors’ addresses:M. Burke, IBM T. J. Watson ResearchCenter, POB 704, Yorktown Heights,NY 10598.email: burkem@watson.ibm.tom;L. Torczon, Computer ScienceDept., Rice University, Houston, TX 77251-1892.email: torczon@-ice.edwPermissionto copywithout fee all or part of this material is granted provided that the copiesarenot made or distributed for direct commercial advantage,the ACM copyright notice and the titleof the publication and its date appear, and notice is given that copying is by permission of theAssociationfor Computing Machinery.
To copy otherwise, or to republish, requires a fee and/orspecificpermission.01993 ACM 0164-0925/93/0700-0367 $01.50ACM Transact~onsonProgramming Languagesand Systems,Vol 15, No 3, July 1993,pages367-399,368.M Burke and L Torczoncedural analysis and optimization.Recent work has included both fasteralgorithmsfor some interproceduraldata-flow problems and efficacy studiesof some interproceduraltransformations[7, 8, 9, 13, 15, 16, 24, 25, 33].
Thesetechniquesare also being applied in compilersthat try to automaticallyrestructureprograms to expose parallelism.In each of these areas, the goal isto improve the efficiency of code generated for a whole program by giving thecompiler more context over which to optimize.Unfortunately,interproceduraloptimizationdirectly conflicts with one ofthe most treasuredfeatures of traditionalALGOL-likeprogramminglanguages: separate compilation.Interproceduraldata-flowanalysis gives thecompiler facts about the naming environmentin which the code will executeat run-timeand about the side effects of procedures that will be called atrun-time.Using such informationmakes the correctnessof compile-timedecisions for one procedure dependent on the source text for other procedures.Cross-proceduraloptimizations,like interproceduralregister allocationandinline substitution,have a similar effect, although they may rely on information derived even later in the compilationprocess, like the specific mapping ofnames to storage locations.
As soon as informationfrom other procedures isused to make compile-timedecisions, the object code produced by the compiler becomes a function of those other procedures. In such a system, editingchanges made to one procedure can invalidateprior compilationsof otherprocedures.To produce a practical system that performs interproceduralanalysis andoptimizationwill require a mechanism for tracking such recompilationdependence, detecting when they are violated, and automaticallyrecompilingthenecessary parts of the program.
Of course, the compiler could adopt the naiveapproach and recompile the entire program after a change to any singleprocedure—sacrificingany possible benefit of separate compilation.The alanalysisto determine, at compile-time,ternative is to perform a recompilationthe set of procedures that may need to be recompiled in response to editingchanges to one or more procedures in the program. The power (and success)of such an analysis should be measured by the number of spurious recompilationthat it avoids—proceduresthat would otherwisebe recompiledunnecessarily.This paper examines the recompilationproblems introducedby the use ofinterproceduraldataflow informationas a basis for compile-timedecisions.
Itextends the work presented by Cooper et al. [17] and Torczon [32]. Wepresent a general approach to recompilationanalysisand three specifictechniques for implementingit. The general frameworkis based on observingthe nature of the interproceduralsets themselves and the ways in which anoptimizercan use them. The differentimplementationtechniquesproducerecompilationtests of successivelygreater precision,with a concomitantincrease in the expense of the test. Each of the techniquesrepresentsasignificantimprovementover recompilingthe entire program.This problem has not received much attentionin the literature,primarilybecause few compilershave actuallycomputedand used interproceduralinformation.For example, the PLI OptimizingCompiler trivializesthe probACM TransactIonson ProgrammmgLanguages and Systems, Vol 15, No 3, July 1993InterproceduralOptimization:EliminatingUnnecessaryRecompilation..369Iem by limitingits analysis to a single compilationunit [28].
Other systems,like the ECS project at IBM Research, appear to recompile the entire programfor each executable[3]. Some systemsignoretheproblemcompletely—forexample, the Cray FORTRANcompiler will perform inlinesubstitution,but does not provide the user with support for managing theresulting recompilationproblems.There are systems that are related to this work. Feldman’s make system isan ancestor of our system.
It pioneered the idea of automatic reconstructionbased on an analysis of the internal dependenceof a system [20]. However,make requires that the programmerexplicitlyspecify the compilationdependence, while our method derives them analytically.The system proposed byTichy [31] and Tichy and Baker [30] analyzes the recompilationdependencefiles. For each module that uses aintroducedthrough the use of includespecific include file, it records those definitionsthat are actually referenced inthe module. Using this information,it can determine which modules must berecompiled when an include file is changed.
Schwanke and Kaiser suggestthat the number of routines selected for recompilationby the Tichy-Bakerapproach could be further reduced if harmless inconsistenciesbetween separately compiled modules were not considered when determiningwhich routines to recompile [27]. Althoughthese systems are similar in flavor to ourapproach, they do not determine if changes in the interproceduraldata-flowinformationfor a procedure make recompilationof that procedure necessary.Althoughthe implementationsof this work have been in the context ofsystems that analyze FORTRAN, the techniques are applicable across a widevariety of languages.
They work directly with data-flow informationproducedby a compiler; the complicationsintroduced by specific language features arethus folded into computing the base informationon which our methods work.The remainderof this paper is subdividedinto eight sections. Section 2describes our model of the compilationsystem.
Характеристики
Тип файла PDF
PDF-формат наиболее широко используется для просмотра любого типа файлов на любом устройстве. В него можно сохранить документ, таблицы, презентацию, текст, чертежи, вычисления, графики и всё остальное, что можно показать на экране любого устройства. Именно его лучше всего использовать для печати.
Например, если Вам нужно распечатать чертёж из автокада, Вы сохраните чертёж на флешку, но будет ли автокад в пункте печати? А если будет, то нужная версия с нужными библиотеками? Именно для этого и нужен формат PDF - в нём точно будет показано верно вне зависимости от того, в какой программе создали PDF-файл и есть ли нужная программа для его просмотра.