TPC BENCHMARK (TM) H (779138), страница 25
Текст из файла (страница 25)
The information provided should be sufficientfor an independent reconstruction of the driver.8.3.8.2 If an implementation specific layer is used, then a detailed description of how it performs its functions, how its various components interact and any product functionalities or environmental setting on which it relies must bedisclosed. All related source code, scripts and configuration files must be reported in the supporting files archive.The information provided should be sufficient for an independent reconstruction of the implementation specificlayer.8.3.8.3 If profile-directed optimization as described in Clause 5.2.9 is used, such use must be disclosed. In particular, theprocedure and any scripts used to perform the optimization must be reported in the supporting files archive.TPC BenchmarkTM H Standard Specification Revision 2.17.1Page 1138.3.9Clause 9 - Audit Related Items8.3.9.1 The auditor's agency name, address, phone number, and attestation letter with a brief audit summary report indicating compliance must be included in the full disclosure report.
A statement should be included specifying whom tocontact in order to obtain further information regarding the audit process.8.4Executive SummaryThe executive summary is meant to be a high level overview of a TPC-H implementation. It should provide thesalient characteristics of a benchmark execution (metrics, configuration, pricing, etc.) without the exhaustive detailfound in the FDR. When the TPC-Energy optional reporting is selected by the test sponsor, the additionalrequirements and format of TPC-Energy related items in the executive summary are included in the TPC EnergySpecification, located at www.tpc.org.The executive summary has three components:8.4.1Implementation OverviewPricing SpreadsheetNumerical QuantitiesPage LayoutEach component of the executive summary should appear on a page by itself.
Each page should use a standardheader and format, including1/2 inch margins, top and bottom;3/4 inch left margin, 1/2 inch right margin;2 pt. frame around the body of the page. All interior lines should be 1 pt.;Sponsor identification and System identification, each set apart by a 1 pt. rule, in 16-20 pt. Times Boldfont;TPC-H, TPC-Pricing, TPC-Energy (if reported) with three tier versioning (e.g., 1.2.3), and report date, separatedfrom other header items and each other by a 1 pt. Rule, in 9-12 pt.
Times font.Comment 1: It is permissible to use or include company logos when identifying the sponsor.Comment 2: The report date must be disclosed with a precision of 1 day. The precise format is left to the test sponsor.Comment : Appendix E contains a sample executive summary. It is meant to help clarify the requirements insection 8.4 and is provided solely as an example.8.4.2Implementation OverviewThe implementation overview page contains six sets of data, each laid out across the page as a sequence of boxesusing 1 pt.
rule, with a title above the required quantity. Both titles and quantities should use a 9-12 pt. Times fontunless otherwise noted.8.4.2.1 The first section contains the results that were obtained from the reported run of the Performance test.Table 13: Implementation Overview InformationTitleQuantityPrecisionUnitsFontTotal System Cost3 yr. Cost of ownership (seeClause 7: )1$116-20 pt. BoldTPC-H Composite Queryper Hour MetricQphH (see Clause 5.4.3)0.1QphH@nGB16-20 pt. BoldTPC BenchmarkTM H Standard Specification Revision 2.17.1Page 114Price/Performance$/QphH (see Clause 5.4.4)1$/QphH@nGB16-20 pt.
Bold8.4.2.2 The next section details the system configurationTable 14: System Configuration InformationTitleQuantityPrecisionUnitsFontDatabase SizeRaw data size of test database(see Clause 4.1.3 and Clause8.3.6.7)1GB9-12 pt. Times(see Clause 8.3.6.7)DBMS ManagerBrand, Software Version ofDBMS used9-12 pt. TimesOperating SystemBrand, Software Version ofOS used9-12 pt. TimesOther SoftwareBrand, Software Version ofother software components9-12 pt.
TimesSystem Availability DateThe Availability Date of thesystem, defined in Clause 0 ofthe TPC Pricing Specification.1 day9-12 pt. TimesComment: The Software Version must uniquely identify the orderable software product referenced in the PricedConfiguration (e.g., RALF/2000 4.2.1)8.4.2.3 This section is the largest in the implementation overview, and contains a graphic representation of the reportedquery times.
Each query and refresh function executed during the benchmark should be listed in the graph, with anyquery variants clearly identified. In addition:All labels and scales must use a 10 point Courier font, except for the legend and the graph title which mustuse a Times font;All line sizes must be 1 point;The legend must be reproduced as depicted in the example, and must be placed where needed to avoidoverlapping any portion of the graph;The query time axis must labeled with no more than 8 values, including the zero origin;Each pair of bars must be separated by a gap of 50% of the bar's width;A zero-based linear scale must be used for the query times;The upper bound of the time scale must be no greater than 120% of the longest query timing interval;The bars used for the power test must be sized based on the measured (i.e., without the adjustment definedin Clause 5.4.1.4) query timing intervals of the power test, and must be solid white;The bars used for the throughput test must be sized based on the arithmetic mean by query type of the measured query timing intervals of the throughput test, and must be solid black;The geometric mean of the power test components must be computed using unadjusted timings of queriesand refresh functions and must be placed on the graph as a dashed line labeled on top with its value.
It mustbe expressed using the same format and precision as TPC-H Power specified in Clause 5: ;TPC BenchmarkTM H Standard Specification Revision 2.17.1Page 115The arithmetic mean of the throughput test must be calculated using unadjusted timings with the followingcomputation:where QI(i,s) is defined in Clause 5.3.7.2, and S is defined in Clause 5.1.2.3;A solid line representing the mean must be placed on the graph intersecting only the queries and must belabeled on top with its value. The arithmetic mean of the throughput test must be expressed with the sameformat and precision as TPC-H Throughput specified in Clause 5: ;All query numbers must be followed by a variant letter when a variant was used in the tests.8.4.2.4 This section contains the database load and sizing informationTable 15: Database Load and Sizing InformationTitleQuantityPrecisionUnitsFontDatabase Load TimeLoad Time (see Clause 4.3)1 sec.hh:mm:ss9-12 pt.
TimesTotal Disk/Database SizeData StorageClause 8.3.6.7)0.01Ratio(see9-12 pt. Times9-12 pt. TimesSize Percentage (see Clause8.3.6.10)0.1Load includes backupY/N (see Clause 4.3.6)N/AN/A9-12 pt. TimesData Redundancymechanisms used for (Basetables only)Y/N (see Clause 8.3.6.4)N/AN/A9-12 pt. TimesData Redundancymechanisms used for(Base tables and auxiliarydata structures)Y/N (see Clause 8.3.6.4)N/AN/A9-12 pt. TimesDataRedundancymechanismsusedfor(Everything)Y/N (see Clause 8.3.6.4)N/AN/A9-12 pt. TimesMemory/Database SizePercentageData Redundancy Level (See Clause 8.3.6.4)N/A N/A 9-12 pt.
Times BoldBase Tables [0..3] (See Clause 8.3.6.4)N/A N/A 9-12 pt. TimesAuxiliary Structures [0..3] (See Clause 8.3.6.4)N/A N/A 9-12 pt. TimesDBMS Temporary Space [0..3] (See Clause 8.3.6.4)N/A N/A 9-12 pt. TimesOS and DBMS Software[0..3] (See Clause 8.3.6.4)N/A N/A 9-12 pt. Times8.4.2.5 The next section of the Implementation Overview should contain a synopsis of the SUT's major system components,includingtotal number of nodes used/total number of processors used with their types and speeds in GHz/ totalnumber of cores used/total number of threads used;TPC BenchmarkTM H Standard Specification Revision 2.17.1Page 116Main and cache memory sizes;Network and I/O connectivity;Disk quantity and geometry.If the implementation used a two-tier architecture, front-end and back-end systems should be detailed separately.8.4.2.5.1 The term "main memory" as referenced in Clause 8.4.2.5 refers to the memory of the host system or server / clientcomponents of the SUT in Clause 6.2.1 that perform database and query logic processing.
The main memory size to bedisclosed in Clause 8.4.2.5 is the amount of memory that is directly addressable by the processors/cores/threads of eachcomponent and accessible to store data or instructions.8.4.2.6 The final section of the implementation Overview should contain a note stating:“Database Size includes only raw data (e.g., no temp, index, redundant storage space, etc.).”8.4.3Pricing SpreadsheetThe major categories in the Price Spreadsheet, as appropriate, are:Server HardwareServer StorageServer SoftwareDiscounts (may optionally be included with above major category subtotal calculations)t.8.4.4Numerical Quantities SummaryThe Numerical Quantities Summary page contains three sets of data, presented in tabular form, detailing the execution timings for the reported execution of the performance test. Each set of data should be headed by its given titleand clearly separated from the other tables.8.4.4.1 The first section contains measurement results from the benchmark execution.Section Title: Measurement ResultsItem TitlePrecisionNotesDatabase Scale Factor1Total Data Storage/Database Size0.01Start of Database Loadyyyy-mm-dd hh:mm:ssEnd of Database Loadyyyy-mm-dd hh:mm:ssDatabase Load Timehh:mm:ssQuery Streams for Throughput Test1TPC-H Power0.1TPC-H Throughput0.1TPC-H Composite Query-per-Hour Metric (QphH@Size)0.1Total System Price Over 3 Years$1(1)TPC-H Price Performance Metric ($/QphH@Size)$0.01(1)(1) depending on the currency used for publication this sign has to be exchanged with the ISO currency symbolTPC BenchmarkTM H Standard Specification Revision 2.17.1Page 1178.4.4.2 The second section contains query and query stream timing information.Section Title: Measurement IntervalsItem TitleMeasurement Interval in Throughput Test (Ts)PrecisionNotes1 secondDuration of Stream Execution(1)Stream1Seed1Start Date/Timemm/dd/yy hh:mm:ssEnd Date/Timemm/dd/yy hh:mm:ssTotal Timehh:mm:ssRefresh Start Date/Timemm/dd/yy hh:mm:ssRefresh End Date/Timemm/dd/yy hh:mm:ss(1) The remaining items in this section should be reported as a sub-table, with one entry for each stream executedduring the performance test.8.4.4.3 The final section, titled Timing Intervals (in Sec.) contains individual query and refresh function timings.