Survey and Analysis of the National Security High Performance Computing Architectural Requirements
4 June 2001
Richard Games, Ph.D.
Are current high performance computers (HPCs) that use commodity microprocessors (“commodity HPCs”) adequate for national security applications? Or is there a critical need for traditional vector supercomputers? These were the overarching questions of a quick-reaction survey conducted by the author during the first two weeks of April 2001. The Deputy Under Secretary of Defense for Science and Technology commissioned the survey to help understand the national security issues associated with Cray Inc.’s request to the United States Department of Commerce to lift the import duty on Japanese vector supercomputers. This report contains the analysis of the survey results along with recommendations.
There are no simple answers to these two questions. At the minimum the answers depend on the specific application area. The author conducted face-to-face interviews with multiple users and software developers in 10 high performance computing application areas, including support to research and development, acquisition, and operations. Operational support ranged from “off-line” predictive analysis for planning purposes to “on-line” applications such as weather prediction, surveillance, and reconnaissance. Based on the interviews conducted, the overall assessment was that commodity HPCs are providing useful capability in all areas surveyed except for the cryptanalysis area. Over the last five years there has been a major investment made by the national security community to retool legacy vector-supercomputer software to run on commodity HPCs. As a result, these systems are in high demand by the national security community. Although commodity HPCs are producing useful results, in almost every case there were significant issues identified with their use. These issues included the negative impact that their difficult programming environments are having on researchers and system developers and the inefficiency in many cases of the actual processing due to a serious processor-memory communications bottleneck. The real value of this survey came from the insights gained from the discussions of the challenges that national security users and software developers face because of the current HPC technology base.
The bottom line for the national security community reduces to the interrelated issues of productivity and affordability. How productive are the researchers/developers that write the high performance software? How productive (efficient) are the HPCs that run the software? What does it cost? Included in the total cost are the facilities and the operational risks associated with the reliability of larger less-efficient installations. Perhaps the hardest cost to quantify is the “opportunity lost” when a domain researcher spends time on complicated computer programming rather than on creating new science. A number of recommendations are made to increase the flexibility and performance of future national security HPC options. These include assessing the impact of current stateof- the-art Japanese vector supercomputers, promulgating the software best practices identified during the survey, and initiating a pragmatic R&D program to improve the productivity of HPCs for national security applications.
Click on a date/time to view the file as it appeared at that time.
|current||10:20, 28 December 2012||(263 KB)||Webmaster||Category:HEC|