NSF Awards Grant for Bringing HPC to Data-Intensive Environments

Texas Tech gets $500,000 to create a supercomputing prototype for dealing with massive data sets.

The US National Science Foundation (NSF) has awarded a $500,000 grant to Texas Tech University to develop a supercomputer prototype optimized for data-intensive applications. Yong Chen, assistant professor of computer science and Directory of the Data-Intensive Scalable Computing Laboratory, will lead the effort.
Supercomputers were originally developed for computation-intensive applications, and computational efficiency is still the focus of most high-performance computing systems. However, the new era of Big Data is bringing a new focus on data-intensive problems. The evolving emphasis on data means that many HPC systems "...spend a majority of their time manipulating data, rather than doing actual computing. The amount of computing time is significantly less than the data access movement time."
The goal of the research is to provide software solutions that will increase the speed and efficiency for data-intensive application but still support high-performance computing techniques for solving large, complex problems within science and industry.