Summary: There is a growing demand for the computational power that supercomputers offer researchers in the United States, but access to them is being stifled by a number of factors, according to a new report by the Center for Data Innovation.
Original author and publication date: Yasmin Tadjdeh – January 28, 2021
Futurizonte Editor’s Note: The question is: What will happen once the demand is met? How many supercomputers there will be out there?
From the article:
Supercomputers are a subset of high-performance computing, or HPC, which refers to systems that can solve difficult computational problems, according to the report, “How the United States Can Increase Access to Supercomputing.” They can be harnessed for a number of different research areas and are particularly important in the development of artificial intelligence systems.
Japan’s Fugaku system is considered the No. 1 supercomputer in the world, and U.S.-based Oak Ridge National Laboratory’s Summit the No. 2.
The demand for the platforms in the United States is growing rapidly but the government is not investing enough money into the technology, said Hodan Omaar, a policy analyst at the Center for Data Innovation and author of the report.
“There isn’t enough funding in HPC to support the acquisition of systems and software that can support AI researchers,” she (Hodan Omaar) said during an online event in December.
This inhibits the ability of AI researchers to develop new products that are vital to maintaining U.S. competitiveness, the report said.
In the United States, funding for high-performance computing comes from both the Department of Energy — which oversees 17 national labs across the country — and the National Science Foundation, she noted.
The Energy Department typically invests in the most powerful systems available, but those often only support a small number of researchers, Omaar said. The NSF is responsible for systems that are not quite as powerful but are used by the majority of researchers.
The Energy Department has increased its investment in large-scale HPC resources over the last decade by about 90 percent, from $277 million in 2010 to $538 million in 2019 in constant 2010 dollars, according to the report. Meanwhile, the NSF has decreased its funding by 50 percent, from $325 million in 2010 to $167 million in 2019.
“This discrepancy has led to a U.S. HPC portfolio weighted toward very powerful systems that can only support a smaller number of researchers,” the report said. “However, both funding sources fail to meet current demand.”