Machine Learning, Analytics Play Growing Role in US Exascale Efforts
Exascale computing promises to bring significant changes to both the high-performance computing space and eventually enterprise datacenter infrastructures. The systems, which are being developed in multiple countries around the globe, promise 50 times the performance of current 20 petaflop-capable systems that are now among the fastest in the world, and that bring corresponding improvements in such areas as energy efficiency and physical footprint. The systems need to be powerful run the increasingly complex applications being used by engineers and scientists, but they can't be so expensive to acquire or run that only a handful of organizations can use them. At the same time, the emergence of high-level data analytics and machine learning is forcing some changes in the exascale efforts in the United States, changes that play a role in everything from the software stacks that are being developed for the systems to the competition with Chinese companies that also are aggressively pursuing exascale computing. During a talk last week at the OpenFabrics Workshop in Austin, Texas, Al Geist, from the Oak Ridge National Laboratory and CTO of the Exascale Computing Project (ECP), outlined the work the ECP is doing to develop exascale-capable systems within the next few years.
Apr-4-2017, 13:31:36 GMT
- Country:
- Asia > China (0.19)
- North America > United States
- Texas > Travis County > Austin (0.25)
- Industry:
- Energy (0.50)
- Government > Regional Government (0.35)
- Information Technology (0.71)
- Technology: