Goto

Collaborating Authors

Exascale Day Webcast and Panel Discussion - Cray

#artificialintelligence

You're invited to celebrate the first annual Exascale Day on October 18 (1018, of course!) with the US Department of Energy's Exascale Computing Project, global supercomputing leader, Cray, a Hewlett Packard Enterprise company, and the DOE laboratories that will house the nation's first three exascale supercomputers: Argonne, Oak Ridge and Lawrence Livermore. Together we are hosting a panel discussion on how the advanced technology of the Exascale Era will change the face of computational science and the advances it will foster. Exascale supercomputers like Aurora, Frontier and El Capitan will arm scientists and researchers to push the boundaries of research well beyond today's capabilities in our quest for medical cures, energy sources, nuclear security, and delving deeper into mysteries such as the birth of the universe. There are a myriad possibilities of what visionaries will be able to do with a quintillion computations per second.


Beyond Super - Cray

#artificialintelligence

Exascale is more than a speed milestone or a system size. Exascale is new workloads brought on by new questions intersecting with new compute capabilities to create a major technological shift. To understand this shift, you have to understand some macro trends in research and enterprise. First, uncontrolled data growth is driving organizations of all sizes to data-intensive computing. Second, digital transformation has become a business imperative.


U.S. efforts to build next-gen supercomputer take shape

PCWorld

For decades, the U.S. took for granted the doubling of supercomputing power every 10 years, roughly in line with Moore's Law. But once a petascale system was reached in 2008, it gradually became clear that the next leap -- a system 1,000 times more powerful -- would be difficult. Initially, some believed such a system -- an exascale computer -- was possible in 10 years, or by 2018. It took too much power, and it required new approaches to applications to utilize an almost unimaginable level of parallelism involving hundreds of millions of cores. Another problem to solve was the need for resilience, or an ability to continue to working around multiple ongoing hardware failures expected in a system of this size.


EU Pledges 20 Million Euros for New Exascale Computing Project

#artificialintelligence

A new European exascale computing project, known as EuroEXA, kicked off at the Barcelona Supercomputer Center this week during a meeting that brought together the 16 organizations involved in the effort. EuroEXA is the latest in a series of exascale investments by the European Union (EU), which will contribute €20 million to the project over the next three and a half years. It consolidates the research efforts of a number of separate projects initiated under the EU's Horizon 2020 program, including ExaNeSt (exascale interconnects, storage, and cooling), EcoScale (exascale heterogeneous computing) and ExaNoDe (exascale processor and node design). This €20 million is just the down payment to the total €50 million investment that will eventually be contributed by the European Commission to the EuroEXA work. As reflected in the consolidated Horizon 2020 efforts, it will include R&D money for exascale system software, server hardware, networking, storage, cooling and datacenter technologies.


Machine Learning, Analytics Play Growing Role in US Exascale Efforts - AI Trends

#artificialintelligence

Exascale computing promises to bring significant changes to both the high-performance computing space and eventually enterprise datacenter infrastructures. The systems, which are being developed in multiple countries around the globe, promise 50 times the performance of current 20 petaflop-capable systems that are now among the fastest in the world, and that bring corresponding improvements in such areas as energy efficiency and physical footprint. The systems need to be powerful run the increasingly complex applications being used by engineers and scientists, but they can't be so expensive to acquire or run that only a handful of organizations can use them. At the same time, the emergence of high-level data analytics and machine learning is forcing some changes in the exascale efforts in the United States, changes that play a role in everything from the software stacks that are being developed for the systems to the competition with Chinese companies that also are aggressively pursuing exascale computing. During a talk last week at the OpenFabrics Workshop in Austin, Texas, Al Geist, from the Oak Ridge National Laboratory and CTO of the Exascale Computing Project (ECP), outlined the work the ECP is doing to develop exascale-capable systems within the next few years.