Results


fulltext

Communications of the ACM

On the other hand, some HPC systems run highly exotic hardware and software stacks. This fact means that aside from all of the normal reasons that any network-connected computer might be attacked, HPC computers have their own distinct systems, resources, and assets that an attacker might target, as well as their own distinctive attributes that make securing such systems somewhat distinct from securing other types of computing systems. As a result, although I discuss confidentiality, a typical component of the "C-I-A" triad, because even in open science, data leakage is certainly an issue and a threat, this article focuses more on integrity related threats,31,32 including alteration of code or data, or misuse of computing cycles, and availability related threats, including disruption or denial of service against HPC systems or networks that connect them. The diagram at top shows a typical workflow for data analysis in HPC; the middle diagram shows a typical workflow for modeling and simulation; and the bottom diagram shows a coupled, interactive compute-visualization workflow.


Zooming in on climate predictions

@machinelearnbot

"Rather than using machine learning to get people to click ads or maximize page views, I decided solving problems in climate science was a better use of my skills and time." "Rather than using machine learning to get people to click ads or maximize page views, I decided solving problems in climate science was a better use of my skills and time," Vandal said. "These downscaled datasets will be of immense value to climate researchers and eco-climatic modelers who want to study anything from the impact of ecosystems to changes in climate for future warming scenarios," said Sangram Ganguly, one of the study's co-authors and a senior research scientist at the Bay Area Environmental Research Institute at the NASA Ames Research Center. "The computer science field changes really fast," Auroop Ganguly said.


Intel and DARPA look to AI and machine learning to boost graph analytics in big data - TechRepublic

#artificialintelligence

Graph analytics focus on the many-to-many relationships in data, giving insight into multi-layered, indirect relationships in those datasets, the release said. Specifically, Intel's Data Center Group (DCG), Platform Engineering Group (PEG), and Intel Labs are the divisions that will be working with DARPA. Big data is also a computing trend that is continuing to drive advances in AI, as well creating new industries for practices such as data cleansing. As big data continues to make its way through the enterprise, some have even suggested that 2017 will be the year that data science hits a turning point in providing even more value to businesses.


British kid finds NASA mistake: when too many cooks don't spoil anything

Christian Science Monitor

March 24, 2017 --The days when a chemist's assistant like Michael Faraday or a friar like Gregor Mendel could single-handedly revolutionize a field of science may seem long gone, but one British student is showing the world that anyone can play a role in research. This week NASA is feeling grateful to the sharp eyes of 17-year-old Miles Soloman of Sheffield, England, who was able to help uncover a faulty sensor on board the International Space Station (ISS) when he noticed some wacky readings in a data spreadsheet. His findings add to a long history of amateurs making real contributions to science, a phenomenon many researchers are eager to encourage. Miles's physics teacher, James O'Neill, had no idea what was going to happen when he enrolled his class in the TimPix project from the Institute for Research in Schools (IRIS), an initiative that provides classes with data collected from a radiation detector on board the ISS. By studying the data sets, students can learn about energy and "contribute to research that will improve our understanding of radiation in space," IRIS wrote on their website.


Wanted: Toolsmiths

Communications of the ACM

"As we honor the more mathematical, abstract, and scientific' parts of our subject more, and the practical parts less, we misdirect the young and brilliant minds away from a body of challenging and important problems that are our peculiar domain, depriving these problems of the powerful attacks they deserve." I have the privilege of working at the Defense Advanced Research Projects Agency (DARPA) and currently serve as the Acting Deputy Director of the Defense Sciences Office (DSO). Our goal at DARPA is to create and prevent technological surprise through investments in science and engineering, and our history and contributions are well documented. The DSO is sometimes called "DARPA's DARPA," because we strive to be at the forefront of all of science--on the constant lookout for opportunities to enhance our national security and collective well-being, and our projects are very diverse. One project uses cold atoms to measure time with 10 18th precision; another is creating amazing composite materials that can change the way in which we manufacture.


TEDx Manchester: AI & The Future of Work

#artificialintelligence

Quantum Computing "Today's conventional computing technology is compared to reading every book in a library, one by one. Set a reminder in 90 minutes... Is Jeff bezos a nice guy? Is arsenal winning the league? It's not immigrants, it's automation! Ride the train, don't jump in front of it Thank You! @vhirsch me@vhirsch.com


DARPA's latest idea could put today's Turing-era computers at risk

#artificialintelligence

The U.S. Defense Advanced Research Projects Agency (DARPA) has come up with some crazy ideas in the past, and its latest idea is to create computers that are always learning and adapting, much like humans. However, this isn't a far-fetched idea. Mobile devices, computers, and gadgets already have artificial intelligence features, with notable examples being Apple's Siri, Microsoft's Cortana, and Amazon's Alexa. But these devices can only learn and draw conclusions within the scope of information pre-programmed into systems. Existing machine-learning techniques don't allow computers to think outside the box, so to speak, or think dynamically based on the situations and circumstances.


DARPA's latest idea could put today's Turing-era computers at risk

PCWorld

The U.S. Defense Advanced Research Projects Agency (DARPA) has come up with some crazy ideas in the past, and its latest idea is to create computers that are always learning and adapting, much like humans. DARPA's aptly named Lifelong Learning Machine (L2M) program has the ambitious goal to create technology for "new AI systems that learn online, in the field, and based on what they encounter -- without having to be taken offline for reprogramming or retraining for new conditions," according to a document published Thursday detailing the program. An adaptive computer that draws on experience to make decisions has been a "long-standing" goal, said Hava Siegelmann, program manager for the L2M project at DARPA. The ability to give biological intelligence will involve developing new computer architectures and new machine-learning techniques.


Poll: Where readers stand on artificial intelligence, cloud computing and population health

#artificialintelligence

When IBM CEO Ginni Rometty delivered the opening keynote at HIMSS17 she effectively set the stage for artificial intelligence, cognitive computing and machine learning to be prevalent themes throughout the rest of the conference. Some 70 percent of respondents are either actively planning or researching artificial intelligence, cognitive computing and machine learning technologies -- while 7 percent are rolling them out and 1 percent have already completed an implementation. It's not entirely surprising that more respondents, 30 percent, are either rolling out or have completed a rollout of population health technologies, while 50 percent are either researching actively planning to do so. The overarching themes at the pre-conference HIMSS and Healthcare IT News Cloud Computing Forum on Sunday were that security is not a core competency of hospital and health systems, thus many cloud providers can better protect health data and the ability to spin up server, storage and compute resources on Amazon, Google or Microsoft is enabling a whole new era of innovation that simply is not possible when hospitals have to invest in their own infrastructure to run proofs-of-concept and pilot programs.


The machines that learned to listen

#artificialintelligence

We don't want to look things up in dictionaries – so I wanted to build a machine to translate speech – Alexander Waibel At the 1962 World Fair, IBM showcased its "Shoebox" machine, able to understand 16 spoken English words. In 1990, Dragon released the first consumer speech recognition product, Dragon Dictate, for a whopping $9,000. "Before that time, speech recognition products were limited to discrete speech, meaning that they could only recognise one word at a time," says Peter Mahoney, senior vice president and general manager of Dragon, Nuance Communications. In the last 10 years or so, machine learning techniques loosely based on the workings of the human brain have allowed computers to be trained on huge datasets of speech, enabling excellent recognition across many people using many different accents.