Goto

Collaborating Authors

computing


A hyperdimensional computing system that performs all core computations in-memory

#artificialintelligence

Hyperdimensional computing (HDC) is an emerging computing approach inspired by patterns of neural activity in the human brain. This unique type of computing can allow artificial intelligence systems to retain memories and process new information based on data or scenarios it previously encountered. Most HDC systems developed in the past only perform well on specific tasks, such as natural language processing (NLP) or time series problems. In a paper published in Nature Electronics, researchers at IBM Research- Zurich and ETH Zurich presented a new HDC system that performs all core computations in-memory and that could be applied to a variety of tasks. "Our work was initiated by the natural fit between the two concepts of in-memory computing and hyperdimensional computing," Abu Sebastian and Abbas Rahimi, the two lead researchers behind the study, told TechXplore.


Unique material design for brain-like computations

#artificialintelligence

Researchers at the U.S. Army Combat Capabilities Development Command's Army Research Laboratory say this may be changing as they endeavor to design computers inspired by the human brain's neural structure. As part of a collaboration with Lehigh University, Army researchers have identified a design strategy for the development of neuromorphic materials. "Neuromorphic materials is a name given to the material categories or combination of materials that provide both computing and memory capabilities in devices," said Dr. Sina Najmaei, a research scientist and electrical engineer with the laboratory. Najmaei and his colleagues published a paper, Dynamically reconfigurable electronic and phononic properties in intercalated Hafnium Disulfide (HfS2), in the May 2020 issue of Materials Today. The neuromorphic computing concept is an in-memory solution that promises orders of magnitude reductions in power consumption over conventional transistors, and is suitable for complex data classification and processing.


How the AI hardware market will emerge stronger from 2020 - TechHQ

#artificialintelligence

The semiconductor industry is looking towards recovery strategies. Software has been "the star of high-tech" over the years, but hardware is the core enabler of innovation. As businesses and consumers alike latch on to the advantages of AI applications, whether it's virtual assistants or facial recognition systems, there is a resurging need for advanced hardware. Deloitte describes semiconductors as "essential technology enablers" that power many of the cutting-edge digital devices we use today. By providing next generation accelerator architectures, semiconductor companies can increase computational efficiency or facilitate the transfer of large data sets through memory or storage, crucial for machine learning and AI development.


Fiber: Distributed Computing for AI Made Simple

#artificialintelligence

Jeff Clune is the former Loy and Edith Harris Associate Professor in Computer Science at the University of Wyoming, a Senior Research Manager and founding member of Uber AI Labs, and currently a Research Team Leader at OpenAI. Jeff focuses on robotics and training neural networks via deep learning and deep reinforcement learning. He has also researched open questions in evolutionary biology using computational models of evolution, including studying the evolutionary origins of modularity, hierarchy, and evolvability. Prior to becoming a professor, he was a Research Scientist at Cornell University, received a PhD in computer science and an MA in philosophy from Michigan State University, and received a BA in philosophy from the University of Michigan. More about Jeff's research can be found at JeffClune.com


Artificial Intelligence, Augmented Reality & Automation: Technology For Change

#artificialintelligence

Melvin Greer is Chief Data Scientist, Americas, Intel Corporation. He is responsible for building Intel's data science platform through graph analytics, machine learning and cognitive computing to accelerate transformation of data into a strategic asset for Public Sector and commercial enterprises. His systems and software engineering experience has resulted in patented inventions in Cloud Computing, Synthetic Biology and IoT Bio-sensors for edge analytics. He significantly advances the body of knowledge in basic research and critical, highly advanced engineering and scientific disciplines. Mr. Greer is a member of the American Association for the Advancement of Science (AAAS) and U.S. National Academy of Science, Engineering and Medicine, GUIRR.


Artificial intelligence – the greatest opportunity for women? - Welcome to the WISE Campaign

#artificialintelligence

Sarah Burnett, Vice President at Everest Group, Computer Weekly top 30 influential women in IT, chair of BCS Women and founder of WISE members, BCSWomen AI Accelerator In March this year, I launched the new Artificial Intelligence (AI) Accelerator for WISE members, BCSWomen to make AI more relevant to women and encourage more females into computing. Just … Continue reading "Artificial intelligence – the greatest opportunity for women?"


Neuromorphic Computing: The Next-Level Artificial Intelligence

#artificialintelligence

Can AI function like a human brain? But now, armed with Neuromorphic Computing, they are ready to show the world that their dream can change the world for better. As we unearth the benefits, the success of our machine learning and AI quest seem to depend to a great extent on the success of Neuromorphic Computing. The technologies of the future like autonomous vehicles and robots will need access to and utilization of an enormous amount of data and information in real-time. Today, to a limited extent, this is done by machine learning and AI that depend on supercomputer power.


After this COVID winter comes an AI spring

#artificialintelligence

During boom times, companies focus on growth. In tough times, they seek to improve efficiency. History shows us that after every major economic downturn since the 1980s, businesses relied on digital technology and, specifically, innovations in software technology to return to full productivity with fewer repetitive jobs and less bloat. The years I've spent as a VC have convinced me that this is the best time to start an AI-first enterprise, not despite the recession, but because of it. The next economic recovery will both be driven by artificial intelligence and accelerate its adoption.


Researchers discover unique material design for brain-like computations

#artificialintelligence

Over the past few decades, computers have seen dramatic progress in processing power; however, even the most advanced computers are relatively rudimentary in comparison with the complexities and capabilities of the human brain. Researchers at the U.S. Army Combat Capabilities Development Command's Army Research Laboratory say this may be changing as they endeavor to design computers inspired by the human brain's neural structure. As part of a collaboration with Lehigh University, Army researchers have identified a design strategy for the development of neuromorphic materials. "Neuromorphic materials is a name given to the material categories or combination of materials that provide both computing and memory capabilities in devices," said Dr. Sina Najmaei, a research scientist and electrical engineer with the laboratory. Najmaei and his colleagues published a paper, Dynamically reconfigurable electronic and phononic properties in intercalated Hafnium Disulfide (HfS2), in the May 2020 issue of Materials Today.


Robotics in business: Everything humans need to know

#artificialintelligence

One kind of robot has endured for the last half-century: the hulking one-armed Goliaths that dominate industrial assembly lines. These industrial robots have been task-specific -- built to spot weld, say, or add threads to the end of a pipe. They aren't sexy, but in the latter half of the 20th century they transformed industrial manufacturing and, with it, the low- and medium-skilled labor landscape in much of the US, Asia, and Europe. You've probably been hearing a lot more about robots and robotics over the last couple years. That's because, for the first time since the 1961 debut of GM's Unimate, regarded as the first industrial robot, the field is once again transforming world economies. Only this time the impact is going to be broader. That's particularly true in light of the COVID-19 pandemic, which has helped advance automation adoption across a variety of industries as manufacturers, fulfillment centers, retail, and restaurants seek to create durable, hygienic operations that can withstand evolving disruptions and regulations.