processing unit


Microsoft sends a new kind of AI processor into the cloud

#artificialintelligence

Microsoft rose to dominance during the '80s and '90s thanks to the success of its Windows operating system running on Intel's processors, a cosy relationship nicknamed "Wintel". Now Microsoft hopes that another another hardware–software combo will help it recapture that success--and catch rivals Amazon and Google in the race to provide cutting-edge artificial intelligence through the cloud. Microsoft hopes to extend the popularity of its Azure cloud platform with a new kind of computer chip designed for the age of AI. Starting today, Microsoft is providing Azure customers with access to chips made by the British startup Graphcore. Graphcore, founded in Bristol, UK, in 2016, has attracted considerable attention among AI researchers--and several hundred million dollars in investment--on the promise that its chips will accelerate the computations required to make AI work.


Top Machine Learning Frameworks For AI Development Company [2020]

#artificialintelligence

It's a fact that Artificial technology is increasingly making our lives easier. If we think about it, every second component is now attached with some sort of machine learning tool that makes it work by minimum human interference. AI technology is transforming every sequence of our lives, therefore machine learning is also growing with a newer speed, and so are the innovations of artificial intelligence development companies. Transportation has grown a lot more than the commutation methods and assisting the communication requirements of the clients. The customers are gradually becoming addicted to handling complex tasks from mobile phones.


Intel flexes AI processing muscle

#artificialintelligence

Cloud and datacenter architects searching for new ways to pack more artificial intelligence horsepower into already constrained spaces will want to take a close look at Intel's new Nervana Neural Network Processors. Depending on the application, the processors may offer four times the performance or one-fifth the power draw as commercially available alternatives. The new processors are Intel's first ASIC offerings tailored specifically for deep learning workloads. The company announced last week the processors are shipping now. In addition to the NNP-T1000 for training and the NNP-I1000 for inference, Intel also announced the coming generation of the Movidius Myriad Vision Processing Unit, which is designed for AI vision and inference processing at the edge.


Top 11 Hot Chips For Machine Learning

#artificialintelligence

Though machine learning has been around for more than three decades, it took a lot of time for the hardware to catch up with the demands of these power-hungry algorithms. With each passing year, the chip-set manufacturers have tried to make the hardware lighter and faster. Today, over 100 companies are working on building next-generation chips and hardware architectures that would match the capabilities of algorithms. These chips are capable of enabling deep learning applications on smartphones and other edge computing devices. Intel recently revealed new details of upcoming high-performance artificial intelligence accelerators: Intel Nervana neural network processors.


Asus Hooks Up With Google to Create Tinker Board for AI

#artificialintelligence

Asus Japan announced this week that it'll show off two new single-board computers at the upcoming ET & IoT Technology 2019 event kicking off November 20 in Yokohama, Japan. The latest Tinker Edge T and Tinker Edge R are designed specifically for IoT (Internet of Things) and edge AI applications. The Tinker Edge T measures 85 x 56mm, which is around the size of a credit card. Both single-board computers depend on a small heatsink with an accompanying cooling fan to stay cool during operation. The system also relies on the Vivante GC7000 Lite 3D graphics engine and Google's Coral Edge tensor processing unit (TPU), which is optimized for Tensorflow Lite and boasts performance up to 4 tera operations per second (TOPS).


Can China Grow Its Own AI Tech Base?

#artificialintelligence

Last December, China's top AI scientists gathered in Suzhou for the annual Wu Wenjun AI Science and Technology Award ceremony. They had every reason to expect a feel-good appreciation of China's accomplishments in AI. Yet the mood was decidedly downbeat. "After talking about our advantages, everyone mainly wants to talk about the shortcomings of Chinese AI capabilities in the near-term--where are China's AI weaknesses," said Li Deyi, the president of the Chinese Association for Artificial Intelligence. More than two years after the release of the New Generation Artificial Intelligence Development Plan (AIDP), China's top AI experts worry that Beijing's AI push will not live up to the hype.


A List of Chip/IP for Deep Learning

#artificialintelligence

Machine Learning, especially Deep Learning technology is driving the evolution of artificial intelligence (AI). At the beginning, deep learning has primarily been a software play. Start from the year 2016, the need for more efficient hardware acceleration of AI/ML/DL was recognized in academia and industry. This year, we saw more and more players, including world's top semiconductor companies as well as a number of startups, even tech giants Google, have jumped into the race. I believe that it could be very interesting to look at them together. So, I build this list of AI/ML/DL ICs and IPs on Github and keep updating. If you have any suggestion or new information, please let me know. The companies and products in the list are organized into five categories as shown in the following table. Intel purchased Nervana Systems who was developing both a GPU/software approach in addition to their Nervana Engine ASIC. Intel is also planning in integrating into the Phi platform via a Knights Crest project.


How Parallel Processing Solves Our Biggest Computational Problems

#artificialintelligence

Take all the help you can get. If parallel computing has a central tenet, that might be it. Some of the crazy-complex computations asked of today's hardware are so demanding that the compute burden must be borne by multiple processors, effectively "parallelizing" whatever task is being performed. Perhaps the most notable push toward parallelism happened around 2006, when tech hardware powerhouse Nvidia approached Wen-mei Hwu, a professor of electrical and computer engineering at the University of Illinois-Urbana Champaign. Nvidia was designing graphics processing units (GPUs) -- which, thanks to large numbers of threads and cores, had far higher memory bandwidth than the traditional central processing unit (CPUs) -- as a way to process huge numbers of pixels.


(PDF) Killer Robots - A Psychologist's Personal Reflections on AI in Warfare

#artificialintelligence

Pointing fingers, is missing the point. One of the most frequently asked questions I get is: Don't Machines do not see the world as humans do. What if, it would help to kill only the bad guys? What if, AI would make war more humane? Decide (How to react?) and Act (Do something, a response).


Intelligence community laying foundation for AI data analysis Federal News Network

#artificialintelligence

Artificial intelligence is a concept that seems tailor-made for the intelligence community. The ability to sort through massive amounts of data, seeking out patterns large and small, anomalies that warrant further investigation, that's what intelligence analysts do already. Imagine what they could achieve when augmented by AI? Dean Souleles, chief technology advisor for the Office of the Director of National Intelligence, said on Agency in Focus – Intelligence Community that the IC is working now to lay the foundation for adopting AI. "You cannot build a house without a solid foundation. The foundation of AI is data and computational technology," Souleles said. "The intelligence community has spent much of the last decade on a program we call ICITE, the information technology enterprise of the IC. And that's been about modernizing the technology infrastructure. And that is about getting cloud technology throughout the community, making basic computational capability available to our technologists just as it is in the private sector. But that's not good enough, because the new era of computation requires sophisticated kinds of computing. We talk about GPUs, graphical processing units, or tensor processing units (TPUs), or neuromorphic chips or field programmable gate arrays, or any of the wide variety of things that are the specialized computation that enable AI computation. And we need to make the investments in those things."