Results


What is AI, really? – AI-First Design – Medium

#artificialintelligence

This is the first chapter in Element AI's Foundations Series on AI-First Design (AI1D). Each chapter aims to define the component parts of AI1D in order to create a common language with which to explore this new era of design. You can read the intro to the series here, and sign up to stay tuned for the next chapter here. As a designer, why should you need to be able to understand artificial intelligence? It's a term being bandied about so much in media and tech circles lately, a kind of catchall that could be describing anything from virtual personal assistants, robots, sci-fi characters, or the latest deep learning algorithm. Perhaps you work in AI and you have a more nuanced understanding of these distinct fields, or maybe you just sense that your work will be affected in some way by AI in the coming years, but you're not quite sure how.


The Deep (Learning) Transformation of Mobile and Embedded Computing

IEEE Computer

Mobile and embedded devices increasingly rely on deep neural networks to understand the world--a feat that would have overwhelmed their system resources only a few years ago. Further integration of machine learning and embedded/mobile systems will require additional breakthroughs of efficient learning algorithms that can function under fluctuating resource constraints, giving rise to a field that straddles computer architecture, software systems, and artificial intelligence. N. D. Lane and P. Warden, "The Deep (Learning) Transformation of Mobile and Embedded Computing," in Computer, vol.



Automated Machine Learning on the Cloud in Python – Towards Data Science

#artificialintelligence

This article will cover a brief introduction to these topics and show how to implement them, using Google Colaboratory to do automated machine learning on the cloud in Python. Originally, all computing was done on a mainframe. You logged in via a terminal, and connected to a central machine where users simultaneously shared a single large computer. Then, along came microprocessors and the personal computer revolution and everyone got their own machine. Laptops and desktops work fine for routine tasks, but with the recent increase in size of datasets and computing power needed to run machine learning models, taking advantage of cloud resources is a necessity for data science.


Explosive growth in AI compute shows enterprises must get smart about strategy

#artificialintelligence

Artificial intelligence research organization OpenAI recently released a report that shows the amount of compute power needed for training runs in the largest machine learning systems has increased by 300,000 times since 2012. Because machine learning results improve when given additional computing resources, we'll likely see even greater demands for silicon infrastructure to drive better results. Enterprises are increasingly using machine learning to automate complex problems and analytical tasks. But OpenAI's research shows there's a key challenge ahead: How can enterprises build the infrastructure they need to produce the business results they want when the technical requirements keep changing? First off, enterprises should try to find the least complicated algorithm necessary to solve the business problem at hand.


Microsoft CEO Nadella: The whole world is now a computer

ZDNet

Thanks to cloud computing, the Internet of Things and artificial intelligence, we should start to think of the planet as one giant computer, according to Microsoft chief executive Satya Nadella. "Digital technology, pervasively, is getting embedded in every place: every thing, every person, every walk of life is being fundamentally shaped by digital technology -- it is happening in our homes, our work, our places of entertainment," said Nadella speaking in London. "It's amazing to think of a world as a computer. I think that's the right metaphor for us as we go forward." For some time, Nadella has been refocusing Microsoft on higher growth areas like cloud, machine learning and AI.


Machine Learning Treats Brain Disorders Where They Most Often Occur

#artificialintelligence

These neuropsychiatric disorders are prevalent in low- to middle-income countries due to various factors, e.g. Around 80 percent of the world's epilepsy occurs in low- to middle-income countries, but only 20 percent of people get treatment. The physician-to-patient ratio can be as low as one for every 20,000 people in those countries, with even fewer psychiatrists and neurologists, causing a so-called treatment gap.1 However, timely diagnosis and treatment of epilepsy is possible and can make a difference.2 Last fall, partnering with the Nanyang Technological University (NTU) of Singapore, we took the first steps in tackling this challenge in our Science for Social Good program. Our team included a Social Good Fellow from Columbia University, several machine learning and cloud computing researchers from IBM Research, and collaborators from NTU. Together, we came up with a cloud-based automated machine learning approach to provide decision support for non-specialist physicians in electroencephalography (EEG) analysis and interpretation.


How machine learning can drive clinical and operational improvements

#artificialintelligence

As the healthcare industry pivots toward enhancing the delivery of care, improving the efficiency of their business operations and advancing the quality of scientific discovery, many organizations are increasingly embracing machine learning as part of their overall clinical and financial strategy. Enthusiasm for advanced cognitive computing in healthcare is on the rise as providers recognize the need for analytical tools to gain predictive and preventative insights. By using machine learning algorithms, healthcare organizations can glean patterns in patient data and diagnostic imaging that will help them treat and diagnose patients with greater accuracy, map care pathways and processes, reduce costs in care, and improve outcomes. Yet, the task of implementing machine learning projects comes with challenges. Machine learning projects can be expensive.


AI Weekly: Computing power is shaping the future of AI

#artificialintelligence

This week, OpenAI published an analysis that documents an explosion in compute power over the past six years, which is driving advances in artificial intelligence. Compute power used in the largest AI training runs, the piece found, has doubled every 3.5 months since 2012. The breakdown of compute power necessary to create well-known AI systems like ResNets and AlphaGo Zero provides some of the clearest metrics available to demonstrate why AI is growing faster and proliferating to all corners of society. Together with big data and improvements to algorithms, this boom in compute is carving paths to the future for both businesses and the rest of the world. This week, the international community met in Geneva for the AI for Good Global Summit to discuss how AI can be used to make progress toward the United Nation's Sustainable Development Goals like zero hunger, no poverty, and good health.


Intel Makes a Move into Vision Intelligence IoT Business

@machinelearnbot

Intel plays a lot of roles in the IT business besides making processors with microscopic transistors for servers, PCs, the internet of things, and mobile devices. It also makes security hardware and software, memory and programmable enterprise solutions, 5G connectivity hardware and software and a list of others too long to note here. But one of the greenest fields coming into the venerable chipmaker's view here in mid-2018 has to do with what's called "the edge"--that mysterious, nebulous and more distributed area outside the data center where a lot of computing is starting to happen and will be happening more and more as time goes on. We're hearing a lot about this lately, largely because our devices (smartphones, laptops, tablets, IoT devices) on the fringes of centralized systems can hold much more information and do more with it than in years past. Intel wants to make more and more of the infrastructure for these devices and systems.