Results


DARPA digs into the details of practical quantum computing -- GCN

#artificialintelligence

Quantum computing promises enough computational power to solve problems far beyond the capabilities of the fastest digital computers, so the Defense Advanced Research Projects Agency is laying the groundwork for applying the technology to real-world problems. In a request for information, DARPA is asking how quantum computing can enable new capabilities when it comes to solving science and technology problems, such as understanding complex physical systems, optimizing artificial intelligence and machine learning and enhancing distributed sensing. Noting that it is not interested in solving cryptology issues, DARPA is asking the research community to help solve challenges of scale, environmental interactions, connectivity and memory and suggest "hard" science and technology problems the technology could be leveraged to solve. Establishing the fundamental limits of quantum computing in terms of how problems should be framed, when a model's scale requires a quantum-based solution, how to manage connectivity and errors, the size of potential speed gains and the ability to break large problems into smaller pieces that can map to several quantum platforms. Improving machine learning by leveraging a hybrid quantum/classical computing approach to decrease the time required to train machine learning models.


AI Vision IoT

#artificialintelligence

This is to use Camera's camera view. Changing the width and height, 1280 x 720 worked great for me, but you can play around with the dimensions to see what fits your need. I set this to 30, the higher you set the number the more computing power it would require. You can play around to see what the benchmark for it, but 30 has worked great for me.


Novel synaptic architecture for brain inspired computing

#artificialintelligence

The findings are an important step toward building more energy-efficient computing systems that also are capable of learning and adaptation in the real world. They were published last week in a paper in the journal Nature Communications. The researchers, Bipin Rajendran, an associate professor of electrical and computer engineering, and S. R. Nandakumar, a graduate student in electrical engineering, have been developing brain-inspired computing systems that could be used for a wide range of big data applications. Over the past few years, deep learning algorithms have proven to be highly successful in solving complex cognitive tasks such as controlling self-driving cars and language understanding. At the heart of these algorithms are artificial neural networks -- mathematical models of the neurons and synapses of the brain -- that are fed huge amounts of data so that the synaptic strengths are autonomously adjusted to learn the intrinsic features and hidden correlations in these data streams.


A Quick History of Modern Robotics

#artificialintelligence

General Motors deployed the first mechanical-arm robot to operate one of its assembly lines as early as 1959. Since that time, robots have been employed to perform numerous manufacturing tasks such as welding, riveting, and painting. This first generation of robots was inflexible, could not respond simply to errors, and required individual programming specific to the tasks they were designed to perform. These robots were governed and inspired by logic--a series of programs coded into their operating systems. Now, the next wave of intelligent robotics is taking advantage of a different kind of learning, predicated on experience rather than logical instruction, to learn how to perform tasks in much the same way that a child would.


How Blockchain and AI Integration is Changing Business

#artificialintelligence

Artificial intelligence has fascinated the human imagination since the times this term started appearing in sci-fi books. Computer science is developing rapidly, and nowadays intelligent computers are no longer fiction -- they are the reality. Blockchain technology was first described in 2008 by an anonymous inventor of Bitcoin, Satoshi Nakamoto. Nobody knows anything about this person or group of people, and Mr. Nakamoto left the project in 2010. Yet, his (or their) brainchild is still alive and kicking, and is implemented in innovative projects all over the world.


How Artificial Intelligence in Healthcare Can Improve Patient Outcomes

#artificialintelligence

When Benjamin Franklin said, "An ounce of prevention is worth a pound of cure," he was talking about fire safety. Nevertheless, the axiom works just as well when taken literally. In fact, Franklin's advice anticipated hundreds of years of healthcare best practices. Spotting and preventing medical problems early on is far cheaper and more efficient than catching them late. The problem for overworked physicians is that issues are not always easy for human eyes to detect.


Artificial Intelligence Boosts UAE GDB by $96 Billion by 2030

#artificialintelligence

Rapid adoption of artificial intelligence (AI) solutions will increase the UAE's GDP by USD 96 billion by 2030, enabling organizations to better meet and predict customer and citizen trends and drive digital business innovation. As the UAE Strategy for AI guides nationwide transformation, AI and machine learning are entering the mainstream. PwC predicts that AI will contribute USD 96 billion in UAE GDP 2030. By industry, Accenture says finance (USD 37 billion), healthcare (USD 22 billion), and transport and storage (USD 19 billion) will see the biggest growth by 2035. "Artificial intelligence solutions can enable new innovations that can augment the existing workforce, optimizing costs, efficiency, and innovation.


Big Data and Robotics - DZone AI

#artificialintelligence

The last few months have witnessed a rise in the attention given to Artificial Intelligence (AI) and robotics. The fact is that robots have already become a part of society; in fact, it is now an integral part. Big data is also definitely a buzzword today. Enterprises worldwide generate a huge amount of data. The data doesn't have a specified format.


Novel synaptic architecture for brain inspired computing

#artificialintelligence

The brain and all its magnificent capabilities is powered by less than 20 watts. Stop to think about that for a second. As I write this blog my laptop is using about 80 watts, yet at only a fourth of the power, our brain outperforms state-of-the-art supercomputers by several orders of magnitude when it comes to energy efficiency and volume. For this reason it shouldn't be surprising that scientists around the world are seeking inspiration from the human brain as a promising avenue towards the development of next generation AI computing systems and while the IT industry has made significant progress in the past several years, particularly in using machine learning for computer vision and speech recognition, current technology is hitting a wall when it comes to deep neural networks matching the power efficiency of their biological counterpart, but this could be about to change. As reported last week in Nature Communications, my colleagues and I at IBM Research and collaborators at EPFL and the New Jersey Institute of Technology have developed and experimentally tested an artificial synapse architecture using 1 million devices--a significant step towards realizing large-scale and energy efficient neuromorphic computing technology.


How Artificial Intelligence Could Help Us Live Longer

#artificialintelligence

What if we could generate novel molecules to target any disease, overnight, ready for clinical trials? Imagine leveraging machine learning to accomplish with 50 people what the pharmaceutical industry can barely do with an army of 5,000. It's a multibillion-dollar opportunity that can help billions. The worldwide pharmaceutical market, one of the slowest monolithic industries to adapt, surpassed $1.1 trillion in 2016. In 2018, the top 10 pharmaceutical companies alone are projected to generate over $355 billion in revenue.