If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
Artificial Intelligence (AI) has always been a topic of debate--is it good for us? Are we walking towards a better future or an inevitable doom? According to an on-going research program by McKinsey Global Institute, every occupation includes multiple types of activities, and each has a different requirement for automation. Almost all occupations have a partial automation potential. And so, almost half of all the work done by humans can eventually be taken over by a high intelligence computer.
A Harvard research team's prototype of a portable exosuit is made of cloth components worn at the waist and thighs. A computer that's built into the shorts uses an algorithm that can sense when the user shifts between a walking gait and a and running gait. A Harvard research team's prototype of a portable exosuit is made of cloth components worn at the waist and thighs. A computer that's built into the shorts uses an algorithm that can sense when the user shifts between a walking gait and a and running gait. Say the word "exosuit" and superheroes come to mind -- somebody like Tony Stark from Marvel Comics, whose fancy suit enables him become Iron Man.
Deep learning is a subset of machine learning, a branch of artificial intelligence that configures computers to perform tasks through experience. Deep learning, an advanced artificial intelligence technique, has become increasingly popular in the past few years, thanks to abundant data and increased computing power. It's the main technology behind many of the applications we use every day, including online language translation and automated face-tagging in social media. This technology has also proved useful in healthcare: Earlier this year, computer scientists at the Massachusetts Institute of Technology (MIT) used deep learning to create a new computer program for detecting breast cancer. Classic models had required engineers to manually define the rules and logic for detecting cancer, but for this new model, the scientists gave a deep-learning algorithm 90,000 full-resolution mammogram scans from 60,000 patients and let it find the common patterns between scans of patients who ended up with breast cancer and those who didn't.
Artificial Intelligence, or AI for short, has become quite the public buzzword. Companies and investors are pouring money into the field. Universities -- even high schools -- are rushing to start new degree programs or colleges dedicated to AI. Civil society organizations are scrambling to understand the impact of AI technology on humanity, and governments are competing to encourage or regulate AI research and deployment. One country, the United Arab Emirates, even boasts a minister for AI. At the same time, the world's militaries are developing AI-based weaponry to defeat their enemies, police agencies are experimenting with AI as a surveillance tool to identify or interrogate suspects, and companies are testing its ability to replace humans in menial or more meaningful jobs -- all of which may change the equation of life for all of the world's people.
Communication has been an essential part of human evolution. Communication enabled us to form groups, make tactics, plan ahead and essentially survive the wild. So, the importance of communication cannot be ruled out. It was only natural that humans would want to introduce such communicating abilities into the machines they made. With the advancements in A.I., each next generation of the system could hold up a longer conversation with humans than the previous system.
Machine learning, introduced 70 years ago, is based on evidence of the dynamics of learning in our brain. Using the speed of modern computers and large data sets, deep learning algorithms have recently produced results comparable to those of human experts in various applicable fields, but with different characteristics that are distant from current knowledge of learning in neuroscience. Using advanced experiments on neuronal cultures and large scale simulations, a group of scientists at Bar-Ilan University in Israel has demonstrated a new type of ultrafast artifical intelligence algorithms -- based on the very slow brain dynamics -- which outperform learning rates achieved to date by state-of-the-art learning algorithms. In an article published in the journal Scientific Reports, the researchers rebuild the bridge between neuroscience and advanced artificial intelligence algorithms that has been left virtually useless for almost 70 years. "The current scientific and technological viewpoint is that neurobiology and machine learning are two distinct disciplines that advanced independently," said the study's lead author, Prof. Ido Kanter, of Bar-Ilan University's Department of Physics and Gonda (Goldschmied) Multidisciplinary Brain Research Center.
This is part of a series of stories examining how artificial intelligence is disrupting industries. Can artificial intelligence (AI) make health care smarter? Anytime a patient breaks a bone, sprains an ankle or hits their head, the radiology industry goes to work, using x-rays, CT scanners, MRI machines and other tools and techniques to take a closer look inside the human body without the need for surgery. Radiology technicians take the pictures and radiologists examine them to determine the extent of the injury. What if a computer could do the analysis?
In Padang, West Sumatra, San Francisco-based non-profit organisation Rainforest Connection is mounting used cellphones on trees to detect sounds that originate from chainsaws or trucks belonging to illegal loggers. Rangers, villagers and law enforcement agencies are then alerted to the illegal activities and can take action. In Singapore, DBS Bank is predicting when employees will quit, so management can intervene and retain staff. In Taipei, Taiwan's performing arts centre National Theatre and Concert Hall is using technology to provide automatic sub-titling so that people with hearing disabilities can also enjoy performances. What unites the three cities in their cutting-edge exploits is a new frontier technology known as artificial intelligence (AI).
Imagine if you had a version of Amazon's Alexa or Google Assistant inside your head, capable of feeding you external information whenever you required it, without you needing to say a single word and without anyone else hearing what it had to say back to you. An advanced version of this idea is the basis for future tech-utopian dreams like Elon Musk's Neuralink, a kind of connected digital layer above the cortex that will let our brains tap into hitherto unimaginable machine intelligence. Arnav Kapur, a postdoctoral student with the MIT Media Lab, has a similar idea. And he's already shown it off. The current AlterEgo device prototype looks a bit like one of those popstar Britney mics, as imagined by the designers of the Star Trek: The Next Generation TV show.
By carrying out advanced experiments on neuronal cultures and large scale simulations, a group of scientists from Bar-Ilan University in Israel claims to have created a new type of ultra-fast artificial intelligence algorithm. This algorithm is based on the dynamics of the human brain, which, despite computing at a much slower rate than modern computers, is extremely fast and efficient. In an article published today in the journal Scientific Reports, researchers claim to be rebuilding the bridge between neuroscience and advanced artificial intelligence algorithms that, they say, has taken a backseat for almost 70 years. "The current scientific and technological viewpoint is that neurobiology and machine learning are two distinct disciplines that advanced independently," the study's lead author, Prof. Ido Kanter, of Bar-Ilan University's Department of Physics and Gonda (Goldschmied) Multidisciplinary Brain Research Center said in a press release. "The absence of expectedly reciprocal influence is puzzling."