If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
Years before smart homes became a thing, I replaced all the switches in our house with computerized switches. At first, it was just a way to add wall switches without pulling new wire. Over time, I got more ambitious. The system runs a timer routine when it detects no one is home, turns on the basement light when you open the door, and lights up rooms in succession on well-worn paths such as bedroom to kitchen. Other members of the family are less enthusiastic. A light might fail to turn on or might go out for lack of motion, or maybe for lack of any discernible reason. The house seems to have a mind of its own.
The nature of consciousness seems to be unique among scientific puzzles. Not only do neuroscientists have no fundamental explanation for how it arises from physical states of the brain, we are not even sure whether we ever will. Astronomers wonder what dark matter is, geologists seek the origins of life, and biologists try to understand cancer--all difficult problems, of course, yet at least we have some idea of how to go about investigating them and rough conceptions of what their solutions could look like. Our first-person experience, on the other hand, lies beyond the traditional methods of science. Following the philosopher David Chalmers, we call it the hard problem of consciousness. But perhaps consciousness is not uniquely troublesome. Going back to Gottfried Leibniz and Immanuel Kant, philosophers of science have struggled with a lesser known, but equally hard, problem of matter. What is physical matter in and of itself, behind the mathematical structure described by physics?
Learning difficulties are not linked to differences in particular brain regions, but in how the brain is wired, research suggests. According to figures from the Department for Education, 14.9% of all pupils in England – about 1.3 million children – had special educational needs in January 2019, with 271,200 having difficulties that required support beyond typical special needs provision. Dyslexia, attention deficit hyperactivity disorder (ADHD), autism and dyspraxia are among conditions linked to learning difficulties. Now experts say different learning difficulties are not specific to particular diagnoses, nor are they linked to particular regions of the brain – as has previously been thought. Instead the team, from the University of Cambridge, say learning difficulties appear to be associated with differences in the way connections in the brain are organised.
Here is our annual list of technological advances that we believe will make a real difference in solving important problems. We avoid the one-off tricks, the overhyped new gadgets. Instead we look for those breakthroughs that will truly change how we live and work. We're excited to announce that with this year's list we're also launching our very first editorial podcast, Deep Tech, which will explore the the people, places, and ideas featured in our most ambitious journalism. Later this year, Dutch researchers will complete a quantum internet between Delft and the Hague. An internet based on quantum physics will soon enable inherently secure communication. A team led by Stephanie Wehner, at Delft University of Technology, is building a network connecting four cities in the Netherlands entirely by means of quantum technology.
The global affective computing market is envisioned to create high growth prospects on the back of the rising deployment of machine and human interaction technologies. With enabling technologies already making a mark with their adoption in a range of industry verticals, it could be said that the market has started to evolve. Facial feature extraction software collecting a handsome demand in the recent years is expected to augur well for the growth of the deployment of cameras in affective computing systems. Detection of psychological disorders, facial expression recognition for dyslexia, autism, and other disorders in specially-abled children, and various other applications could increase the use of affective computing technology. Life sciences and healthcare are prognosticated to showcase a promising rise in the demand for affective computing.
FirstWord MedTech's Digital Ten is a fortnightly round-up of the 10 most read and noteworthy headlines related to digital health, including industry deals, alliances, collaborations, innovations and R&D news. Insulet, the company behind the Omnipod tubeless wearable insulin delivery system, is partnering with Abbott to integrate the latter's Freestyle Libre continuous glucose monitoring (CGM) sensor with its new-generation Omnipod Horizon automated insulin delivery (AID) system onto a digital platform. The companies will make their respective technologies compatible so they can be paired and share CGM and insulin dosing data on a digital platform. Abbott has similar partnerships with Novo Nordisk and Sanofi, in which the CGM tech will be developed to share data with the drug companies' connected insulin pens. Abbott also counts Bigfoot Biomedical and Tandem Diabetes Care among its insulin delivery partners.
Intel's Neural Compute Stick 2 is an example of machine learning hardware for edge devices. Analyzing large amounts of data based on complex machine learning algorithms requires significant computational capabilities. Therefore, much processing of data takes place in on-premises data centers or cloud-based infrastructure. However, with the arrival of powerful, low-energy consumption Internet of Things devices, computations can now be executed on edge devices such as robots themselves. This has given rise to the era of deploying advanced machine learning methods such as convolutional neural networks, or CNNs, at the edges of the network for "edge-based" ML.
In order to train the machine learning algorithms, researchers gathered data on patients from the Indiana Network for Patient Care. The models used information on prescriptions and diagnoses, which are structured fields, as well as medical notes, which are free text, to predict the onset of dementia. Researchers found that the free-text notes were the most valuable to help identify people at risk of developing the disease. The research team, which also included scientists from Georgia State, Albert Einstein College of Medicine and Solid Research Group, recently published its findings on two different machine learning approaches. The paper published in the Journal of the American Geriatrics Society analyzed the results of a natural language processing algorithm, which learns rules by analyzing examples, and the Artificial Intelligence in Medicine article shared the results from a random forest model, which is built using an ensemble of decision trees. Both methods showed similar accuracy at predicting the onset of dementia within one and three years of diagnosis.
In 2019, the number of published papers related to AI and machine learning was nearly 25,000 in the U.S. alone, up from roughly 10,000 in 2015. And NeurIPS 2019, one of the world's largest machine learning and computational neuroscience conferences, featured close to 2,000 accepted papers from thousands of attendees. There's no question that the momentum reflects an uptick in publicity and funding -- and correspondingly, competition -- within the AI research community. But some academics suggest the relentless push for progress might be causing more harm than good. In a recent tweet, Zachary Lipton, an assistant professor at Carnegie Mellon University, jointly appointed in the Tepper School of Business and the machine learning department, proposed a one-year moratorium on papers for the entire community, which he said might encourage "thinking" without "sprinting/hustling/spamming" toward deadlines.