machine learning


Machine learning on edge devices solves lack of data scientists

#artificialintelligence

The current approach to AI and machine learning is great for big companies that can afford to hire data scientists. But questions remain as to how smaller companies, which often lack the hiring budgets to bring in high-priced data scientists, can tap into the potential of AI. One potential solution may lie in doing machine learning on edge devices. Gadi Singer, vice president of the Artificial Intelligence Products Group and general manager of architecture at Intel, said in an interview at the O'Reilly AI Conference in New York that even one or two data scientists are enough to manage AI integration at most enterprises. But will the labor force supply adequate amounts of trained data scientists to cover all enterprises' AI ambitions?


AI Weekly: Contrary to current fears, AI will create jobs and grow GDP

#artificialintelligence

The inevitable march toward automation continues, analysts from the McKinsey Global Institute and from Tata Communications wrote in separate reports this week. Artificial intelligence's growth comes as no surprise -- a survey from Narrative Science and the National Business Research Institute conducted earlier this year found that 61 percent of businesses implemented AI in 2017, up from 38 percent in 2016 -- but this week's findings lay out in detail the likely socioeconomic impacts in the coming decade. The McKinsey models predict that 70 percent of companies will adopt at least one form of AI -- whether computer vision, natural language, virtual assistants, robotic process automation, or advanced machine learning -- by 2020. And Tata found unbridled enthusiasm among business leaders for an AI-dominated future; in a survey of 120 of them, 90 percent said they expect AI to enhance decision-making. McKinsey and Tata both contend that's a good thing.


AI Weekly: Contrary to current fears, AI will create jobs and grow GDP

#artificialintelligence

The inevitable march toward automation continues, analysts from the McKinsey Global Institute and from Tata Communications wrote in separate reports this week. Artificial intelligence's growth comes as no surprise -- a survey from Narrative Science and the National Business Research Institute conducted earlier this year found that 61 percent of businesses implemented AI in 2017, up from 38 percent in 2016 -- but this week's findings lay out in detail the likely socioeconomic impacts in the coming decade. The McKinsey models predict that 70 percent of companies will adopt at least one form of AI -- whether computer vision, natural language, virtual assistants, robotic process automation, or advanced machine learning -- by 2020. And Tata found unbridled enthusiasm among business leaders for an AI-dominated future; in a survey of 120 of them, 90 percent said they expect AI to enhance decision-making. McKinsey and Tata both contend that's a good thing.


How HIEs and AI can work in tandem to boost interoperability ROI

#artificialintelligence

"We spent all those years adopting EHRs, and now we're wanting to get the most out of them. Now we have the digital data, so it should be more liquid and in control of patients and put to use in the care process, even if I go to multiple sites for my care." As ONC and CMS prepare to digest the voluminous public comment on their proposed interoperability rules, especially the emphasis on exchange specs such as FHIR and open APIs, he sees the future only getting brighter for these types of advances as data flows more freely. "We're in the interoperability business, and we like having data being more available and more liquid, and systems being more open to getting data out of them," Woodlock said. "A lot of customers are starting to embark on their journey with with FHIR, and they're really bullish on this as well: having a standards-based API way to interact with medical record medical record data," he added.


Comparing Emotion Recognition Tech: Microsoft, Neurodata Lab, Amazon, Affectiva

#artificialintelligence

Automated emotion recognition has been with us for some time already. Ever since it entered the market, it has never stopped getting more accurate. Even tech giants joined the race and released their software for emotion recognition, after smaller startups had successfully done the same. We set out to compare the most known algorithms. Emotions are subjective and variable, so when it comes to accuracy in emotion recognition, the matters are not that self-evident.


Python Machine Learning Case Studies - Programmer Books

#artificialintelligence

The book is equipped with practical examples along with code snippets to ensure that you understand the data science approach to solving real-world problems.


Facial recognition : 7 trends to watch (2019 review)

#artificialintelligence

Few biometric technologies are sparking the imagination quite like facial recognition. Equally, its arrival has prompted profound concerns and reactions. With artificial intelligence and the blockchain, face recognition certainly represents a significant digital challenge for all companies and organizations - and especially governments. In this dossier, you'll discover the 7 face recognition facts and trends that are set to shape the landscape in 2019. Let's jump right in .


Three Benefits to Deploying Artificial Intelligence in Radiology Workflows

#artificialintelligence

Artificial Intelligence (AI) has the capability to provide radiologists with tools to improve their productivity, decision making and effectiveness and will lead to quicker diagnosis and improved patient outcomes. It will initially deploy as a diverse collection of assistive tools to augment, quantify and stratify the information available to the diagnostician, and offer a major opportunity to enhance and augment the radiology reading. It will improve access to medical record information and give radiologists more time to think about what is going on with patients, diagnose more complex cases, collaborate with patient care teams, and perform more invasive procedures. Deep Learning algorithms in particular will form the foundation for decision and workflow support tools and diagnostic capabilities. Algorithms will provide software the ability to "learn" by example on how to execute a task, then automatically execute those tasks as well as interpret new data.


Robots that learn to use improvised tools

Robohub

In many animals, tool-use skills emerge from a combination of observational learning and experimentation. For example, by watching one another, chimpanzees can learn how to use twigs to "fish" for insects. Similarly, capuchin monkeys demonstrate the ability to wield sticks as sweeping tools to pull food closer to themselves. While one might wonder whether these are just illustrations of "monkey see, monkey do," we believe these tool-use abilities indicate a greater level of intelligence. Right: A gorilla using a stick to gather herbs.


Facial recognition is big tech's latest toxic 'gateway' app John Naughton

The Guardian

The headline above an essay in a magazine published by the Association of Computing Machinery (ACM) caught my eye. "Facial recognition is the plutonium of AI", it said. Since plutonium – a by-product of uranium-based nuclear power generation – is one of the most toxic materials known to humankind, this seemed like an alarmist metaphor, so I settled down to read. The article, by a Microsoft researcher, Luke Stark, argues that facial-recognition technology – one of the current obsessions of the tech industry – is potentially so toxic for the health of human society that it should be treated like plutonium and restricted accordingly. You could spend a lot of time in Silicon Valley before you heard sentiments like these about a technology that enables computers to recognise faces in a photograph or from a camera.