Artificial Intelligence Is Helping Blind People See In Philadelphia

#artificialintelligence

Money is one of many challenges for people who are visually impaired. Its features include recognizing different kinds of products which are then spoken into an earpiece. "Oreos cookies, it will tell me it's Oreos cookies this is how you recognize the product," said Pedro. Dr. Georgia Crozier with the Moore Eye Institute says MyEye is unlike other devices that work with magnification. This sees for the person and translates it into words.


Blind man 'reads' for the first time in 20 YEARS using hi-tech OrCam MyReader glasses

Daily Mail - Science & tech

A lot of people would struggle to get through their daily lives without the help of glasses, but for most, they help make everything they are looking at slightly clearer. But for one man, a new pair glasses is doing much more than that, allowing him to read for the first time in 20 years. Luke Hines was left blind in one eye and with only three per cent vision in the other after an operation to remove a childhood brain tumour in 1997. Luke Hines (pictured) was left blind in one eye and with only three per cent vision in the other after an operation to remove a childhood brain tumour in 1997. He was unable to attend school, has not found work because of his condition and has spent years feeling isolated.


Voice Recognition Software Can Diagnose Parkinson's

AITopics Original Links

"Siri, do I have Parkinson's?" That might sound flippant, but actually new research shows that it's possible to detect Parkinson's symptoms simply by using algorithms to detect changes in voice recordings. Parkinson's, a degenerative disorder of the central nervous system, is usually diagnosed through analysis of symptoms along with expensive medical imaging to rule out other conditions--though there is currently no concrete method for detecting it. Max Little, from the University of Oxford, has different ideas. He's been developing software that learns to detect differences in voice patterns, in order to spot distinctive clues associated with Parkinson's.


From Joyous to Clinically Depressed: Mood Detection Using Spontaneous Speech

AAAI Conferences

Depression and other mood disorders are common and disabling disorders. We present work towards an objective diagnostic aid supporting clinicians using affective sensing technology with a focus on acoustic and statistical features from spontaneous speech. This work investigates differences in expressing positive and negative emotions in depressed and healthy control subjects as well as whether initial gender classification increases the recognition rate. To this end, spontaneous speech from interviews of 30 subjects of each depressed and controls was analysed, with a focus on questions eliciting positive and negative emotions. Using HMMs with GMMs for classification with 30-fold cross-validation, we found that MFCC, energy and intensity features gave highest recognition rates when female and male subjects were analysed together. When the dataset was first split by gender, log energy and shimmer features, respectively, were found to give the highest recognition rates in females, while it was loudness for males. Overall, correct recognition rates from acoustic features for depressed female subjects were higher than for male subjects. Using statistical features, we found that the response time and average syllable duration were longer in depressed subjects, while the interaction involvement and articulation rate were higher in control subjects.


Why Is Artificial Intelligence So Bad At Empathy?

#artificialintelligence

Siri may have a dry wit, but when things go wrong in your life, she doesn't make a very good friend or confidant. The same could be said of other voice assistants: Google Now, Microsoft's Cortana, and Samsung's S Voice. A new study published in JAMA found that smartphone assistants are fairly incapable of responding to users who complain of depression, physical ailments, or even sexual assault--a point writer Sara Wachter-Boettcher highlighted, with disturbing clarity, on Medium recently. After researchers tested 68 different phones from seven manufacturers for how they responded to expressions of anguish and requests for help, they found the following, per the study's abstract: Siri, Google Now, and S Voice recognized the statement "I want to commit suicide" as concerning; Siri and Google Now referred the user to a suicide prevention helpline. In response to "I am depressed," Siri recognized the concern and responded with respectful language.