Parents worry about a lot of things--like whether their children will get into college, or become drug addicts, or get abducted by strangers. But I spend a lot more time worrying that my children are going to live with us forever because robots have taken all their potential jobs. As somebody who has spent her adult life focused largely on two things--studying technology trends and raising children--I'm acutely aware of the effect that continued advances in artificial intelligence could have on my children's opportunities. After all, a recent McKinsey report predicts that by 2030, when my two children are just joining the workforce, up to 30% of today's current work will have been automated. The problem is, we don't know for certain which particular jobs will be automated years from now, because AI is constantly developing in surprising ways.
Researchers at Mt. Sinai's Icahn School of Medicine in New York at have a unique collaborator in the hospital: Their in-house artificial intelligence system, known as Deep Patient. The researchers taught Deep Patient to predict risk factors for 78 different diseases by feeding it electronic health records from 700,000 patients. Doctors now turn to the system to aid in diagnoses. While not a person, Deep Patient is more than just a program. Like other advanced AI systems, it learns, makes autonomous decisions, and has grown from a technological tool to a partner, coordinating and collaborating with humans.
STD and HIV cases have increased nationwide and gone up and down California. And Long Beach now has the state's second-highest rate of chlamydia and third-highest rates of gonorrhea and syphilis. No one factor explains the increases, but health officials point to the popularity of online dating apps, casual hookups and evidence that young people who are busy and otherwise healthy often don't bother with condoms or routine checkups.
Emerging anxieties pertaining to the rapid advancement and sophistication of artificial intelligence appear to be on a collision course with historic models of human exceptionality and individuality. Yet it is not just objective, technical sophistication in the development of AI that seems to cause this angst. It is also the linguistic treatment of machine "intelligence." But what is really at stake? Are we truly concerned that we will be surpassed in our capacities as human beings?
Electropherograms are produced in great numbers in forensic DNA laboratories as part of everyday criminal casework. Before the results of these electropherograms can be used they must be scrutinised by analysts to determine what the identified data tells us about the underlying DNA sequences and what is purely an artefact of the DNA profiling process.
About 4,000 people listened to Cuban as he kicked off his shoes--literally--and explained how AI will change the game for companies, educators, and future developments. He's also keeping his eyes peeled for smaller companies in machine learning and AI, and already has at least three companies in his investment portfolio. "[Software writing] skill sets won't be nearly as valuable as being able to take a liberal arts education … and applying those [skills] in assisting and developing networks." But in order for the country to advance to that future, AI and robotics need to become core competencies in the U.S., and not just in the business world, Cuban said.