Goto

Collaborating Authors

Results


Machine learning expert Jordan bemoans use of AI as catch-all term - AI News

#artificialintelligence

A pioneer in machine learning has argued that the technology is best placed to augment human intelligence and bemoaned'confusion' over the meaning of artificial intelligence (AI). Michael I. Jordan, a professor in the department of electrical engineering and computer science, and department of statistics, at the University of California, Berkeley, told the IEEE that while science-fiction discussions around AI were'fun', they were also a'distraction.' "There's not been enough focus on the real problem, which is building planetary-scale machine learning-based systems that actually work, deliver value to humans, and do not amplify inequities," said Jordan, in an article from IEEE Spectrum author Kathy Pretz. Jordan, whose awards include the IEEE John von Neumann Medal, awarded last year for his contributions to machine learning and data science, wrote an article entitled'Artificial Intelligence: The Revolution Hasn't Happened Yet', first published in July 2019 but last updated at the start of this year. With various contributors thanked at the foot of the article – including one Jeff Bezos – Jordan outlined the rationale for caution.


Machine Learning In The Real World

#artificialintelligence

Over the past few decades, machine learning has emerged as the real-world face of what is often mistakenly called "artificial intelligence." It is establishing itself as a mainstream technology tool for companies, enabling them to improve productivity, planning, and ultimately, profits. Michael Jordan, professor of Computer Science and Statistics at the University of California, Berkeley, noted in a recent Medium post: "Most of what is being called'AI' today, particularly in the public sphere, is what has been called'machine learning' for the past several decades." Jordan argues that unlike much that is mislabeled "artificial intelligence," ML is the real thing. He maintains that it was already clear in the early 1990s that ML would grow to have massive industrial relevance.