computational


What the History of Math Can Teach Us About the Future of AI

#artificialintelligence

Whenever an impressive new technology comes along, people rush to imagine the havoc it could wreak on society, and they overreact. Today we see this happening with artificial intelligence (AI). I was at South by Southwest last month, where crowds were buzzing about Elon Musk's latest hyperbolic claim that AI poses a far greater danger to humanity than nuclear weapons. Some economists have similarly sounded alarms that automation will put nearly half of all jobs in the U.S. at risk by 2030. The drumbeat of doomsaying has people spooked: a Gallup/Northeastern study published in March found that about three out of four Americans are convinced that AI will destroy more jobs than it creates.


Announcing Ursa Labs: an innovation lab for open source data science

@machinelearnbot

Funding open source software development is a complicated subject. I'm excited to announce that I've founded Ursa Labs (https://ursalabs.org), an independent development lab with the mission of innovation in data science tooling. I am initially partnering with RStudio and Two Sigma to assist me in growing and maintaining the lab's operations, and to align engineering efforts on creating interoperable, cross-language computational systems for data science, all powered by Apache Arrow. In this post, I explain the rationale for forming Ursa Labs and what to expect in the future. In recent years, the world's businesses have become more dependent than ever on open source software ("OSS", henceforth).


March of the robots: how artificial intelligence can enhance our lives

#artificialintelligence

The death of Stephen Hawking has robbed the world of one of its greatest minds – an intellect that would have been locked away and extinguished much sooner had he been born a generation earlier. The severely disabled scientist was one of the greatest beneficiaries, and also one of the most outspoken cautionary voices, of automation and artificial intelligence. "The development of full artificial intelligence could spell the end of the human race," said Hawking in 2014, ironically through a speech synthesiser that used machine learning to anticipate how the professor thought and suggested words he might want to use next. Full artificial intelligence is likely to emerge from a more advanced version of this machine learning, which encourages computers to use something akin to intuition to advance beyond their basic programming. It's a potentially dangerous endeavour if it is developed unchecked.


What depressed robots can teach us about mental health Zachary Mainen

#artificialintelligence

Depression seems a uniquely human way of suffering, but surprising new ways of thinking about it are coming from the field of artificial intelligence. Worldwide, over 350 million people have depression, and rates are climbing. The success of today's generation of AI owes much to studies of the brain. Might AI return the favour and shed light on mental illness? The central idea of computational neuroscience is that similar issues face any intelligent agent – human or artificial – and therefore call for similar sorts of solutions.


How to Develop AI on a Raspberry Pi With Google Colaboratory

#artificialintelligence

Last year Google partnered with the Raspberry Pi Foundation to survey users on what would be most helpful in bringing Google's artificial intelligence and machine learning tools to the Raspberry Pi. Now those efforts are paying off. Thanks to Colaboratory – a new open-source project from Google – engineers, researchers, and makers can now build and run machine learning applications on a simple single-board computer. Google has officially opened up its machine learning and data science workflow – making learning about machine learning or data analytics as easy as using a notebook and a Raspberry Pi. Google's Colaboratory is a research and education tool that can easily be shared via Google's Chrome web browser.


Use of Machine Learning Technology in Mobile Apps

#artificialintelligence

Technology, which has gigantic promises for bringing computing machines closer to human intelligence, is machine learning. Machine Learning is an area in the field of Artificial Intelligence that can solve a broad range of computational tasks and can significantly improve the user experience. As computers are getting smarter and efficient in interacting with humans, we are slowly entering into a new era where computational machines would have more control on human actions as compared to past. Machine Learning is a completely new idea for future. We are already getting benefit every day from ML.


Could artificial intelligence get depressed and have hallucinations?

#artificialintelligence

A hallucinating artificial intelligence might see something like this product of Google's Deep Dream algorithm. As artificial intelligence (AI) allows machines to become more like humans, will they experience similar psychological quirks such as hallucinations or depression? And might this be a good thing? Last month, New York University in New York City hosted a symposium called Canonical Computations in Brains and Machines, where neuroscientists and AI experts discussed overlaps in the way humans and machines think. Zachary Mainen, a neuroscientist at the Champalimaud Centre for the Unknown, a neuroscience and cancer research institute in Lisbon, speculated that we might expect an intelligent machine to suffer some of the same mental problems people do.


2001: A Space Odyssey Predicted The Future--50 Years Ago

WIRED

The space race was in full swing. For the first time, a space probe had recently landed on another planet (Venus). And I was eagerly studying everything I could to do with space. Then on April 2, 1968 (May 15 in the UK), the movie 2001: A Space Odyssey was released--and I was keen to see it. So in the early summer of 1968 there I was, the first time I'd ever been in an actual cinema (yes, it was called that in the UK). I'd been dropped off for a matinee, and was pretty much the only person in the theater. And to this day, I remember sitting in a plush seat and eagerly waiting for the curtain to go up, and the movie to begin. It started with an impressive extraterrestrial sunrise. But then what was going on? Those were landscapes, and animals. I was confused, and frankly a little bored. But just when I was getting concerned, there was a bone thrown in the air that morphed into a spacecraft, and pretty soon there was a rousing waltz--and a big space station turning majestically on the screen. The next two hours had a big effect on me. It wasn't really the spacecraft (I'd seen plenty of them in books by then, and in fact made many of my own concept designs). But what was new and exciting for me in the movie was the whole atmosphere of a world full of technology--and the notion of what might be possible there, with all those bright screens doing things, and, yes, computers driving it all. It would be another year before I saw my first actual computer in real life. But those two hours in 1968 watching 2001 defined an image of what the computational future could be like, that I carried around for years. I think it was during the intermission to the movie that some seller of refreshments--perhaps charmed by a solitary kid so earnestly pondering the movie--gave me a "cinema program" about the movie. Half a century later I still have that program, complete with a food stain, and faded writing from my 8-year-old self, recording (with some misspelling) where and when I saw the movie. A lot has happened in the past 50 years, particularly in technology, and it's an interesting experience for me to watch 2001 again--and compare what it predicted with what's actually happened. Of course, some of what's actually been built over the past 50 years has been done by people like me, who were influenced in larger or smaller ways by 2001. When Wolfram Alpha was launched in 2009--showing some distinctly HAL-like characteristics--we paid a little homage to 2001 in our failure message (needless to say, one piece of notable feedback we got at the beginning was someone asking: "How did you know my name was Dave?!"). One very obvious prediction of 2001 that hasn't panned out, at least yet, is routine, luxurious space travel. But like many other things in the movie, it doesn't feel like what was predicted was off track; it's just that--50 years later--we still haven't got there yet. Well, they have lots of flat-screen displays, just like real computers today.


Buzzword Convergence: Making Sense of Quantum Neural Blockchain AI--Stephen Wolfram Blog

#artificialintelligence

What happens if you take four of today's most popular buzzwords and string them together? Does the result mean anything? Given that today is April 1 (as well as being Easter Sunday), I thought it'd be fun to explore this. Think of it as an Easter egg… from which something interesting just might hatch. And to make it clear: while I'm fooling around in stringing the buzzwords together, the details of what I'll say here are perfectly real. But before we can really launch into talking about the whole string of buzzwords, let's discuss some of the background to each of the buzzwords on their own. Saying something is "quantum" sounds very modern. But actually, quantum mechanics is a century old. And over the course of the past century, it's been central to understanding and calculating lots of things in the physical sciences. But even after a century, "truly quantum" technology hasn't arrived. Yes, there are things like lasers and MRIs and atomic force microscopes that rely on quantum phenomena, and needed quantum mechanics in order to be invented. But when it comes to the practice of engineering, what's done is still basically all firmly classical, with nothing quantum about it. Today, though, there's a lot of talk about quantum computing, and how it might change everything.


Did Stephen Wolfram's Knowledge Engine Just Become a Quantum Neural Blockchain AI?

#artificialintelligence

What happens if you take four of today's most popular buzzwords and string them together? Does the result mean anything? Given that today is April 1 (as well as being Easter Sunday), I thought it'd be fun to explore this. Think of it as an Easter egg... from which something interesting just might hatch. And to make it clear: while I'm fooling around in stringing the buzzwords together, the details of what I'll say here are perfectly real. But before we can really launch into talking about the whole string of buzzwords, let's discuss some of the background to each of the buzzwords on their own. Saying something is "quantum" sounds very modern. But actually, quantum mechanics is a century old. And over the course of the past century, it's been central to understanding and calculating lots of things in the physical sciences. But even after a century, "truly quantum" technology hasn't arrived. Yes, there are things like lasers and MRIs and atomic force microscopes that rely on quantum phenomena, and needed quantum mechanics in order to be invented. But when it comes to the practice of engineering, what's done is still basically all firmly classical, with nothing quantum about it. Today, though, there's a lot of talk about quantum computing, and how it might change everything.