Goto

Collaborating Authors

New Greylock venture partner Mustafa Suleyman is looking for AI's next best thing – TechCrunch

#artificialintelligence

Mustafa Suleyman has been working in artificial intelligence for 12 years, trying to figure out how to use machine learning systems and AI to do important things in the work and have impact at scale. "And over the years, I've been lucky enough to be at the forefront of a lot of cutting-edge applications of AI," he told TechCrunch. "Over the years, that experience has given me a really good intuition, for when a piece of AI is ready for the real world, and when it's not. The projects that I've seen fail are mostly because people overestimate how good AI is. People think that AI is this silver bullet and it can solve all your problems, but actually you have to craft the environment and the application problem correctly."


DeepMind co-founder Mustafa Suleyman launches new AI venture

#artificialintelligence

DeepMind co-founder Mustafa Suleyman has joined two other high-profile industry figures in launching a new venture called Inflection AI. LinkedIn co-founder Reid Hoffman is joining Suleyman on the venture. "Reid and I are excited to announce that we are co-founding a new company, Inflection AI," wrote Suleyman in a statement. "Inflection will be an AI-first consumer products company, incubated at Greylock, with all the advantages and expertise that come from being part of one of the most storied venture capital firms in the world." Dr Karén Simonyan, another former DeepMind AI expert, will serve as Inflection AI's chief scientist and its third co-founder.


The identity of the people on Google's artificial intelligence ethics board is still a mystery

#artificialintelligence

DeepMind cofounder Mustafa Suleyman once again refused to say who sits on Google's mysterious AI ethics board on Monday despite having previously said he wants to disclose it to the public. The board was quietly created in 2014 when Google acquired the London artificial intelligence lab. It was established in a bid to ensure that the self-thinking software DeepMind and Google is developing remains safe and of benefit to humanity. Speaking at the TechCrunch Disrupt conference, Suleyman said: "So look, I've said many times, we want to be as innovative and progressive and open with our governance as we are with our technology. "It's no good for us to just be technologists in a vacuum independently of the social and political consequences, build technologies that we think may or may not be useful, while we throw them over the wall.


DeepMind's Mustafa Suleyman joins Google AI

#artificialintelligence

DeepMind and co-founder Mustafa Suleyman have decided to go their separate ways. Earlier this year there were disputed reports the two were arguing, some even suggested he'd been placed on leave. But now it seems he's actually left the UK-based enterprise. Can't wait to get going! More in Jan as I start the new job!


DeepMind's Mysterious Ethics Board Will Reportedly 'Control' AGI If It's Ever Created

#artificialintelligence

DeepMind's ethics board has been a closely guarded secret ever since the artificial intelligence company was acquired by Google in 2014. But a new report from Hal Hodson at The Economist sheds some light on how the ethics board came about and what it's there for. DeepMind and Google signed an agreement drawn up by lawyers in London called the "Ethics and Safety Review Agreement" in the year leading up to the acquisition, according to the report. This agreement states that if DeepMind ever succeeds in its core mission of building artificial general intelligence (AGI)-- sometimes described as a machine that can successfully complete any intellectual task that a human can, and widely thought of as the holy grail in AI -- then the control of that machine will lie with those on a governing panel known as the Ethics Board, according to the report. The Ethics Board essentially allows DeepMind to legally maintain a degree of control over the technology it creates, no matter how valuable or dangerous it becomes, according to the report.