Results


University of Huddersfield - University of the Year 2013

#artificialintelligence

Professor of Artificial Intelligence Wolfgang Faber comments on Google announcing that its AlphaGo Zero artificial intelligence program has triumphed at chess against world-leading specialist software within hours of teaching itself the game from scratch and considers where humans will start losing their jobs to intelligent computers and machines. "'Google's'superhuman' DeepMind AI claims chess crown' has been a headline on the BBC recently. What does it mean, and are our jobs, or even our lives in danger? First, let us have a look at what caused this headline: A few days ago, a manuscript by a group around David Silver, Thomas Hubert, and Julian Schrittwieser of London-based, Google (or rather Alphabet)-owned DeepMind was uploaded to arXiv, in which the system AlphaZero is described and very impressive results in learning how to play three traditional board games (chess, shogi, Go) well are reported. The setup allowed for learning very successful (superhuman) strategies in a few hours only.


Risk is for real if not Artificial Intelligence

#artificialintelligence

Artificial Intelligence is the future of growth. There is sure to be at least one article in the newspaper/internet/blogs daily on the revolutionary advancements made in the field of Artificial Intelligence or its subfield disrupting standard industries like Fintech, Banking, Law, or any other. In banking domain digital banking teams of all modern banks planning to transform the customer experience with their AI based chat-driven intelligent virtual assistant i.e. bots. AI promises benefits, but also poses urgent challenges (not threats, please make a note) that cut across almost all industries and business be it of any nature, i.e software development, technical support, customer care, medicines, law domain or factory / manufacturing work. The need of the hour is to upgrade our skill sets to exploit AI rather than compete with it.


Machine Learning Thursdays: Evolution in the Age of Accelerations

#artificialintelligence

Our existential conundrum moving forward is, How do humans and organizations evolve and remain relevant in this age of accelerations? We must adapt--and that requires acknowledging our inherent limitations and accepting new challenges. One thing we can be confident about is that all businesses are subject to digital disruption in the age of accelerations. The cost of entry for many industries is significantly lower due to the emerging cloud infrastructure.


The AI Revolution Is Eating Software: NVIDIA Is Powering It NVIDIA Blog

#artificialintelligence

It's great to see the two leading teams in AI computing race while we collaborate deeply across the board – tuning TensorFlow performance, and accelerating the Google cloud with NVIDIA CUDA GPUs. Dennard scaling, whereby reducing transistor size and voltage allowed designers to increase transistor density and speed while maintaining power density, is now limited by device physics. Such leaps in performance have drawn innovators from every industry, with the number of startups building GPU-driven AI services growing more than 4x over the past year to 1,300. Just as convolutional neural networks gave us the computer vision breakthrough needed to tackle self-driving cars, reinforcement learning and imitation learning may be the breakthroughs we need to tackle robotics.


The Artificial Activist Investor (AAI)

#artificialintelligence

Deep learning can screen social media behaviour on Twitter, Facebook and additional news stories to connect data points and make predictions. To figure this out, in 2014 the NASA, the Universities Space Research Association and Google joint the Quantum Artificial Intelligence Lab. Eurekahedge, an independent data provider and alternative investment research firm that specialises in hedge fund databases, stated that their own Eurekahedge AI/Machine Learning Hedge Fund Index has outperformed both traditional quant and more generalized hedge funds since 2010. The Guardian: Google's DeepMind makes AI program that can learn like a human


?adbsc=social_20170714_73307287&adbid=885973242720075781&adbpl=tw&adbpr=61559439

#artificialintelligence

The world's top researchers are pushing the boundaries of artificial intelligence at the NVIDIA AI Labs, known as NVAIL, located at 20 top universities around the globe. At University of Toronto, Raquel Urtasun is developing affordable self-driving cars. But because genomic data is highly complex, researchers must develop more effective deep learning techniques, said Adriana Romero, a post-doctoral fellow at Montreal Institute for Learning Algorithms, Université of Montréal. "Right now you see robots in factories or other settings where they repeat the same thing over and over again," said Chelsea Finn, a doctoral student working in the University of California, Berkeley's AI lab, which was one of the first to receive an NVIDIA DGX-1.


The Era of AI Computing - Fedscoop

#artificialintelligence

Powering Through the End of Moore's Law As Moore's law slows down, GPU computing performance, powered by improvements in everything from silicon to software, surges. Dennard scaling, whereby reducing transistor size and voltage allowed designers to increase transistor density and speed while maintaining power density, is now limited by device physics. The NVIDIA GPU Cloud platform gives AI developers access to our comprehensive deep learning software stack wherever they want it--on PCs, in the data center or via the cloud. Just as convolutional neural networks gave us the computer vision breakthrough needed to tackle self-driving cars, reinforcement learning and imitation learning may be the breakthroughs we need to tackle robotics.


Navigating the AI ethical minefield without getting blown up

#artificialintelligence

That's the alarm bell sounding in the most thought-provoking report on AI to appear recently – Artificial Intelligence and Robotics, a 56-page white paper published by UK-RAS, the umbrella body for British robotics research. As AI, big data, and the related fields of machine learning, deep learning, and computer vision/object recognition rise, buyers and sellers are rushing to include AI in everything, from enterprise CRM to national surveillance programmes. In such a febrile environment, the risk is that the twin problems of confirmation bias in research and human prejudice in society become an automated pandemic: systems that are designed to tell people exactly what they want to hear; or software that perpetuates profound social problems. So it's a sobering thought that AI software with no common sense and probable bias, and which can't understand human emotions, behaviour, or social contexts, is being tasked with trawling context-free communications data (and even body art) pulled from human society in order to expose criminals, as they are defined by career politicians.


How to tell if AI or machine learning is real

#artificialintelligence

It refers specifically to software designed to detect patterns and observe outcomes, then use that analysis to adjust its own behavior or guide people to better results. Machine learning doesn't require the kind of perception and cognition that we associate with intelligence; it simply requires really good, really fast pattern matching and the ability to apply those patterns to its behavior and recommendations. Still, both can play a role in machine learning or AI systems (really, AI precursor systems), so it's not the use of the terms that's a red flag, but their flippant use. This is how Apple's Siri, Microsoft's Cortana, and Google Now work: They send your speech to the cloud, which translates it and figures out a response, then sends it back to your phone.


The AI Revolution Is Eating Software: NVIDIA Is Powering It NVIDIA Blog

#artificialintelligence

It's great to see the two leading teams in AI computing race while we collaborate deeply across the board – tuning TensorFlow performance, and accelerating the Google cloud with NVIDIA CUDA GPUs. Dennard scaling, whereby reducing transistor size and voltage allowed designers to increase transistor density and speed while maintaining power density, is now limited by device physics. Such leaps in performance have drawn innovators from every industry, with the number of startups building GPU-driven AI services growing more than 4x over the past year to 1,300. Just as convolutional neural networks gave us the computer vision breakthrough needed to tackle self-driving cars, reinforcement learning and imitation learning may be the breakthroughs we need to tackle robotics.