Results


We Almost Gave Up On Building Artificial Brains - The Crux

#artificialintelligence

Today artificial neural networks are making art, writing speeches, identifying faces and even driving cars. It feels as if we're riding the wave of a novel technological era, but the current rise in neural networks is actually a renaissance of sorts. It may be hard to believe, but artificial intelligence researchers were already beginning to see the promise in neural networks during World War II in their mathematical models. But by the 1970s, the field was ready to give up on them entirely. "[T]here were no impressive results until computers grew up, that is until the past 10 years," Patrick Henry Winston, a professor at MIT who specializes in artificial intelligence, says.


In Pursuit of Virtual Life

Communications of the ACM

At first glance, the creature known as Caenorhabditis elegans--commonly referred to as C. elegans, a type of roundworm--seems remarkably simple; it is comprised of only 959 cells and approximately 302 neurons. In contrast, the human body contains somewhere around 100 trillion cells and about 100 billion neurons in the brain. Yet decoding the genome for this worm and digitally reproducing it--something that could spur enormous advances in the understanding of life and how organisms work--is a challenge for the ages. "The project will take years to complete. It involves enormous time and resources," says Stephen Larson, project coordinator for the OpenWorm Foundation.


Building a better brain

#artificialintelligence

The human brain weighs three pounds and is made up of more than 100 billion nerve cells that allow us to remember birthdays, recognize and evade danger, compose symphonies, build bridges, and design super-smart machines to take over the tasks we find too difficult, too dirty, or too boring.


How artificial intelligence can benefit India

#artificialintelligence

China is investing $2.1 billion in creating an AI research park. It is time to ensure that we are not left behind in this important skill.


Separating science fact from science hype: How far off is the singularity?

#artificialintelligence

The term "artificial intelligence" was only just coined about 60 years ago, but today, we have no shortage of experts pondering the future of AI. Chief amongst the topics considered is the technological singularity, a moment when machines reach a level of intelligence that exceeds that of humans.


The development of AI ethics must keep pace with innovation

#artificialintelligence

The ability of artificial intelligence to make ethically sound decisions is a hot topic in debates around the world. The issue is particularly prevalent in discussions on the future of autonomous cars, but it spans to include ethical conundrums similar to those depicted in sci-fi flicks like Blade Runner.


Artificial Intelligence Fundamentals : Making Machine Intelligent

#artificialintelligence

We human beings are the most sophisticated living gadget on this mother earth, We are the most powerful intellectual machine which has it's own intelligence to make decisions, our intellect made sure we ruled over all other living creatures on this planet. We learned to acquire all the skills which was necessary for our survival but once our survival process was ensured we started to explore more, our infinite intelligence which knows no boundaries wanted more. We started to invent tools which will help us save time for ourself and ensure more safety and security, gradually we ventured to invent machines which can be an extension to our intellectual brain and memorise more information and multitask for us .


What If AI Succeeds?

AI Magazine

Within the time of a human generation, computer technology will be capable of producing computers with as many artificial neurons as there are neurons in the human brain. Within two human generations, intelligists (AI researchers) will have discovered how to use such massive computing capacity in brainlike ways. This situation raises the likelihood that twenty-first century global politics will be dominated by the question, Who or what is to be the dominant species on this planet? This article discusses rival political and technological scenarios about the rise of the artilect (artificial intellect, ultraintelligent machine) and launches a plea that a world conference be held on the socalled "artilect debate." Many years ago, while reading my first book on molecular biology, I realized not only that living creatures, including human beings, are biochemical machines, but also that one day, humanity would sufficiently understand the principles of life to be able to reproduce life artificially (Langton 1989) and even create a creature more intelligent than we are.


How 5 of the Most Innovative Tech Companies Are Using AI In 2017

#artificialintelligence

For the past couple of years, AI has turned from a "meh" kind of topic into one of the leading trends in almost every industry. Large corporations are buying AI-focused startups as fast as they can. At the same time, the market is witnessing an unprecedented amount of investments made in the area of AI. For example, Toyota raised a $100m. The technology has become so popular that you can find it in places where you expect it the least. One of the former Google engineers has even founded a religion that worships Artificial Intelligence. Zack Thoutt, a developer and a Game of Thrones fan, was so impatient to see the new season of the show that he decided to create a neural network that wrote all five chapters of the Fire and Ice saga. That said, let's go through some of the most notable AI implementation by world's largest companies and startups.


Artificial Intelligence Is a Game Changer for Virtually Every Business

#artificialintelligence

On February 9, 2017, two technology market leaders made announcements: SAP unveiled its next-generation intelligent ERP system, and Nvidia announced that demand for artificial intelligence (AI) applications was driving demand for its graphics platform. On the face of it, these announcements were business as usual--routine sound bites that proliferate in the tech news landscape. Look a bit deeper, though, and you realize that this day marked a profound shift in both the way businesses use technology and the implications for the rest of us.