brain


How Brain Waves Surf Sound Waves to Process Speech - Facts So Romantic

Nautilus

Reprinted with permission from Quanta Magazine's Abstractions blog. When he talks about where his fields of neuroscience and neuropsychology have taken a wrong turn, David Poeppel of New York University doesn't mince words. "There's an orgy of data but very little understanding," he said to a packed room at the American Association for the Advancement of Science annual meeting in February. He decried the "epistemological sterility" of experiments that do piecework measurements of the brain's wiring in the laboratory but are divorced from any guiding theories about behaviors and psychological phenomena in the natural world. It's delusional, he said, to think that simply adding up those pieces will eventually yield a meaningful picture of complex thought.


Self-navigating AI learns to take shortcuts: study

#artificialintelligence

A computer programme modelled on the human brain learnt to navigate a virtual maze and take shortcuts, outperforming a flesh-and-blood expert, its developers said Wednesday. While artificial intelligence (AI) programmes have recently made great strides in imitating human brain processing--everything from recognising objects to playing complicated board games--spatial navigation has remained a challenge. It requires the recalculation of one's position, after each step taken, in relation to the starting point and destination--even when travelling a never-before-taken route. Navigation is considered a complex behavioural task, and in animals is partly controlled by a sort of onboard GPS driven by "grid cells" in the brain's hippocampus region. These cells have been observed firing in a regular pattern as mammals explore a new environment.


This Machine Learning System Thinks About Music Like You Do -- NOVA Next 7wData

#artificialintelligence

If you've ever let Spotify DJ your party and then found yourself asking, a half an hour in, "Spotify, what are you thinking?"--well, it actually may be thinking a lot like you. Scientists reported in a new study from the Massachusetts Institute of Technology that they've created a machine-learning system that processes sound just like humans, whether it's discerning the meaning of a word or classifying music by genre. It's the first artificial system to mimic the way the brain interprets sounds--and it rivals humans in its accuracy. The research, published today in the journal Neuron, offers a tantalizing new way to study the brain. The researchers' model was based on what's called a deep neural network, a system whose structure is loosely inspired by neurons, or brain cells.


Renault is using artificial intelligence to run its influencer marketing campaigns

#artificialintelligence

Renault is letting machine learning power the decisions it makes about influencer marketing. The automaker is testing an AI social listening tool in a number of European markets which allows it to easily segment audiences and identify influencers they trust. It's part of a wider shift by the brand to move away from mass reach and towards one-to-one communication on social, a strategy being led by Francois-Xavier Pierrel, corporate director of data, CRM and social. Pierrel was poached from Facebook last year to sit in the newly-created role, and admitted it was "scary" that the car marque didn't have anyone in this position with a firm grip on its data until just last year. However, now that he's settled in, it's clear he's trying to break down internal data silos and realign the company's social strategy to both engage and retain customers.


Will AI Ever Become Conscious?

#artificialintelligence

One example of a sci-fi struggle to define AI consciousness is AMC's "Humans" (Tues. At this point in the series, human-like machines called Synths have become self-aware; as they band together in communities to live independent lives and define who they are, they must also battle for acceptance and survival against the hostile humans who created and used them. But what exactly might "consciousness" mean for artificial intelligence (AI) in the real world, and how close is AI to reaching that goal? Philosophers have described consciousness as having a unique sense of self coupled with an awareness of what's going on around you. And neuroscientists have offered their own perspective on how consciousness might be quantified, through analysis of a person's brain activity as it integrates and interprets sensory data.


Evolution of pallium, hippocampus, and cortical cell types revealed by single-cell transcriptomics in reptiles

Science

Just how related are reptilian and mammalian brains? Tosches et al. used single-cell transcriptomics to study turtle, lizard, mouse, and human brain samples. They assessed how the mammalian six-layered cortex might be derived from the reptilian three-layered cortex. Despite a lack of correspondence between layers, mammalian astrocytes and adult neural stem cells shared evolutionary origins. General classes of interneuron types were represented across the evolutionary span, although subtypes were species-specific.


Imitation Learning in Unity: The Workflow – Unity Blog

#artificialintelligence

With the release of ML-Agents v0.3 Beta, there are lots of new ways to use Machine Learning in your projects. Whether you're working on games, simulations, academic or any other sort of projects, your work can benefit from the use of neural networks in the virtual environment. If you've been using ML-Agents before this latest release, you will already be familiar with Reinforcement Learning. If not, I wrote a beginner's guide to get you started. This blog post will help you get up to speed with one of the major features that represent an alternative to Reinforcement Learning: Imitation Learning.


Bitcoin explained by AI will melt your brain

#artificialintelligence

Hello, I have watched this video and I will never be the same again. By now I'm assuming we all have a fundamental understanding of what Bitcoin is: digital money regulated and distributed with encryption techniques, stored on a ledger independent of a centralised bank. Maybe go here for a better explanation. That's how a human being might explain Bitcoin. Here's how an AI might try to explain it.


Give the Robots Electronic Tongues

WIRED

Humans lives their lives trapped in a glass cage of perception. You can only see a limited range of visible light, you can only taste a limited range of tastes, you can only hear a limited range of sounds. But machines can kind of leapfrog over the limitations of natural selection. By creating advanced robots, humans have invented a new kind of being, one that can theoretically sense a far greater range of stimuli. Which is presenting roboticists with some fascinating challenges, not only in creating artificial senses of touch and taste, but in figuring out what robots should ignore in a human world.


We may have got the evolution of our big brains entirely wrong

New Scientist

Why are our brains six times as large as those of other mammals with bodies of a similar size? The leading hypothesis has been that our brain expansion was driven by social pressures, by the need to cooperate or compete with others. But instead the key factor may have been "ecological" challenges like finding food and lighting fires. "We were expecting social challenges to be a strong promoter of brain size," says Mauricio González-Forero of the University of St Andrews in the UK. He has developed a mathematical model of human brain evolution with his colleague Andy Gardner.