Goto

Collaborating Authors

 scalp


5 personal care innovations that lived up to the hype in 2024

Popular Science

Plenty of personal care products--the treatments and gadgets that fill our medicine cabinets, home gyms, and vanities--promise innovation. Companies that craft cosmetics, supplements, fitness tools, and other wellness aids tend to go hard on buzzwords without putting in the research to make something truly new. That doesn't mean there aren't worthwhile, forward-thinking personal care products available, though, and this year brought some notable offerings. From high-tech sleep and activity trackers that make peak performance possible to cutting-edge hair dryers that give your scalp a break from burns, these five beauty and wellness products actually back up their big promises. Be sure to read the full list of the 50 greatest innovations of 2024.)


Temporary scalp tattoo can be used to record brain activity

New Scientist

Tattoos printed onto a person's scalp can detect electrical activity in the brain and carry signals to a recording device Analysing brainwaves could be made easier by printing a temporary tattoo onto a person's head. Electroencephalography (EEG) is a way of measuring electrical activity in the brain via electrodes placed on the scalp. It can be used to test patients for neurological conditions such as epilepsy, tumours or injury from stroke or traumatic impacts to the head. Because people's skulls vary in size and shape, technicians have to spend considerable amounts of time measuring and marking the scalp to get accurate readings. A gel helps the electrodes detect brain signals, but it stops working well as it dries.


Our brains have a built-in GPS! Scientists pinpoint a 'neural compass' that prevents us from getting lost

Daily Mail - Science & tech

For many of us, navigating the world seems like an impossible task without our smartphone. But a new study suggests humans are more adept at making our way from A to B than we might have realised. Scientists have found we have an'internal neural compass' in our brains that lets us orientate ourselves and navigate through an environment. This compass – which takes the form of an electrical signal transmitted by nerve cells – tells us we're about to head in a new direction. What's more, once we've reoriented ourselves, it lets us know that we're travelling along a new path – so eastwards instead of northwards, for example.


China Has a Controversial Plan for Brain-Computer Interfaces

WIRED

At a tech forum in Beijing last week, a Chinese company unveiled a "homegrown" brain-computer interface that allowed a monkey to seemingly control a robotic arm just by thinking about it. In a video shown at the event, a monkey with its hands restrained uses the interface to move a robotic arm and grasp a strawberry. The system, developed by NeuCyber NeuroTech and the Chinese Institute for Brain Research, involves soft electrode filaments implanted in the brain, according to state-run news media outlet Xinhua. Researchers in the US have tested similar systems in paralyzed people to allow them to control robotic arms, but the demonstration underscores China's progress in developing its own brain-computer interface technology and vying with the West. Brain-computer interfaces, or BCIs, collect and analyze brain signals, often to allow direct control of an external device, such as a robotic arm, keyboard, or smartphone.


The Papers: 'Lockdown at Palace' and 'AI claims first scalp'

BBC News

Artificial intelligence has claimed its first scalp, according to the Financial Times. It says shares in the education sector fell sharply on Tuesday after US company Chegg, which provides online study guides, said that a "significant spike in student interest" in AI tool ChatGPT was harming its customer growth. The paper says it marks "one of the first instances of a company acknowledging a hit to its finances as a direct result of advances" in the technology.


Brain-Computer Interface Enables Mind Control of Robot Dog

#artificialintelligence

A new peer-reviewed study published in ACS Applied Nano Materials demonstrates a new type of AI-enabled brain-machine interface (BMI) featuring noninvasive biosensor nanotechnology and augmented reality that enables humans to use thoughts to control robots with a high degree of accuracy. Brain-machine interfaces (BMIs) are hands-free and voice-command-free communication systems that allow an individual to operate external devices through brain waves, with vast potential for future robotics, bionic prosthetics, neurogaming, electronics, and autonomous vehicles. The artificial intelligence (AI) renaissance with the improved pattern-recognition capabilities of deep neural networks is contributing to the acceleration of advances in brain-machine interfaces, also known as brain-computing interfaces (BCIs). AI deep learning helps find the relevant signals in the noisy brain activity data. The neural activity of the human brain is recorded using sensors.


Artificial intelligence to accelerate economical energy transition, WEF says

#artificialintelligence

Artificial intelligence has "tremendous potential" to support and accelerate a reliable and lowest-cost energy transition, a new report by the World Economic Forum has revealed. Through its high-tech applications, AI can integrate renewable energy resources into the power grid, support an autonomous electricity distribution system and open up new revenue streams for demand-side flexibility, WEF said in its Harnessing Artificial Intelligence to Accelerate Energy Transition report compiled in collaboration with BloombergNEF and Deutsche Energie-Agentur (dena) – the German energy agency. AI can create substantial value for the global energy transition, the report said. Every 1 per cent of additional efficiency in demand will create $1.3 trillion in value between 2020 and 2050 due to reduced investment needs, according to BloombergNEF's net-zero scenario modelling. This could be achieved by enabling greater energy efficiency and flexing demand. "In energy, we are only seeing the beginning ...


AI scans your brain and draws what you see

#artificialintelligence

Russian researchers have used a non-invasive technique that visualizes the brain activity of a person, recreating surprisingly accurate moving images of what our eyes actually see. The method could someday be employed in cognitive disorder treatment or post-stroke rehabilitation devices that are controlled by a patient's thoughts. This is not the first time that scientists have decoded people's brain activity patterns to generate images. Such methods, however, typically rely on functional MRI or surgically implanted neurons, which can be invasive and cumbersome, thereby limiting the potential for everyday applications. The new technique developed by researchers at the Moscow Institute of Physics and Technology and Russian corporation Neurobotics is much more versatile.


Mind-Reading Neural Network Uses Brain Waves to Recreate Human Thoughts

#artificialintelligence

It has long been the stuff of science fiction but now mind-reading machines may actually be here and they may not be invasive. Researchers from the Russian corporation Neurobotics and the Moscow Institute of Physics and Technology have found a way to visualize a person's brain activity as actual images without the use of invasive brain implants. The work has the potential to enable new non-invasive post-stroke rehabilitation devices controlled by brain signals as well as novel cognitive disorder treatments. In order to do achieve such applications, neurobiologists need to understand how the brain encodes information by studying it in real-time such as when a person is watching a video. This is where the new brain-computer interface developed by the researchers comes in. Using artificial neural networks and electroencephalography, or EEG, a technique for recording brain waves via electrodes placed noninvasively on the scalp, the team was able to visualize what test subjects were looking at in videos in real-time.


Robots can read your mind to fix their mistakes

#artificialintelligence

Imagine a robot stacking boxes in a warehouse when it suddenly sees that one box is in the wrong stack. It goes back and puts the container in the right place. How did the machine know it had made a mistake? The robot's human boss didn't punch any codes into a computer to have the robot correct its mistake. The boss didn't say a word.