Goto

Collaborating Authors

Results


Where is Memory Information Stored in the Brain?

arXiv.org Artificial Intelligence

Within the scientific research community, memory information in the brain is commonly believed to be stored in the synapse - a hypothesis famously attributed to psychologist Donald Hebb. However, there is a growing minority who postulate that memory is stored inside the neuron at the molecular (RNA or DNA) level - an alternative postulation known as the cell-intrinsic hypothesis, coined by psychologist Randy Gallistel. In this paper, we review a selection of key experimental evidence from both sides of the argument. We begin with Eric Kandel's studies on sea slugs, which provided the first evidence in support of the synaptic hypothesis. Next, we touch on experiments in mice by John O'Keefe (declarative memory and the hippocampus) and Joseph LeDoux (procedural fear memory and the amygdala). Then, we introduce the synapse as the basic building block of today's artificial intelligence neural networks. After that, we describe David Glanzman's study on dissociating memory storage and synaptic change in sea slugs, and Susumu Tonegawa's experiment on reactivating retrograde amnesia in mice using laser. From there, we highlight Germund Hesslow's experiment on conditioned pauses in ferrets, and Beatrice Gelber's experiment on conditioning in single-celled organisms without synapses (Paramecium aurelia). This is followed by a description of David Glanzman's experiment on transplanting memory between sea slugs using RNA. Finally, we provide an overview of Brian Dias and Kerry Ressler's experiment on DNA transfer of fear in mice from parents to offspring. We conclude with some potential implications for the wider field of psychology.


Analyzing hierarchical multi-view MRI data with StaPLR: An application to Alzheimer's disease classification

arXiv.org Machine Learning

Multi-view data refers to a setting where features are divided into feature sets, for example because they correspond to different sources. Stacked penalized logistic regression (StaPLR) is a recently introduced method that can be used for classification and automatically selecting the views that are most important for prediction. We show how this method can easily be extended to a setting where the data has a hierarchical multi-view structure. We apply StaPLR to Alzheimer's disease classification where different MRI measures have been calculated from three scan types: structural MRI, diffusion-weighted MRI, and resting-state fMRI. StaPLR can identify which scan types and which MRI measures are most important for classification, and it outperforms elastic net regression in classification performance.


Experimental drug can reverse age-related cognitive decline within THREE DAYS, new mouse study finds

Daily Mail - Science & tech

Scientists believe they may have created the first drug that reverses age-related cognitive decline in just days. It works by rebooting the protein production machinery of the cells in our brains after it gets smothered by a stress response from our body. Research conducted on aged mice showed the compound restored memory and mental flexibility as well rejuvenated brain and immune cells. The team, from the University of California, San Francisco (UCSF) says the findings shed fresh light on age-related brain diseases and could lead to a treatment that stops the progression of symptoms. An integrated stress response (ISR) can stop the protein production machine of cells if it notices something is wrong, but can also damage cells in our brains if left'on' (file image) In the past, the team had been studying integrated stress response (ISR), which can be triggered within a cell due to certain conditions.


AI model uses retinal scans to predict Alzheimer's disease

#artificialintelligence

A form of artificial intelligence designed to interpret a combination of retinal images was able to successfully identify a group of patients who were known to have Alzheimer's disease, suggesting the approach could one day be used as a predictive tool, according to an interdisciplinary study from Duke University. The novel computer software looks at retinal structure and blood vessels on images of the inside of the eye that have been correlated with cognitive changes. The findings, appearing last week in the British Journal of Ophthalmology, provide proof-of-concept that machine learning analysis of certain types of retinal images has the potential to offer a non-invasive way to detect Alzheimer's disease in symptomatic individuals. "Diagnosing Alzheimer's disease often relies on symptoms and cognitive testing," said senior author Sharon Fekrat, M.D., retina specialist at the Duke Eye Center. "Additional tests to confirm the diagnosis are invasive, expensive, and carry some risk. Having a more accessible method to identify Alzheimer's could help patients in many ways, including improving diagnostic precision, allowing entry into clinical trials earlier in the disease course, and planning for necessary lifestyle adjustments."


Artificial Intelligence, speech and language processing approaches to monitoring Alzheimer's Disease: a systematic review

arXiv.org Artificial Intelligence

Language is a valuable source of clinical information in Alzheimer's Disease, as it declines concurrently with neurodegeneration. Consequently, speech and language data have been extensively studied in connection with its diagnosis. This paper summarises current findings on the use of artificial intelligence, speech and language processing to predict cognitive decline in the context of Alzheimer's Disease, detailing current research procedures, highlighting their limitations and suggesting strategies to address them. We conducted a systematic review of original research between 2000 and 2019, registered in PROSPERO (reference CRD42018116606). An interdisciplinary search covered six databases on engineering (ACM and IEEE), psychology (PsycINFO), medicine (PubMed and Embase) and Web of Science. Bibliographies of relevant papers were screened until December 2019. From 3,654 search results 51 articles were selected against the eligibility criteria. Four tables summarise their findings: study details (aim, population, interventions, comparisons, methods and outcomes), data details (size, type, modalities, annotation, balance, availability and language of study), methodology (pre-processing, feature generation, machine learning, evaluation and results) and clinical applicability (research implications, clinical potential, risk of bias and strengths/limitations). While promising results are reported across nearly all 51 studies, very few have been implemented in clinical research or practice. We concluded that the main limitations of the field are poor standardisation, limited comparability of results, and a degree of disconnect between study aims and clinical applications. Attempts to close these gaps should support translation of future research into clinical practice.


EEG-based Brain-Computer Interfaces (BCIs): A Survey of Recent Studies on Signal Sensing Technologies and Computational Intelligence Approaches and their Applications

arXiv.org Artificial Intelligence

Brain-Computer Interface (BCI) is a powerful communication tool between users and systems, which enhances the capability of the human brain in communicating and interacting with the environment directly. Advances in neuroscience and computer science in the past decades have led to exciting developments in BCI, thereby making BCI a top interdisciplinary research area in computational neuroscience and intelligence. Recent technological advances such as wearable sensing devices, real-time data streaming, machine learning, and deep learning approaches have increased interest in electroencephalographic (EEG) based BCI for translational and healthcare applications. Many people benefit from EEG-based BCIs, which facilitate continuous monitoring of fluctuations in cognitive states under monotonous tasks in the workplace or at home. In this study, we survey the recent literature of EEG signal sensing technologies and computational intelligence approaches in BCI applications, compensated for the gaps in the systematic summary of the past five years (2015-2019). In specific, we first review the current status of BCI and its significant obstacles. Then, we present advanced signal sensing and enhancement technologies to collect and clean EEG signals, respectively. Furthermore, we demonstrate state-of-art computational intelligence techniques, including interpretable fuzzy models, transfer learning, deep learning, and combinations, to monitor, maintain, or track human cognitive states and operating performance in prevalent applications. Finally, we deliver a couple of innovative BCI-inspired healthcare applications and discuss some future research directions in EEG-based BCIs.


Breathing through your nose BOOSTS your memory

Daily Mail - Science & tech

Breathing through your nose boosts your memory, according to new research. It improves the transfer of the events we experience in our daily lives to our long-term memory bank, say scientists. In the study, participants exposed to certain odours were better at recalling them if their mouths had been taped over. The findings add to a growing body of evidence that inhaling through the nose rather than the mouth enhances cognition. Intriguingly, recent studies have also suggested a fading sense of smell is one of the first signs of Alzheimer's disease.


A brain signature highly predictive of future progression to Alzheimer's dementia

arXiv.org Machine Learning

Early prognosis of Alzheimer's dementia is hard. Mild cognitive impairment (MCI) typically precedes Alzheimer's dementia, yet only a fraction of MCI individuals will progress to dementia, even when screened using biomarkers. We propose here to identify a subset of individuals who share a common brain signature highly predictive of oncoming dementia. This signature was composed of brain atrophy and functional dysconnectivity and discovered using a machine learning model in patients suffering from dementia. The model recognized the same brain signature in MCI individuals, 90% of which progressed to dementia within three years. This result is a marked improvement on the state-of-the-art in prognostic precision, while the brain signature still identified 47% of all MCI progressors. We thus discovered a sizable MCI subpopulation which represents an excellent recruitment target for clinical trials at the prodromal stage of Alzheimer's disease.


Violent games like Call of Duty change people's brains

Daily Mail - Science & tech

Violent games often have a bad reputation with many claiming they fuel aggressive and anti-social behaviour.


Human brains 'file' irrelevant thoughts of the past into a 'trash folder'

Daily Mail - Science & tech

Vital clues about how the brain erases long-term memories have been uncovered by researchers. The study reveals how forgetting can be the result of an'active deletion process' - similar to moving a computer file to a virtual bin - rather than a failure to remember. And the findings may help point towards new ways of tackling memory loss associated with conditions such as Alzheimer's disease and other types of dementia. A study has revealed how forgetting memories can be the result of an'active deletion process' - similar to moving a computer file to a virtual bin - rather than a failure to remember. The findings could also help scientists to understand why some unwanted memories are so long-lasting - such as those of people suffering from post-traumatic stress disorders.