Breaking News
May 3, 2019 - Vaping and Smoking May Signal Greater Motivation to Quit
May 3, 2019 - Dementia looks different in brains of Hispanics
May 3, 2019 - Short-Staffed Nursing Homes See Drop In Medicare Ratings
May 3, 2019 - Study of teens with eating disorders explores how substance users differ from non-substance users
May 3, 2019 - Scientists develop new video game that may help in the study of Alzheimer’s
May 3, 2019 - Arc Bio introduces Galileo Pathogen Solution product line at ASM Clinical Virology Symposium
May 3, 2019 - Cornell University study uncovers relationship between starch digestion gene and gut bacteria
May 3, 2019 - How to Safely Use Glucose Meters and Test Strips for Diabetes
May 3, 2019 - Anti-inflammatory drugs ineffective for prevention of Alzheimer’s disease
May 3, 2019 - Study tracks Pennsylvania’s oil and gas waste-disposal practices
May 3, 2019 - Creating a better radiation diagnostic test for astronauts
May 3, 2019 - Vegans are often deficient in these four nutrients
May 3, 2019 - PPDC announces seed grants to develop medical devices for children
May 3, 2019 - Study maps out the frequency and impact of water polo head injuries
May 3, 2019 - Research on Reddit identifies risks associated with unproven treatments for opioid addiction
May 3, 2019 - Good smells may help ease tobacco cravings
May 3, 2019 - Medical financial hardship found to be very common among people in the United States
May 3, 2019 - Researchers develop multimodal system for personalized post-stroke rehabilitation
May 3, 2019 - Study shows significant mortality benefit with CABG over percutaneous coronary intervention
May 3, 2019 - Will gene-editing of human embryos ever be justifiable?
May 3, 2019 - FDA Approves Dengvaxia (dengue vaccine) for the Prevention of Dengue Disease in Endemic Regions
May 3, 2019 - Why Tonsillitis Keeps Coming Back
May 3, 2019 - Fighting the opioid epidemic with data
May 3, 2019 - Maggot sausages may soon be a reality
May 3, 2019 - Deletion of ATDC gene prevents development of pancreatic cancer in mice
May 2, 2019 - Targeted Therapy Promising for Rare Hematologic Cancer
May 2, 2019 - Alzheimer’s disease is a ‘double-prion disorder,’ study shows
May 2, 2019 - Reservoir bugs: How one bacterial menace makes its home in the human stomach
May 2, 2019 - Clinical, Admin Staff From Cardiology Get Sneak Peek at Epic
May 2, 2019 - Depression increases hospital use and mortality in children
May 2, 2019 - Vicon and NOC support CURE International to create first gait lab in Ethiopia
May 2, 2019 - Researchers use 3D printer to make paper organs
May 2, 2019 - Viral infection in utero associated with behavioral abnormalities in offspring
May 2, 2019 - U.S. Teen Opioid Deaths Soaring
May 2, 2019 - Opioid distribution data should be public
May 2, 2019 - In the Spotlight: “I’m learning every single day”
May 2, 2019 - 2019 Schaefer Scholars Announced
May 2, 2019 - Podcast: KHN’s ‘What The Health?’ Bye-Bye, ACA, And Hello ‘Medicare-For-All’?
May 2, 2019 - Study describes new viral molecular evasion mechanism used by cytomegalovirus
May 2, 2019 - SLU study suggests a more equitable way for Medicare reimbursement
May 2, 2019 - Scientists discover first gene involved in lower urinary tract obstruction
May 2, 2019 - Researchers identify 34 genes associated with increased risk of ovarian cancer
May 2, 2019 - Many low-income infants receive formula in the first few days of life, finds study
May 2, 2019 - Global study finds high success rate for hip and knee replacements
May 2, 2019 - Taking depression seriously: What is it?
May 2, 2019 - With Head Injuries Mounting, Will Cities Put Their Feet Down On E-Scooters?
May 2, 2019 - Scientists develop small fluorophores for tracking metabolites in living cells
May 2, 2019 - Study casts new light into how mothers’ and babies’ genes influence birth weight
May 2, 2019 - Researchers uncover new brain mechanisms regulating body weight
May 2, 2019 - Organ-on-chip systems offered to Asia-Pacific regions by Sydney’s AXT
May 2, 2019 - Adoption of new rules drops readmission penalties against safety net hospitals
May 2, 2019 - Kids and teens who consume zero-calorie sweetened beverages do not save calories
May 2, 2019 - Improved procedure for cancer-related erectile dysfunction
May 2, 2019 - Hormone may improve social behavior in autism
May 2, 2019 - Alzheimer’s disease may be caused by infectious proteins called prions
May 2, 2019 - Even Doctors Can’t Navigate Our ‘Broken Health Care System’
May 2, 2019 - Study looks at the impact on criminal persistence of head injuries
May 2, 2019 - Honey ‘as high in sugars as table sugar’
May 2, 2019 - Innovations to U.S. food system could help consumers in choosing healthy foods
May 2, 2019 - FDA Approves Mavyret (glecaprevir and pibrentasvir) as First Treatment for All Genotypes of Hepatitis C in Pediatric Patients
May 2, 2019 - Women underreport prevalence and intensity of their own snoring
May 2, 2019 - Concussion summit focuses on science behind brain injury
May 2, 2019 - Booker’s Argument For Environmental Justice Stays Within The Lines
May 2, 2019 - Cornell research explains increased metastatic cancer risk in diabetics
May 2, 2019 - Mount Sinai study provides fresh insights into cellular pathways that cause cancer
May 2, 2019 - Researchers to study link between prenatal pesticide exposures and childhood ADHD
May 2, 2019 - CoGEN Congress 2019: Speakers’ overviews
May 2, 2019 - A new strategy for managing diabetic macular edema in people with good vision
May 2, 2019 - Sagent Pharmaceuticals Issues Voluntary Nationwide Recall of Ketorolac Tromethamine Injection, USP, 60mg/2mL (30mg per mL) Due to Lack of Sterility Assurance
May 2, 2019 - Screen time associated with behavioral problems in preschoolers
May 2, 2019 - Hormone reduces social impairment in kids with autism | News Center
May 2, 2019 - Researchers synthesize peroxidase-mimicking nanozyme with low cost and superior catalytic activity
May 2, 2019 - Study results of a potential drug to treat Type 2 diabetes in children announced
May 2, 2019 - Multigene test helps doctors to make effective treatment decisions for breast cancer patients
May 2, 2019 - UNC School of Medicine initiative providing unique care to dementia patients
May 2, 2019 - Nestlé Health Science and VHP join forces to launch innovative COPES program for cancer patients
May 2, 2019 - Study examines how our brain generates consciousness and loses it during anesthesia
May 2, 2019 - Transition Support Program May Aid Young Adults With Type 1 Diabetes
May 2, 2019 - Study shows how neutrophils exacerbate atherosclerosis by inducing smooth muscle-cell death
May 2, 2019 - Research reveals complexity of how we make decisions
Artificial neural networks can predict how different areas in the brain respond to words

Artificial neural networks can predict how different areas in the brain respond to words

Can artificial intelligence (AI) help us understand how the brain understands language? Can neuroscience help us understand why AI and neural networks are effective at predicting human perception?

Research from Alexander Huth and Shailee Jain from The University of Texas at Austin (UT Austin) suggests both are possible.

In a paper presented at the 2018 Conference on Neural Information Processing Systems (NeurIPS), the scholars described the results of experiments that used artificial neural networks to predict with greater accuracy than ever before how different areas in the brain respond to specific words.

“As words come into our heads, we form ideas of what someone is saying to us, and we want to understand how that comes to us inside the brain,” said Huth, assistant professor of Neuroscience and Computer Science at UT Austin. “It seems like there should be systems to it, but practically, that’s just not how language works. Like anything in biology, it’s very hard to reduce down to a simple set of equations.”

The work employed a type of recurrent neural network called long short-term memory (LSTM) that includes in its calculations the relationships of each word to what came before to better preserve context.

“If a word has multiple meanings, you infer the meaning of that word for that particular sentence depending on what was said earlier,” said Jain, a PhD student in Huth’s lab at UT Austin. “Our hypothesis is that this would lead to better predictions of brain activity because the brain cares about context.”

It sounds obvious, but for decades neuroscience experiments considered the response of the brain to individual words without a sense of their connection to chains of words or sentences. (Huth describes the importance of doing “real-world neuroscience” in a March 2019 paper in the Journal of Cognitive Neuroscience.)

In their work, the researchers ran experiments to test, and ultimately predict, how different areas in the brain would respond when listening to stories (specifically, the Moth Radio Hour). They used data collected from fMRI (functional magnetic resonance imaging) machines that capture changes in the blood oxygenation level in the brain based on how active groups of neurons are. This serves as a correspondent for where language concepts are “represented” in the brain.

Using powerful supercomputers at the Texas Advanced Computing Center (TACC), they trained a language model using the LSTM method so it could effectively predict what word would come next – a task akin to Google auto-complete searches, which the human mind is particularly adept at.

“In trying to predict the next word, this model has to implicitly learn all this other stuff about how language works,” said Huth, “like which words tend to follow other words, without ever actually accessing the brain or any data about the brain.”

Based on both the language model and fMRI data, they trained a system that could predict how the brain would respond when it hears each word in a new story for the first time.

Past efforts had shown that it is possible to localize language responses in the brain effectively. However, the new research showed that adding the contextual element – in this case up to 20 words that came before – improved brain activity predictions significantly. They found that their predictions improve even when the least amount of context was used. The more context provided, the better the accuracy of their predictions.

“Our analysis showed that if the LSTM incorporates more words, then it gets better at predicting the next word,” said Jain, “which means that it must be including information from all the words in the past.”

The research went further. It explored which parts of the brain were more sensitive to the amount of context included. They found, for instance, that concepts that seem to be localized to the auditory cortex were less dependent on context.

“If you hear the word dog, this area doesn’t care what the 10 words were before that, it’s just going to respond to the sound of the word dog”, Huth explained.

On the other hand, brain areas that deal with higher-level thinking were easier to pinpoint when more context was included. This supports theories of the mind and language comprehension.

“There was a really nice correspondence between the hierarchy of the artificial network and the hierarchy of the brain, which we found interesting,” Huth said.

Natural language processing — or NLP — has taken great strides in recent years. But when it comes to answering questions, having natural conversations, or analyzing the sentiments in written texts, NLP still has a long way to go. The researchers believe their LSTM-developed language model can help in these areas.

The LSTM (and neural networks in general) works by assigning values in high-dimensional space to individual components (here, words) so that each component can be defined by its thousands of disparate relationships to many other things.

The researchers trained the language model by feeding it tens of millions of words drawn from Reddit posts. Their system then made predictions for how thousands of voxels (three-dimensional pixels) in the brains of six subjects would respond to a second set of stories that neither the model nor the individuals had heard before. Because they were interested in the effects of context length and the effect of individual layers in the neural network, they essentially tested 60 different factors (20 lengths of context retention and three different layer dimensions) for each subject.

All of this leads to computational problems of enormous scale, requiring massive amounts of computing power, memory, storage, and data retrieval. TACC’s resources were well suited to the problem. The researchers used the Maverick supercomputer, which contains both GPUs and CPUs for the computing tasks, and Corral, a storage and data management resource, to preserve and distribute the data. By parallelizing the problem across many processors, they were able to run the computational experiment in weeks rather than years.

“To develop these models effectively, you need a lot of training data,” Huth said. “That means you have to pass through your entire dataset every time you want to update the weights. And that’s inherently very slow if you don’t use parallel resources like those at TACC.”

If it sounds complex, well — it is.

This is leading Huth and Jain to consider a more streamlined version of the system, where instead of developing a language prediction model and then applying it to the brain, they develop a model that directly predicts brain response. They call this an end-to-end system and it’s where Huth and Jain hope to go in their future research. Such a model would improve its performance directly on brain responses. A wrong prediction of brain activity would feedback into the model and spur improvements.

“If this works, then it’s possible that this network could learn to read text or intake language similarly to how our brains do,” Huth said. “Imagine Google Translate, but it understands what you’re saying, instead of just learning a set of rules.”

With such a system in place, Huth believes it is only a matter of time until a mind-reading system that can translate brain activity into language is feasible. In the meantime, they are gaining insights into both neuroscience and artificial intelligence from their experiments.

“The brain is a very effective computation machine and the aim of artificial intelligence is to build machines that are really good at all the tasks a brain can do,” Jain said. “But, we don’t understand a lot about the brain. So, we try to use artificial intelligence to first question how the brain works, and then, based on the insights we gain through this method of interrogation, and through theoretical neuroscience, we use those results to develop better artificial Intelligence.

“The idea is to understand cognitive systems, both biological and artificial, and to use them in tandem to understand and build better machines.”

Source:

https://www.tacc.utexas.edu/-/brain-inspired-ai-inspires-insights-about-the-brain-and-vice-versa-

Tagged with:

About author

Related Articles