A $40.5 million collaborative research center headquartered at Seattle’s Allen Institute aims to create high-resolution maps of brains ravaged by Alzheimer’s disease, to trace new paths to early diagnosis and treatment.
Previous versions of the atlas were rendered with lower-resolution 3-D maps. The latest high-resolution maps are fine enough to pinpoint the locations of individual brain cells — which is crucial for interpreting datasets that contain thousands or millions of pieces of information.
“In the old days, people would define different regions of the brain by eye. As we get more and more data, that manual curation doesn’t scale anymore,” Lydia Ng, senior director of technology at the Seattle-based Allen Institute for Brain Science, explained in a news release. “Just as we have a reference genome sequence, you need a reference anatomy.”
Seattle’s Allen Institute is heading into a new phase of research into neuroscience — a phase that includes reorganizing its current activities as well as adding new ones.
The Allen Institute for Brain Science, which is the largest division under the institute’s umbrella, was established by Microsoft co-founder Paul Allen in 2003 and has continued on its mission since Allen’s death in 2018. It’s grown to more than 300 scientists and staff members who work in two broad research areas.
One program, known as Cell Types, focuses on mapping out a “periodic table” of brain cells. The Allen Institute’s new 16-year plan calls for the Allen Institute for Brain Science to focus solely on studying brain cell types and neural connectivity.
The second program, known as MindScope, seeks to understand how the brain’s neural circuits produce the sense of vision. That field of study, along with the Allen Brain Observatory, will transition out of the Allen Institute for Brain Science to become a separate program at the Allen Institute.
A new division, due for launch in 2022, will focus on research related to neural computation and dynamics.
For years, neuroscientists have been monitoring the brain activity of mice as they looked at a wide range of images — including the film-noir classic “Touch of Evil” — in hopes of discovering deep insights about the workings of the visual system. Now they’ve come upon a plot twist worthy of director Orson Welles himself.
The latest results, reported today in the journal Nature Neuroscience by researchers at Seattle’s Allen Institute for Brain Science and at the University of Washington, suggest that more than 90% of the neurons in the visual cortex don’t work the way scientists thought.
“We thought that there are simple principles according to which these neurons process visual information, and those principles are all in the textbooks,” Christof Koch, the brain institute’s chief scientist and president, said in a news release. ”But now that we can survey tens of thousands of cells at once, we get a more subtle — and much more complicated picture.”
Not that there’s anything wrong with that.
“To me, that’s the business. In some sense, that’s the exciting thing,” Michael Buice, an associate investigator at the Allen Institute and one of the study’s lead authors, told GeekWire. “We’re in a more interesting place than we thought.”
Researchers at Seattle’s Allen Institute say a new and improved map of the mouse brain reveals not only how different regions are connected, but how those connections are ordered in a hierarchical way.
They add that the mapping techniques behind their study, which was published today by the journal Nature, could shed light on how diseases like Alzheimer’s, Parkinson’s or schizophrenia tangle up connections in the human brain.
Like that earlier version of the Allen Mouse Brain Connectivity Atlas, the newly published map was created by injecting glow-in-the-dark viruses into the brains of mice, and then tracking how brain impulses lit up different types of brain cells.
The Neuropixels system, developed by an international collaboration that includes the Allen Institute, could be adapted to record brain activity in human patients as well, said Josh Siegle, a senior scientist at the institute who works with the probes.
“The application I’m most interested in is decoding the communication patterns of the brain, and really understanding how information is transmitted between regions,” Siegle told GeekWire. “What are the transmission protocols?”
Neuropixels has already produced insights into the brain’s inner workings, Siegle said. This week, the institute is due to publish findings on the BioRxiv preprint server that confirm hierarchical patterns of connectivity in the brain.
Do animals possess consciousness? Can consciousness be uploaded into a computer? Can we measure objectively whether someone is conscious or not?
Those may sound like deep, imponderable questions — but in a newly published book, “The Feeling of Life Itself,” neuroscientist Christof Koch actually lays out some answers: Yes, no … and yes, scientists are already testing a method for measuring consciousness, with eerie implications.
A study led by researchers at Seattle’s Allen Institute for Brain Science lays out a “parts list” for the brain, including a detailed look at the differences between the parts for human brains and mouse brains.
They say the genetic results, published today in the journal Nature, suggest that relying on mice to study how the brains of men and women work could lead neuroscientists down blind alleys.
“The answer may be that you have to go to species that are more similar to humans,” Ed Lein, an investigator at the Allen Institute who’s also affiliated with the University of Washington, told GeekWire.
It’s not that the basic parts list is all that different: The researchers found that most of the 75 different cell types identified in the human brain, based on genetic makeup, are found in the mouse brain as well.
That commonality applies even to cells that the scientists had previously thought might be uniquely human, such as the “rosehip neurons” discovered last year.
But there are significant differences in the way those genes are expressed — differences that have developed over 75 million years of evolution. “The genes themselves haven’t really changed, but their regulation can change a lot,” Lein said.
Neuroscientists have demonstrated a computerized system that can determine in real time what’s being said, based on brain activity rather than actual speech.
The technology is being supported in part by Facebook Reality Labs, which is aiming to create a non-invasive, wearable brain-to-text translator. But in the nearer term, the research is more likely to help locked-in patients communicate through thought.
“They can imagine speaking, and then these electrodes could maybe pick this up,” said Christof Koch, chief scientist and president of the Seattle-based Allen Institute for Brain Science, who was not involved in the study.
“Real-time processing of brain activity has been used to decode simple speech sounds, but this is the first time this approach has been used to identify spoken words and phrases,” UCSF postdoctoral researcher David Moses, the study’s principal investigator, said in a news release.