ƻԺ

Quantitative Life Sciences logo

Event

QLS Seminar Series - Carsen Stringer

Tuesday, October 3, 2023 12:00to13:00

Making sense of large-scale neural and behavioral data

Carsen Stringer, Howard Hughes Medical Institute
Tuesday October 3, 12-1pm
Zoom Link:
In Person: 550 Sherbrooke, Room 189

Abstract:Advances in protein engineering and microscopy have enabled routine recordings of over 50,000 neurons simultaneously in the mouse cortex at a sampling rate of 3Hz, or ~8,000 neurons at a rate of 30Hz. What properties does this large-scale activity have? One popular hypothesis is that this neural activity is “simple” and low-dimensional, and we can summarize even 50,000 neuron recordings with just a few numbers at any one time. Many analytical tools and theories have been developed based on this assumption. However, in our large-scale recordings we found that neural responses to visual stimuli were high-dimensional, exploring many diverse patterns of activity that could not be reduced to a few numbers. This high-dimensional structure that cannot be easily captured by existing data visualization methods. We therefore developed an embedding algorithm called Rastermap, which captures complex temporal and highly nonlinear relationships between neurons, and provides useful visualizations by assigning each neuron to a location in the embedding space. We found within neural datasets from virtual reality tasks unique subpopulations of neurons encoding abstract elements of decision-making, the environment and behavioral states. Further, we found that ongoing “spontaneous” activity in cortex was high-dimensional, representing the moment-to-moment behaviors of the mouse. To interrogate behavioral representations in the mouse brain, we developed the fast Facemap network for tracking 13 distinct points on the mouse face recorded from arbitrary camera angles. We used Facemap to find that the neuronal activity clusters which were highly driven by behaviors were more spatially spread-out across cortex. We also found that the deep keypoint features inferred by the model had time-asymmetrical state dynamics that were not apparent in the raw keypoint data. In summary, Facemap provides a stepping stone towards understanding the function of the brainwide neural signals and their relation to behavior.

Back to top