Research Gallery

The Burwell Lab uses electrophysiology, neuroanatomy, optogenetics, and behavior to understand how structures in the rodent hippocampal system contribute to memory and other cognitive functions. Panel A, left, shows a schematic of a rat about to make a correct choice in a two dimensional object discrimination task in the Floor Projection Maze, a novel apparatus in which visual stimuli are back-projected to the maze floor. The spatial plot to the right and the perievent histogram below show a postrhinal neuron that fires only when the animal is about to make a correct right choice.

Panel B shows Channelrhodopsin (ChR2) labeling of fibers and DAPI labeling of neurons in the postrhinal cortex. We have successfully modulated object discrimination using light stimulation of ChR2 transduced cells in behaving rats.

When people blame another person, they interpret the person’s behavior in light of a system of norms. Researchers in Prof. Bertram Malle's Social Cognitive Science Research Center examine these behavior interpretations and especially the inferences about the person’s mental states before, during, and even after performing the behavior.

According to the researchers' theory of blame, a social perceiver engages in many processing steps before blaming another person: The perceiver first detects a negative event, determines whether an agent caused it, and judges whether the agent acted intentionally. This intentionality judgment bifurcates subsequent processing: If the behavior seems intentional, the perceiver considers the agent’s reasons (which may be justifying or aggravating); if the behavior seems unintentional, the perceiver considers whether the agent should have and could have prevented the outcome. Thus, blame is far more than anger; it includes a complex analysis of the other person's mind and behavior.

How do we integrate higher-order cognitive processes with actions? Using a combination of techniques including behavior (psychophysics) investigations and neuroimaging, Joo-Hyun Song’s lab investigates how higher-order cognitive processes such as visual memory, attention, target selection, and unconscious representations interact with visually-guided actions such as saccadic eye movements and reaching movements. The figure shows curved reach trajectories (red, blue, and green) in an odd-colored target search task (inset), contrasted with those in no-distractor cases (cyan). Due to competition between the target and distractors in visual search, reach trajectories often swerve towards a distractor and are then quickly re-directed to the target. Changes of the direction of curved reach trajectories demonstrate the current locus of attention as well as the time course of target-distractor competition. A close examination of the details of reaching trajectories enables insights into a wide range of dynamic internal representations. From Joo-Hyun Song’s Perception, Action, and Cognition Lab.

How do we search our memory for the knowledge we need, when we need it? Analogous to searching the web or a large bibliographic database, searching memory requires generating effective retrieval plans, while also sorting through what has been retrieved to locate the most relevant information. The frontal lobes are among the brain structures critical for this type of strategic memory search. Research in David Badre’s lab investigates the mechanisms by which the frontal cortex helps us retrieve information and make use of it when planning, making decisions, and learning about our world.

The map image is a stylized detail from an image showing brain activity in the human frontal lobe during strategic memory search. The depicted division between mid ventrolateral prefrontal cortex (VLPFC) and anterior VLPFC shows a putative functional division of labor in the brain between regions that guide memory retrieval and choose relevant information for further processing. From the Badre Lab.

Bill Warren's Virtual Environment Navigation Lab (VENLab) puts people in a large immersive virtual environment to test theories of perception and action.

Which variables influence control over learning and action? When given a cue to induce a decision among multiple conflicting alternatives, people often hesitate, and then either choose the option that has worked the best for them in the past, or explore alternatives to determine whether they might produce a better outcome. Research in the Frank Lab studies the temporal dynamics of these processes and individual differences thereof. The image depicts brain wave activity over different parts of frontal cortex (mapped onto the scalp), and how they evolve across time as participants are presented with a cue, make a response, and receive outcome feedback. Bright colors signify increased oscillatory activity at a particular time and frequency, which are quantified and linked to particular decision computations. From Michael Frank's Laboratory of Neural Computation and Cognition.

How does brain injury provide insights into the processes involved in speaking and understanding? Aphasia studies show that lexical and speech processing recruits a broad neural system reflecting multiple stages of processing including acoustic-phonetic analysis, mapping sound to lexical form and meaning, selecting appropriate words, and the integration and modulation of these multiple sources of information. Research in Sheila Blumstein’s lab uses behavioral and functional neuroimaging paradigms with normals and aphasics to examine mechanisms involved in speech and lexical processing and the neural systems underlying them.

The figure shows a lesion in the right cerebellum which paradoxically led to resolution of foreign accent syndrome (FAS), where a patient speaks with what is perceived as a foreign accent typically following a left frontal lesion. This deficit typically affects the prosody and rhythm of speech, not the articulation of sound segments, providing important clues as to the processing stages involved in speech production. From Sheila Blumstein's Lab.

How do we make decisions and learn from experience? Research in Michael Frank's lab uses computational modeling to understand the neurocognitive dynamics involved in learning, decision making and cognitive control -- and how these are altered by disease and treatment. The image depicts a patient with Parkinson's disease treated with a surgically implanted stimulator into the basal ganglia. Surrounding the patient is a computer model of the associated neural circuit with a snapshot of neuronal firing rates reflected by height and color. The model simulates the effects of the neurochemical dopamine (DA), which is deficient in the disease, and of deep brain stimulation (DBS). The models are tested and refined via synergistic experiments using neuropsychological, pharmacological, genetic, and electrophysiological tools. From Michael Frank's Laboratory of Neural Computation and Cognition.

Everyday tasks can be planned at different levels of abstraction. For example, consider the task of making a cup of coffee. “Making coffee” is an abstract goal that could be performed in different ways and in different environments. However, one could also plan this task in terms of a set of more concrete subgoals, like “grind beans” and “pour water”. Indeed, an individual instance of making coffee could even be described in terms of a specific sequence of movements. Research in David Badre’s lab explores how the brain can internally control thought and action at different levels of abstraction in order to achieve a desired outcome.

The image depicts brain activation going from rostral (in the front) prefrontal cortex back toward motor cortex as control over action goes from highly abstract to more concrete rules that indicate specific responses. From the Badre Lab.

For decades, carefully logging data about how mice go through the motions of their daily routines has been a tedious staple of behavioral and neuroscience research:

  Hour 2, minute 27: mouse 4 is sleeping
Hour 3, minute 12: mouse 7 is eating

… and so on. It’s a task most people would happily cede to automation.Thomas Serre and a team of colleagues at the Massachusetts Institute of Technology and California Institute of Technology have created a new computer system that is as accurate as people in identifying mouse behaviors in videos. What’s more, the team is making the fully customizable open-source software available for free. Given standard camcorder footage of a mouse, the software will automatically identify a mouse’s behavior frame by frame. From the Serre Lab.