Predictive Processing

The brain is not a passive receiver of sensory information. Rather, our brains guide us through the world, estimating the sensory stimulation they will receive moment-by-moment. As we begin our day at work, we know the force with which to turn the office door handle, and where the computer will be positioned on the desk. Evidence from action, perception and social cognition reveals that the brain uses contextual expectations based on internal models to achieve this prediction. Our research uses functional magnetic imaging of primary visual cortex V1 to investigate the brain’s prediction of the environment in perception.

High Resolution 7T fMRI

Laminar fMRI in humans has a spatial resolution that allows us to ask novel questions about cortical function. In the cortex, feedforward, lateral, and feedback projections terminate in distinct cortical layers. We measure processing in sensory areas, to understand how cortical feedback pathways signal information about the brain’s prediction of the world.

Multiscale, Multimethod Neuroscience

In our Human Brain Project collaboration, we integrate our measurements of brain function in humans at the spatial resolutions of cortical layers that we can achieve at 7T, with complementary animal and modeling work, achieving insights into brain organization and function that can be gained with this multiscale approach.

Artificial Intelligence

We propose approaches for modelling early visual cortex testing recent AI models on how they fit brain activity. As an example, we recently published a paper called “A self-supervised deep neural network for image completion resembles early visual cortex fMRI activity patterns for occluded scenes” in which we compared a recent AI method to fMRI data, showing how architectures trained with less supervision offered better similarity to fMRI data than supervised models. In addition, we develop new brain imaging tools with the goal to automatise data analysis in fMRI. We deployed two different tools for MRI segmentation, at 3T and 7T, with related publications.

Multisensory Processing

We are interested in how the brain integrates information from different sensory systems into multimodal representations of the world. For example, we have demonstrated that the Capture of Auditory Motion by Vision Is Represented by an Activation Shift from Auditory to Visual Motion Cortex and that abstract information extracted from auditory stimuli is processed in visual cortex.

Reorganisation of Retinotopic Maps

We were fortunate to be able to work with a 10-year-old girl born with half a brain to investigate how she is able to see normally through one eye. The youngster, from Germany, has both fields of vision in one eye and is the only known case of its kind in the world. See here for the original publication and here for news coverage.