Student Speakers

We are very excited to announce the student speakers for this year’s synapsium! The four student presentations cover a wide range of topics with diverse neuroimaging techniques, and all captures the advancements in this rapidly changing field. The student presentations will be organized into two parallel sessions, so check out the short introduction below and note down the talks you want to attend! 

*presenting authors are bolded

Session 1: 11:10-11:30

Decoding meaning composition during natural listening

Ryan Law, Hugo Weissbart, Andrea E. Martin

We can immediately imagine what the phrase pink banana describes despite having never seen a pink banana before. This unique capacity of language to generate infinitely more complex expressions by flexibly combining finite word meanings is a hallmark of human semantic competence. Our research aims to extend our understanding on the neural bases of this capacity by using naturalistic stimuli.

Session 2: 11:40-12:00

Transcranial ultrasonic stimulation of human amygdala to modulate fear learning

Sjoerd Meijer, Tinke van Buijtene, Jesse Lam, Benjamin Kop, Linda de Voogd, Karin Roelofs, and Lennart Verhagen

Tinke works in the Brain Stimulation Lab at the Donders Centre for Cognition and is part of a research project involving Transcranial Ultrasound Stimulation (TUS). TUS is a novel non-invasive neuromodulation technique that can achieve focal modulation of deep brain structures such as the amygdala. Combining TUS with well-validated fear conditioning procedures, this project aims to investigate the role of the amygdala in threat learning processes.

Domain-specific and domain-general neural network engagement during human-robot interactions

Ann Hogenhuis, Ruud Hortensius

We aimed to map the similarities and differences in domain-specific and domain-general neural network engagement during real, embodied, and recursive interactions with a human or robotic agent. Employing whole-brain, region-of-interest, and functional connectivity analyses, we mapped functional activity and connectivity across networks associated with domain-general and domain-specific, as well as social and non- social cognitive processes. Our results suggest that interactions with an artificial agent such as a social robot might lead to only subtle differences in response profiles of neural networks at a perceptual but not cognitive level. 

The goal of human vision changes rapidly when viewing scenes

Vanshika Bawa, Agnessa Karapetian, Kshitij Dwivedi, Radoslaw M. Cichy, Raphael Leuner, Martin Pflaum, Gemma Roig

As you look around, your brain first recognizes changes in contrast and brightness, and just within the next few hundred milliseconds you have already deduced objects and their relationships with one another. To understand how these deductions are formed so rapidly from the retinal image, it is vital to uncover what happens in between. EEG makes this possible and I will show you how.