top of page
Research interests

Dynamics of emotional experiences in the brain

Human's uniquely experience intense feelings in response to music, films, and narratives, such as nostalgia, pleasurable sadness, and chills. Using a combination of neuroimaging, psychophysiological measures, and continuous self-reports, my work tries to uncover brain patterns associated with these emotions and how these patterns change over time. 

​

To study emotional responses in a more ecologically valid manner, I try to use more naturalistic stimuli (full-length piece of music and films) as well as model-free, data-driven analytical techniques to quantify the patterns of response (see Sachs et al. 2019). As an example, on the right is a video of changes in brain signal overtime as a person listens to a piece of music in an fMRI machine.

 

​

Development of and individual variation in neural network structure

How do early childhood experiences influence the structure and function of the human brain? Using both resting-state and task-based fMRI as well as structural imaging, we can asses how particular brain networks related to cognitive and social functioning are enhanced by early childhood training, such as with music (see Sachs et al. 2017), and account for individual variation in affective experience. 

​

In particular, using diffusion tensor imaging, I have shown that white matter connectivity differences between sensory procssing and emotion processing areas of the brain can account for individual differences in reward sensitivity to music (see Loui et al. 2017 and Sachs et al. 2016).

​

Upcoming work will assess how network changes are altered in clinical populations and children with neurodevelopmental disorders.

loui_anhedonia_website.jpg

Musical emotions and machine learning

Machine learning algorithms applied to neuroimaging data allow researchers to predict various mental states and decode representational content from spatially-distributed patterns of fMRI signal. Using such approaches, I work to uncover uncover the neural representations of emotions and how these representations fundamental neural systems responsible for representing vary based on different structural and featural components of the stimulus. The right figure shows the results of a multivoxel pattern analysis (MVPA) that identified spatial patterns represented auditory emotions across music and the human voice (see Sachs et al., 2018). 

​

In addition, in collaboration with the SAIL Lab at USC Viterbi School of Engineering, I work on developing new and improved models to predict affective experiences in response to music using multimodal datasets of sensorial, behavioral psychophysiological, and neural signal (see Greer et al., 2019 and Ma et al., 2019).

musicalburst_decode_image_website.png
bottom of page