Dr. Lauren Fink is an Assistant Professor in the Department of Psychology, Neuroscience & Behaviour, and a member of the McMaster Institute for Music & the Mind, the Neuroscience Graduate Program, and the School of Computational Science & Engineering. She is also the director of the Beat Lab.
Having progressed from a music conservatory degree in Percussion Performance to a PhD in Neuroscience, most of Lauren’s research centres around timing. From using computational models to predict listeners’ attention to music over time, to building assistive devices to help people move together in time, Dr. Fink’s lab is interested in understanding the physiological, psychological, and interpersonal changes induced through engaging with music.
Dr. Fink’s lab conceives of music as multi-modal (auditory, visual, tactile, motor, etc.), and often incorporates methods like eye-tracking. Currently, one big focus is on multi-person mobile eye-tracking to better understand attention and immersion during social, musical contexts. For example, we can have up to 30 audience members wearing mobile eye-tracking glasses while watching a performance, or we can have an ensemble of musicians wearing glasses while performing together. Though our questions tend to be centered around music, most of our methods offer interesting possibilities in other domains, such as speech. We are looking forward to synergies with other ARiEAL members!
Expandable List
Assistant Professor, Department of Psychology, Neuroscience & Behaviour, McMaster University
Member, McMaster Institute for Music & the Mind, McMaster University
Member, Neuroscience Graduate Program, McMaster University
Member, School of Computational Science & Engineering, McMaster University
* indicates co-first authorship, or co-last authorship (equal contribution)
+ indicates student mentee
Journal Articles:
Czepiel, A.+, Fink, L., Seibert, C., Scharinger, M., Kotz, S. (2023). Aesthetic and physiological effects of naturalistic multimodal music listening. Cognition 239, 105537.https://doi.org/10.1016/j.cognition.2023.105537
Saxena, S.+, Fink, L.*, Lange, E.* (2023). Deep learning models for webcam eye-tracking in online experiments. Behavior Research Methods. https://doi.org/10.3758/s13428-023-02190-6
Lange, E., & Fink, L. (2023). Eye-blinking, musical processing, and subjective states – A methods account. Psychophysiology, 00(e14350). https://doi.org/10.1111/psyp.14350
Fink, L., Simola, J., Tavano, A., Lange, E., Wallot, S., & Laeng, B. (2023). From pre-processing to advanced dynamic modeling of pupil data. Behavior Research Methods. https://doi.org/10.3758/s13428-023-02098-1
Coretta, S., Casillas, J.V., […] Fink, L., […] & Timo B. Roettger. (2023). Multidimensional signals and analytic flexibility: Estimating degrees of freedom in human speech analyses. Advances in Methods and Practices in Psychological Science, 6(3). https://doi.org/10.1177/25152459231162567
Fink, L., Alexander, P. & Janata, P. (2022). The Groove Enhancement Machine (GEM): A multi-person adaptive metronome to manipulate sensorimotor synchronization and subjective enjoyment. Frontiers in Human Neuroscience 16:916551. https://doi.org/10.3389/fnhum.2022.916551
Wittstock, S., Sperber, L., Kirk, G., McCarty, K., de Sola-Smith, K., Wade, J., Simon, M., Fink, L. (2022). Making what we know explicit: Perspectives from graduate writing consultants on supporting graduate writers. Praxis: A Writing Center Journal, 19(2). https://www.praxisuwc.com/192-wittstock-et-al
Czepiel, A.+, Fink, L.K., Fink, L.T., Wald-Fuhrmann, M., Tröndle, M., & Merrill, J. (2021). Synchrony in the periphery: inter-subject correlation of physiological responses during live music concerts. Scientific Reports 11, 22457. https://doi.org/10.1038/s41598-021-00492-3
*Fink, L., *Warrenburg, L. A., Howlin, C., Randall, W. M., Hansen, N. C., & Wald-Fuhrmann, M. (2021). Viral Tunes: Changes in musical behaviours and interest in coronamusic predict socio-emotional coping during COVID-19 lockdown. Humanities & Social Sciences Communications, 8(120). https://doi.org/10.1057/s41599-021-00858-y
*Durojaye, C., *Fink, L., Roeske, T., Wald-Fuhrmann, M. & Larrouy-Maestri, P. (2021). Perception of Nigerian talking drum performances as speech-like vs. music-like: the role of familiarity and acoustic cues. Frontiers in Psychology 12:652673. https://doi.org/10.3389/fpsyg.2021.652673
Sharma, N., Krishnamohan, V., Ganapathy, S., Gangopadhayay, A. & Fink, L. (2020). Acoustic and linguistic features influence talker change detection. JASA Express Letters 147(5). https://doi.org/10.1121/10.0002462
Fink, L., Lange, E., & Groner, R. (2019). The application of eye-tracking in music research. Journal of Eye Movement Research, 11(2):1. https://doi.org/10.16910/jemr.11.2.1
Fink, L., Hurley, B., Geng, J. & Janata, P. (2018). A linear oscillator model predicts dynamic temporal attention and pupillary entrainment to rhythmic musical patterns. Journal of Eye Movement Research, 11(2):12. https://doi.org/10.16910/jemr.11.2.12
Hurley, B., Fink, L., & Janata, P. (2018). Mapping the dynamic allocation of attention in musical patterns. Journal of Experimental Psychology: Human Perception & Performance, 44(11), 1694-1711. https://doi.org/10.1037/xhp0000563
Fink, L. (2017). Chance operations in neuroscience. In Lane, J. and L. Fink (Eds.), Allen Otte Folio, 17-20. https://mediapressmusic.com/allen-otte-folio-various/
Fink, L. (2016). The Greatest. Pulse Special Issue of Ethnomusicology Review/Sounding Board. https://ethnomusicologyreview.ucla.edu/content/greatest
Conference papers (peer-reviewed)
Fink, L. (2023). Eye movement patterns when playing from memory: Examining consistency across repeated performances and the relationship between eyes and audio. Proceedings of the 17th International Conference on Music Perception and Cognition, Aug. 24-28, Tokyo, Japan. psyarxiv.com/tecdv
Saxena, S.+, Lange, E. & Fink, L. (2022). Towards efficient calibration for webcam eye-tracking in online experiments. In 2022 Symposium on Eye Tracking Research and Applications (ETRA ’22), June 08–11, 2022, Seattle, WA, USA. https://doi.org/10.1145/3517031.3529645
Fink, L. (2021). Computational models of temporal expectations. Proceedings of the Future Directions of Music Cognition International Conference, pp. 208-213. https://doi.org/10.18061/FDMC.2021.0041
Sharma, N., Krishnamohan, V., Ganapathy, S., Gangopadhayay, A. & Fink, L. (2020). On the impact of language familiarity in talker change detection. Proceedings of the 2020 IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), Barcelona, Spain, pp. 6249 – 6253. https://doi.org/10.1109/ICASSP40776.2020.9054294
Edited Volumes
Fink, L. & Lange, E., Eds. (2018). Special Issue on Music & Eye-Tracking. Journal of Eye Movement Research. Vol. 11(2). https://bop.unibe.ch/JEMR/issue/view/793
Fink, L., Ed. (2017). Explorations: The UC Davis Undergraduate Research Journal (vol. 19). The Regents of the University of California. http://explorations.ucdavis.edu/2017/index.html
Lane, J. & L. Fink, Eds. (2017). Allen Otte Folio. A collection of percussion pieces, distributed by Media Press Inc. https://mediapressmusic.com/allen-otte-folio-various/
- Schlichting, J.+, Saxena, S.+, Flannery, M.+, & Fink, L. (2023, Oct.). Social justice advocacy through music performance: Testing the effect of performance context and audience physiological responses. 19th Annual Neuromusic Conference, Hamilton, Ontario, Canada. https://www.neuromusic.ca/posters-2023/social-justice-advocacy-through-music-performance-testing-the-influence-of-performance-context-and-audience-physiological-responses-2/
- Flannery, M.+, & Fink, L. (2023, Oct.). Automating music stimuli creation and analyses: A music synthesis algorithm for producing ground truth data. 19th Annual Neuromusic Conference, Hamilton, Ontario, Canada. https://www.neuromusic.ca/posters-2023/automating-music-stimuli-creation-and-analyses-a-music-synthesis-algorithm-for-producing-ground-truth-data/
- Saxena, S.+ & Fink, L. (2023, Oct.). Synchronized multi-person eye-tracking in dynamic scenes. Poster presented at the 19th Annual Neuromusic Conference, Hamilton, Ontario, Canada. https://www.neuromusic.ca/posters-2023/synchronized-multi-person-eye-tracking-in-dynamic-scenes/
- Saxena, S.+, Fiehn, H.+, Shi, J.+, & Fink, L. (2023, Aug.). Cross-modal correspondence between contemporary art and music: from perception to aesthetic evaluation. Talk presented at the 17th International Conference on Music Perception & Cognition (ICMPC17-APSCOM7), Tokyo, Japan.
- Flannery, M.+, Woolhouse, M., Fink, L. (2023, Aug.). Models trained on procedurally generated stimuli predict human judgments of Music Acoustic Features in real-world music. Poster presented at the 17th International Conference on Music Perception & Cognition (ICMPC17-APSCOM7), Tokyo, Japan.
- Czepiel, A.+, Fink, L., Seibert, C., Scharinger, M., Wald-Fuhrmann, M. Kotz, S. (2023, Aug.) Cardiorespiratory synchrony to music and among audience members during a live concert. Talk presented at the 17th International Conference on Music Perception & Cognition (ICMPC17-APSCOM7), Tokyo, Japan.
- Fink, L. (2023, Aug.). Eye movement patterns when playing from memory: Examining consistency across repeated performances and the relationship between eyes and audio. Talk presented at the 17th International Conference on Music Perception & Cognition (ICMPC17-APSCOM7), Tokyo, Japan. https://psyarxiv.com/tecdv/
- Damsma, A., Bouwer, F., Fink, L., Cannon, J., Doelling, K., Grahn, J., Honing H., & Kaplan, T. (2023, Aug.). Modelling rhythm perception beyond the beat. Symposium presented at the 17th International Conference on Music Perception & Cognition (ICMPC17-APSCOM7), Tokyo, Japan.
- Fink, L., Hörster, M., Poeppel, D., Wald-Fuhrmann, M., & Larrouy-Maestri, P. (2022, Sept.). Western Listeners’ perception of music and speech is reflected in acoustic and semantic descriptors. Poster (virtual) presented at the Biology-culture relationships in the evolution of language and music workshop, at the Joint Conference on Language Evolution, Kanazawa, Japan.
- Saxena, S.+, Fink, L., & Lange, E. (2022, Aug.). An online experiment with deep learning models for tracking eye movements via webcam. Accepted talk at the European Conference on Eye Movements,Leicester, UK.
- Linna, J., Kushan, M., Beck, J., Fink, L., Margulis, L. (2022, Aug.). Using pupillometry to investigate the effect of meditation on musical listening. Poster presented the Society for Music Perception & Cognition, Portland, OR.
- Lange, E. & Fink, L. (2022, July). Eyeblinks as indices of subjective states during music listening: Methodological considerations. Talk presented the Conference on Music & Eye-Tracking, Frankfurt am Main, Germany. https://vimeo.com/728532868/5c8f91824d