Lab Activity
Loyola University Chicago UX & Biometrics Lab Announces 2026 Lab Fellows
The UX & Biometrics Lab at Loyola University Chicago is pleased to announce its 2026 Lab Fellows, an accomplished cohort of Information Systems & Analytics students and early-career researchers from the Quinlan School of Business. Following a process that reviewed 18 research proposals, the 2026 Fellows have been organized into five teams. Each team will pursue an original, NeuroIS-focused human-subject research project, exploring emerging questions at the intersection of user experience, neuroscience-informed information systems, and biometric analytics. These projects will be developed and conducted through the UX & Biometrics Research course, an intensive, research-driven experience designed to integrate coursework directly with the lab’s ongoing mission and methods. This year’s cohort represents the next generation of researchers advancing the frontiers of NeuroIS and applied biometric research. Over the coming year, we look forward to the ideas, insights, and innovations that will emerge from their work. Congratulations to all of our 2026 Lab Fellows; we’re excited to see what you discover. 1/27/2026 - UX & Biometrics Lab - Chicago, IL

Lab's Conference Paper Named Among iMotions’ Top 5 Publications of 2025!
“Short-Form Videos: An Exploratory Study on the Impact of Subtitles and ASMR Split-Screen Format Options Using Eyegaze and Facial Expression Data”, authored by 2024 Lab Fellows Nate Pascale, Omar Tinawi, João Vítor Moraes Barreto, Adnan Aldaas, and Dr. Dinko Bačić, has been named one of the Top 5 Publications of 2025 by iMotions. Each year, researchers across disciplines use the iMotions platform to study how people think, feel, and behave, with 2025 producing an especially strong set of publications spanning neuroscience, marketing, human–computer interaction, and healthcare. Our paper was selected as one of these standout contributions, reflecting key trends in contemporary behavioral research such as multimodal measurement, emotionally intelligent technologies, digital media consumption, and the growing role of AI. Using eye tracking and facial expression analysis, the study shows that subtitles and split-screen formats significantly alter visual attention patterns, while ASMR-enhanced split screens increase positive emotional engagement. Importantly, while these emotional gains do not improve recall, they also do not degrade recall, highlighting the nuanced relationship between attention, emotion, and memory in short-form media. 1/13/2026 - UX & Biometrics Lab - Chicago, IL