Lauren Fink had hard news to break with Shreshth Saxena one year after they started working together in Germany.
Fink was on the move. She’d accepted a tenure-track position with the Department of Psychology, Neuroscience & Behavior at McMaster University. With the appointment came the keys to LIVELab, a state-of-the-art and one-of-a-kind research-based concert hall.
“This was a once-in-a-career opportunity that was too good to pass up,” says Fink, who at the time was in her third year as a postdoctoral researcher at the Max Planck Institute for Empirical Aesthetics. Founded in 2013, the institute investigates why and how people create art and how they perform, experience, and evaluate it.
Assistant professor Lauren Fink and PhD student Shreshth Saxena.
Fink had one big regret about the move – Saxena would likely prove to be a once-in-a-career hire. He’d been working in the private sector as a principal computer vision engineer, developing and deploying deep learning models on mobile devices when Fink recruited him as a student researcher.
“Shreshth’s a unicorn with his skill set and background. I knew it would be a huge loss to go to McMaster without him.” One of Fink’s future colleagues said Saxena could possibly transfer to Mac as a grad student. “I didn’t realize that was even an option.”
After Fink broke the news and broached the idea of relocating to McMaster, Saxena wasted no time weighing his options. “I’d already moved from India to Germany so what was one more move to Canada?”
Fink’s now an assistant professor and principal investigator of BEATLab – it’s a research group that studies musical engagement to learn about fundamental dynamics of the nervous system and apply cognitive neuroscientific methods to create more immersive musical experiences. Saxena became Fink’s first PhD student and inaugural lab member.
To create more immersive musical experiences, they set out to revolutionize eye-tracking research by attempting something that had yet to be done and in the process became co-founders of SocialEyes, a software start-up.
Eye-tracking research began in the late 1800s when French ophthalmologist Louis Emile Javal observed reading patterns using mirrors and listening to eye muscles. The first practical eye tracker – a specialized and less-than-comfortable contact lens – was created by psychologist Edmund Huey in 1908. Neuroscientist Edmund T. Rolls at the University of Cambridge opened the first dedicated eye-tracking laboratory in 1929.
Today, researchers use specialized eye-tracking hardware and software to collect objective, real-time data on what people look at, for how long and in what order. That data can help researchers better understand behaviors that are often expressed subconsciously, from decision-making to attention, fatigue and memory.
“Our eyes don’t lie,” says Fink. “Eye movements are a window into human behavior.”
What’s seen through those windows is driving advances in our understanding of cognitive processes and supporting medical diagnosis and research for conditions including autism, Alzheimer’s, Parkinson’s and ophthalmic diseases. Eye-tracking research is also being used to optimize user experiences and create gaze-driven human-computer interfaces like monitoring systems that can detect drowsiness while we’re driving and alert us to pull over before an accident happens.
Yet much of today’s eye-tracking research continues to be done in labs much like the one Rolls opened nearly 100 years ago. Studies are conducted one participant at a time using expensive and immobile equipment. Fink and Saxena believed advances in low-cost, high-tech glasses could take eye-tracking out of controlled lab environments and into the wilds of multi-person social settings like concerts and art galleries. “No one else was doing eye-tracking research on the scale we were considering,” says Fink.
They quickly realized why. “I enjoy solving engineering problems,” says Saxena. “What Lauren was proposing was an incredibly complex problem that had yet to be solved.”
Undeterred, they set out to find a way to synchronize, record, store and analyze gaze data streaming in real time from an audience wearing eye tracking glasses and watching the same performance from different vantage points while swinging and swaying in their seats.
Solving that challenge would put Fink and Saxena at the leading edge of psychology, cognitive neuroscience and computer science. Fink had the science background to push methodological boundaries while Saxena, who’d graduated with a master’s degree in computer science from the University of Delhi, had the technical know-how to make his supervisor’s vision a reality.
PhD student Shreshth Saxena modelling eye tracking glasses with SocialEyes software running in the background at LIVELab. Saxena and assistant professor Lauren Fink have created a software solution that can synchronize, record, store and analyze gaze data streaming in real time from large audiences.
Together, they started building a platform for real-world mobile eye-tracking. “What would’ve taken me a decade to figure out on my own took Shreshth just under a year to code,” says Fink, who adds none of their work would’ve happened without the move to Mac and access to LIVELab.
They’ve already successfully run a series of research studies in LIVELab with up to 30 people at a time wearing eye-tracking glasses. Those studies have generated more than 500 hours of video – equal to around eight terabytes of data. Fink and Saxena can analyze individual and collective gaze and identify when, where and for how long attention is fixed and when it wanders.
They’re now looking to commercialize their software solution. Their SocialEyes start-up has received pre-seed funding from the Faculty of Science and a Business Strategy Internship (BSI) grant from Mitacs and Lab2Market. They’ve begun the validation phase to see if their idea has real potential and a big enough market of potential buyers. Early reaction to their research suggests there’s a whole lot of interest in what they’ve created.
Fink originally envisioned using mobile eye-tracking to study collective behaviour and attention in concerts and art galleries. It’s since expanded to include classrooms and campaign trails. Word’s gotten out thanks to published papers, talks at international conferences and media coverage. Fink and Saxena are already collaborating with researchers at the Rotman Research Institute at Baycrest, the University of Toronto, and Boston College.
During last year’s federal election, LIVELab and the Digital Society Lab at McMaster joined forces to monitor eye movements and heart rates of undecided voters during the English-language leaders’ debate. Their collaboration set off a wave of widely shared media coverage. That coverage in turn generated interest from researchers from around the world, with requests for more collaborations and consultations.
The SocialEyes co-founders are also fielding fee-for-service requests to set up and run eye tracking experiments in LIVELab. “We’re happy to support other researchers but if the demand continues at this pace it’s going to be a real challenge for us to conduct our own research,” says Fink.
An audience takes SocialEyes for a test drive during a live performance in LIVELab.
While their platform is open source, they’ve started the process of working with McMaster to protect their intellectual property. Fink and Saxena say they want to ensure their eye tracking solution benefits the international research community and isn’t co-opted for covert surveillance. “Protecting everyone’s personal privacy is paramount,” says Fink.
“We’ve watched audience members in LIVELab forget that they’re wearing eye tracking glasses. They start scrolling through their smartphones. And like our eyes, our smartphones reveal more about us than we realize.”
Ever the problem-solver and trouble-shooter, Saxena’s already found a way to delete any video footage of gazes that drift down to the dopamine machines in our hands.
For more about SocialEyes, read the paper, check out the open-source code repository or watch a short explainer video.