2025 Ignite Tech Grant Recipients Announced

LIVE is thrilled to announce the winners from the latest cycle of its Ignite Tech Development Grant, showcasing the diverse potential of emerging learning technology innovation. This year’s competition attracted a strong and diverse applicant pool from across Vanderbilt’s schools and all academic levels, from undergraduates to faculty. The winning proposals focus on learning that spans the spectrum from math to empathy to AI ethics. Recipients were chosen based on their project’s novelty, field significance, feasibility, scalability, sustainability and future research potential. Over the award period, LIVE’s research engineer, Albert Na, will collaborate with the awardees to transform their ideas into tangible, practical and impactful tools, pushing the boundaries of learning technology development. Learn more about the Ignite Tech Grant and the application guidelines on the Ignite Tech Grant webpage. 

                           

The 2025 Ignite Tech Grant Recipients

Reality Remix: Integrating Technology & Emotional Insight

Eric Park, Assistant Professor of Marketing | Owen Graduate School of Business 

Owen School of Business Faculty member, Eric Park, has explored augmented reality (AR) through the novel theory of egocentric spatial repositioning (ESR), which proposes that virtual content can be uniquely embedded in a user’s personal space. While preliminary evidence suggests that ESR may reduce psychological distance and enhance emotional engagement, its underlying mechanisms remain underexplored. In collaboration with LIVE, Park will rigorously test ESR as a psychological process, investigating its potential to enhance empathy, improve perspective-taking, and promote social responsibility. The project seeks to establish a robust theoretical foundation for AR interventions in education and civic engagement by systematically validating ESR’s psychological implications.

 

Math Mysteries: Decoding Learning’s Hidden Curriculum

Joanne Golann, Associate Professor of Leadership, Policy, & Organizations and Bethany Rittle-Johnson, Professor of Psychology | Peabody College 

                       

Peabody College faculty members Joanne Golann and Bethany Rittle-Johnson will work to analyze over 11,000 hours of in-home video recordings from 21 families, capturing children’s daily lives over two weeks. The project intends to apply cutting-edge AI tools to a sample of video data to detect and interpret mathematical learning activities in home settings. Through working with LIVE, the team hopes to discover new insights into how children engage with math in informal learning environments. By building upon their work from Vanderbilt’s Scaling Success Grant and integrating the findings through the Ignite Tech Grant, this project introduces a significant methodological breakthrough in educational research, with potential applications across multiple disciplines. The team plans to release the dataset on a restricted-use basis, promoting broader scholarly engagement and diverse perspectives in early childhood education research. This approach offers a transformative method for understanding how children learn outside formal educational settings, potentially providing evidence-based strategies to support learning across home and school contexts.

 

Eyes on Companions: Tracking Empathy Through Gaze

Peter Chesney, Collaborative Humanities Postdoctoral Program (CHPP) Fellow | College of Arts and Science

Post Doctoral scholar and historian, Peter Chesney is teaming up with visual artist, Yael Vishnizki-Levi, to create an interactive installation with LIVE’s Interactive Gaze Experience that is housed in the LIVE space. Collaborating with LIVE’s Research Engineer, the team will integrate documentary imagery with an advanced eye-tracking system to analyze viewer attention. The project will focus on tracking eye movements of participants as they view photographs and video footage of guide dogs with blind persons.

By generating “gaze traces,” the project aspires to create attentional replays that capture viewer experiences. The methodology involves showing flickering versions of images to compare how viewer attention is distributed between dogs and humans, providing insights into visual perception and engagement with human-canine interactions. The project draws inspiration from Morris Frank, a 1929 Vanderbilt alumnus who pioneered America’s first guide dog training program, highlighting the historical significance of human-canine partnerships.

 

Exploring AI Ethics in the Classroom with Agents of ViTAL: Ethics Mission

Namrata Srivastava & Sarah Burriss | Research Scientists with the Engage AI Institute Lab, Department of Teaching & Learning and Department of Computer Science | College of Connected Computing and Peabody College

            

Research Scientists Namrata Sirivastava and Sarah Burriss aim to transform an existing educational game for middle school students, (Agents of ViTAL: Ethics Missions, developed by the NSF supported EngageAI Institute), enhancing its approach to exploring AI ethics by incorporating a specially-designed conversational agent. Expanding on the current game, the new effort will actively engage students through an innovative LLM-powered EthicsBot. The agent’s creation will be a collaborative effort, co-designed with middle and high school students and built in partnership with LIVE’s research engineer. At its core, the project integrates an innovative interactive AI tool designed to achieve multiple educational objectives: scaffold ethical reasoning, facilitate collaborative learning, and provide a dynamic platform for exploring ethical considerations within a virtual school environment. By combining student insights, technical expertise, and advanced AI capabilities, the project seeks to create a more immersive and interactive learning experience that moves beyond traditional AI education, empowering students to critically engage with the ethics of advanced technologies.

Explore Story Topics