Âé¶¹ÒùÔº


How VR and AI could help the next generation grow kinder and more connected

VR with haptic gloves
Credit: Julia M Cameron from Pexels

Empathy is not just a "nice-to-have" soft skill—it is a foundation of how children and adults regulate emotions, build friendships and learn from one another.

Between the ages of 6 and 9, children begin shifting from being self-centered to noticing the emotions and perspectives of others. This makes early childhood one of the most important periods for developing empathy and other skills.

Traditionally, pretend play has been a natural way to practice empathy. Many adults can remember acting out scenes as doctor and patient, or using sticks and leaves as imaginary currency. Those playful moments were not just entertainment—they were early lessons in empathy and taking someone else's perspective.

But as children spend more time with technology and less in pretend play, these opportunities are shrinking. Some educators worry that technology . Yet research in affective computing— that recognize emotions, simulate them or both—suggests that technology can also become part of the solution.

Virtual reality, in particular, can create immersive environments where children interact with characters who display emotions as vividly as real humans. I'm a who studies social-emotional learning in the context of how people use technology. Used thoughtfully, the combination of VR and artificial intelligence could help reshape social-emotional learning practices and serve as a new kind of "empathy classroom" or "emotional regulation simulator."

Game of emotions

As a part of my doctoral studies at the University of Florida, in 2017 I began developing a VR Empathy Game framework that combines insights from developmental psychology, affective computing and participatory design with children. At the at the University of Maryland, I worked with their KidsTeam program, where children of 7-11 served as design partners, helping us to imagine what an empathy-focused VR game should feel like.

In 2018, 15 master's students at the at the University of Central Florida and I created the , ? This game is based on a Russian folktale and introduces four characters, each representing a core emotion: Baba Yaga embodies anger, Goose represents fear, the Older Sister shows happiness and the Younger Sister expresses sadness.

Unlike most games, it does not reward players with points or badges. Instead, children can only by getting to know the characters, listening to their stories and practicing empathic actions. For example, they can look at the game's world through a character's glasses, revisit their memories or even hug Baba Yaga to comfort her. This design choice reflects a core idea of social-emotional learning: Empathy is not about external rewards but about pausing, reflecting and responding to the needs of others.

My colleagues and I have been refining the game since then and using it to study children and empathy.

Different paths to empathy

We tested the game with individually. After asking general questions and giving an empathy survey, we invited children to play the game. We while they were playing and discussed their experience afterward.

Our most important discovery was that children interacted with the VR characters humans usually follow while interacting with each other. Some children displayed , meaning they had an understanding of the characters' emotional states. They listened thoughtfully to characters, tapped their shoulders to get their attention, and attempted to help them. At the same time, they were not completely absorbed in the VR characters' feelings.

Others expressed , directly mirroring characters' emotions, sometimes becoming so distressed by fear or sadness that it made them stop the game. In addition, a few other children did not connect with the characters at all, focusing mainly on exploring the virtual environment. All three behaviors can happen in real life as well when children interact with their peers.

These findings highlight both the promise and the challenge. VR can indeed evoke powerful empathic responses, but it also raises questions about how to design experiences that support children with different temperaments—some need more stimulation, and others need gentler pacing.

AI eye on emotions

The current big question for us is how to effectively incorporate this type of empathy game into everyday life. In classrooms, VR will not replace real conversations or traditional role-play, but it can enrich them. A teacher might use a short VR scenario to spark discussion, encouraging students to reflect on what they felt and how it connects to their real friendships. In this way, VR becomes a springboard for dialogue, not a stand-alone tool.

We are also exploring adaptive VR systems that respond to a child's emotional state in real time. A headset might detect if a child is anxious or scared—through facial expressions, or gaze—and adjust the experience by scaling down the characters' expressiveness or offering supportive prompts. Such a responsive "empathy classroom" could give children safe opportunities to gradually strengthen their emotional regulation skills.

This is where AI becomes essential. AI systems can make sense of the data collected by VR headsets such as eye gaze, , heart rate or body movement and use it to in real time. For example, if a child looks anxious or avoids eye contact with a sad character, the AI could gently slow down the story, provide encouraging prompts or reduce the emotional intensity of the scene. On the other hand, if the child appears calm and engaged, the AI might introduce a more complex scenario to deepen their learning.

In our current research, we are itself—tracking moment-to-moment emotional responses during gameplay to provide educators with better insight into how empathy develops.

Future work and collaboration

As promising as I believe this work is, it raises big questions. Should VR characters express emotions at full intensity, or should we tone them down for sensitive children? If children treat VR characters as real, how do we make sure those lessons carry to the playground or dinner table? And with headsets still costly, how do we ensure empathy technology doesn't widen digital divides?

These are not just research puzzles but ethical responsibilities. This vision requires collaboration among educators, researchers, designers, parents and children themselves. Computer scientists design the technology, psychologists ensure the experiences are emotionally healthy, teachers adapt them for curriculum, and children co-create the games to make them engaging and meaningful.

Together, we can shape technologies that not only entertain but also nurture , emotional regulation and deeper connection in the next generation.

Provided by The Conversation

This article is republished from under a Creative Commons license. Read the .The Conversation

Citation: How VR and AI could help the next generation grow kinder and more connected (2025, October 2) retrieved 14 October 2025 from /news/2025-10-vr-ai-generation-kinder.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further


0 shares

Feedback to editors