

Interactive VR Nursing Asst Training: youtube.com/watch?v=Tkxp-5ejvEE
Practicing patient-centered care
Video by Ally Quinn and Sam Kmiec | Photos courtesy of the Barmaki lab March 13, 2025
Computer science students design interactive virtual reality training tool
“Hi, Julie,” called Rana Tuncer, a senior computer science major at the University of Delaware.
No, Tuncer was not greeting a fellow peer — she was interacting with a computer-generated patient while demonstrating the capabilities of a virtual reality training program she and other students helped create in Assistant Professor Leila Barmaki’s Human-Computer Interaction Laboratory.
To her left, UD sophomore Gael Lucero-Palacios controlled the avatar’s responses.
Tuncer described the digital interface as a two-way system that can help nurse trainees build their communication skills and learn to provide patient-centered care across a variety of situations. In this case, users would rehearse their bedside manner with expectant mothers before ever encountering a pregnant patient in person.
“The training helps aspiring nurses practice more difficult and sensitive conversations they might have with patients. Our tool is targeted to midwifery patients,” Lucero-Palacios said. “Learners can practice these conversations in a safe environment. It’s multilingual, too. We currently offer English or Turkish, and we’re working on a Spanish demo.”
This type of judgement-free rehearsal environment has the potential to remove language barriers to care, with the ability to change the language capabilities of an avatar. For instance, the idea is that on one interface the “practitioner” could speak in one language, but it would be heard on the other interface in the patient’s native language. The patient avatar also can be customized to resemble different health stages and populations to provide learners a varied experience.
Last December, Tuncer took the project on the road, piloting the virtual reality training program for faculty members in the Department of Midwifery at Ankara University in Ankara, Turkey. With technical support provided by Lucero-Palacios back in the United States, she was able to run a demo with the Ankara team, showcasing the UD-developed system’s interactive rehearsal environment’s capabilities.
Meanwhile, for Tuncer, Lucero-Palacios and the other students involved in the Human-Computer Interaction Laboratory, developing the VR training tool offered the opportunity to enhance their computer science, data science and artificial intelligence skills outside the classroom.
“There were lots of interesting hurdles to overcome, like figuring out a lip-sync tool to match the words to the avatar’s mouth movements and figuring out server connections and how to get the languages to switch and translate properly,” Tuncer said.
Lucero-Palacios was fascinated with developing text-to-speech capabilities and the ability to use technology to impact patient care.
“If a nurse is well-equipped to answer difficult questions, then that helps the patient,” said Lucero-Palacios.

The project is an ongoing research effort in the Barmaki lab that has involved many students. Significant developments occurred during the summer of 2024 when undergraduate researchers Tuncer and Lucero-Palacios contributed to the project through funding support from the National Science Foundation (NSF). However, work began before and continued well beyond that summer, involving many students over time. UD senior Gavin Caulfield provided foundational support to developing the program’s virtual environment and contributed to development of the text-to-speech/speech-to-text capabilities. CIS doctoral students Fahim Abrar and Behdokht Kiafar, along with Pinar Kullu, a postdoctoral fellow in the lab, used multimodal data collection and analytics to quantify the participant experience.
“Interestingly, we found that participants showed more positive emotions in response to patient vulnerabilities and concerns,” said Kullu.
The work builds on previous research Barmaki, an assistant professor of computer and information sciences and resident faculty member in the Data Science Institute, completed with colleagues at New Jersey Institute of Technology and University of Central Florida in an NSF-funded project focused on empathy training for healthcare professionals using a virtual elderly patient. In the project, Barmaki employed machine learning tools to analyze a nursing trainee’s body language, gaze, verbal and nonverbal interactions to capture micro-expressions (facial expressions), and the presence or absence of empathy.
“There is a huge gap in communication when it comes to caregivers working in geriatric care and maternal fetal medicine,” said Barmaki. “Both disciplines have high turnover and challenges with lack of caregiver attention to delicate situations.”
When these human-human interactions go wrong, for whatever reason, it can extend beyond a single patient visit. For instance, a pregnant woman who has a negative health care experience might decide not to continue routine pregnancy care.
Beyond the project’s potential to improve health care professional field readiness, Barmaki was keen to note the benefits of real-world workforce development for her students.
“Perceptions still exist that computer scientists work in isolation with their computers and rarely interact, but this is not true,” Barmaki said, pointing to the multi-faceted team members involved in this project. “Teamwork is very important. We have a nice culture in our lab where people feel comfortable asking their peers or more established students for help.”

Barmaki also pointed to the potential application of these types of training environments, enabled by virtual reality, artificial intelligence and natural language processing, beyond health care. With the framework in place, she said, the idea could be adapted for other types of training involving human-human interaction, say in education, cybersecurity, even in emerging technology such as artificial intelligence (AI). Keeping people at the center of any design or application of this work is critical, particularly as uses for AI continue to expand.
“As data scientists, we see things as spreadsheets and numbers in our work, but it’s important to remember that the data is coming from humans,” Barmaki said.
While this project leverages computer vision and AI as a teaching tool for nursing assistants, Barmaki explained this type of system can also be used to train AI and to enable more responsible technologies down the road. She gave the example of using AI to study empathic interactions between humans and to recognize empathy.
“This is the most important area where I’m trying to close the loop, in terms of responsible AI or more empathy-enabled AI,” Barmaki said. “There is a whole area of research exploring ways to make AI more natural, but we can’t work in a vacuum; we must consider the human interactions to design a good AI system.”
Asked whether she has concerns about the future of artificial intelligence, Barmaki was positive.
“I believe AI holds great promise for the future, and, right now, its benefits outweigh the risks,” she said.
Contact Us
Have a UDaily story idea?
Contact us at ocm@udel.edu
Members of the press
Contact us at 302-831-NEWS or visit the Media Relations website