Peer reviewed journals

Effects of virtual human appearance fidelity on emotion contagion in affective inter-personal simulations

M. Volonte, S. V. Babu, H. Chaturbedi, N. Newsome, E. Ebrahimi, T. Roy, S. Daily, T. Fasolino
IEEE transactions on visualization and computer graphics
AbstractRealistic versus stylized depictions of virtual humans in simulated inter-personal situations and their ability to elicit emo-tional responses in users has been an open question for artists and researchers alike. We empirically evaluated the effects of nearvisually realistic vs. non-realistic stylized appearance of virtual humans on the emotional response of participants in a medical virtualreality system that was designed to educate users in recognizing the signs and symptoms of patient deterioration. In a between-subjects experiment protocol, participants interacted with one of three different appearances of a virtual patient, namely visuallyrealistic, cartoon-shaded and charcoal-sketch like conditions in a mixed reality simulation. Emotional impact were measured viaa combination of quantitative objective measures were gathered using skin Electrodermal Activity (EDA) sensors, and quantitativesubjective measures such as the Differential Emotion Survey (DES IV), Positive and Negative Affect Schedule (PANAS), and SocialPresence questionnaire. The emotional states of the participants were analyzed across four distinct time steps during which themedical condition of the virtual patient deteriorated (an emotionally stressful interaction), and were contrasted to a baseline affectivestate. Objective EDA results showed that in all three conditions, male participants exhibited greater levels of arousal as comparedto female participants. We found that negative affect levels were significantly lower in the visually realistic condition, as comparedto the stylized appearance conditions. Furthermore, in emotional dimensions of interest-excitement, surprise, anger, fear and guiltparticipants in all conditions responded similarly. However, in social emotional constructs of shyness, presence, perceived personal-ity, and enjoyment-joy, we found that participants responded differently in the visually realistic condition as compared to the cartoonand sketch conditions. Our study suggests that virtual human appearance can affect not only critical emotional reactions in affectiveinter-personal training scenarios, but also users’ perceptions of personality and social characteristic of the virtual interlocutors.
[pdf]

Interaction with proactive and reactive agents in box manipulation tasks in virtual environments.

Liu, K.Y., M. Volonte, Hsu, Y.C., Babu, S.V. and Wong, S.K
Computer Animation and Virtual Worlds,30(3-4), p.e1881
AbstractThis paper studies the user collaboration experience with proactive and reactive agents in transporting boxes in virtual environments. Two main characters, the avatar and the agent, are controlled by a user and a controller, respectively. The user and the agent communicate with each other by voice. The agent can be proactive or reactive. The user follows the instruction issued by the proactive agent, whereas the user instructs the reactive agent to perform actions. The goal is to transport boxes to goal positions with orientation constraints. We conducted a user study to analyze the behaviors of participants in several aspects, including task completion time, path length, control experience, and co‐presence experience. We report our findings and make suggestions for future development.
[pdf]

Peer Reviewed Conference

Empirical Evaluation of Virtual Human Conversational and Affective Animations on Visual Attention in Inter-Personal Simulations

M. Volonte, A. Robb, A.T. Duchowski, S.V. Babu
Proceedings ofIEEE VR (3DUI).(Conference acceptance rate of 20.6%)
AbstractCreating realistic animations of virtual humans remains comparatively complex and expensive. This research explores the degree to which animation fidelity affects users' gaze behavior when interacting in virtual reality training simulations that include virtual humans. Participants were randomly assigned to one of three conditions , wherein the virtual patient either: 1) was not animated; 2) played idle animations; or 3) played idle animations, looked at the participant when speaking, and lip-synced speech and facial gestures when conversing with the participant. Each participant's gaze was recorded in an inter-personal interactive patient surveillance simulation. Results suggest that conversational and passive animations elicited visual attention in a similar manner, as compared to the no animation condition. Results also suggest that when participants face critical situations in inter-personal medical simulations, visual attention towards the virtual human decreases while gaze towards goal directed activities increases.
[pdf]

Effects of Stereoscopic Viewing andHaptic Feedback, Sensory-Motor Congruence and Calibration on Near-Field Fine Motor Perception-ActionCoordination in Virtual Reality.

D. Brickler, M. Volonte, J. W. Bertrand, A.T. Duchowski, S. V. Babu
2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) (pp. 28-37). IEEE
AbstractWe present an empirical evaluation on how stereoscopic viewing andhaptic feedback deferentially affects fine motor perception-action co-ordination in a pick-and-place task in Virtual Reality (VR). The fac-tors considered were stereoscopic viewing, haptic feedback, sensory-motor congruence and mismatch, and calibration on perception-action coordination in near field fine motor task performance in VR.Quantitative measures of placement error, distance, collision, andtime to complete trials were recorded and analyzed. Overall, wefound that participants’ manual dexterous task performance wasenhanced in the presence of both stereoscopic viewing and hapticfeedback. However, we found that time to complete task was greatlyenhanced by the presence of haptic feedback, and economy and effi-ciency of movement of the end effector as well as the manipulatedobject was enhanced by the presence of both haptic feedback andstereoscopic viewing. Whereas, number of collisions and placementaccuracy were greatly enhanced by the presence of stereoscopicviewing in near-field fine motor perception-action coordination. Ourresearch additionally shows that mismatch in sensory-motor stimulican detrimentally affect the number of collisions, and efficiency ofend effector and object movements in near-field fine motor activities,and can be further negatively affected by the absence of haptic feed-back and stereoscopic viewing. In spite of reduced cue situations inVR, and the absence or presence of stereoscopic viewing and hapticfeedback, we found that participants tend to calibrate or adapt theirperception-action coordination rapidly with a set of at least 5 trials.
[pdf]

Effects of a Virtual Human Appearance Fidelity Continuum on Visual Attention in Virtual Reality.

M. Volonte, A.T. Duchowski, S. V. Babu
IVA '19 Proceedings of the 19th ACM International Conference on Intelligent Virtual Agents, pages 141-147
AbstractIn this contribution we studied how different rendering styles of a virtual human impacted users' visual attention in an interactive medical training simulator. In a mixed design experiment, 78 participants interacted with a virtual human representing a sample from the non-photorealistic (NPR) to the photorealistic (PR) rendering continuity. We presented five rendering style samples scenarios, namely low fidelity all Pencil Shaded (APS), Low to Mid Fidelity Pencil Shader on virtual patient (VP) only (PS), Mid Fidelity All Cartoon Shaded (ACT), Mid to High Fidelity Cartoon Shader on VP only (CT), and relatively High Fidelity Human Like (HL) appearance, and compared how visual attention differed between groups of users. For this study, we employed an eye tracking system for collecting and analyzing users' gaze during interaction with the virtual human in a failure to rescue medical training simulation. Results suggests that users may spend more time in the simulations on the non-realistic fidelity continuum that necessarily do not involve interaction with the virtual human. However, users preferred visually attending to virtual humans in the middle and high fidelity visual appearance conditions when engaging virtual humans in simulated social face-to-face dialogue as compared to the other conditions.
[pdf]

Empirical Evaluation of the Interplay of Emotion and Visual Attention in Human-Virtual Human Interaction.

M. Volonte, R. G. Anaraky, B. P. Knijnenburg, A.T. Duchowski, S. V. Babu
SAP '19 (Publishing process)
AbstractWe examined the effect of rendering style and the interplay between attention and emotion in users during interaction with a virtual patient in a medical training simulator. The virtual simulation was rendered representing a sample from the photo-realistic to the non-photorealistic continuum, namely Near-Realistic, Cartoon or Pencil-Shader. In a mixed design study, we collected 45 participants' emotional responses and gaze behavior using surveys and an eye tracker while interacting with a virtual patient who was medically deteriorating over time. We used a cross-lagged panel analysis of attention and emotion to understand their reciprocal relationship over time. We also performed a mediation analysis to compare the extent to which the virtual agent’s appearance and his affective behavior impacted users’ emotional and attentional responses. Results showed the interplay between participants’ visual attention and emotion over time and also showed that attention was a stronger variable than emotion during the interaction with the virtual human.
[pdf]

Peer Reviewed Poster Abstracts

Towards Standardization of Medical Trials Using Virtual Experimenters.

Z. J. Inks, M. Volonte, S. Beadle, B. Horing, A. C. Robb, S. V. Babu In 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) (pp. 585-586). IEEE.

AbstractWe describe a system for experiment standardization and distributingmedication in medical trials using a virtual human. In our system, weemployed a virtual experimenter that explains the experiment, medi-cation, procedure and risks involved through a large screen display.The participant is able to interact with and ask questions of the vir-tual experimenter through a touch screen interface on an additionalmonitor display. During the interaction, the virtual experimenter willpresent the participant with a pill. The pill is physically distributedby a custom-made Arduino based dispenser. We conducted an initialuser evaluation of the system using a placebo response protocol anda perceived pain scale. In the study, participants submerged theirhand in a hot water bath before and after interacting with the systemand reported their perceived pain response. The system was either avirtual human or text interface that either disseminated a pill statingthat it was an analgesic or did not provide a medication. Throughthis system, we propose the potential use of virtual humans as amethod to provide a consistent and standardized interaction betweena participant and experimenter, while maintaining the benefits ofsocial interaction in medication trials.
[pdf]

Towards an Immersive Driving Simulator to Study Factors Related to Cybersickness.

R. Venkatakrishnan, M. Volonte, A. Bhargava, H. Solini, R. Venkatakrishnan, A. C. Robb, S. V. Babu, K. M. Lucaites, C. Pagano2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)

AbstractThe commercialization of Virtual Reality (VR) devices has made it easier for everyday users to experience VR from the comfort of their living rooms. This recent uptake in VR has also increased reported incidents of cybersickness. Cybersickness refers to the discomfort experienced by an individual while experiencing virtual environments. The symptoms are similar to those of motion sickness but are more disorienting in nature resulting in dizziness, blurred vision, etc. Cybersickness is currently one of the biggest hurdles to the widespread adoption of VR, and it is therefore critical to explore the factors that influence its onset. Towards this end, we present a proof of concept simulation to study cybersickness in highly realistic immersive virtual environments.
[pdf]