Goto

Collaborating Authors

 vr game


VRScout: Towards Real-Time, Autonomous Testing of Virtual Reality Games

Wu, Yurun, Sun, Yousong, Wunsche, Burkhard, Wang, Jia, Wen, Elliott

arXiv.org Artificial Intelligence

Abstract--Virtual Reality (VR) has rapidly become a mainstream platform for gaming and interactive experiences, yet ensuring the quality, safety, and appropriateness of VR content remains a pressing challenge. Traditional human-based quality assurance is labor-intensive and cannot scale with the industry's rapid growth. While automated testing has been applied to traditional 2D and 3D games, extending it to VR introduces unique difficulties due to high-dimensional sensory inputs and strict real-time performance requirements. VRScout learns from human demonstrations using an enhanced Action Chunking Transformer that predicts multi-step action sequences. This enables our agent to capture higher-level strategies and generalize across diverse environments. T o balance responsiveness and precision, we introduce a dynamically adjustable sliding horizon that adapts the agent's temporal context at runtime. We evaluate VRScout on commercial VR titles and show that it achieves expert-level performance with only limited training data, while maintaining real-time inference at 60 FPS on consumer-grade hardware. These results position VRScout as a practical and scalable framework for automated VR game testing, with direct applications in both quality assurance and safety auditing.


Exploring Emotions in Multi-componential Space using Interactive VR Games

Somarathna, Rukshani, Mohammadi, Gelareh

arXiv.org Artificial Intelligence

Emotion understanding is a complex process that involves multiple components. The ability to recognise emotions not only leads to new context awareness methods but also enhances system interaction's effectiveness by perceiving and expressing emotions. Despite the attention to discrete and dimensional models, neuroscientific evidence supports those emotions as being complex and multi-faceted. One framework that resonated well with such findings is the Component Process Model (CPM), a theory that considers the complexity of emotions with five interconnected components: appraisal, expression, motivation, physiology and feeling. However, the relationship between CPM and discrete emotions has not yet been fully explored. Therefore, to better understand emotions underlying processes, we operationalised a data-driven approach using interactive Virtual Reality (VR) games and collected multimodal measures (self-reports, physiological and facial signals) from 39 participants. We used Machine Learning (ML) methods to identify the unique contributions of each component to emotion differentiation. Our results showed the role of different components in emotion differentiation, with the model including all components demonstrating the most significant contribution. Moreover, we found that at least five dimensions are needed to represent the variation of emotions in our dataset. These findings also have implications for using VR environments in emotion research and highlight the role of physiological signals in emotion recognition within such environments.


VR.net: A Real-world Dataset for Virtual Reality Motion Sickness Research

Wen, Elliott, Gupta, Chitralekha, Sasikumar, Prasanth, Billinghurst, Mark, Wilmott, James, Skow, Emily, Dey, Arindam, Nanayakkara, Suranga

arXiv.org Artificial Intelligence

Researchers have used machine learning approaches to identify motion sickness in VR experience. These approaches demand an accurately-labeled, real-world, and diverse dataset for high accuracy and generalizability. As a starting point to address this need, we introduce `VR.net', a dataset offering approximately 12-hour gameplay videos from ten real-world games in 10 diverse genres. For each video frame, a rich set of motion sickness-related labels, such as camera/object movement, depth field, and motion flow, are accurately assigned. Building such a dataset is challenging since manual labeling would require an infeasible amount of time. Instead, we utilize a tool to automatically and precisely extract ground truth data from 3D engines' rendering pipelines without accessing VR games' source code. We illustrate the utility of VR.net through several applications, such as risk factor detection and sickness level prediction. We continuously expand VR.net and envision its next version offering 10X more data than the current form. We believe that the scale, accuracy, and diversity of VR.net can offer unparalleled opportunities for VR motion sickness research and beyond.


Virtual reality games can be used as a tool in personnel assessment -- ScienceDaily

#artificialintelligence

Virtual reality gamers (VR game) who finished it faster than their fellow gamers also have higher levels of general intelligence and processing capacity. This was the result of a study conducted by the University of Cologne, the University of Liechtenstein and Vorarlberg University of Applied Sciences. The results also indicate that virtual reality games can be useful supplementary human resource management tools in companies for predicting the job performance of an applicant. The study "Intelligence at play: game-based assessment using a virtual-reality application" by Markus Weinmann of the University of Cologne and his fellow scientists was published in the journal Virtual Reality. Several studies have already shown that video games may indicate or even help to develop intellectual and cognitive abilities.


Nvidia's DLSS AI-upscaling SDK now supports VR

#artificialintelligence

It's also now listed in the DLSS Unreal Engine changelog. Deep Learning Super Sampling (DLSS) uses the Tensor cores in GeForce RTX graphics cards to power a detail-enhancing neural network. Nvidia claims the result is superior to native 4K rendering. "AI upscaling" algorithms have become popular in the last few years, with some websites even letting users upload any image on their PC or phone to be upscaled. Given enough training data, they can produce a significantly more detailed output than traditional upscaling, though the algorithm is technically only "hallucinating" what it expects the missing detail should look like.


Apple buys a VR company that put real faces on virtual avatars

Engadget

Apple has acquired Spaces, a VR company that offered both VR experiences and, after the pandemic hit, a way of bringing your virtual avatar into Zoom meetings. Protocol quotes an unnamed Apple spokesperson offering the usual boilerplate confirmation, saying that it had nothing else to add. Spaces is, or was, a company that started out offering free-roam VR experiences, similar to what The Void offered in those heady pre-pandemic times. It was spun out of DreamWorks, and its first project was a Terminator-themed VR game for up to four players. But the most interesting thing about it was the facial tracking it used to try and make its VR games more immersive than the competition.


High-speed 5G network seen as ready to give big boost to online gaming

The Japan Times

CHIBA – At this year's Tokyo Game Show, the big draw was next-generation 5G networking -- setting pulses racing with the prospect of a radically more immersive gaming experience. Offering data transmission speeds around 100 times faster than 4G, 5G is expected to enable more seamless imagery -- particularly lower latency, more vivid images -- and sharper motion. Industry experts say it will dramatically improve the quality of augmented and virtual reality games. "It was very smooth, responsive and consistent," said Omar Alshiji, a 23-year-old game designer from Bahrain, after trying out the fighting game Tekken at the NTT Docomo Inc. booth. The major mobile carrier installed 5G base stations at its booth this year, making the high-speed network available at the show. The four-day industry event, held in the city of Chiba, ended Sunday.


High-speed 5G network seen as ready to give big boost to online gaming

The Japan Times

CHIBA – Next-generation 5G networking was the big draw at Tokyo Game Show 2019, setting pulses racing with the prospect of a radically more immersive gaming experience. Offering data transmission speeds around 100 times faster than 4G, 5G is expected to enable more seamless imagery with lower latency, more vivid images and sharper motion. Industry experts say it will dramatically improve the quality of augmented and virtual reality games. "It was very smooth, responsive and consistent," said Omar Alshiji, a 23-year-old game designer from Bahrain, after trying out the fighting game "Tekken" at the NTT Docomo Inc. booth at the four-day game show in Chiba. The major mobile carrier installed 5G base stations at its booth this year, making the high-speed network available at the show. "My country does not have 5G, only 4G so I wanted to try it.


How Do Players’ Eye Movements Relate to Their Excitement in a VR Adaptive Game?

Abdessalem, Hamdi Ben (University of Montreal) | Chaouachi, Maher (University of Montreal) | Boukadida, Marwa (University of Montreal) | Frasson, Claude (University of Montreal)

AAAI Conferences

Interaction with games can induce emotional reactions which could have an impact on players’ game experience and performance. Physiological sensors such as EEG and eye tracking represent an important mean to track these emotional reactions. In addition, virtual reality isolates the players from the external environment, strengthening the emotional measures. In this paper, we present an explorative study of the use of eye tracking for game adaptation according to the players’ excitement. Results showed that there exists a relationship between the modification of the game’s speed and the EEG excitement index and a correlation between eye movement and excitement as well. These results suggest that eye tracking could be a valid support or replacement of EEG data in game adaptation.


The future of driving: Audi puts VR games in every car as autonomous vehicles are unveiled at CES

Daily Mail - Science & tech

At the Consumer Electronics Show in Las Vegas, attendees were given a firsthand look at what the future of driving will look like. Everything from autonomous'people-movers' to a VR experience that lets users battle Iron Man from the backseat of a car was on display at the world's largest tech trade show. Major companies including BMW, Honda and Mercedes had concept cars to show off, many complete with facial recognition, flashy OLED touchscreens and futuristic pod-like designs - you'd be forgiven if you thought many of them didn't even look like cars. Major companies including BMW, Honda and Mercedes had concepts to show off at CES, many complete with facial recognition and OLED touchscreens. Pictured is Mercedes' concept car, which can be used for personal rides and delivery Audi and Disney grabbed attendees' attention with their debut of a new'Holoride' system that aims to bring virtual reality to every car, including Ford, Mercedes and other models.