Time Series Foundation Models - current state and future directions
ISA15 - Influence of a HMD on UX and performance in a VR-based sports application
1. Influence of a HMD on UX and
performance in a VR-based
sports application
Pedro Kayatt Ricardo Nakamura
Escola Politécnica da Universidade de São Paulo
2. Introduction
• The concept of Head Mounted Displays have been introduced since
the 1960s
• Unfortunately they were not widely adopted for virtual environments
• We believe that this is due, mainly, by discomfort.
Sutherland, I. E. (1968). "A head-mounted three dimensional display". Proceedings of AFIPS 68, pp. 757-764
The Eyephone from VPL Research
3. Motivation
• Due the smartphones “resolution race” we saw some exciting
launches on HMD area.
• Low cost devices, with fast response times and great resolution.
4. Our Question
• The main objective of our work is to evaluate the impact of the use of
a current-generation HMD on the experience and performance of
users of an application of virtual reality for ball sports.
• The results may then be used to discuss the effects of technology
improvements for these devices in their applicability.
• As HMDs are not widely available if compared to displays and
projectors, we also expected that a “novelty factor” might affect
users’experience and thus our experiment was designed to determine
if this would happen
5. Related Work
• S. H. Creem-Regehr, P. Willemsen, A. A. Gooch, and W. B.
Thompson. The influence of restricted viewing conditions on
egocentric distance perception: Implications for real and
virtual environments. Perception, 34(2):191–204, 2005.
• S. Davis, K. Nesbitt, and E. Nalivaiko. Comparing the onset of
cybersickness using the oculus rift and two virtual roller
coasters. In Proceedings of the 11th
• H. C. Miles, S. R. Pop, S. J. Watt, G. P. Lawrence, and N. W.
John. A review of virtual environments for training in ball
sports. Computers & Graphics, 36(6):714–726, 2012.
6. Methodology
• In order to perform the experiment, we created a game called
RiftSoccer, based on the rules of soccer.
• The experiment involved ten test subjects divided into a test group
and a control group. The five members of the test group performed
the test wearing the Oculus Rift DK2 HMD, while the five members of
the control group performed the test using a regular 19” LED display.
All the users were undergraduate students with ages ranging from 23
to 29 years.
7. RiftSoccer
• The game is viewed in first person in a virtual
environment that simulates a soccer field. A ball is
placed at the penalty mark and then, after an
arbitrary time, it is launched at a random angle
towards the goal. At this point, the player must
react causing a leap movement to catch the ball.
• Two versions of the same game, one using the
stereoscopic camera and reading the sensors
of the HMD and other with a normal camera.
• No HUD or interface was provided leaving the
view of the subject totally free.
8. The test
• The test subject was presented to the rules of the game and three
trial kicks were performed to ensure that the game controls were
understood and could be operated correctly. After that they should
try to catch 10 kicks.
• The game software automatically recorded two performance
parameters: the reaction time and whether the test subject had
selected the correct direction to catch the incoming ball.
• After finishing the game session, the test subject was asked to fulfill
the questionnaire that was previously described.
9. Questionnaire
• Q1. How much did you feel immersed* in the application?
(*According to your own definition of immersion).
• 1. Not immersed; 2. a little immersed; 3. moderately immersed; 4. quite
immersed; 5. very immersed.
• Q2. How would you rate your performance**? (**Reaction time).
• 1. Awful; 2. bad; 3. average; 4. good; 5. great.
• Q3. How many penalty kicks do you think you defended?
• 1. None; 2. a few; 3. about half; 4. many; 5. most.
• Q4. Did you have fun?
• 1. No; 2. a little; 3. average; 4. much; 5. a lot.
11. Results (Quantitative)
• The average response time for all test subjects of the test group was 0.33s, while
the average response time for the subjects of the control group was 0.37s.
• Using ANOVA analysis variance we could establish that the control and test
groups were not equal. (F = 6.33 and Fcrit = 5.31).
• On the other hand, there was no impact of the HMD in the number of defenses –
that is, the number of times the task was performed correctly (over 10 kicks, an
average of 5.8 kicks were caught by both groups).
ANOVA single factor - summary 1
ANOVA single factor - summary 2
12. Results (Qualitative)
Questionnaire responses for the test group Questionnaire responses for the control group
• In general, qualitative assessments were higher for the test group
than the control group, markedly so in question 1, regarding
immersion, and question 4, regarding fun.
• Part of the difference might be explained by the novelty of the
experience of use of the HMD.
13. Conclusion
• The qualitative results, combined with the quantitative evaluation
that indicates a gain of performance in reaction time for the test
group, allow us to conclude that the use of new generation HMDs is
advantageous for virtual environments for ball sports training.
• We believe that the results of this experiment motivate further
research into the use of new generation HMDs for other sports
applications of virtual reality. New experiments may also be
performed in other application domains in which previous studies
had found that HMDs were still unsuitable for actual use due to user
discomfort or cybersickness.
14. Future Works
• Introduction of Natural User Interfaces (or motion controllers),
creating a 4 cases test:
• Using NUI and HMD
• Not using NUI and using HMD
• Not using NUI and not using HMD
• Using NUI and not using HMD
• More subjects and other ball sports.
15. Remarks
• R. Yao, T. Heath, A. Davies, T. Forsyth, N. Mitchell, and P. Hoberman.
Oculus VR Best Practices Guide. Oculus VR, 2015.
• Do not break the “guidelines” of VR, it will lead you to cybersickness
and it can bring to misleading conclusions.
• Great changes occurred on this last 2 years, next launches could show
similar improvement. We realize that if the technology is improving
fast we must evaluate it as fast as possible so we can keep up with the
knowledge.
16. ACKNOWLEDGMENTS
• CAPES – Brazilian Federal Agency for Support and Evaluation of
Graduate Education within the Ministry of Education of Brazil.
• Interlab – Escola Politécnica da Universidade de São Paulo