SlideShare une entreprise Scribd logo
1  sur  161
Télécharger pour lire hors ligne
LECTURE 2: VR
TECHNOLOGY
COMP 4010 - Virtual Reality
Semester 5 - 2017
Mark Billinghurst, Bruce Thomas
University of South Australia
Overview
•Presence in VR
•Perception and VR
•Human Perception
•VR Technology
PRESENCE
Presence ..
“The subjective experience of being in one place or
environment even when physically situated in another”
Witmer, B. G., & Singer, M. J. (1998). Measuring presence in virtual environments: A presence
questionnaire. Presence: Teleoperators and virtual environments, 7(3), 225-240.
Immersion vs. Presence
• Immersion: describes the extent to which technology is
capable of delivering a vivid illusion of reality to the senses
of a human participant.
• Presence: a state of consciousness, the (psychological)
sense of being in the virtual environment.
• So Immersion, defined in technical terms, is capable of
producing a sensation of Presence
• Goal of VR: Create a high degree of Presence
• Make people believe they are really in Virtual Environment
Slater, M., & Wilbur, S. (1997). A framework for immersive virtual environments (FIVE):
Speculations on the role of presence in virtual environments. Presence: Teleoperators and
virtual environments, 6(6), 603-616.
How to Create Strong Presence?
• Use Multiple Dimensions of Presence
• Create rich multi-sensory VR experiences
• Include social actors/agents that interact with user
• Have environment respond to user
• What Influences Presence
• Vividness – ability to provide rich experience (Steuer 1992)
• Using Virtual Body – user can see themselves (Slater 1993)
• Internal factors – individual user differences (Sadowski 2002)
• Interactivity – how much users can interact (Steuer 1992)
• Sensory, Realism factors (Witmer 1998)
Example: UNC Pit Room
• Key Features
• Training room and pit room
• Physical walking
• Fast, accurate, room scale tracking
• Haptic feedback – feel edge of pit, walls
• Strong visual and 3D audio cues
• Task
• Carry object across pit
• Walk across or walk around
• Dropping virtual balls at targets in pit
• http://wwwx.cs.unc.edu/Research/eve/walk_exp/
Typical Subject Behaviour
• Note – from another pit experiment
• https://www.youtube.com/watch?v=VVAO0DkoD-8
Benefits of High Presence
• Leads to greater engagement, excitement and satisfaction
• Increased reaction to actions in VR
• People more likely to behave like in the real world
• E.g. people scared of heights in real world will be scared in VR
• More natural communication (Social Presence)
• Use same cues as face to face conversation
• Note: The relationship between Presence and Performance is
unclear – still an active area of research
Measuring Presence
• Presence is very subjective so there is a lot of debate
among researchers about how to measure it
• Subjective Measures
• Self report questionnaire
• University College London Questionnaire (Slater 1999)
• Witmer and Singer Presence Questionnaire (Witmer 1998)
• ITC Sense Of Presence Inventory (Lessiter 2000)
• Continuous measure
• Person moves slider bar in VE depending on Presence felt
• Objective Measures
• Behavioural
• reflex/flinch measure, startle response
• Physiological measures
• change in heart rate, skin conductance, skin temperature
Presence Slider
Relevant Papers
• Slater, M., & Usoh, M. (1993). Representation systems, perceptual positions, and presence
in immersive virtual environments. Presence, 2:221–233.
• Slater, M. (1999). Measuring presence: A response to the Witmer and Singer Presence
Questionnaire. Presence, 8:560–565.
• Steuer, J. (1992). Defining virtual reality: Dimensions determining telepresence. Journal of
Communication, 42(4):72–93.
• Sadowski, W. J. and Stanney, K. M. (2002) Measuring and Managing Presence in Virtual
Environments. In: Handbook of Virtual Environments: Design, implementation, and
applications.http://vehand.engr.ucf.edu/handbook/
• Schuemie, M. J., Van Der Straaten, P., Krijn, M., & Van Der Mast, C. A. (2001). Research on
presence in virtual reality: A survey CyberPsychology & Behavior, 4(2), 183-201.
• Lee, K. M. (2004). Presence, explicated. Communication theory, 14(1), 27-50.
• Witmer, B. G., & Singer, M. J. (1998). Measuring presence in virtual environments: A
presence questionnaire. Presence: Teleoperators and virtual environments, 7(3), 225-240.
• Lessiter, J., Freeman, J., Keogh, E., & Davidoff, J. (2000). Development of a new cross-
media presence questionnaire: The ITC-Sense of presence. Paper at the Presence 2000
Workshop, March 27–28, Delft.
PERCEPTION AND VR
What is Reality?
How do We Perceive Reality?
• We understand the world through
our senses:
• Sight, Hearing, Touch, Taste, Smell
(and others..)
• Two basic processes:
• Sensation – Gathering information
• Perception – Interpreting information
Simple Sensing/Perception Model
Goal of Virtual Reality
“.. to make it feel like you’re actually in
a place that you are not.”
Palmer Luckey
Co-founder, Oculus
Creating the Illusion of Reality
• Fooling human perception by using
technology to generate artificial sensations
• Computer generated sights, sounds, smell, etc
Reality vs. Virtual Reality
• In a VR system there are input and output devices
between human perception and action
Example Birdly - http://www.somniacs.co/
• Create illusion of flying like a bird
• Multisensory VR experience
• Visual, audio, wind, haptic
Birdly Demo
• https://www.youtube.com/watch?v=gHE6H62GHoM
‘Virtual Reality is a synthetic sensory experience which may
one day be indistinguishable from the real physical world.”
-Roy Kalawsky (1993)
Today
Tomorrow
HUMAN PERCEPTION
Motivation
• Understand: In order to create a strong sense of Presence
we need to understand the Human Perception system
• Stimulate: We need to be able to use technology to provide
real world sensory inputs, and create the VR illusion
VR Hardware Human Senses
Senses
• How an organism obtains information for perception:
• Sensation part of Somatic Division of Peripheral Nervous System
• Integration and perception requires the Central Nervous System
• Five major senses:
• Sight (Opthalamoception)
• Hearing (Audioception)
• Taste (Gustaoception)
• Smell (Olfacaoception)
• Touch (Tactioception)
Other Lessor Known Senses..
• Proprioception = sense of body position
• what is your body doing right now
• Equilibrium = balance
• Acceleration
• Nociception = sense of pain
• Temperature
• Satiety (the quality or state of being fed or gratified to or beyond capacity)
• Thirst
• Micturition
• Amount of CO2 and Na in blood
Relative Importance of Each Sense
• Percentage of neurons in
brain devoted to each sense
• Sight – 30%
• Touch – 8%
• Hearing – 2%
• Smell - < 1%
• Over 60% of brain involved
with vision in some way
VR System Overview
• Simulate output
• Map output to devices
• Use devices to
stimulate the senses
Visual
Simulation
3D Graphics HMD Vision
System
Brain
Example: Visual Simulation
Human-Machine Interface
Sight
The Human Visual System
• Purpose is to convert visual input to signals in the brain
The Human Eye
• Light passes through cornea and lens onto retina
• Photoreceptors in retina convert light into electrochemical signals
Photoreceptors – Rods and Cones
• Retina photoreceptors come in two types, Rods and Cones
Rods vs. Cones
• RODS
• 125 million cells in retina
• Concentrated on
periphery of retina
• No color detection
• Most sensitive to light
• Scotopic (night) vision
• Provide peripheral vision,
motion detection
• CONES
• 4.5-6 million in retina
• Responsible for color
vision
• Sensitive to red, blue,
green light
• Work best in more
intense light
Colour Perception
• Humans only perceive small part of electromagnetic spectrum
Horizontal and Vertical FOV
• Humans can see ~135
o
vertical (60
o
above, 75
o
below)
• See up to ~ 210
o
horizontal FOV, ~ 115
o
stereo overlap
• Colour/stereo in centre, Black & White/mono in periphery
Types of Visible Perception Possible
• As move further from fovea, vision becomes more limited
Dynamic Range
• Rods respond to low Luminance light, Cones to bright light
Comparing to Displays
• Human vision has far higher dynamic range than any
available display technology
• 40 f-stops, cf 17 f-stops for HDR display
Vergence + Accommodation
• saas
Vergence/Accommodation Demo
• https://www.youtube.com/watch?v=p_xLO7yxgOk
Vergence-Accommodation Conflict
• Looking at real objects, vergence and focal distance match
• In Virtual Reality, vergence and accommodation can miss-match
• Focusing on HMD screen, but accommodating for virtual object behind screen
Visual Acuity
Visual Acuity Test Targets
• Ability to resolve details
• Several types of visual acuity
• detection, separation, etc
• Normal eyesight can see a 50 cent coin at 80m
• Corresponds to 1 arc min (1/60th of a degree)
• Max acuity = 0.4 arc min
Resolution of the Eye
• Decreases away from the fovea
• Maximum resolution of 1 arcmin – spot of 6x10-6
m size on retina
Stereo Perception/Stereopsis
• Eyes separated by IPD
• Inter pupillary distance
• 5 – 7.5cm (average. 6.5cm)
• Each eye sees diff. image
• Separated by image parallax
• Images fused to create 3D
stereo view
Depth Perception
• The visual system uses a range of different Stereoscopic
and Monocular cues for depth perception
Stereoscopic Monocular
eye convergence angle
disparity between left
and right images
diplopia
eye accommodation
perspective
atmospheric artifacts (fog)
relative sizes
image blur
occlusion
motion parallax
shadows
texture
Parallax can be more important for depth perception!
Stereoscopy is important for size and distance evaluation
More Depth Cues
Example: Perspective Cues
• asdas
More Examples
Occlusion
Texture GradientShadows
Depth Perception Distances
• asdf
Fooling Depth Perception
• https://www.youtube.com/watch?v=p-eZcHPp7Go
Properties of the Human Visual System
• visual acuity: 20/20 is ~1 arc min
• field of view: ~200° monocular, ~120° binocular, ~135° vertical
• resolution of eye: ~576 megapixels
• temporal resolution: ~60 Hz (depends on contrast, luminance)
• dynamic range: instantaneous 6.5 f-stops, adapt to 46.5 f-stops
• colour: everything in CIE xy diagram
• depth cues in 3D displays: vergence, focus, (dis)comfort
• accommodation range: ~8cm to ∞, degrades with age
The Perfect Retina Display
• A HMD capable of creating images indistinguishable from
reality would need to match the properties of the eye:
• FOV: 200-220° x 135° needed (both eyes)
• 120° stereo overlap
• Acuity: ~0.4 arc min (1 pixel/0.4 arc min)
• Pixel Resolution: ~30,000 x 20,000 pixels
• 200*60°/0.4 = 30,000, 135*60°/0.4 = 20,250
• Pixels/inch: > 2190 PPI @ 100mm (depends on distance to screen)
• Update rate: 60 Hz
• The biggest challenge: bandwidth
• compress and transmit huge amount of data
• drive and operate display pixels
Comparison between Eyes and HMD
Human Eyes HTC Vive
FOV 200° x 135° 110° x 110°
Stereo Overlap 120° 110°
Resolution 30,000 x 20,000 2,160 x 1,200
Pixels/inch >2190 (100mm to screen) 456
Update 60 Hz 90 Hz
See http://doc-ok.org/?p=1414
http://www.clarkvision.com/articles/eye-resolution.html
http://wolfcrow.com/blog/notes-by-dr-optoglass-the-resolution-of-the-human-eye/
Hearing
Anatomy of the Ear
How the Ear Works
• https://www.youtube.com/watch?v=pCCcFDoyBxM
Sound Frequency and Amplitude
• Frequency determines the pitch of the sound
• Amplitude relates to intensity of the sound
• Loudness is a subjective measure of intensity
High frequency =
short period
Low frequency =
long period
Distance to Listener
• Relationship between sound intensity and distance to the
listener
Inverse-square law
• The intensity varies inversely with the square of the distance from the
source. So if the distance from the source is doubled (increased by a
factor of 2), then the intensity is quartered (decreased by a factor of 4).
Auditory Thresholds
• Humans hear frequencies from 20 – 22,000 Hz
• Most everyday sounds from 80 – 90 dB
Sound Localization
• Humans have two ears
• localize sound in space
• Sound can be localized
using 3 coordinates
• Azimuth, elevation,
distance
Sound Localization
• Azimuth Cues
• Difference in time of sound reaching two ears
• Interaural time difference (ITD)
• Difference in sound intensity reaching two ears
• Interaural level difference (ILD)
• Elevation Cues
• Monaural cues derived from the pinna (ear shape)
• Head related transfer function (HRTF)
• Range Cues
• Difference in sound relative to range from observer
• Head movements (otherwise ITD and ILD are same)
Sound Localization
• https://www.youtube.com/watch?v=FIU1bNSlbxk
Sound Localization (Azimuth Cues)
Interaural Time Difference
HRTF (Elevation Cue)
• Pinna and head shape affect frequency intensities
• Sound intensities measured with microphones in ear and
compared to intensities at sound source
• Difference is HRTF, gives clue as to sound source location
Accuracy of Sound Localization
• People can locate sound
• Most accurately in front of them
• 2-3° error in front of head
• Least accurately to sides and behind head
• Up to 20° error to side of head
• Largest errors occur above/below elevations and behind head
• Front/back confusion is an issue
• Up to 10% of sounds presented in the front are perceived coming
from behind and vice versa (more in headphones)
BUTEAN, A., Bălan, O., NEGOI, I., Moldoveanu, F., & Moldoveanu, A. (2015). COMPARATIVE RESEARCH
ON SOUND LOCALIZATION ACCURACY IN THE FREE-FIELD AND VIRTUAL AUDITORY DISPLAYS.
InConference proceedings of» eLearning and Software for Education «(eLSE)(No. 01, pp. 540-548).
Universitatea Nationala de Aparare Carol I.
Touch
Touch
• Mechanical/Temp/Pain stimuli transduced into Action
Potentials (AP)
• Transducing structures are specialized nerves:
• Mechanoreceptors: Detect pressure, vibrations & texture
• Thermoreceptors: Detect hot/cold
• Nocireceptors: Detect pain
• Proprioreceptors: Detect spatial awareness
• This triggers an AP which then travels to various
locations in the brain via the somatosensory nerves
Haptic Sensation
• Somatosensory System
• complex system of nerve cells that responds to changes to
the surface or internal state of the body
• Skin is the largest organ
• 1.3-1.7 square m in adults
• Tactile: Surface properties
• Receptors not evenly spread
• Most densely populated area is the tongue
• Kinesthetic: Muscles, Tendons, etc.
• Also known as proprioception
Somatosensory System
• Map of somatosensory
areas of the brain,
showing more area for
regions with more
receptors
Anatomy of the Skin
Cutaneous System
• Skin – heaviest organ in the body
• Epidermis outer layer, dead skin cells
• Dermis inner layer, with four kinds of mechanoreceptors
Mechanoreceptors
• Cells that respond to pressure, stretching, and vibration
• Slow Acting (SA), Rapidly Acting (RA)
• Type I at surface – light discriminate touch
• Type II deep in dermis – heavy and continuous touch
Receptor
Type
Rate of
Acting
Stimulus
Frequency
Receptive Field Detection
Function
Merkel
Discs
SA-I 0 – 10 Hz Small, well
defined
Edges, intensity
Ruffini
corpuscles
SA-II 0 – 10 Hz Large, indistinct Static force,
skin stretch
Meissner
corpuscles
RA-I 20 – 50 Hz Small, well
defined
Velocity, edges
Pacinian
corpuscles
RA-II 100 – 300
Hz
Large, indistinct Acceleration,
vibration
Spatial Resolution
• Sensitivity varies greatly
• Two-point discrimination
Body
Site
Threshold
Distance
Finger 2-3mm
Cheek 6mm
Nose 7mm
Palm 10mm
Forehead 15mm
Foot 20mm
Belly 30mm
Forearm 35mm
Upper Arm 39mm
Back 39mm
Shoulder 41mm
Thigh 42mm
Calf 45mm
http://faculty.washington.edu/chudler/chsense.html
Proprioception/Kinaesthesia
• Proprioception (joint position sense)
• Awareness of movement and positions of body parts
• Due to nerve endings and Pacinian and Ruffini corpuscles at joints
• Enables us to touch nose with eyes closed
• Joints closer to body more accurately sensed
• Users know hand position accurate to 8cm without looking at them
• Kinaesthesia (joint movement sense)
• Sensing muscle contraction or stretching
• Cutaneous mechanoreceptors measuring skin stretching
• Helps with force sensation
Smell
Olfactory System
• Human olfactory system. 1: Olfactory bulb 2: Mitral cells 3: Bone 4: Nasal
epithelium 5: Glomerulus 6: Olfactory receptor neurons
How the Nose Works
• https://www.youtube.com/watch?v=zaHR2MAxywg
Smell
• Smells are sensed by olfactory sensory neurons in the
olfactory epithelium
• 10 cm2
with hundreds of different types of olfactory receptors
• Human’s can detect at least 10,000 different odors
• Some researchers say trillions of odors
• Sense of smell closely related to taste
• Both use chemo-receptors
• Olfaction + taste contribute to flavour
• The olfactory system is the only sense that bypasses the
thalamus and connects directly to the forebrain
Taste
Sense of Taste
• https://www.youtube.com/watch?v=FSHGucgnvLU
Basics of Taste
• Sensation produced when a substance in the mouth
reacts chemically with taste receptor cells
• Taste receptors mostly on taste buds on the tongue
• 2,000 – 5,000 taste buds on tongues/100+ receptors each
• Five basic tastes:
• sweetness, sourness, saltiness, bitterness, and umami
• Flavour influenced by other senses
• smell, texture, temperature, “coolness”, “hotness”
Taste Trivia
VR TECHNOLOGY
Using Technology to Stimulate Senses
• Simulate output
• E.g. simulate real scene
• Map output to devices
• Graphics to HMD
• Use devices to
stimulate the senses
• HMD stimulates eyes
Visual
Simulation
3D Graphics HMD Vision
System
Brain
Example: Visual Simulation
Human-Machine Interface
Key Technologies for VR System
• Visual Display
• Stimulate visual sense
• Audio/Tactile Display
• Stimulate hearing/touch
• Tracking
• Changing viewpoint
• User input
• Input Devices
• Supporting user interaction
Mapping Between Input and Output
Input
Output
VISUAL DISPLAY
Creating an Immersive Experience
•Head Mounted Display
•Immerse the eyes
•Projection/Large Screen
•Immerse the head/body
•Future Technologies
•Neural implants
•Contact lens displays, etc
HMD Basic Principles
• Use display with optics to create illusion of virtual screen
Key Properties of HMDs
• Lens
• Focal length, Field of View
• Occularity, Interpupillary distance
• Eye relief, Eye box
• Display
• Resolution, contrast
• Power, brightness
• Refresh rate
• Ergonomics
• Size, weight
• Wearability
Simple Magnifier HMD Design
p
q
Eyepiece
(one or more lenses) Display
(Image Source)
Eye f
Virtual
Image
1/p + 1/q = 1/f where
p = object distance (distance from image source to eyepiece)
q = image distance (distance of image from the lens)
f = focal length of the lens
Field of View
Monocular FOV is the angular subtense
(usually expressed in degrees) of the
displayed image as measured from the
pupil of one eye.
Total FOV is the total angular size of the displayed image
visible to both eyes.
Binocular(or stereoscopic) FOV refers to
the part of the displayed image visible to
both eyes.
FOV may be measured horizontally,
vertically or diagonally.
Ocularity
• Monocular - HMD
image goes to only
one eye.
• Bioccular - Same
HMD image to both
eyes.
• Binocular
(stereoscopic) -
Different but
matched images to
each eye.
Interpupillary Distance (IPD)
nIPD is the horizontal
distance between a
user's eyes.
nIPD is the distance
between the two
optical axes in a
binocular view system.
Distortion in Lens Optics
A rectangle Maps to this
Example Distortion
Oculus Rift DK2 HTC Vive
To Correct for Distortion
• Must pre-distort image
• This is a pixel-based
distortion
• Graphics rendering uses
linear interpolation!
• Too slow on most systems
• Use shader programming
HMD Design Trade-offs
• Resolution vs. field of view
• As FOV increases, resolution decreases for fixed pixels
• Eye box vs. field of view
• Larger eye box limits field of view
• Size, Weight and Power vs. everything else
vs.
Oculus Rift
• Cost: $599 USD
• FOV: 110
o
Horizontal
• Refresh rate: 90 Hz
• Resolution 1080x1200/eye
• 3 DOF orientation tracking
• 3 axis positional tracking
Inside an Oculus Rift
Comparison Between HMDs
Computer Based vs. Mobile VR Displays
• dsfsaf
Google Cardboard
• Released 2014 (Google 20% project)
• >5 million shipped/given away
• Easy to use developer tools
+ =
Multiple Mobile VR Viewers Available
Projection/Large Display Technologies
• Room Scale Projection
• CAVE, multi-wall environment
• Dome projection
• Hemisphere/spherical display
• Head/body inside
• Vehicle Simulator
• Simulated visual display in windows
CAVE
• Developed in 1992, EVL University of Illinois Chicago
• Multi-walled stereo projection environment
• Head tracked active stereo
Cruz-Neira, C., Sandin, D. J., DeFanti, T. A., Kenyon, R. V., & Hart, J. C. (1992). The CAVE: audio
visual experience automatic virtual environment. Communications of the ACM, 35(6), 64-73.
Typical CAVE Setup
• 4 sides, rear projected stereo images
Demo Video – Wisconsin CAVE
• https://www.youtube.com/watch?v=mBs-OGDoPDY
CAVE Variations
Stereo Projection
• Active Stereo
• Active shutter glasses
• Time synced signal
• Brighter images
• More expensive
• Passive Stereo
• Polarized images
• Two projectors (one/eye)
• Cheap glasses (powerless)
• Lower resolution/dimmer
• Less expensive
Caterpillar Demo
• https://www.youtube.com/watch?v=r9N1w8PmD1E
Allosphere
• Univ. California Santa Barbara
• One of a kind facility
• Immersive Spherical display
• 10 m diameter
• Inside 3 story anechoic cube
• Passive stereoscopic projection
• 26 projectors
• Visual tracking system for input
• See http://www.allosphere.ucsb.edu/
Kuchera-Morin, J., Wright, M., Wakefield, G., Roberts,
C., Adderton, D., Sajadi, B., ... & Majumder, A. (2014).
Immersive full-surround multi-user system
design. Computers & Graphics, 40, 10-21.
Allosphere Demo
• https://www.youtube.com/watch?v=25Ch8eE0vJg
Vehicle Simulators
• Combine VR displays with vehicle
• Visual displays on windows
• Motion base for haptic feedback
• Audio feedback
• Physical vehicle controls
• Steering wheel, flight stick, etc
• Full vehicle simulation
• Emergencies, normal operation, etc
• Weapon operation
• Training scenarios
Demo: Boeing 787 Simulator
• https://www.youtube.com/watch?v=3iah-blsw_U
HAPTIC/TACTILE DISPLAYS
Haptic Feedback
• Greatly improves realism
• When is it needed?
• Other cues occluded/obstructed
• Required for task performance
• High bandwidth!
• Hands and wrist are most important
• High density of touch receptors
• Two kinds of feedback
• Touch Feedback
• information on texture, temperature, etc.
• Does not resist user contact
• Force Feedback
• information on weight, and inertia.
• Actively resists contact motion
Haptic Devices
• Pin arrays for the finger(s)
• Force-feedback "arms"
• "Pager" motors
• Particle brakes
• Passive haptics
• Many devices are application specific
• Like surgical devices
Active Haptics
• Actively resists contact motion
• Dimensions
• Force resistance
• Frequency Response
• Degrees of Freedom
• Latency
• Intrusiveness
• Safety
• Comfort
• Portability
Example: Phantom Omni
• Combined stylus input/haptic output
• 6 DOF haptic feedback
Phantom Omni Demo
• https://www.youtube.com/watch?v=REA97hRX0WQ
Immersion Cybergrasp
• Haptic feedback on Glove
• Combined with glove input
CyberMotion (MPI Tübingen)
1
2
5
https://www.youtube.com/watch?v=shdS-hynLHg
CableRobot (MPI Tübingen)
1
2
6
https://www.youtube.com/watch?v=cJCsomGwdk0&t=3s
Passive Haptics
• Not controlled by system
• Pros
• Cheap
• Large scale
• Accurate
• Cons
• Not dynamic
• Limited use
UNC Being There Project
Passive Haptic Paddle
• Using physical props to provide haptic feedback
• http://www.cs.wpi.edu/~gogo/hive/
Tactile Feedback Interfaces
• Goal: Stimulate skin tactile receptors
• Using different technologies
• Air bellows
• Jets
• Actuators (commercial)
• Micropin arrays
• Electrical (research)
• Neuromuscular stimulations (research)
Vibrotactile Cueing Devices
• Vibrotactile feedback has been incorporated into many
devices
• Can we use this technology to provide scalable, wearable
touch cues?
Vibrotactile Feedback Projects
Navy TSAS Project
TactaBoard and
TactaVest
Example: CyberTouch Glove
• Immersion Corporation
• Expensive - $15000
• Six Vibrotactile actuators
• Back of finger
• Palm
• Off-centered actuator motor
• Rotation speed=frequency of
vibration (0-125 Hz)
• When tracked virtual hand
intersects with virtual object,
send signal to glove to vibrate
http://www.cyberglovesystems.com/cybertouch/
AUDIO DISPLAYS
Audio Displays
• Spatialization vs. Localization
• Spatialization is the processing of sound signals to make
them emanate from a point in space
• This is a technical topic
• Localization is the ability of people to identify the source
position of a sound
• This is a human topic, i.e., some people are better at it than others.
Audio Display Properties
Presentation Properties
• Number of channels
• Sound stage
• Localization
• Masking
• Amplification
Logistical Properties
o Noise pollution
o User mobility
o Interface with tracking
o Environmental
requirements
o Integration
o Portability
o Throughput
o Cumber
o Safety
o Cost
Channels & Masking
• Number of channels
• Stereo vs. mono vs. quadrophonic
• 2.1, 5.1, 7.1
• Two kinds of masking
• Louder sounds mask softer ones
• We have too many things vying for our audio attention these days!
• Physical objects mask sound signals
• Happens with speakers, but not with headphones
Audio Displays: Head-worn
Ear Buds On Ear Open
Back
Closed Bone
Conduction
Spatialized Audio Effects
• Naïve approach
• Simple left/right shift for lateral position
• Amplitude adjustment for distance
• Easy to produce using consumer hardware/software
• Does not give us "true" realism in sound
• No up/down or front/back cues
• We can use multiple speakers for this
• Surround the user with speakers
• Send different sound signals to each one
Head-Related Transfer Functions
• A.k.a. HRTFs
• A set of functions that model how sound from a source at
a known location reaches the eardrum
Constructing HRTFs
• Small microphones placed into ear canals
• Subject sits in an anechoic chamber
• Can use a mannequin's head instead
• Sounds played from a large number of known locations
around the chamber
• Functions are constructed for this data
• Sound signal is filtered through inverse functions to place
the sound at the desired source
OSSIC 3D Audio Headphones
• https://www.ossic.com/3d-audio/
OSSIC Demo
• https://www.youtube.com/watch?v=WjvofhhzTik
Measuring HRTFs
• Putting microphones in Manikin or human ears
• Playing sound from fixed positions
• Record response
More About HRTFs
• Functions take into account, for example,
• Individual ear shape
• Slope of shoulders
• Head shape
• So, each person has his/her own HRTF!
• Need to have a parameterizable HRTFs
• Some sound cards/APIs allow you to specify an HRTF
• Check Wikipedia or Google for more info!
TRACKING
Tracking inVirtual Reality
• Registration
• Positioning virtual object wrt real/virtual objects
• Tracking
• Continually locating the users viewpoint/input
• Position (x,y,z), Orientation (r,p,y)
Tracking Technologies
• Active
• Mechanical, Magnetic, Ultrasonic
• GPS, Wifi, cell location
• Passive
• Inertial sensors (compass, accelerometer, gyro)
• Computer Vision
• Marker based, Natural feature tracking
• Hybrid Tracking
• Combined sensors (eg Vision + Inertial)
MechanicalTracker
•Idea: mechanical arms with joint sensors
•++: high accuracy, haptic feedback
•-- : cumbersome, expensive
Microscribe Sutherland
MagneticTracker
• Idea: difference between a magnetic
transmitter and a receiver
• ++: 6DOF, robust
• -- : wired, sensible to metal, noisy, expensive
• -- : error increases with distance
Flock of Birds (Ascension)
Example: Razer Hydra
• Developed by Sixense
• Magnetic source + 2 wired controllers
• Short range (1-2 m)
• Precision of 1mm and 1o
• $600 USD
Razor Hydra Demo
• https://www.youtube.com/watch?v=jnqFdSa5p7w
InertialTracker
• Idea: measuring linear and angular orientation rates
(accelerometer/gyroscope)
• ++: no transmitter, cheap, small, high frequency, wireless
• -- : drift, hysteris only 3DOF
IS300 (Intersense)
Wii Remote
OpticalTracker
• Idea: Image Processing and ComputerVision
• Specialized
• Infrared, Retro-Reflective, Stereoscopic
• Monocular BasedVisionTracking
ART Hi-Ball
Outside-In vs.Inside-OutTracking
Example: Vive Lighthouse Tracking
• Outside-in tracking system
• 2 base stations
• Each with 2 laser scanners, LED array
• Headworn/handheld sensors
• 37 photo-sensors in HMD, 17 in hand
• Additional IMU sensors (500 Hz)
• Performance
• Tracking server fuses sensor samples
• Sampling rate 250 Hz, 4 ms latency
• 2mm RMS tracking accuracy
• See http://doc-ok.org/?p=1478
Lighthouse Components
Base station
- IR LED array
- 2 x scanned lasers
Head Mounted Display
- 37 photo sensors
- 9 axis IMU
Lighthouse Setup
How Lighthouse Tracking Works
• Position tracking using IMU
• 500 Hz sampling
• But drifts over time
• Drift correction using optical tracking
• IR synchronization pulse (60 Hz)
• Laser sweep between pulses
• Photo-sensors recognize sync pulse, measure time to laser
• Know when sensor hit and which sensor hit
• Calculate position of sensor relative to base station
• Use 2 base stations to calculate pose
• Use IMU sensor data between pulses (500Hz)
• See http://xinreality.com/wiki/Lighthouse
Lighthouse Tracking
Base station scanning
https://www.youtube.com/watch?v=avBt_P0wg_Y
https://www.youtube.com/watch?v=oqPaaMR4kY4
Room tracking
www.empathiccomputing.org
@marknb00
mark.billinghurst@unisa.edu.au

Contenu connexe

Tendances

Comp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsComp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsMark Billinghurst
 
2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR SystemsMark Billinghurst
 
Comp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface DesignComp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface DesignMark Billinghurst
 
Comp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRComp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRMark Billinghurst
 
COMP 4010 Lecture 3 VR Input and Systems
COMP 4010 Lecture 3 VR Input and SystemsCOMP 4010 Lecture 3 VR Input and Systems
COMP 4010 Lecture 3 VR Input and SystemsMark Billinghurst
 
Comp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-PerceptionComp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-PerceptionMark Billinghurst
 
COMP 4010 Lecture3: Human Perception
COMP 4010 Lecture3: Human PerceptionCOMP 4010 Lecture3: Human Perception
COMP 4010 Lecture3: Human PerceptionMark Billinghurst
 
Comp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsComp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsMark Billinghurst
 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR PrototypingMark Billinghurst
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsMark Billinghurst
 
COMP 4010 Lecture10: AR Tracking
COMP 4010 Lecture10: AR TrackingCOMP 4010 Lecture10: AR Tracking
COMP 4010 Lecture10: AR TrackingMark Billinghurst
 
COMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR InteractionCOMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR InteractionMark Billinghurst
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VRMark Billinghurst
 
Comp 4010 2021 - Snap Tutorial-1
Comp 4010 2021 - Snap Tutorial-1Comp 4010 2021 - Snap Tutorial-1
Comp 4010 2021 - Snap Tutorial-1Mark Billinghurst
 
Virtual Reality: Sensing the Possibilities
Virtual Reality: Sensing the PossibilitiesVirtual Reality: Sensing the Possibilities
Virtual Reality: Sensing the PossibilitiesMark Billinghurst
 
COMP 4010 - Lecture 2: Presence in Virtual Reality
COMP 4010 - Lecture 2: Presence in Virtual RealityCOMP 4010 - Lecture 2: Presence in Virtual Reality
COMP 4010 - Lecture 2: Presence in Virtual RealityMark Billinghurst
 
Comp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR SystemsComp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR SystemsMark Billinghurst
 

Tendances (20)

Comp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsComp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research Directions
 
2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems
 
Comp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface DesignComp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface Design
 
Comp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VRComp4010 Lecture8 Introduction to VR
Comp4010 Lecture8 Introduction to VR
 
COMP 4010 Lecture 3 VR Input and Systems
COMP 4010 Lecture 3 VR Input and SystemsCOMP 4010 Lecture 3 VR Input and Systems
COMP 4010 Lecture 3 VR Input and Systems
 
Comp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-PerceptionComp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-Perception
 
COMP 4010 Lecture3: Human Perception
COMP 4010 Lecture3: Human PerceptionCOMP 4010 Lecture3: Human Perception
COMP 4010 Lecture3: Human Perception
 
Comp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsComp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and Systems
 
Lecture1 introduction to VR
Lecture1 introduction to VRLecture1 introduction to VR
Lecture1 introduction to VR
 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications
 
Designing Usable Interface
Designing Usable InterfaceDesigning Usable Interface
Designing Usable Interface
 
COMP 4010 Lecture10: AR Tracking
COMP 4010 Lecture10: AR TrackingCOMP 4010 Lecture10: AR Tracking
COMP 4010 Lecture10: AR Tracking
 
COMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR InteractionCOMP 4010 Lecture9 AR Interaction
COMP 4010 Lecture9 AR Interaction
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR
 
Comp 4010 2021 - Snap Tutorial-1
Comp 4010 2021 - Snap Tutorial-1Comp 4010 2021 - Snap Tutorial-1
Comp 4010 2021 - Snap Tutorial-1
 
Virtual Reality: Sensing the Possibilities
Virtual Reality: Sensing the PossibilitiesVirtual Reality: Sensing the Possibilities
Virtual Reality: Sensing the Possibilities
 
COMP 4010 - Lecture 2: Presence in Virtual Reality
COMP 4010 - Lecture 2: Presence in Virtual RealityCOMP 4010 - Lecture 2: Presence in Virtual Reality
COMP 4010 - Lecture 2: Presence in Virtual Reality
 
Comp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR SystemsComp4010 Lecture7 Designing AR Systems
Comp4010 Lecture7 Designing AR Systems
 
Lecture 4: VR Systems
Lecture 4: VR SystemsLecture 4: VR Systems
Lecture 4: VR Systems
 

En vedette

ISA15 - Influence of a HMD on UX and performance in a VR-based sports applica...
ISA15 - Influence of a HMD on UX and performance in a VR-based sports applica...ISA15 - Influence of a HMD on UX and performance in a VR-based sports applica...
ISA15 - Influence of a HMD on UX and performance in a VR-based sports applica...Pedro Kayatt
 
COMP 4010: Lecture 5 - Interaction Design for Virtual Reality
COMP 4010: Lecture 5 - Interaction Design for Virtual RealityCOMP 4010: Lecture 5 - Interaction Design for Virtual Reality
COMP 4010: Lecture 5 - Interaction Design for Virtual RealityMark Billinghurst
 
Fifty Shades of Augmented Reality: Creating Connection Using AR
Fifty Shades of Augmented Reality: Creating Connection Using ARFifty Shades of Augmented Reality: Creating Connection Using AR
Fifty Shades of Augmented Reality: Creating Connection Using ARMark Billinghurst
 
COMP 4010 - Lecture 7: Introduction to Augmented Reality
COMP 4010 - Lecture 7: Introduction to Augmented RealityCOMP 4010 - Lecture 7: Introduction to Augmented Reality
COMP 4010 - Lecture 7: Introduction to Augmented RealityMark Billinghurst
 
COMP 4010: Lecture8 - AR Technology
COMP 4010: Lecture8 - AR TechnologyCOMP 4010: Lecture8 - AR Technology
COMP 4010: Lecture8 - AR TechnologyMark Billinghurst
 
COMP 4010 - Lecture10: Mobile AR
COMP 4010 - Lecture10: Mobile ARCOMP 4010 - Lecture10: Mobile AR
COMP 4010 - Lecture10: Mobile ARMark Billinghurst
 
COMP 4010: Lecture 6 Example VR Applications
COMP 4010: Lecture 6 Example VR ApplicationsCOMP 4010: Lecture 6 Example VR Applications
COMP 4010: Lecture 6 Example VR ApplicationsMark Billinghurst
 
Collaborative Immersive Analytics
Collaborative Immersive AnalyticsCollaborative Immersive Analytics
Collaborative Immersive AnalyticsMark Billinghurst
 
Developing AR and VR Experiences with Unity
Developing AR and VR Experiences with UnityDeveloping AR and VR Experiences with Unity
Developing AR and VR Experiences with UnityMark Billinghurst
 
Create Your Own VR Experience
Create Your Own VR ExperienceCreate Your Own VR Experience
Create Your Own VR ExperienceMark Billinghurst
 
COMP 4010 - Lecture11 - AR Applications
COMP 4010 - Lecture11 - AR ApplicationsCOMP 4010 - Lecture11 - AR Applications
COMP 4010 - Lecture11 - AR ApplicationsMark Billinghurst
 
COMP 4010 Lecture12 - Research Directions in AR and VR
COMP 4010 Lecture12 - Research Directions in AR and VRCOMP 4010 Lecture12 - Research Directions in AR and VR
COMP 4010 Lecture12 - Research Directions in AR and VRMark Billinghurst
 
COMP 4010 Lecture 9 AR Interaction
COMP 4010 Lecture 9 AR InteractionCOMP 4010 Lecture 9 AR Interaction
COMP 4010 Lecture 9 AR InteractionMark Billinghurst
 
Beyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented RealityBeyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented RealityMark Billinghurst
 
Building VR Applications For Google Cardboard
Building VR Applications For Google CardboardBuilding VR Applications For Google Cardboard
Building VR Applications For Google CardboardMark Billinghurst
 
COMP 4010 - Lecture1 Introduction to Virtual Reality
COMP 4010 - Lecture1 Introduction to Virtual RealityCOMP 4010 - Lecture1 Introduction to Virtual Reality
COMP 4010 - Lecture1 Introduction to Virtual RealityMark Billinghurst
 
COMP 4010: Lecture 4 - 3D User Interfaces for VR
COMP 4010: Lecture 4 - 3D User Interfaces for VRCOMP 4010: Lecture 4 - 3D User Interfaces for VR
COMP 4010: Lecture 4 - 3D User Interfaces for VRMark Billinghurst
 
Using Interaction Design Methods for Creating AR and VR Interfaces
Using Interaction Design Methods for Creating AR and VR InterfacesUsing Interaction Design Methods for Creating AR and VR Interfaces
Using Interaction Design Methods for Creating AR and VR InterfacesMark Billinghurst
 

En vedette (20)

ISA15 - Influence of a HMD on UX and performance in a VR-based sports applica...
ISA15 - Influence of a HMD on UX and performance in a VR-based sports applica...ISA15 - Influence of a HMD on UX and performance in a VR-based sports applica...
ISA15 - Influence of a HMD on UX and performance in a VR-based sports applica...
 
COMP 4010: Lecture 5 - Interaction Design for Virtual Reality
COMP 4010: Lecture 5 - Interaction Design for Virtual RealityCOMP 4010: Lecture 5 - Interaction Design for Virtual Reality
COMP 4010: Lecture 5 - Interaction Design for Virtual Reality
 
Fifty Shades of Augmented Reality: Creating Connection Using AR
Fifty Shades of Augmented Reality: Creating Connection Using ARFifty Shades of Augmented Reality: Creating Connection Using AR
Fifty Shades of Augmented Reality: Creating Connection Using AR
 
COMP 4010 - Lecture 7: Introduction to Augmented Reality
COMP 4010 - Lecture 7: Introduction to Augmented RealityCOMP 4010 - Lecture 7: Introduction to Augmented Reality
COMP 4010 - Lecture 7: Introduction to Augmented Reality
 
COMP 4010: Lecture8 - AR Technology
COMP 4010: Lecture8 - AR TechnologyCOMP 4010: Lecture8 - AR Technology
COMP 4010: Lecture8 - AR Technology
 
COMP 4010 - Lecture10: Mobile AR
COMP 4010 - Lecture10: Mobile ARCOMP 4010 - Lecture10: Mobile AR
COMP 4010 - Lecture10: Mobile AR
 
COMP 4010: Lecture 6 Example VR Applications
COMP 4010: Lecture 6 Example VR ApplicationsCOMP 4010: Lecture 6 Example VR Applications
COMP 4010: Lecture 6 Example VR Applications
 
Easy Virtual Reality
Easy Virtual RealityEasy Virtual Reality
Easy Virtual Reality
 
Collaborative Immersive Analytics
Collaborative Immersive AnalyticsCollaborative Immersive Analytics
Collaborative Immersive Analytics
 
Developing AR and VR Experiences with Unity
Developing AR and VR Experiences with UnityDeveloping AR and VR Experiences with Unity
Developing AR and VR Experiences with Unity
 
Create Your Own VR Experience
Create Your Own VR ExperienceCreate Your Own VR Experience
Create Your Own VR Experience
 
COMP 4010 - Lecture11 - AR Applications
COMP 4010 - Lecture11 - AR ApplicationsCOMP 4010 - Lecture11 - AR Applications
COMP 4010 - Lecture11 - AR Applications
 
COMP 4010 Lecture12 - Research Directions in AR and VR
COMP 4010 Lecture12 - Research Directions in AR and VRCOMP 4010 Lecture12 - Research Directions in AR and VR
COMP 4010 Lecture12 - Research Directions in AR and VR
 
COMP 4010 Lecture 9 AR Interaction
COMP 4010 Lecture 9 AR InteractionCOMP 4010 Lecture 9 AR Interaction
COMP 4010 Lecture 9 AR Interaction
 
Beyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented RealityBeyond Reality (2027): The Future of Virtual and Augmented Reality
Beyond Reality (2027): The Future of Virtual and Augmented Reality
 
Building VR Applications For Google Cardboard
Building VR Applications For Google CardboardBuilding VR Applications For Google Cardboard
Building VR Applications For Google Cardboard
 
COMP 4010 - Lecture1 Introduction to Virtual Reality
COMP 4010 - Lecture1 Introduction to Virtual RealityCOMP 4010 - Lecture1 Introduction to Virtual Reality
COMP 4010 - Lecture1 Introduction to Virtual Reality
 
COMP 4010: Lecture 4 - 3D User Interfaces for VR
COMP 4010: Lecture 4 - 3D User Interfaces for VRCOMP 4010: Lecture 4 - 3D User Interfaces for VR
COMP 4010: Lecture 4 - 3D User Interfaces for VR
 
Using Interaction Design Methods for Creating AR and VR Interfaces
Using Interaction Design Methods for Creating AR and VR InterfacesUsing Interaction Design Methods for Creating AR and VR Interfaces
Using Interaction Design Methods for Creating AR and VR Interfaces
 
Brands in Virtual Reality
Brands in Virtual RealityBrands in Virtual Reality
Brands in Virtual Reality
 

Similaire à VR LECTURE: UNDERSTANDING HUMAN PERCEPTION

COMP 4010 - Lecture 2: VR Technology
COMP 4010 - Lecture 2: VR TechnologyCOMP 4010 - Lecture 2: VR Technology
COMP 4010 - Lecture 2: VR TechnologyMark Billinghurst
 
COMP 4010 Lecture10 AR/VR Research Directions
COMP 4010 Lecture10 AR/VR Research DirectionsCOMP 4010 Lecture10 AR/VR Research Directions
COMP 4010 Lecture10 AR/VR Research DirectionsMark Billinghurst
 
Empathic Computing: New Approaches to Gaming
Empathic Computing: New Approaches to GamingEmpathic Computing: New Approaches to Gaming
Empathic Computing: New Approaches to GamingMark Billinghurst
 
COMP Lecture1 - Introduction to Virtual Reality
COMP Lecture1 - Introduction to Virtual RealityCOMP Lecture1 - Introduction to Virtual Reality
COMP Lecture1 - Introduction to Virtual RealityMark Billinghurst
 
Immediacy, immersion and immersiveness in augmented reality pt1
Immediacy, immersion and immersiveness in augmented reality pt1Immediacy, immersion and immersiveness in augmented reality pt1
Immediacy, immersion and immersiveness in augmented reality pt1Mark_Childs
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseMark Billinghurst
 
The NeuroPsychology of Presence
The NeuroPsychology of PresenceThe NeuroPsychology of Presence
The NeuroPsychology of PresenceRiva Giuseppe
 
Introduction To Virtual Reality
Introduction To Virtual RealityIntroduction To Virtual Reality
Introduction To Virtual RealityRashmi Bhat
 
Multimodal Multi-sensory Interaction for Mixed Reality
Multimodal Multi-sensory Interaction for Mixed RealityMultimodal Multi-sensory Interaction for Mixed Reality
Multimodal Multi-sensory Interaction for Mixed RealityMark Billinghurst
 
COMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual RealityCOMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual RealityMark Billinghurst
 
Vr and language learning cynthia calongne
Vr and language learning cynthia calongneVr and language learning cynthia calongne
Vr and language learning cynthia calongneCynthia Calongne
 
Perception! Immersion! Empowerment! Superpowers as Inspiration for Visualization
Perception! Immersion! Empowerment! Superpowers as Inspiration for VisualizationPerception! Immersion! Empowerment! Superpowers as Inspiration for Visualization
Perception! Immersion! Empowerment! Superpowers as Inspiration for Visualizationivaderivader
 
Assignment 3.pptx
Assignment 3.pptxAssignment 3.pptx
Assignment 3.pptxVijohnDavid
 
COMP 4010 - Lecture 4: 3D User Interfaces
COMP 4010 - Lecture 4: 3D User InterfacesCOMP 4010 - Lecture 4: 3D User Interfaces
COMP 4010 - Lecture 4: 3D User InterfacesMark Billinghurst
 
Rafael Brown (Digital Myths): Social VR: Creating Social Presence
Rafael Brown (Digital Myths): Social VR: Creating Social PresenceRafael Brown (Digital Myths): Social VR: Creating Social Presence
Rafael Brown (Digital Myths): Social VR: Creating Social PresenceAugmentedWorldExpo
 

Similaire à VR LECTURE: UNDERSTANDING HUMAN PERCEPTION (20)

COMP 4010 - Lecture 2: VR Technology
COMP 4010 - Lecture 2: VR TechnologyCOMP 4010 - Lecture 2: VR Technology
COMP 4010 - Lecture 2: VR Technology
 
ISS2022 Keynote
ISS2022 KeynoteISS2022 Keynote
ISS2022 Keynote
 
Lecture3 - VR Technology
Lecture3 - VR TechnologyLecture3 - VR Technology
Lecture3 - VR Technology
 
COMP 4010 Lecture10 AR/VR Research Directions
COMP 4010 Lecture10 AR/VR Research DirectionsCOMP 4010 Lecture10 AR/VR Research Directions
COMP 4010 Lecture10 AR/VR Research Directions
 
Empathic Computing: New Approaches to Gaming
Empathic Computing: New Approaches to GamingEmpathic Computing: New Approaches to Gaming
Empathic Computing: New Approaches to Gaming
 
COMP Lecture1 - Introduction to Virtual Reality
COMP Lecture1 - Introduction to Virtual RealityCOMP Lecture1 - Introduction to Virtual Reality
COMP Lecture1 - Introduction to Virtual Reality
 
Immediacy, immersion and immersiveness in augmented reality pt1
Immediacy, immersion and immersiveness in augmented reality pt1Immediacy, immersion and immersiveness in augmented reality pt1
Immediacy, immersion and immersiveness in augmented reality pt1
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the Metaverse
 
The NeuroPsychology of Presence
The NeuroPsychology of PresenceThe NeuroPsychology of Presence
The NeuroPsychology of Presence
 
Module 1 VR.pdf
Module 1 VR.pdfModule 1 VR.pdf
Module 1 VR.pdf
 
Introduction To Virtual Reality
Introduction To Virtual RealityIntroduction To Virtual Reality
Introduction To Virtual Reality
 
Multimodal Multi-sensory Interaction for Mixed Reality
Multimodal Multi-sensory Interaction for Mixed RealityMultimodal Multi-sensory Interaction for Mixed Reality
Multimodal Multi-sensory Interaction for Mixed Reality
 
COMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual RealityCOMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual Reality
 
Empathic Tele-Existence
Empathic Tele-ExistenceEmpathic Tele-Existence
Empathic Tele-Existence
 
Vr and language learning cynthia calongne
Vr and language learning cynthia calongneVr and language learning cynthia calongne
Vr and language learning cynthia calongne
 
Perception! Immersion! Empowerment! Superpowers as Inspiration for Visualization
Perception! Immersion! Empowerment! Superpowers as Inspiration for VisualizationPerception! Immersion! Empowerment! Superpowers as Inspiration for Visualization
Perception! Immersion! Empowerment! Superpowers as Inspiration for Visualization
 
Assignment 3.pptx
Assignment 3.pptxAssignment 3.pptx
Assignment 3.pptx
 
COMP 4010 - Lecture 4: 3D User Interfaces
COMP 4010 - Lecture 4: 3D User InterfacesCOMP 4010 - Lecture 4: 3D User Interfaces
COMP 4010 - Lecture 4: 3D User Interfaces
 
ICS2208 lecture6
ICS2208 lecture6ICS2208 lecture6
ICS2208 lecture6
 
Rafael Brown (Digital Myths): Social VR: Creating Social Presence
Rafael Brown (Digital Myths): Social VR: Creating Social PresenceRafael Brown (Digital Myths): Social VR: Creating Social Presence
Rafael Brown (Digital Myths): Social VR: Creating Social Presence
 

Plus de Mark Billinghurst

Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsMark Billinghurst
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024Mark Billinghurst
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented RealityMark Billinghurst
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesMark Billinghurst
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the MetaverseMark Billinghurst
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationMark Billinghurst
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseMark Billinghurst
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR SystemsMark Billinghurst
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsMark Billinghurst
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseMark Billinghurst
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional InterfacesMark Billinghurst
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsMark Billinghurst
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsMark Billinghurst
 
Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Mark Billinghurst
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARMark Billinghurst
 

Plus de Mark Billinghurst (16)

Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented Reality
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR Experiences
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the Metaverse
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader Metaverse
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR Systems
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive Analytics
 
Metaverse Learning
Metaverse LearningMetaverse Learning
Metaverse Learning
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole Metaverse
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional Interfaces
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research Directions
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications
 
Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise AR
 

Dernier

Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek SchlawackFwdays
 
The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxLoriGlavin3
 
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024BookNet Canada
 
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESSALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESmohitsingh558521
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxLoriGlavin3
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupFlorian Wilhelm
 
Take control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteTake control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteDianaGray10
 
unit 4 immunoblotting technique complete.pptx
unit 4 immunoblotting technique complete.pptxunit 4 immunoblotting technique complete.pptx
unit 4 immunoblotting technique complete.pptxBkGupta21
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebUiPathCommunity
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii SoldatenkoFwdays
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsPixlogix Infotech
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxLoriGlavin3
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfAlex Barbosa Coqueiro
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsRizwan Syed
 
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptxPasskey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptxLoriGlavin3
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr BaganFwdays
 
Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 3652toLead Limited
 
DevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache MavenDevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache MavenHervé Boutemy
 

Dernier (20)

Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
 
The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptx
 
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESSALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project Setup
 
Take control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteTake control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test Suite
 
unit 4 immunoblotting technique complete.pptx
unit 4 immunoblotting technique complete.pptxunit 4 immunoblotting technique complete.pptx
unit 4 immunoblotting technique complete.pptx
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio Web
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and Cons
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
 
DMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special EditionDMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special Edition
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdf
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL Certs
 
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptxPasskey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptx
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan
 
Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365
 
DevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache MavenDevoxxFR 2024 Reproducible Builds with Apache Maven
DevoxxFR 2024 Reproducible Builds with Apache Maven
 

VR LECTURE: UNDERSTANDING HUMAN PERCEPTION

  • 1. LECTURE 2: VR TECHNOLOGY COMP 4010 - Virtual Reality Semester 5 - 2017 Mark Billinghurst, Bruce Thomas University of South Australia
  • 2. Overview •Presence in VR •Perception and VR •Human Perception •VR Technology
  • 4. Presence .. “The subjective experience of being in one place or environment even when physically situated in another” Witmer, B. G., & Singer, M. J. (1998). Measuring presence in virtual environments: A presence questionnaire. Presence: Teleoperators and virtual environments, 7(3), 225-240.
  • 5. Immersion vs. Presence • Immersion: describes the extent to which technology is capable of delivering a vivid illusion of reality to the senses of a human participant. • Presence: a state of consciousness, the (psychological) sense of being in the virtual environment. • So Immersion, defined in technical terms, is capable of producing a sensation of Presence • Goal of VR: Create a high degree of Presence • Make people believe they are really in Virtual Environment Slater, M., & Wilbur, S. (1997). A framework for immersive virtual environments (FIVE): Speculations on the role of presence in virtual environments. Presence: Teleoperators and virtual environments, 6(6), 603-616.
  • 6. How to Create Strong Presence? • Use Multiple Dimensions of Presence • Create rich multi-sensory VR experiences • Include social actors/agents that interact with user • Have environment respond to user • What Influences Presence • Vividness – ability to provide rich experience (Steuer 1992) • Using Virtual Body – user can see themselves (Slater 1993) • Internal factors – individual user differences (Sadowski 2002) • Interactivity – how much users can interact (Steuer 1992) • Sensory, Realism factors (Witmer 1998)
  • 7. Example: UNC Pit Room • Key Features • Training room and pit room • Physical walking • Fast, accurate, room scale tracking • Haptic feedback – feel edge of pit, walls • Strong visual and 3D audio cues • Task • Carry object across pit • Walk across or walk around • Dropping virtual balls at targets in pit • http://wwwx.cs.unc.edu/Research/eve/walk_exp/
  • 8. Typical Subject Behaviour • Note – from another pit experiment • https://www.youtube.com/watch?v=VVAO0DkoD-8
  • 9. Benefits of High Presence • Leads to greater engagement, excitement and satisfaction • Increased reaction to actions in VR • People more likely to behave like in the real world • E.g. people scared of heights in real world will be scared in VR • More natural communication (Social Presence) • Use same cues as face to face conversation • Note: The relationship between Presence and Performance is unclear – still an active area of research
  • 10. Measuring Presence • Presence is very subjective so there is a lot of debate among researchers about how to measure it • Subjective Measures • Self report questionnaire • University College London Questionnaire (Slater 1999) • Witmer and Singer Presence Questionnaire (Witmer 1998) • ITC Sense Of Presence Inventory (Lessiter 2000) • Continuous measure • Person moves slider bar in VE depending on Presence felt • Objective Measures • Behavioural • reflex/flinch measure, startle response • Physiological measures • change in heart rate, skin conductance, skin temperature Presence Slider
  • 11. Relevant Papers • Slater, M., & Usoh, M. (1993). Representation systems, perceptual positions, and presence in immersive virtual environments. Presence, 2:221–233. • Slater, M. (1999). Measuring presence: A response to the Witmer and Singer Presence Questionnaire. Presence, 8:560–565. • Steuer, J. (1992). Defining virtual reality: Dimensions determining telepresence. Journal of Communication, 42(4):72–93. • Sadowski, W. J. and Stanney, K. M. (2002) Measuring and Managing Presence in Virtual Environments. In: Handbook of Virtual Environments: Design, implementation, and applications.http://vehand.engr.ucf.edu/handbook/ • Schuemie, M. J., Van Der Straaten, P., Krijn, M., & Van Der Mast, C. A. (2001). Research on presence in virtual reality: A survey CyberPsychology & Behavior, 4(2), 183-201. • Lee, K. M. (2004). Presence, explicated. Communication theory, 14(1), 27-50. • Witmer, B. G., & Singer, M. J. (1998). Measuring presence in virtual environments: A presence questionnaire. Presence: Teleoperators and virtual environments, 7(3), 225-240. • Lessiter, J., Freeman, J., Keogh, E., & Davidoff, J. (2000). Development of a new cross- media presence questionnaire: The ITC-Sense of presence. Paper at the Presence 2000 Workshop, March 27–28, Delft.
  • 14. How do We Perceive Reality? • We understand the world through our senses: • Sight, Hearing, Touch, Taste, Smell (and others..) • Two basic processes: • Sensation – Gathering information • Perception – Interpreting information
  • 16. Goal of Virtual Reality “.. to make it feel like you’re actually in a place that you are not.” Palmer Luckey Co-founder, Oculus
  • 17. Creating the Illusion of Reality • Fooling human perception by using technology to generate artificial sensations • Computer generated sights, sounds, smell, etc
  • 18. Reality vs. Virtual Reality • In a VR system there are input and output devices between human perception and action
  • 19. Example Birdly - http://www.somniacs.co/ • Create illusion of flying like a bird • Multisensory VR experience • Visual, audio, wind, haptic
  • 21. ‘Virtual Reality is a synthetic sensory experience which may one day be indistinguishable from the real physical world.” -Roy Kalawsky (1993) Today Tomorrow
  • 23. Motivation • Understand: In order to create a strong sense of Presence we need to understand the Human Perception system • Stimulate: We need to be able to use technology to provide real world sensory inputs, and create the VR illusion VR Hardware Human Senses
  • 24. Senses • How an organism obtains information for perception: • Sensation part of Somatic Division of Peripheral Nervous System • Integration and perception requires the Central Nervous System • Five major senses: • Sight (Opthalamoception) • Hearing (Audioception) • Taste (Gustaoception) • Smell (Olfacaoception) • Touch (Tactioception)
  • 25. Other Lessor Known Senses.. • Proprioception = sense of body position • what is your body doing right now • Equilibrium = balance • Acceleration • Nociception = sense of pain • Temperature • Satiety (the quality or state of being fed or gratified to or beyond capacity) • Thirst • Micturition • Amount of CO2 and Na in blood
  • 26. Relative Importance of Each Sense • Percentage of neurons in brain devoted to each sense • Sight – 30% • Touch – 8% • Hearing – 2% • Smell - < 1% • Over 60% of brain involved with vision in some way
  • 27. VR System Overview • Simulate output • Map output to devices • Use devices to stimulate the senses Visual Simulation 3D Graphics HMD Vision System Brain Example: Visual Simulation Human-Machine Interface
  • 28. Sight
  • 29. The Human Visual System • Purpose is to convert visual input to signals in the brain
  • 30. The Human Eye • Light passes through cornea and lens onto retina • Photoreceptors in retina convert light into electrochemical signals
  • 31. Photoreceptors – Rods and Cones • Retina photoreceptors come in two types, Rods and Cones
  • 32. Rods vs. Cones • RODS • 125 million cells in retina • Concentrated on periphery of retina • No color detection • Most sensitive to light • Scotopic (night) vision • Provide peripheral vision, motion detection • CONES • 4.5-6 million in retina • Responsible for color vision • Sensitive to red, blue, green light • Work best in more intense light
  • 33. Colour Perception • Humans only perceive small part of electromagnetic spectrum
  • 34. Horizontal and Vertical FOV • Humans can see ~135 o vertical (60 o above, 75 o below) • See up to ~ 210 o horizontal FOV, ~ 115 o stereo overlap • Colour/stereo in centre, Black & White/mono in periphery
  • 35. Types of Visible Perception Possible • As move further from fovea, vision becomes more limited
  • 36. Dynamic Range • Rods respond to low Luminance light, Cones to bright light
  • 37. Comparing to Displays • Human vision has far higher dynamic range than any available display technology • 40 f-stops, cf 17 f-stops for HDR display
  • 40. Vergence-Accommodation Conflict • Looking at real objects, vergence and focal distance match • In Virtual Reality, vergence and accommodation can miss-match • Focusing on HMD screen, but accommodating for virtual object behind screen
  • 41. Visual Acuity Visual Acuity Test Targets • Ability to resolve details • Several types of visual acuity • detection, separation, etc • Normal eyesight can see a 50 cent coin at 80m • Corresponds to 1 arc min (1/60th of a degree) • Max acuity = 0.4 arc min
  • 42. Resolution of the Eye • Decreases away from the fovea • Maximum resolution of 1 arcmin – spot of 6x10-6 m size on retina
  • 43. Stereo Perception/Stereopsis • Eyes separated by IPD • Inter pupillary distance • 5 – 7.5cm (average. 6.5cm) • Each eye sees diff. image • Separated by image parallax • Images fused to create 3D stereo view
  • 44.
  • 45. Depth Perception • The visual system uses a range of different Stereoscopic and Monocular cues for depth perception Stereoscopic Monocular eye convergence angle disparity between left and right images diplopia eye accommodation perspective atmospheric artifacts (fog) relative sizes image blur occlusion motion parallax shadows texture Parallax can be more important for depth perception! Stereoscopy is important for size and distance evaluation
  • 50. Fooling Depth Perception • https://www.youtube.com/watch?v=p-eZcHPp7Go
  • 51. Properties of the Human Visual System • visual acuity: 20/20 is ~1 arc min • field of view: ~200° monocular, ~120° binocular, ~135° vertical • resolution of eye: ~576 megapixels • temporal resolution: ~60 Hz (depends on contrast, luminance) • dynamic range: instantaneous 6.5 f-stops, adapt to 46.5 f-stops • colour: everything in CIE xy diagram • depth cues in 3D displays: vergence, focus, (dis)comfort • accommodation range: ~8cm to ∞, degrades with age
  • 52. The Perfect Retina Display • A HMD capable of creating images indistinguishable from reality would need to match the properties of the eye: • FOV: 200-220° x 135° needed (both eyes) • 120° stereo overlap • Acuity: ~0.4 arc min (1 pixel/0.4 arc min) • Pixel Resolution: ~30,000 x 20,000 pixels • 200*60°/0.4 = 30,000, 135*60°/0.4 = 20,250 • Pixels/inch: > 2190 PPI @ 100mm (depends on distance to screen) • Update rate: 60 Hz • The biggest challenge: bandwidth • compress and transmit huge amount of data • drive and operate display pixels
  • 53. Comparison between Eyes and HMD Human Eyes HTC Vive FOV 200° x 135° 110° x 110° Stereo Overlap 120° 110° Resolution 30,000 x 20,000 2,160 x 1,200 Pixels/inch >2190 (100mm to screen) 456 Update 60 Hz 90 Hz See http://doc-ok.org/?p=1414 http://www.clarkvision.com/articles/eye-resolution.html http://wolfcrow.com/blog/notes-by-dr-optoglass-the-resolution-of-the-human-eye/
  • 56. How the Ear Works • https://www.youtube.com/watch?v=pCCcFDoyBxM
  • 57. Sound Frequency and Amplitude • Frequency determines the pitch of the sound • Amplitude relates to intensity of the sound • Loudness is a subjective measure of intensity High frequency = short period Low frequency = long period
  • 58. Distance to Listener • Relationship between sound intensity and distance to the listener Inverse-square law • The intensity varies inversely with the square of the distance from the source. So if the distance from the source is doubled (increased by a factor of 2), then the intensity is quartered (decreased by a factor of 4).
  • 59. Auditory Thresholds • Humans hear frequencies from 20 – 22,000 Hz • Most everyday sounds from 80 – 90 dB
  • 60. Sound Localization • Humans have two ears • localize sound in space • Sound can be localized using 3 coordinates • Azimuth, elevation, distance
  • 61. Sound Localization • Azimuth Cues • Difference in time of sound reaching two ears • Interaural time difference (ITD) • Difference in sound intensity reaching two ears • Interaural level difference (ILD) • Elevation Cues • Monaural cues derived from the pinna (ear shape) • Head related transfer function (HRTF) • Range Cues • Difference in sound relative to range from observer • Head movements (otherwise ITD and ILD are same)
  • 63. Sound Localization (Azimuth Cues) Interaural Time Difference
  • 64. HRTF (Elevation Cue) • Pinna and head shape affect frequency intensities • Sound intensities measured with microphones in ear and compared to intensities at sound source • Difference is HRTF, gives clue as to sound source location
  • 65. Accuracy of Sound Localization • People can locate sound • Most accurately in front of them • 2-3° error in front of head • Least accurately to sides and behind head • Up to 20° error to side of head • Largest errors occur above/below elevations and behind head • Front/back confusion is an issue • Up to 10% of sounds presented in the front are perceived coming from behind and vice versa (more in headphones) BUTEAN, A., Bălan, O., NEGOI, I., Moldoveanu, F., & Moldoveanu, A. (2015). COMPARATIVE RESEARCH ON SOUND LOCALIZATION ACCURACY IN THE FREE-FIELD AND VIRTUAL AUDITORY DISPLAYS. InConference proceedings of» eLearning and Software for Education «(eLSE)(No. 01, pp. 540-548). Universitatea Nationala de Aparare Carol I.
  • 66. Touch
  • 67. Touch • Mechanical/Temp/Pain stimuli transduced into Action Potentials (AP) • Transducing structures are specialized nerves: • Mechanoreceptors: Detect pressure, vibrations & texture • Thermoreceptors: Detect hot/cold • Nocireceptors: Detect pain • Proprioreceptors: Detect spatial awareness • This triggers an AP which then travels to various locations in the brain via the somatosensory nerves
  • 68. Haptic Sensation • Somatosensory System • complex system of nerve cells that responds to changes to the surface or internal state of the body • Skin is the largest organ • 1.3-1.7 square m in adults • Tactile: Surface properties • Receptors not evenly spread • Most densely populated area is the tongue • Kinesthetic: Muscles, Tendons, etc. • Also known as proprioception
  • 69. Somatosensory System • Map of somatosensory areas of the brain, showing more area for regions with more receptors
  • 71. Cutaneous System • Skin – heaviest organ in the body • Epidermis outer layer, dead skin cells • Dermis inner layer, with four kinds of mechanoreceptors
  • 72. Mechanoreceptors • Cells that respond to pressure, stretching, and vibration • Slow Acting (SA), Rapidly Acting (RA) • Type I at surface – light discriminate touch • Type II deep in dermis – heavy and continuous touch Receptor Type Rate of Acting Stimulus Frequency Receptive Field Detection Function Merkel Discs SA-I 0 – 10 Hz Small, well defined Edges, intensity Ruffini corpuscles SA-II 0 – 10 Hz Large, indistinct Static force, skin stretch Meissner corpuscles RA-I 20 – 50 Hz Small, well defined Velocity, edges Pacinian corpuscles RA-II 100 – 300 Hz Large, indistinct Acceleration, vibration
  • 73. Spatial Resolution • Sensitivity varies greatly • Two-point discrimination Body Site Threshold Distance Finger 2-3mm Cheek 6mm Nose 7mm Palm 10mm Forehead 15mm Foot 20mm Belly 30mm Forearm 35mm Upper Arm 39mm Back 39mm Shoulder 41mm Thigh 42mm Calf 45mm http://faculty.washington.edu/chudler/chsense.html
  • 74. Proprioception/Kinaesthesia • Proprioception (joint position sense) • Awareness of movement and positions of body parts • Due to nerve endings and Pacinian and Ruffini corpuscles at joints • Enables us to touch nose with eyes closed • Joints closer to body more accurately sensed • Users know hand position accurate to 8cm without looking at them • Kinaesthesia (joint movement sense) • Sensing muscle contraction or stretching • Cutaneous mechanoreceptors measuring skin stretching • Helps with force sensation
  • 75. Smell
  • 76. Olfactory System • Human olfactory system. 1: Olfactory bulb 2: Mitral cells 3: Bone 4: Nasal epithelium 5: Glomerulus 6: Olfactory receptor neurons
  • 77. How the Nose Works • https://www.youtube.com/watch?v=zaHR2MAxywg
  • 78. Smell • Smells are sensed by olfactory sensory neurons in the olfactory epithelium • 10 cm2 with hundreds of different types of olfactory receptors • Human’s can detect at least 10,000 different odors • Some researchers say trillions of odors • Sense of smell closely related to taste • Both use chemo-receptors • Olfaction + taste contribute to flavour • The olfactory system is the only sense that bypasses the thalamus and connects directly to the forebrain
  • 79. Taste
  • 80. Sense of Taste • https://www.youtube.com/watch?v=FSHGucgnvLU
  • 81. Basics of Taste • Sensation produced when a substance in the mouth reacts chemically with taste receptor cells • Taste receptors mostly on taste buds on the tongue • 2,000 – 5,000 taste buds on tongues/100+ receptors each • Five basic tastes: • sweetness, sourness, saltiness, bitterness, and umami • Flavour influenced by other senses • smell, texture, temperature, “coolness”, “hotness”
  • 84. Using Technology to Stimulate Senses • Simulate output • E.g. simulate real scene • Map output to devices • Graphics to HMD • Use devices to stimulate the senses • HMD stimulates eyes Visual Simulation 3D Graphics HMD Vision System Brain Example: Visual Simulation Human-Machine Interface
  • 85. Key Technologies for VR System • Visual Display • Stimulate visual sense • Audio/Tactile Display • Stimulate hearing/touch • Tracking • Changing viewpoint • User input • Input Devices • Supporting user interaction
  • 86. Mapping Between Input and Output Input Output
  • 88. Creating an Immersive Experience •Head Mounted Display •Immerse the eyes •Projection/Large Screen •Immerse the head/body •Future Technologies •Neural implants •Contact lens displays, etc
  • 89. HMD Basic Principles • Use display with optics to create illusion of virtual screen
  • 90. Key Properties of HMDs • Lens • Focal length, Field of View • Occularity, Interpupillary distance • Eye relief, Eye box • Display • Resolution, contrast • Power, brightness • Refresh rate • Ergonomics • Size, weight • Wearability
  • 91. Simple Magnifier HMD Design p q Eyepiece (one or more lenses) Display (Image Source) Eye f Virtual Image 1/p + 1/q = 1/f where p = object distance (distance from image source to eyepiece) q = image distance (distance of image from the lens) f = focal length of the lens
  • 92. Field of View Monocular FOV is the angular subtense (usually expressed in degrees) of the displayed image as measured from the pupil of one eye. Total FOV is the total angular size of the displayed image visible to both eyes. Binocular(or stereoscopic) FOV refers to the part of the displayed image visible to both eyes. FOV may be measured horizontally, vertically or diagonally.
  • 93. Ocularity • Monocular - HMD image goes to only one eye. • Bioccular - Same HMD image to both eyes. • Binocular (stereoscopic) - Different but matched images to each eye.
  • 94. Interpupillary Distance (IPD) nIPD is the horizontal distance between a user's eyes. nIPD is the distance between the two optical axes in a binocular view system.
  • 95. Distortion in Lens Optics A rectangle Maps to this
  • 97. To Correct for Distortion • Must pre-distort image • This is a pixel-based distortion • Graphics rendering uses linear interpolation! • Too slow on most systems • Use shader programming
  • 98.
  • 99. HMD Design Trade-offs • Resolution vs. field of view • As FOV increases, resolution decreases for fixed pixels • Eye box vs. field of view • Larger eye box limits field of view • Size, Weight and Power vs. everything else vs.
  • 100. Oculus Rift • Cost: $599 USD • FOV: 110 o Horizontal • Refresh rate: 90 Hz • Resolution 1080x1200/eye • 3 DOF orientation tracking • 3 axis positional tracking
  • 103. Computer Based vs. Mobile VR Displays
  • 105. Google Cardboard • Released 2014 (Google 20% project) • >5 million shipped/given away • Easy to use developer tools + =
  • 106. Multiple Mobile VR Viewers Available
  • 107. Projection/Large Display Technologies • Room Scale Projection • CAVE, multi-wall environment • Dome projection • Hemisphere/spherical display • Head/body inside • Vehicle Simulator • Simulated visual display in windows
  • 108. CAVE • Developed in 1992, EVL University of Illinois Chicago • Multi-walled stereo projection environment • Head tracked active stereo Cruz-Neira, C., Sandin, D. J., DeFanti, T. A., Kenyon, R. V., & Hart, J. C. (1992). The CAVE: audio visual experience automatic virtual environment. Communications of the ACM, 35(6), 64-73.
  • 109. Typical CAVE Setup • 4 sides, rear projected stereo images
  • 110. Demo Video – Wisconsin CAVE • https://www.youtube.com/watch?v=mBs-OGDoPDY
  • 112. Stereo Projection • Active Stereo • Active shutter glasses • Time synced signal • Brighter images • More expensive • Passive Stereo • Polarized images • Two projectors (one/eye) • Cheap glasses (powerless) • Lower resolution/dimmer • Less expensive
  • 114. Allosphere • Univ. California Santa Barbara • One of a kind facility • Immersive Spherical display • 10 m diameter • Inside 3 story anechoic cube • Passive stereoscopic projection • 26 projectors • Visual tracking system for input • See http://www.allosphere.ucsb.edu/ Kuchera-Morin, J., Wright, M., Wakefield, G., Roberts, C., Adderton, D., Sajadi, B., ... & Majumder, A. (2014). Immersive full-surround multi-user system design. Computers & Graphics, 40, 10-21.
  • 116. Vehicle Simulators • Combine VR displays with vehicle • Visual displays on windows • Motion base for haptic feedback • Audio feedback • Physical vehicle controls • Steering wheel, flight stick, etc • Full vehicle simulation • Emergencies, normal operation, etc • Weapon operation • Training scenarios
  • 117. Demo: Boeing 787 Simulator • https://www.youtube.com/watch?v=3iah-blsw_U
  • 119. Haptic Feedback • Greatly improves realism • When is it needed? • Other cues occluded/obstructed • Required for task performance • High bandwidth! • Hands and wrist are most important • High density of touch receptors • Two kinds of feedback • Touch Feedback • information on texture, temperature, etc. • Does not resist user contact • Force Feedback • information on weight, and inertia. • Actively resists contact motion
  • 120. Haptic Devices • Pin arrays for the finger(s) • Force-feedback "arms" • "Pager" motors • Particle brakes • Passive haptics • Many devices are application specific • Like surgical devices
  • 121. Active Haptics • Actively resists contact motion • Dimensions • Force resistance • Frequency Response • Degrees of Freedom • Latency • Intrusiveness • Safety • Comfort • Portability
  • 122. Example: Phantom Omni • Combined stylus input/haptic output • 6 DOF haptic feedback
  • 123. Phantom Omni Demo • https://www.youtube.com/watch?v=REA97hRX0WQ
  • 124. Immersion Cybergrasp • Haptic feedback on Glove • Combined with glove input
  • 127. Passive Haptics • Not controlled by system • Pros • Cheap • Large scale • Accurate • Cons • Not dynamic • Limited use
  • 128. UNC Being There Project
  • 129. Passive Haptic Paddle • Using physical props to provide haptic feedback • http://www.cs.wpi.edu/~gogo/hive/
  • 130. Tactile Feedback Interfaces • Goal: Stimulate skin tactile receptors • Using different technologies • Air bellows • Jets • Actuators (commercial) • Micropin arrays • Electrical (research) • Neuromuscular stimulations (research)
  • 131. Vibrotactile Cueing Devices • Vibrotactile feedback has been incorporated into many devices • Can we use this technology to provide scalable, wearable touch cues?
  • 132. Vibrotactile Feedback Projects Navy TSAS Project TactaBoard and TactaVest
  • 133. Example: CyberTouch Glove • Immersion Corporation • Expensive - $15000 • Six Vibrotactile actuators • Back of finger • Palm • Off-centered actuator motor • Rotation speed=frequency of vibration (0-125 Hz) • When tracked virtual hand intersects with virtual object, send signal to glove to vibrate http://www.cyberglovesystems.com/cybertouch/
  • 135. Audio Displays • Spatialization vs. Localization • Spatialization is the processing of sound signals to make them emanate from a point in space • This is a technical topic • Localization is the ability of people to identify the source position of a sound • This is a human topic, i.e., some people are better at it than others.
  • 136. Audio Display Properties Presentation Properties • Number of channels • Sound stage • Localization • Masking • Amplification Logistical Properties o Noise pollution o User mobility o Interface with tracking o Environmental requirements o Integration o Portability o Throughput o Cumber o Safety o Cost
  • 137. Channels & Masking • Number of channels • Stereo vs. mono vs. quadrophonic • 2.1, 5.1, 7.1 • Two kinds of masking • Louder sounds mask softer ones • We have too many things vying for our audio attention these days! • Physical objects mask sound signals • Happens with speakers, but not with headphones
  • 138. Audio Displays: Head-worn Ear Buds On Ear Open Back Closed Bone Conduction
  • 139. Spatialized Audio Effects • Naïve approach • Simple left/right shift for lateral position • Amplitude adjustment for distance • Easy to produce using consumer hardware/software • Does not give us "true" realism in sound • No up/down or front/back cues • We can use multiple speakers for this • Surround the user with speakers • Send different sound signals to each one
  • 140. Head-Related Transfer Functions • A.k.a. HRTFs • A set of functions that model how sound from a source at a known location reaches the eardrum
  • 141. Constructing HRTFs • Small microphones placed into ear canals • Subject sits in an anechoic chamber • Can use a mannequin's head instead • Sounds played from a large number of known locations around the chamber • Functions are constructed for this data • Sound signal is filtered through inverse functions to place the sound at the desired source
  • 142. OSSIC 3D Audio Headphones • https://www.ossic.com/3d-audio/
  • 144. Measuring HRTFs • Putting microphones in Manikin or human ears • Playing sound from fixed positions • Record response
  • 145. More About HRTFs • Functions take into account, for example, • Individual ear shape • Slope of shoulders • Head shape • So, each person has his/her own HRTF! • Need to have a parameterizable HRTFs • Some sound cards/APIs allow you to specify an HRTF • Check Wikipedia or Google for more info!
  • 147. Tracking inVirtual Reality • Registration • Positioning virtual object wrt real/virtual objects • Tracking • Continually locating the users viewpoint/input • Position (x,y,z), Orientation (r,p,y)
  • 148. Tracking Technologies • Active • Mechanical, Magnetic, Ultrasonic • GPS, Wifi, cell location • Passive • Inertial sensors (compass, accelerometer, gyro) • Computer Vision • Marker based, Natural feature tracking • Hybrid Tracking • Combined sensors (eg Vision + Inertial)
  • 149. MechanicalTracker •Idea: mechanical arms with joint sensors •++: high accuracy, haptic feedback •-- : cumbersome, expensive Microscribe Sutherland
  • 150. MagneticTracker • Idea: difference between a magnetic transmitter and a receiver • ++: 6DOF, robust • -- : wired, sensible to metal, noisy, expensive • -- : error increases with distance Flock of Birds (Ascension)
  • 151. Example: Razer Hydra • Developed by Sixense • Magnetic source + 2 wired controllers • Short range (1-2 m) • Precision of 1mm and 1o • $600 USD
  • 152. Razor Hydra Demo • https://www.youtube.com/watch?v=jnqFdSa5p7w
  • 153. InertialTracker • Idea: measuring linear and angular orientation rates (accelerometer/gyroscope) • ++: no transmitter, cheap, small, high frequency, wireless • -- : drift, hysteris only 3DOF IS300 (Intersense) Wii Remote
  • 154. OpticalTracker • Idea: Image Processing and ComputerVision • Specialized • Infrared, Retro-Reflective, Stereoscopic • Monocular BasedVisionTracking ART Hi-Ball
  • 156. Example: Vive Lighthouse Tracking • Outside-in tracking system • 2 base stations • Each with 2 laser scanners, LED array • Headworn/handheld sensors • 37 photo-sensors in HMD, 17 in hand • Additional IMU sensors (500 Hz) • Performance • Tracking server fuses sensor samples • Sampling rate 250 Hz, 4 ms latency • 2mm RMS tracking accuracy • See http://doc-ok.org/?p=1478
  • 157. Lighthouse Components Base station - IR LED array - 2 x scanned lasers Head Mounted Display - 37 photo sensors - 9 axis IMU
  • 159. How Lighthouse Tracking Works • Position tracking using IMU • 500 Hz sampling • But drifts over time • Drift correction using optical tracking • IR synchronization pulse (60 Hz) • Laser sweep between pulses • Photo-sensors recognize sync pulse, measure time to laser • Know when sensor hit and which sensor hit • Calculate position of sensor relative to base station • Use 2 base stations to calculate pose • Use IMU sensor data between pulses (500Hz) • See http://xinreality.com/wiki/Lighthouse
  • 160. Lighthouse Tracking Base station scanning https://www.youtube.com/watch?v=avBt_P0wg_Y https://www.youtube.com/watch?v=oqPaaMR4kY4 Room tracking