Slides showing how to use Unity to build Google Cardboard Virtual Reality applications. From a series of lectures given by Mark Billinghurst from the University of South Australia.
2. Mark Billinghurst
▪ Director, Empathic Computing Lab
University of South Australia
▪ Past Director of HIT Lab NZ,
University of Canterbury
▪ PhD Univ. Washington
▪ Research on AR, mobile HCI,
Collaborative Interfaces
▪ More than 300 papers in AR, VR,
interface design
3. What You Will Learn
• Definitions of VR, Brief History of VR
• Introduction to Mobile VR/Google Cardboard
• Intoduction to Unity3D
• Complete 7 projects
• 1 Building a Unity Scene
• 2 Immersive 360 Panorama
• 3 Creating a 3D VR Scene
• 4 Adding Movement
• 5 Gaze based interaction
• 6 Menu input
• 7 Moving Menus
• Cardboard interface design guidelines
• Resources for learning more
7. What is Virtual Reality?
Virtual reality is..
a computer technology that replicates an
environment, real or imagined, and simulates a
user's physical presence and environment to
allow for user interaction. (Wikipedia)
• Defining Characteristics
• Environment simulation
• Presence
• Interaction
8. Key Technologies
• Autonomy
• Head tracking, body input
• Intelligent systems
• Interaction
• User input devices, HCI
• Presence
• Graphics/audio/multisensory output
• Multisensory displays
• Visual, audio, haptic, olfactory, etc
17. Version 1.0 vs Version 2.0
• Version 1.0 – Android focused, magnetic switch, small phone
• Version 2.0 – Touch input, iOS/Android, fits many phones
25. Example Panorama Applications
• Within
• http://with.in
• High quality 360 VR content
• New York Times VR Experience
• NYTVR application
• Documentary experiences
• YouTube 360 Videos
• Collection of 360 videos
26. Google Cardboard App
• 7 default experiences
• Earth: Fly on Google Earth
• Tour Guide: Visit sites with guides
• YouTube: Watch popular videos
• Exhibit: Examine cultural artifacts
• Photo Sphere: Immersive photos
• Street View: Drive along a street
• Windy Day: Interactive short story
32. Tools for Non-Programmers
• Focus on Design, ease of use
• Visual Programming, content arrangement
• Examples
• Insta-VR – 360 panoramas
• http://www.instavr.co/
• Vizor – VR on the Web
• http://vizor.io/
• A-frame – HTML based
• https://aframe.io/
• ENTiTi – Both AR and VR authoring
• http://www.wakingapp.com/
• Eon Creator – Drag and drop tool for AR/VR
• http://www.eonreality.com/eon-creator/
33. Google VR SDK for Unity
Free Download
https://developers.google.com/vr/unity/download/
Features:
1. Lens distortion correction
2. Head tracking
3. 3D calibration
4. Side-by-side rendering
5. Stereo geometry configuration
6. User input event handling
7. VR emulation mode, etc..
Unity Google VR SDK
35. Unity Overview (see www.unity3d.com)
• Created in 2005
• Tool for creating games and 2D/3D applications
• Advanced graphics support
• Support for multiplayer, analytics, performance, ads, etc
• Cross Platform Game Engine
• One of the most popular (> 1.5 million developers)
• 27 platforms (iOS,Android, Windows, Mac, etc)
• Multiple license models
• Free for personal use/small business
• Large developer community
• Tutorials, support
• User generated content/assets
38. Download and Install (for Android)
• Go to unity3d.com/download
• Use Download Assistant – pick components you want
• Make sure to install Android components
• Also install Android studio (https://developer.android.com/studio/)
39. Getting Started
• First time running Unity you’ll be asked to create a project
• Specify project name and location
• Can pick asset packages (pre-made content)
42. Building Scenes
• Use GameObjects:
• Containers that hold different components
• Eg 3D model, texture, animation
• Use Inspector
• View and edit object properties and other settings
• Use Scene View
• Position objects, camera, lights, other GameObjects etc
• Scripting
• Adding interaction, user input, events, etc
43. GameObjects
• Every object in Scene is a GameObject
• GameObjects contain Components
• Eg Transform Component, Camera Components
• Clicking on object will show values in Inspector panel
44. Adding 3D Content
• Create 3D asset using modeling package, or download
• Fbx, Obj file format for 3D models
• Add file to Assets folder in Project
• When project opened 3D model added to Project View
• Drag mesh from Project View into Hierarchy or Scene View
• Creates a game object
46. Unity Prefabs
• When download assets, often download Prefabs (blue squares)
• Use by dragging and dropping into scene hierachy
• Prefab is a way of storing a game object with properties and
components already set
• Prefab is a template from which you can create new object
instances in the scene
• Changes to a prefab asset will change all instances in the scene
49. Making a Simple Scene - Key Steps
1. Create New Project
2. Create Game Object
3. Moving main camera position
4. Adding lights
5. Adding more objects
6. Adding physics
7. Changing object materials
8. Adding script behaviour
59. Example C# Script
GameObject Rotation
using UnityEngine;
using System.Collections;
public class spin : MonoBehaviour {
// Use this for initialization
void Start () {
}
// Update is called once per frame
void Update () {
this.gameObject.transform.Rotate(Vector3.up*10);
}
}
60. Scripting C# Unity 3D
• void Awake():
• Is called when the first scene is loaded and the game object is active
• void Start():
• Called on first frame update
• void FixedUpdate():
• Called before physics calculations are made
• void Update():
• Called every frame before rendering
• void LateUpdate():
• Once per frame after update finished
63. Key Steps
1. Create a new project
2. Load the Google VR SDK
3. Load a panorama image asset
4. Create a Skymap
5. Add to VR scene
6. Deploy to mobile phone
70. AutoPano (Kolor)
• Finds image from panoramas and stitches them together
• http://www.kolor.com/autopano/
71. Add Image Asset to Project
• Assets -> Import Asset
• Select desired image
• In Inspector
• Set Texture Type to Cubemap
• Set mapping to Latitude-
Longitude (Cylindrical)
• Hit Apply button
72. Create Skybox Material
• Assets -> Create -> Material
• Name material - e.g. 'Sky'
• Set Shader to Skybox -> Cubemap
• Drag texture to cubemap
73. Create Skybox
• Window -> Lighting
• new window pops up
• Drag Skybox material into
Skypebox form
75. One Last Thing..
• Check Clear Flags on Camera is set to Skybox
• Select Main Camera
• Look at Camera in Inspector
• Clear Flags -> Skybox
76. Test It Out
• Hit play button
• Use alt/option key + mouse to look around
77. Deploying to Phone (Android)
1. Plug phone into USB
• Put phone into debug mode
2. Open Build Settings
3. Change Target platform to Android
4. Resolution and Presentation
• Default Orientation -> Landscape Left
5. Under Player Settings
• Edit Bundle Identifier – eg com.UniSA.cubeTest
• Minimum API level
6. Build and Run
• Select .apk file name
78. Setting Path to Android
• You may need to tell Unity
where the Android SDK is
• Set the path:
• Edit -> Preferences ->
External Tools
80. Making Immersive Movie
• Create movie texture
• Convert 360 video to .ogg or ,mp4 file
• Add video texture as asset
• Make Sphere
• Equirectangular UV mapping
• Inward facing normals
• Move camera to centre of sphere
• Texture map video to sphere
• Easy Movie Texture ($65)
• Apply texture to 3D object
• For 3D 360 video
• Render two Spheres
• http://bernieroehl.com/360stereoinunity/
82. Key Steps
1. Creating a new project
2. Load Google VR SDK
3. Add GvrViewerMain to scene
4. Loading in 3D asset packages
5. Loading a SkyDome
6. Adding a plane floor
92. Moving Through VR Scenes
• Move through looking
• Look at target to turn on/off moving
• Button/tapping screen
• Being in a vehicle (e.g. Roller Coaster)
93. Adding Movement Through Looking
Goal: Move in direction user is looking when button
on VR display pressed or screen touched
• Key Steps
1. Start with static scene
2. Create player body
3. Create movement script
4. Add movement script to player body
94. Key Steps
1. Create New Project
2. Import GoogleVRforUnity Package
3. Create objects in scene
4. Add player body
5. Include collision detection
6. Add player movement script
95. Create New Project
• Include GoogleVRforUnity
• Assets->ImportPackage->Custom Package
96. Add GvrViewerMain to Project
• Drag GvrViewerMain into Hierarchy
• from Asset->GoogleVR->Prefabs
97. Add Ground Plane and Objects
• Create simple scene of Ground Plane and obects
• GameObject -> 3D Object -> Plane/Cube/Sphere/Cylinder
• Scale and position as you like, add materials
• Add rigidbody components to objects (not plane) to enable collisions
• Select object -> Add Component -> Rigidbody
• Fix position of object: Constraints -> Freeze Position -> check x,y,z (Freeze Rotation)
98. Add Player Body
• Select Main Camera
• Add Component->Mesh Filter
• Click on circle icon on right ->
Select Capsule mesh
99. Make the Body Visible
• Select Main Camera
• Add component -> Mesh Renderer
• Create a material and drag onto capsule mesh
100. Add Collision Detection
• Allow player to collide with objects
• Select Main Camera
• Add Component -> Capsule Collider
• Add Component -> RigidBody
• Fix player to ground
• In RigidBody component
• Uncheck “Use Gravity”
• Uncheck “Is Kinematic”
• Check Constraints -> Freeze Position -> Y axis
101. Add Movement Script
• Select Main Camera
• Create new script called PlayerMovement
• Add component -> New Script
• Key variables - speed, rigidbody
public float speed = 3.0f;
Rigidbody rbody;
• Define fixedupdate movement function (move in direction looking)
void FixedUpdate () {
if(Input.touchCount>0||Input.GetMouseButton(0))
rbody.MovePosition(transform.position+transform.forward
* Time.deltaTime*speed);
}
104. Run Demo
• Use left mouse button to move in direction looking
• Button press/screen tap on mobile phone
105. Demo Problem
• Wait! I'm bouncing off objects
• Moving body hits fixed objects and gets
negative velocity
106. Stopping Camera Motion
• When camera collides it's given momentum
• velocity and angular velocity
• Need to set velocity and angular velocity to zero
• In player movement script
• Set rbody velocity components to zero
110. Gaze Interaction
• Cause events to happen when looking at objects
• E.g look at a target to shoot at it
111. Key Steps
1. Begin with VR scene from Project 4
2. Add physics ray caster
• Casts a ray from camera (gaze ray)
3. Add function to object to respond to gaze
• E.g. when gaze ray hits target cause particle effect
4. Add event trigger to target object
5. Add event system to target object
112. Adding Physics Raycaster
• Aim: To send a virtual ray from camera view
• Process
• Select Main Camera
• Add GvrPointerPhysicsRaycaster Component to Main
Camera
• Add component -> GvrPointerPhysicsRaycaster
113. Add Gaze Function
• Select target object (the cube model)
• Add component -> new script
• Call script CubeInteraction
• Add OnGazeEnter(), OnGazeExit() public functions
• Decide what happens when gaze enters/exits Cube model
• Complete this later
114. Add Event Trigger
• Select Target Object (Cube)
• Add component
• EventTriger
• Add New Event Type -> PointerEntry
• Add object to event
• Hit ‘+’ tag
• Drag Cube object to box under Runtime Only
• Select Function to run
• Select function list -> scroll to CubeInteraction -> OnGazeEnter
• Repeat for OnGazeExit
115. Adding Event System
• Need to user Event System for trigger to work
• Looks for gaze events occuring with Cube object
• Add Event System to Hierachy
• Game Object -> UI -> Event System
• Add gazeInputModule to Event System
• Add component -> Gaze Input Module
116. Add Collider to Object
• Need to detect when target object is being looked at
• Select target Object
• Add Collider (eg Box)
• Add component -> Box Collider
• Adjust position and size of Collider if needed
• Make sure it covers the target area
117. Making Gaze Point Visible
• In current system can't see user's Gaze point
• Add viewing reticle
• Drag GvrReticlePointer prefab onto main camera
• Assets -> GoogleVR -> Prefabs -> UI
• Reticle changes shape when on active object
• Change reticle material to make it more visible
• Set color in GvrReticleMaterial (e.g. to Red)
123. Menu Placement
• Different types of menu placement
• Screen aligned - always visible on screen
• World aligned - attached to object or location in VR scene
• Camera aligned - moves with the user
• This project shows a world aligned menu
124. Interacting with VR Menus
• Touch input
• Tap screen to select menu button
• Suitable for handheld applications
• Head/Gaze pointing
• Look at menu button, click to select
• Ideal for menus in VR display
125. Key Steps
1. Create New Scene and gaze support
2. Create User Interface menu object
3. Add buttons to user interface
4. Add button scripts
5. Add gaze interaction
6. Object interaction scripts
7. Make the menu disappear and reappear
126. Create New Scene
• Create scene with cube and plane
• Add materials
• Import GoogleVRforUnity package
• Drag GvrViewerMain into project hierachy
127. Setup Gaze Pointing
• Drag GvrReticlePointer to Main Camera
• Assets -> GoogleVR -> Prefabs -> UI
• Add Gvr Pointer Physics Raycaster to Main Camera
• Add component -> GvrPointerPhysicsRaycaster
128. Menu Functionality
• Want to set up a menu that changes cube colour
• Menu fixed in space
• Located need object which it affects
• Two buttons (white/blue)
• Look at blue button to set cube colour to blue
• Look at white button to set cube colour to white
129. Menu Implementation
• Create a 2D canvas plane
• Place canvas in VR scene where it is needed
• Add buttons to the plane
• Add scripts to the buttons
• Triggered based on gaze input
130. Setting up Menu Canvas
• Create Empty Object name it UserInterface
• Create image object under UserInterface
• Right click UserInterface -> UI -> Image
• Set the canvas to world space
• Move image until visible and resize
• Change image colour
132. Add Buttons
• Add two buttons to UI image
• Colour one blue (Image script colour)
• Remove button scripts
• We'll add our own
• Add sphere collider same size as button
133. Add Button Scripts
• Create identical scripts for Blue and White buttons
• Different names
• BlueButton, WhiteButton
• Include OnLook() Function
• Gaze function
135. Add Event Triggers
• Add event triggers to each button
• Add component -> Event Trigger
• Event trigger type as Pointer Enter
• Set target object as button
• Set target function as OnLook()
• Add Event System to Hierarchy
• Add component Gaze Input Module
139. Add Gaze Behaviour
• Edit button scripts to add cube colour changing
• Add public CubeActions object
• public CubeActions m_cube;
• Call set colour function in OnLook function
• m_cube.SetColorBlue();
• Drag Cube object to script form
141. Testing It Out
• Cube changes colour depending on button looked at
142. Making the Menu Disappear
• Don't want menu visible all the time
• Right click with mouse to appear/disappear
• Double tap with VR headset to appear/disappear
• Create menu script
• ToggleMenu function - turns menu on and off
• Note: Add script to User Interface object
• Add menu image as arguement
146. Moving a Menu with the User
• World aligned menus good for actions on objects
• e.g. select to change colour
• However you may want to move a menu with the user
• e.g. menu for user navigation
• This project shows how to add a menu to the camera
• Menu moves with the user as they move through the VR scene
147. Key Steps
1. Start with scene from Project 6
2. Create canvas object
3. Add button to canvas
4. Create player
5. Add player movement script
6. Add script for canvas movement
148. User Experience
• Have a walk button on the ground
• When player looks down they can toggle button on and off
• Look at walk button, click to toggle walking on and off
149. Create MoveButton Canvas
• Create canvas object
• UI->Canvas
• Set render mode to world space
• Resize and reposition
• Put flat on plane, a little in front of camera
150. Add Image to Canvas
• Create image on canvas
• Right click canvas
• UI -> image
• Set image to transparent
• Set image size to smaller than canvas
151. Add Button to Image
• Right click image
• UI -> button
• Resize and move to fill image
• Set colour and pressed colour
• Set text to “Walk”
• Expand button to see text object
152. Create Player Object
• Create empty obect
• Rename it Player
• Create empty child
• Rename it LocalTrans
• Move Canvas under LocalTrans
• Move Main Camera under Player
153. Add PlayerMove Script
• Add script to Main Camera
• ToggleWalk function that toggles walking
• If walking on then move camera
155. Connect Player Moving to Button
• Select Button Object
• In the Button Script On Click () Action
• Set target object as Main Camera
• Set target function as ToggleWalk
• PlayerMove -> ToggleWalk
156. Event System
• Make sure project has event system
• Add at same level as Player
• GameObject -> UI -> EventSystem
• Add Gaze Input Module component
• Add Component -> Gaze Input Module
• Remove Standalone Input Module script
• or deactivate by checking checkbox
157. Testing
• Look at Walk button and click
• Player moves, but button doesn't !
158. Moving Menu with Camera
• Add a script to the LocalTrans object
• CanvasMovement script
• Script does the following:
• finds the current camera position
• sets LocalTrans to that position
• rotates LocalTrans about y axis the same as camera
• Outcome:
• Menu moves with camera.
• User can look down to click on button
162. Google Design Guidelines
• Google’s Guidelines for good VR experiences:
• Physiological Considerations
• Interactive Patterns
• Setup
• Controls
• Feedback
• Display Reticle
• From http://www.google.com/design/spec-vr/designing-
for-google-cardboard/a-new-dimension.html
163. Physiological Considerations
• Factors to Consider
• Head tracking
• User control of movement
• Use constant velocity
• Grounding with fixed objects
• Brightness changes
164. Interactive Patterns - Setup
• Setup factors to consider:
• Entering and exiting
• Headset adaptation
• Full Screen mode
• API calls
• Indicating VR apps
165. System Control
• Issuing a command to change system state or mode
• Examples
• Launching application
• Changing system settings
• Opening a file
• Etc.
• Key points
• Make commands visible to user
• Support easy selection
168. Interactive Patterns - Feedback
• Use audio and haptic feedback
• Reduce visual overload
• Audio alerts
• 3D spatial sound
• Phone vibrations
169. Interactive Patterns - Display Reticle
• Easier for users to target objects with a display reticle
• Can display reticle only when near target object
• Highlight objects (e.g. with light source) that user can target
170. Use Ray-casting technique
• “Laser pointer” attached
to virtual hand or gaze
• First object intersected by
ray may be selected
• User only needs to control
2 DOFs
• Proven to perform well
for remote selection
• Variants:
• Cone casting
• Snap-to-object rays
171. Gaze Directed Steering
• Move in direction that you are looking
• Very intuitive, natural navigation
• Can be used on simple HMDs (Google Cardboard
• But: Can’t look in different direction while moving
172. Cardboard Design Lab Application
• Use Cardboard Design Lab app to explore design ideas
177. Useful Resources
• Google Cardboard main page
• https://www.google.com/get/cardboard/
• Developer Website
• https://vr.google.com/cardboard/developers/
• Building a VR app for Cardboard
• http://www.sitepoint.com/building-a-google-cardboard-vr-app-in-unity/
• Creating VR game for Cardboard
• http://danielborowski.com/posts/create-a-virtual-reality-game-for-
google-cardboard/
• Moving in VR space
• http://www.instructables.com/id/Prototyping-Interactive-Environments-
in-Virtual-Re/
178. Resources
• Unity Main site
• http://www.unity3d.com/
• Holistic Development with Unity
• http://holistic3d.com
• Official Unity Tutorials
• http://unity3d.com/learn/tutorials
• Unity Coder Blog
• http://unitycoder.com