SlideShare une entreprise Scribd logo
1  sur  226
Télécharger pour lire hors ligne
BUILDING AR AND VR
EXPERIENCES
Mark Billinghurst
mark.billinghurst@unisa.edu.au
May 10th – 13th, 2016
Xi’an, China
Mark Billinghurst
▪  University of South Australia
▪  Past Director of HIT Lab NZ,
University of Canterbury
▪  PhD Univ. Washington
▪  Research on AR, mobile HCI,
Collaborative Interfaces
▪  More than 300 papers in AR, VR,
interface design
AR/VR Course
• Lectures
•  2:30 - 4:30 pm everyday
•  Lectures/hands-on
• Logistics
•  Bring your own laptop if possible
•  Use Android phone
•  Share computer/phone
• Material
•  All material available for download
What You Will Learn
• AR/VR fundamentals + history
• Basics of Unity Programming
• How to make Panorama VR Applications
• How to create VR Scenes
• How to add Interactivity to VR Applications
• Using the Vuforia AR tracking library
• Creating AR scenes
• Adding AR interactivity
• Design guidelines
• Research directions
Schedule
• Tuesday May 10th
•  Introduction, Learning Unity, Building 360 VR scenes
• Wednesday May 11th
•  Creating 3D scenes, adding interactivity, good design
• Thursday May 12th
•  Introduction to AR, Vuforia basics, building AR scenes
•  Field trip to ARA demo space
• Friday May 13th
•  Adding interactivity, advanced AR tracking, research
ARA Demos
Hololens
CoolGlass
HTC Vive And more …
INTRODUCTION
A Brief History ofTime
• Trend
•  smaller, cheaper, more functions, more intimate
• Technology becomes invisible
•  Intuitive to use
•  Interface over internals
•  Form more important than function
•  Human centered design
A Brief History of Computing
• Trend
• smaller, cheaper, faster, more intimate, intelligent objects
• Computers need to become invisible
• hide the computer in the real world
•  Ubiquitous / Tangible Computing
• put the user inside the computer
•  Virtual Reality
Making Interfaces Invisible
Rekimoto, J. and Nagao, K. 1995. The world through the computer: computer augmented
interaction with real world environments. In Proceedings of the 8th Annual ACM Symposium on
User interface and Software Technology. UIST '95. ACM, New York, NY, 29-36.
Graphical User Interfaces
• Separation between real and digital worlds
•  WIMP (Windows, Icons, Menus, Pointer) metaphor
Ubiquitous Computing
• Computing and sensing embedded in real world
•  Particle devices, RFID, motes, arduino, etc
Virtual Reality
• ImmersiveVR
• Head mounted display, gloves
• Separation from the real world
Augmented Reality
1977 – Star Wars
Augmented Reality Definition
• Defining Characteristics [Azuma 97]
• Combines Real andVirtual Images
• Both can be seen at the same time
• Interactive in real-time
• The virtual content can be interacted with
• Registered in 3D
• Virtual objects appear fixed in space
Azuma, R. T. (1997). A survey of augmented reality. Presence, 6(4), 355-385.
2008 - CNN
AR vsVR
From Reality toVirtual Reality
Ubiquitous Computing Augmented Reality Virtual Reality
Milgram’s Reality-Virtuality continuum
Mixed Reality
Reality - Virtuality (RV) Continuum
Real
Environment
Augmented
Reality (AR)
Augmented
Virtuality (AV)
Virtual
Environment
"...anywhere between the extrema of the virtuality continuum."
P. Milgram and A. F. Kishino, Taxonomy of Mixed Reality Visual Displays
IEICE Transactions on Information and Systems, E77-D(12), pp. 1321-1329, 1994.
VIRTUAL REALITY
Virtual Reality
Computer generated multi-sensory simulation of an
artificial environment that is interactive and immersive.
David Zeltzer’s AIP Cube
! Autonomy – User can to
react to events and stimuli.
! Interaction – User can
interact with objects and
environment.
! Presence – User feels
immersed through sensory
input and output channels
Interaction
Autonomy
Presence
VR
Zeltzer, D. (1992). Autonomy, interaction, and presence. Presence: Teleoperators
& Virtual Environments, 1(1), 127-132.
Key Technologies
• Autonomy
•  Head tracking, body input
•  Intelligent systems
• Interaction
•  User input devices, HCI
• Presence
•  Graphics/audio/multisensory output
•  Multisensory displays
•  Visual, audio, haptic, olfactory, etc
Early Experimenters (1950’s – 80’s)
Helig 1956
Sutherland 1965
Furness 1970’s
Ivan Sutherland HMD
The First Wave (1980’s – 90’s)
NASA 1989
VPL 1990’s
Virtuality 1990’s
Jaron Lanier
•  Founded VPL, coined term “Virtual Reality”
Desktop VR - 1995
•  Expensive - $150,000+
•  2 million polys/sec
•  VGA HMD – 30 Hz
•  Magnetic tracking
Second Wave (2010 - )
• Palmer Luckey
•  HMD hacker
•  Mixed Reality Lab (MxR)
• Oculus Rift (2011 - )
•  2012 - $2.4 million kickstarter
•  2014 - $2B acquisition FaceBook
•  $350 USD, 110o FOV
•  sddg
Oculus Rift
Sony Morpheus
HTC/Valve Vive
2016 - Rise of Consumer HMDs
Desktop VR 2016
• Graphics Desktop
• $1,500 USD
• >4 Billion poly/sec
• $600 HMD
• 1080x1200, 90Hz
• Optical tracking
• Room scale
https://immersivelifeblog.files.wordpress.com/2015/04/vr_history.jpg
Market Size
Computer Based vs. Mobile VR
Mobile VR
CPU: 300 Mhz
HDD; 9GB
RAM: 512 mb
Camera: VGA 30fps
Graphics: 500K poly/sec
1998: SGI O2 2008: Nokia N95
CPU: 332 Mhz
HDD; 8GB
RAM: 128 mb
Camera: VGA 30 fps
Graphics: 2m poly/sec
Mobile Phone AR & VR
• Mobile Phone AR
• Mobile phone
• Live camera view
• Senor input (GPS, compass)
• Mobile Phone VR
• Mobile phone
• Senor input (compass)
• Additional VR viewer
VR2GO (2013)
•  MxR Lab
•  3D print VR viewer for mobiles
•  Open source hardware + software
•  http://projects.ict.usc.edu/mxr/diy/vr2go/
Multiple Mobile VR Viewers Available
•  zxcvz
CARDBOARD VR
•  dsfsaf
Google Cardboard
• Released 2014 (Google 20% project)
• >5 million shipped/given away
• Easy to use developer tools
+ =
Cardboard
($2)
Lenses
($10)
Magnets
($6)
Velcro
($3)
Rubber
Band
(1¢)
Software
Components
Assembling the Cardboard Viewer
Version 1.0 vs Version 2.0
•  Version 1.0 – Android focused, magnetic switch, small phone
•  Version 2.0 – Touch input, iOS/Android, fits many phones
Many Different Cardboard Viewers
SAMPLE CARDBOARD
APPLICATIONS
Cardboard App
• 7 default experiences
•  Earth: Fly on Google Earth
•  Tour Guide: Visit sites with guides
•  YouTube: Watch popular videos
•  Exhibit: Examine cultural artifacts
•  Photo Sphere: Immersive photos
•  Street View: Drive along a street
•  Windy Day: Interactive short story
100’s of Google Play Cardboard apps
Sample Applications
Cardboard Camera
• Capture 360 panoramas
• Stitch together images on phone
• View in VR on Cardboard
Google Expeditions
• Teacher led VR experiences
• https://www.google.com/edu/expeditions/
Building Your Own Application
• Cardboard Viewer
•  https://www.google.com/get/cardboard/
• Smart phone
•  Android/iOS
• Cardboard SDK
•  iOS, Android, Unity
•  https://developers.google.com/cardboard/
• Unity game engine (optional)
•  https://unity3d.com
• Content
Cardboard SDK
	
Features:	
1.  Lens	distor-on	correc-on.	
2.  Head	tracking.	
3.  3D	calibra-on.	
4.  Side-by-side	rendering.	
5.  Stereo	geometry	configura-on.	
6.  User	input	event	handling.	
Unity Cardboard SDK
INTRODUCTION TO UNITY
Unity 3D Game Editor
SETUP
Download and Install
•  Go to unity3d.com/download
•  Use Download Assistant – pick components you want
Getting Started
•  First time running Unity you’ll be asked to create a project
•  Specify project name and location
•  Can pick asset packages (pre-made content)
Unity Interface
•  Toolbar, Scene, Hierarchy, Project, Inspector
Customizable Interface
Building Scenes
• Use GameObjects:
•  Containers that hold different components
•  Eg 3D model, texture, animation
• Use Inspector
•  View and edit object properties and other settings
• Use Scene View
•  Position objects, camera, lights, other GameObjects etc
• Scripting
•  Adding interaction, user input, events, etc
GameObjects
•  Every object in Scene is a GameObject
•  GameObjects contain Components
•  Eg Transform Component, Camera Component
Adding 3D Content
•  Create 3D asset using modeling package, or download
•  Fbx, Obj file format for 3D models
•  Add file to Assets folder in Project
•  When project opened 3D model added to Project View
•  Drag mesh from Project View into Hierarchy or Scene View
•  Creates a game object
Positioning/Scaling Objects
•  Click on object and choose transform
Unity Asset Store
•  Download thousands models, scripts, animations, etc
•  https://www.assetstore.unity3d.com/
UNITY BASICS
Making a Simple Scene
1.  Create New Project
2.  Create Game Object
3.  Moving main camera position
4.  Adding lights
5.  Adding more objects
6.  Adding physics
7.  Changing object materials
8.  Adding script behaviour
CreateProject
•  Create new folder and project
New Empty Project
Create GameObject
•  Load a Sphere into the scene
•  GameObject -> 3D Object -> Sphere
Moving main camera
•  Select Main Camera
•  Select translate icon
•  Move camera
Add Light
•  GameObject -> Light -> Directional Light
•  Use inspector to modify light properties (colour, intensity)
Add Physics
•  Select Sphere
•  Add Rigidbody component
•  Add Component -> Physics -> RigidBody
•  or Component -> Physics -> RigidBody
•  Modify inspector properties (mass, drag, etc)
Add More Objects
•  Add several cubes
•  GameObject -> 3D Object – Cube
•  Move cube
•  Add Rigid Body component (uncheck gravity)
Add Material
•  Assets -> Create -> Material
•  Click Albedo colour box in inspector
•  Select colour
•  Drag asset onto object to apply
Add Script
•  Assets -> Create -> C# script
•  Edit script using Mono
•  Drag script onto Game Object
Example C# Script
GameObject Rotation
using UnityEngine;

using System.Collections;



public class spin : MonoBehaviour {



    // Use this for initialization

    void Start () {

    

    }

    

    // Update is called once per frame

    void Update () {

        this.gameObject.transform.Rotate(Vector3.up*10);

    }

}

#
Scripting C# Unity 3D
•  void Awake():
•  Is called when the first scene is loaded and the game object is active
•  void Start():
•  Called on first frame update
•  void FixedUpdate():
•  Called before physics calculations are made
•  void Update():
•  Called every frame before rendering
•  void LateUpdate():
•  Once per frame after update finished
Final Spinning Cube Scene
BUILDING AR AND VR
EXPERIENCES
Mark Billinghurst
mark.billinghurst@unisa.edu.au
May 10th – 13th
Xi’an
LECTURE 2: VR SCENES
IMMERSIVE PANORAMAS
Types of VR Experiences
• Immersive Spaces
•  360 Panorama’s/Movies
•  High visual quality
•  Limited interactivity
•  Changing viewpoint orientation
• Immersive Experiences
•  3D graphics
•  Lower visual quality
•  High interactivity
•  Movement in space
•  Interact with objects
Immersive Panorama
•  High quality 360 image or video surrounding user
•  User can turn head to see different views
•  Fixed position
Demo: Cardboard Camera
• Capture 360 panoramas
• Stitch together images on phone
• View in VR on Cardboard
Example Applications
• VRSE – Storytelling for VR
•  http://vrse.com/
•  High quality 360 VR content
• New York Times VR Experience
•  NYTVR application
•  Documentary experiences
• Vrideo
•  http://vrideo.com/
•  Streamed immersive movies
Vrideo Website – vrideo.com
Capturing Panorama
• Stitching photos together
•  Image Composite Editor (Microsoft)
•  AutoPano (Kolor)
• Using 360 camera
•  Ricoh Theta-S
•  Fly360
Image Composite Editor (Microsoft)
•  Free panorama stitching tool
•  http://research.microsoft.com/en-us/um/redmond/projects/ice/
AutoPano (Kolor)
•  Finds image from panoramas and stitches them together
•  http://www.kolor.com/autopano/
Steps to Make Immersive Panorama
1.  Create a new project
2.  Load the Cardboard SDK
3.  Load a panorama image asset
4.  Create a Skymap
5.  Add to VR scene
6.  Deploy to mobile phone
Need
•  Google Cardboard SDK Unity package
•  Android SDK to install on Android phone
New Project
Load Cardboard SDK
•  Assets -> Import Package -> Custom Package
•  Navigate to CardboardSDKForUnity.unitypackage
•  Uncheck iOS (for Android build)
Load Cardboard Main Camera
•  Drag CardboardMain prefab into Hierarchy
•  Assets -> Cardboard -> Prefab
•  Delete CameraMain
Panorama Image Asset
•  Find/create suitable panorama image
•  Ideally 2K or higher resolution image
•  Google “Panorama Image Cubemap”
Add Image Asset to Project
•  Assets -> Import Asset
•  Select desired image
•  Set Texture Type to
Cubemap
•  Set mapping to Latitude-
Longitude (Cylindrical)
Create Skybox Material
•  Assets -> Create -> Material
•  Name material
•  Set Shader to Skybox -> Cubemap
•  Drag texture to cubemap
Create Skybox
•  Window -> Lighting
•  Drag Skybox material into
Skypebox form
Panorama Image in Unity
One Last Thing..
•  CardboardMain -> Head -> Main Camera
•  Set Clear Flags to Skybox
Test It Out
•  Hit play, use alt/option key + mouse to look around
Deploy to Mobile (Android)
1.  Plug phone into USB
• make sure device in debug mode
2.  Set correct build settings
3.  Player settings
• Other settings
•  Set Bundle Idenitfier -> com.Company.ProductName
• Resolution and Presentation
•  Default Orientation -> Landscape Left
4.  Build and run
Deploying to Phone
1.  Plug phone into USB
2.  Open Build Settings
3.  Change Target platform to Android
4.  Select Player Settings
5.  Resolution and Presentation
•  Default Orientation -> Landscape Left
6.  Under Other Settings
•  Edit Bundle Identifier – eg com.UniSA.cubeTest
•  Minimum API level
7.  Build and Run
•  Select .apk file name
Running on Phone
•  Droid@Screen View on Desktop
Making Immersive Movie
•  Create movie texture
•  Convert 360 video to .ogg or ,mp4 file
•  Add video texture as asset
•  Make Sphere
•  Equirectangular UV mapping
•  Inward facing normals
•  Move camera to centre of sphere
•  Texture map video to sphere
•  Easy Movie Texture ($65)
•  Apply texture to 3D object
•  For 3D 360 video
•  Render two Spheres
•  http://bernieroehl.com/360stereoinunity/
BUILDING AR AND VR
EXPERIENCES
Mark Billinghurst
mark.billinghurst@unisa.edu.au
May 10th – 13th
Xi’an
LECTURE 3: 3D SCENES
CREATING 3D
ENVIRONMENTS
3D Virtual Environments
• Viewing content in true 3D
• Moving/interacting with scene
Example: Cardboard Tuscany Drive
• Place viewer inside 3D scene
• Navigate by head pointing
Key Steps
1.  Creating a new project
2.  Load Cardboard SDK
3.  Replace camera with CardboardMain
4.  Loading in 3D asset packages
5.  Loading a SkyDome
6.  Adding a plane floor
New Project
•  Camera replaced with CameraMain
Download Model Package
•  Magic Lamp from 3dFoin
•  Search on Asset store
Load Asset + Add to Scene
•  Assets -> Import Package -> Custom Package
•  Look for MagicLamp.unitypackage (If not installed already)
•  Drag MagicLamp_LOD0 to Hierarchy
•  Position and rotate
Import SkySphere package
•  SkySphere Volume1 on Asset store
•  Import SkySphere package
Add SkySphere to Scene
•  Drag Skyball_WithoutCap into Hierarchy
•  SkySphere_V1 -> Meshes
•  Rotate and Scale as needed
Add Ground Plane
•  GameObject -> 3D Object -> Plane
•  Set Scale X to 2.0, Z to 2.0
Testing View
•  Use alt/option key plus mouse to rotate view
Adding More Assets
•  Load from Asset store – look for free assets
•  Assets -> Import Package -> Custom Package
Final Scene
ADDING INTERACTIVITY
Moving through space
• Move through looking
•  Look at target to turn on/off moving
• Button/tapping screen
• Being in a vehicle (e.g. Roller Coaster)
Adding Movement
Goal: Move in direction user looking when
Cardboard Button pressed or screen touched
• Key Steps
1.  Start with static screen
2.  Create movement script
3.  Add movement script to Camera head
4.  Deploy to mobile
Static Scene
Create Movement Script
•  Add new script object
•  Assets -> Create -> C# Script
•  Edit script in Mono
Add Script to Scene
•  Drag Script onto Head object
•  CameraboardMain -> Head
•  Uncheck Track Position Box
•  Adjust movement speed
Gaze Interaction
• Cause events to happen when looking at objects
•  Look at target shoot it
Steps
• Add physics ray caster
• Casts a ray from camera
• Add function to object to respond to gaze
• Eg particle effect
• Add event trigger to object
• Add event system to scene
• Add collider object to target object
Adding Physics Raycaster
• Select Main Camera
•  CardboardMain -> Head -> Main Camera
• Add Physics Raycaster Component
•  Add component -> Physics Raycaster
Add Gaze Function
•  Select target object (Lamp)
•  Add component -> script
•  Add stareAtLamp() public function
Add Event Trigger
•  Select Target Object (Lamp)
•  Add component
•  EventTriger
•  Add New Event Type -> PointerExit
•  Add object to event
•  Hit ‘+’ tag
•  Drag Lamp object to box under RuntimeOn
•  Select Function to run
•  Select function list -> scroll to stareAtLamp
Adding Event System
• Need Event System for trigger to work
• Select Lamp object
•  UI -> Event System
• Add gazeInputModule
•  Add component -> Cardboard -> Gaze Input Module
Add Collider to Object
•  Need to detect when target being looked at
•  Select Lamp Object
•  Add Sphere Collider
•  Add component -> Sphere Collider (type in “Sphere”)
•  Adjust position and radius of Sphere Collider if needed
Add Gaze Event
• Particles triggered looking at lamp
• Add particle system
•  Add Component -> Particle System
•  Pick colour
•  set Emission rate to 0
• Add code to stareAtLamp() function
•  GetComponent<ParticleSystem>().Emit(10);
•  Turns particle system on when looked at
Gaze Demo
•  Particles launched from Lamp when looked at
Adding More Interactivity
•  Load Cardboard Demo application
•  Assets -> Import Package -> Custom Package
•  Load CardboardDemoForUnity.unitypackage
•  Launch Demo Scene
•  Assets -> Cardboard -> DemoScene
Features Shown
•  Gaze reticle + selection
•  Viewpoint teleportation
•  Menu panel overlay
•  Audio feedback
•  Event system
DESIGN GUIDELINES
Google Design Guidelines
• Google’s Guidelines for good VR experiences:
•  Physiological Considerations
•  Interactive Patterns
•  Setup
•  Controls
•  Feedback
•  Display Reticle
•  From http://www.google.com/design/spec-vr/designing-
for-google-cardboard/a-new-dimension.html
Physiological Considerations
• Factors to Consider
•  Head tracking
•  User control of movement
•  Use constant velocity
•  Grounding with fixed objects
•  Brightness changes
Interactive Patterns - Setup
• Setup factors to consider:
• Entering and exiting
• Headset adaptation
• Full Screen mode
• API calls
• Indicating VR apps
Interactive Patterns - Controls
• Use fuze buttons for selection in VR
Interactive Patterns - Feedback
• Use audio and haptic feedback
• Reduce visual overload
• Audio alerts
• 3D spatial sound
• Phone vibrations
Interactive Patterns - Display Reticle
•  Easier for users to target objects with a display reticle
•  Can display reticle only when near target object
•  Highlight objects (e.g. with light source) that user can target
Cardboard Design Lab Application
•  Use Cardboard Design Lab app to explore design ideas
BUILDING AR AND VR
EXPERIENCES
Mark Billinghurst
mark.billinghurst@unisa.edu.au
May 10th – 13th
Xi’an
LECT. 4: AUGMENTED REALITY
1977 – StarWars –Augmented Reality
Augmented Reality Definition
• Defining Characteristics [Azuma 97]
• Combines Real andVirtual Images
• Both can be seen at the same time
• Interactive in real-time
• The virtual content can be interacted with
• Registered in 3D
• Virtual objects appear fixed in space
Azuma, R. T. (1997). A survey of augmented reality. Presence, 6(4), 355-385.
2008 - CNN
•  Put AR pictures here
Augmented Reality Examples
Where CanYou UseAR/VR?
Milgram’s Reality-Virtuality continuum
Mixed Reality
Reality - Virtuality (RV) Continuum
Real
Environment
Augmented
Reality (AR)
Augmented
Virtuality (AV)
Virtual
Environment
"...anywhere between the extrema of the virtuality continuum."
P. Milgram and A. F. Kishino, Taxonomy of Mixed Reality Visual Displays
IEICE Transactions on Information and Systems, E77-D(12), pp. 1321-1329, 1994.
Summary
• Augmented Reality has three key features
• Combines Real andVirtual Images
• Interactive in real-time
• Registered in 3D
• AR can be classified alongside other technologies
• Milgram’s Mixed Reality continuum
TECHNOLOGY
Augmented Reality Definition
• Defining Characteristics
• Combines Real andVirtual Images
• Display Technology
• Interactive in real-time
• Interaction Technology
• Registered in 3D
• Tracking Technology
DISPLAY
Display Technologies
" Types (Bimber/Raskar 2003)
" Head attached
•  Head mounted display/projector
" Body attached
•  Handheld display/projector
" Spatial
•  Spatially aligned projector/monitor
TRACKING
Objects Registered in 3D
• Registration
• Positioning virtual object wrt real world
• Tracking
• Continually locating the users viewpoint
•  Position (x,y,z), Orientation (r,p,y)
Tracking Technologies
#  Active
•  Mechanical, Magnetic, Ultrasonic
•  GPS, Wifi, cell location
#  Passive
•  Inertial sensors (compass, accelerometer, gyro)
•  Computer Vision
•  Marker based, Natural feature tracking
#  Hybrid Tracking
•  Combined sensors (eg Vision + Inertial)
Tracking Types
Magnetic
Tracker
Inertial
Tracker
Ultrasonic
Tracker
Optical
Tracker
Marker-Based
Tracking
Markerless
Tracking
Specialized
Tracking
Edge-Based
Tracking
Template-Based
Tracking
Interest Point
Tracking
Mechanical
Tracker
INTERACTION
• Interface Components
• Physical components
• Display elements
• Visual/audio
• Interaction metaphors
Physical
Elements
Display
ElementsInteraction
MetaphorInput Output
AR Interface Elements
AR Design Space
Reality Virtual Reality
Augmented Reality
Physical Design Virtual Design
AR APPLICATIONS
•  Web based AR
•  Flash, HTML 5 based AR
•  Marketing, education
•  Outdoor Mobile AR
•  GPS, compass tracking
•  Viewing Points of Interest in real world
•  Eg: Junaio, Layar, Wikitude
•  Handheld AR
•  Vision based tracking
•  Marketing, gaming
•  Location Based Experiences
•  HMD, fixed screens
•  Museums, point of sale, advertising
Typical AR Experiences
CityViewARApplication
•  Visualize Christchurch before the earthquakes
User Experience
• MultipleViews
• MapView,ARView, ListView
• Multiple Data Types
• 2D images, 3D content, text, panoramas
Warp Runner
• Puzzle solving game
• Deform real world terrain
Demo:colAR
• Turn colouring books pages into AR scenes
• Markerless tracking, use your own colours..
• Try it yourself: http://www.colARapp.com/
What Makes a GoodAR Experience?
• Compelling
• Engaging,‘Magic’ moment
• Intuitive, ease of use
• Uses existing skills
• Anchored in physical world
• Seamless combination of real and digital
USING VUFORIA
Mark Billinghurst
mark.billinghurst@unisa.edu.au
What you will learn
•  Introduction to Vuforia
•  Platform and features
•  How to install/set-up Vuforia
•  Vuforia Basics
•  Marker Tracking, Object tracking
•  Deploying to Mobile Device
•  Android, iOS
OVERVIEW
Vuforia Overview
•  Platform for Mobile Computer Vision
•  https://www.qualcomm.com/products/vuforia
•  Released by Qualcomm in 2010, acquired by PTC 2015
•  Used by over 200K developers, >20K applications
•  Main Features:
•  Recognition
•  Image, text, object recognition
•  Tracking
•  Image, marker, scene, object
Vuforia Provides
•  Android	
•  iOS	
•  Unity	Extension	
Device	SDK	
•  Target	Management	System		
•  App	Development	Guide	
•  Vuforia	Web	Services	
Tools	&	Services	
•  Dedicated technical support
engineers
•  Thousands of posts
Support	Forum
Vuforia Features
Tracking Targets
Image
Object
Environment
Developer Tools
Target Manager
Cloud Services
Platform Anatomy
User Experiences Enabled
INSTALLATION
Download Vuforia for Unity SDK
•  https://developer.vuforia.com/downloads/sdk
Download Samples
•  https://developer.vuforia.com/downloads/samples
Installing Vuforia Unity Extension
• Create new Unity Project
• Import the Vuforia Unity Extension
•  Double clicking the *.unitypackage file
•  Eg vuforia-unity-5-0-6.unitypackage
•  Manually install package
•  Assets -> Import Package -> Custom Package
• The extension archive will self install
•  folders, plugins and libraries, etc
Imported Vuforia Assets
Unity Asset Structure
•  Editor - Contains the scripts required to
interact with Target data in the Unity editor
•  Plugins - Contains Java and native binaries
that integrate the Vuforia AR SDK with the
Unity Android or Unity iOS application
•  Vuforia - Contains the prefabs and scripts
required to bring AR to your application
•  Streaming Assets / QCAR - Contains the
Device Database configuration XML and
DAT files from the online Target Manager
USING VUFORIA
Setting up a Vuforia Project
• Register as Developer
• Create a Project
• Obtain a License Key
• Load Vuforia package into Unity
• Add license key to AR Camera
• Add Tracking Targets
• Move ImageTarget into Scene
• Add sample object to ImageTarget
Register as Developer
•  https://developer.vuforia.com/user/register
Download Vuforia Packages
•  Go to download URL – log in
•  https://developer.vuforia.com/downloads/sdk
•  Download Current SDK for Unity
•  vuforia-unity-5-5-9.unitypackage
•  Download Core Features Sample
•  vuforia-samples-core-unity-5-5-9.zip
Create License Key
•  https://developer.vuforia.com/targetmanager/licenseManager/licenseListing
Obtain a License Key
•  Vuforia 5.5 apps utilize a license key that uniquely identifies
each app. License keys are created in the License Manager
•  The steps to creating a new license key are..
•  Choose a SDK
•  Choose a licensing option based on your requirements
•  Provide your Billing Information if you've chosen to use a paid license
•  Obtain your license Key
License Key Generated – Save This
Load Vuforia Package
•  Open Unity
•  Load package
•  Assets -> Import Package -> Custom Package
•  Load vuforia-unity-5-5-9 (or current version)
•  Note:
•  On Windows Vuforia only works with 32 bit version of Unity
•  You may need to download Unity 32 version to work
Add License Key to Vuforia Project
•  Open ARCamera Inspector in Vuforia
•  Assets -> Vuforia -> Prefabs
•  Move AR Camera to scene hierarchy (Delete Main Camera)
•  Paste License Key
Adding Tracking Targets
•  Create a target on the Target Manager
•  https://developer.vuforia.com/targetmanager/
•  OR - Use existing targets from other projects
Which Type of Database
•  Device Database vs. Cloud Database?
•  Device: local, Cloud: online
Creating a Target
•  Create a database
•  Add targets
Selecting Target Type
Sample Tracking Images
Loaded Image Target
• Rating indicates how good a target
• Download Dataset -> create unity package
• Eg StoneImage.unitypackage
Loading the Tracking Image
•  Import tracking dataset package
•  Assets -> Import Package -> Custom Package
•  Drag ImageTarget prefab into Scene Hierarchy
•  Select ImageTarget, pick Data Set then Image Target
•  Set image width
•  On AR Camera load target database and activate
•  Database Load Behaviour
ImageTarget Loaded
Testing the Camera View
Add 3D Content
•  As a test, create a simple Cube object
•  GameObject > Create Other > Cube
•  Add the cube as a child of the ImageTarget object by
dragging it onto the ImageTarget item.
•  Move the cube until it is centered on the Image Target.
AR Test View
DEPLOYING TO MOBILE
APPLICATION
• Unity
•  Creating the Application
•  Configure the export settings and build the Application
216
Building for Android
• Open Build Settings
• Change Target platform to Android
• Switch Platform
• Under Player Settings
•  Edit Bundle Identifier – eg com.UniSA.cubeTest
•  Minimum API level
• Build and Run
•  Select .apk file name
ADDING INTERACTION
Adding Interaction
•  Look at Vuforia core samples
•  Virtual Buttons
•  Create virtual buttons on page that triggers actions
•  Eg touch button change object colour
Virtual Buttons
RESOURCES
Books
• Unity Virtual Reality Projects
•  Jonathan Linowes
• Holistic Game Development
with Unity
•  Penny de Byl
Cardboard Resources
•  Google Cardboard main page
•  https://www.google.com/get/cardboard/
•  Developer Website
•  https://www.google.com/get/cardboard/developers/
•  Building a VR app for Cardboard
•  http://www.sitepoint.com/building-a-google-cardboard-vr-app-in-unity/
•  Creating VR game for Cardboard
•  http://danielborowski.com/posts/create-a-virtual-reality-game-for-
google-cardboard/
•  Moving in VR space
•  http://www.instructables.com/id/Prototyping-Interactive-Environments-
in-Virtual-Re/
Vuforia Resources
•  Vuforia Product Page
https://www.qualcomm.com/products/vuforia
•  Vuforia Developer Page
https://developer.vuforia.com
•  SDK Download Page
https://developer.vuforia.com/downloads/sdk
•  Installing Vuforia for Unity extension
http://developer.vuforia.com/library/articles/Solution/
Installing-the-Unity-Extension
•  Tutorials
https://developer.vuforia.com/resources/tutorials
Unity Resources
• Unity Main site
• http://www.unity3d.com/
• Holistic Development with Unity
• http://holistic3d.com
• Official Unity Tutorials
• http://unity3d.com/learn/tutorials
• Unity Coder Blog
• http://unitycoder.com
www.empathiccomputing.org
@marknb00
mark.billinghurst@unisa.edu.au

Contenu connexe

Tendances

Seminar ppt on google cardboard
Seminar ppt on google cardboardSeminar ppt on google cardboard
Seminar ppt on google cardboardPankaj Kushwaha
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseMark Billinghurst
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseMark Billinghurst
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional InterfacesMark Billinghurst
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VRMark Billinghurst
 
COMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual RealityCOMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual RealityMark Billinghurst
 
Comp4010 lecture3-AR Technology
Comp4010 lecture3-AR TechnologyComp4010 lecture3-AR Technology
Comp4010 lecture3-AR TechnologyMark Billinghurst
 
Lecture 2 Presence and Perception
Lecture 2 Presence and PerceptionLecture 2 Presence and Perception
Lecture 2 Presence and PerceptionMark Billinghurst
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsMark Billinghurst
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsMark Billinghurst
 
Augmented Reality
Augmented Reality Augmented Reality
Augmented Reality Kiran Kumar
 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR PrototypingMark Billinghurst
 
Comp 4010 2021 - Snap Tutorial-1
Comp 4010 2021 - Snap Tutorial-1Comp 4010 2021 - Snap Tutorial-1
Comp 4010 2021 - Snap Tutorial-1Mark Billinghurst
 
Virtual reality - Google Cardboard
Virtual reality - Google CardboardVirtual reality - Google Cardboard
Virtual reality - Google CardboardKarthik G N
 
2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR InteractionMark Billinghurst
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationMark Billinghurst
 
Augmented reality ppt
Augmented reality pptAugmented reality ppt
Augmented reality pptDark Side
 

Tendances (20)

Seminar ppt on google cardboard
Seminar ppt on google cardboardSeminar ppt on google cardboard
Seminar ppt on google cardboard
 
Empathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader MetaverseEmpathic Computing: Designing for the Broader Metaverse
Empathic Computing: Designing for the Broader Metaverse
 
Augmented Reality
Augmented RealityAugmented Reality
Augmented Reality
 
Empathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole MetaverseEmpathic Computing: Developing for the Whole Metaverse
Empathic Computing: Developing for the Whole Metaverse
 
Research Directions in Transitional Interfaces
Research Directions in Transitional InterfacesResearch Directions in Transitional Interfaces
Research Directions in Transitional Interfaces
 
426 lecture2: AR Technology
426 lecture2: AR Technology426 lecture2: AR Technology
426 lecture2: AR Technology
 
2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR2022 COMP 4010 Lecture 7: Introduction to VR
2022 COMP 4010 Lecture 7: Introduction to VR
 
COMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual RealityCOMP 4010 - Lecture 1: Introduction to Virtual Reality
COMP 4010 - Lecture 1: Introduction to Virtual Reality
 
Comp4010 lecture3-AR Technology
Comp4010 lecture3-AR TechnologyComp4010 lecture3-AR Technology
Comp4010 lecture3-AR Technology
 
Lecture 2 Presence and Perception
Lecture 2 Presence and PerceptionLecture 2 Presence and Perception
Lecture 2 Presence and Perception
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications
 
Comp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research DirectionsComp4010 Lecture12 Research Directions
Comp4010 Lecture12 Research Directions
 
Augmented Reality
Augmented Reality Augmented Reality
Augmented Reality
 
2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping2022 COMP4010 Lecture5: AR Prototyping
2022 COMP4010 Lecture5: AR Prototyping
 
Lecture 9 AR Technology
Lecture 9 AR TechnologyLecture 9 AR Technology
Lecture 9 AR Technology
 
Comp 4010 2021 - Snap Tutorial-1
Comp 4010 2021 - Snap Tutorial-1Comp 4010 2021 - Snap Tutorial-1
Comp 4010 2021 - Snap Tutorial-1
 
Virtual reality - Google Cardboard
Virtual reality - Google CardboardVirtual reality - Google Cardboard
Virtual reality - Google Cardboard
 
2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction2022 COMP4010 Lecture4: AR Interaction
2022 COMP4010 Lecture4: AR Interaction
 
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote CollaborationTalk to Me: Using Virtual Avatars to Improve Remote Collaboration
Talk to Me: Using Virtual Avatars to Improve Remote Collaboration
 
Augmented reality ppt
Augmented reality pptAugmented reality ppt
Augmented reality ppt
 

Similaire à Building AR and VR Experiences

Developing AR and VR Experiences with Unity
Developing AR and VR Experiences with UnityDeveloping AR and VR Experiences with Unity
Developing AR and VR Experiences with UnityMark Billinghurst
 
Create Your Own VR Experience
Create Your Own VR ExperienceCreate Your Own VR Experience
Create Your Own VR ExperienceMark Billinghurst
 
Building VR Applications For Google Cardboard
Building VR Applications For Google CardboardBuilding VR Applications For Google Cardboard
Building VR Applications For Google CardboardMark Billinghurst
 
Mobile AR Lecture1-introduction
Mobile AR Lecture1-introductionMobile AR Lecture1-introduction
Mobile AR Lecture1-introductionMark Billinghurst
 
2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: PerceptionMark Billinghurst
 
Cardboard VR: Building Low Cost VR Experiences
Cardboard VR: Building Low Cost VR ExperiencesCardboard VR: Building Low Cost VR Experiences
Cardboard VR: Building Low Cost VR ExperiencesMark Billinghurst
 
Virtual Reality: Sensing the Possibilities
Virtual Reality: Sensing the PossibilitiesVirtual Reality: Sensing the Possibilities
Virtual Reality: Sensing the PossibilitiesMark Billinghurst
 
COMP 4010 - Lecture 7: Introduction to Augmented Reality
COMP 4010 - Lecture 7: Introduction to Augmented RealityCOMP 4010 - Lecture 7: Introduction to Augmented Reality
COMP 4010 - Lecture 7: Introduction to Augmented RealityMark Billinghurst
 
COMP 4010: Lecture8 - AR Technology
COMP 4010: Lecture8 - AR TechnologyCOMP 4010: Lecture8 - AR Technology
COMP 4010: Lecture8 - AR TechnologyMark Billinghurst
 
2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XRMark Billinghurst
 
Comp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-PerceptionComp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-PerceptionMark Billinghurst
 
2016 AR Summer School Lecture2
2016 AR Summer School Lecture22016 AR Summer School Lecture2
2016 AR Summer School Lecture2Mark Billinghurst
 
Mobile AR Lecture 2 - Technology
Mobile AR Lecture 2 - TechnologyMobile AR Lecture 2 - Technology
Mobile AR Lecture 2 - TechnologyMark Billinghurst
 
Augmented Reality: The Next 20 Years
Augmented Reality: The Next 20 YearsAugmented Reality: The Next 20 Years
Augmented Reality: The Next 20 YearsMark Billinghurst
 
Comp 4010 2021 Lecture1-Introduction to XR
Comp 4010 2021 Lecture1-Introduction to XRComp 4010 2021 Lecture1-Introduction to XR
Comp 4010 2021 Lecture1-Introduction to XRMark Billinghurst
 

Similaire à Building AR and VR Experiences (20)

Developing AR and VR Experiences with Unity
Developing AR and VR Experiences with UnityDeveloping AR and VR Experiences with Unity
Developing AR and VR Experiences with Unity
 
Create Your Own VR Experience
Create Your Own VR ExperienceCreate Your Own VR Experience
Create Your Own VR Experience
 
AR-VR Workshop
AR-VR WorkshopAR-VR Workshop
AR-VR Workshop
 
Mobile AR Tutorial
Mobile AR TutorialMobile AR Tutorial
Mobile AR Tutorial
 
Easy Virtual Reality
Easy Virtual RealityEasy Virtual Reality
Easy Virtual Reality
 
Building VR Applications For Google Cardboard
Building VR Applications For Google CardboardBuilding VR Applications For Google Cardboard
Building VR Applications For Google Cardboard
 
Mobile AR Lecture1-introduction
Mobile AR Lecture1-introductionMobile AR Lecture1-introduction
Mobile AR Lecture1-introduction
 
2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception2022 COMP4010 Lecture2: Perception
2022 COMP4010 Lecture2: Perception
 
Cardboard VR: Building Low Cost VR Experiences
Cardboard VR: Building Low Cost VR ExperiencesCardboard VR: Building Low Cost VR Experiences
Cardboard VR: Building Low Cost VR Experiences
 
Virtual Reality: Sensing the Possibilities
Virtual Reality: Sensing the PossibilitiesVirtual Reality: Sensing the Possibilities
Virtual Reality: Sensing the Possibilities
 
COMP 4010 - Lecture 7: Introduction to Augmented Reality
COMP 4010 - Lecture 7: Introduction to Augmented RealityCOMP 4010 - Lecture 7: Introduction to Augmented Reality
COMP 4010 - Lecture 7: Introduction to Augmented Reality
 
COMP 4010: Lecture8 - AR Technology
COMP 4010: Lecture8 - AR TechnologyCOMP 4010: Lecture8 - AR Technology
COMP 4010: Lecture8 - AR Technology
 
Virtual Reality 2.0
Virtual Reality 2.0Virtual Reality 2.0
Virtual Reality 2.0
 
2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR2022 COMP4010 Lecture1: Introduction to XR
2022 COMP4010 Lecture1: Introduction to XR
 
Lecture 4: VR Systems
Lecture 4: VR SystemsLecture 4: VR Systems
Lecture 4: VR Systems
 
Comp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-PerceptionComp4010 2021 Lecture2-Perception
Comp4010 2021 Lecture2-Perception
 
2016 AR Summer School Lecture2
2016 AR Summer School Lecture22016 AR Summer School Lecture2
2016 AR Summer School Lecture2
 
Mobile AR Lecture 2 - Technology
Mobile AR Lecture 2 - TechnologyMobile AR Lecture 2 - Technology
Mobile AR Lecture 2 - Technology
 
Augmented Reality: The Next 20 Years
Augmented Reality: The Next 20 YearsAugmented Reality: The Next 20 Years
Augmented Reality: The Next 20 Years
 
Comp 4010 2021 Lecture1-Introduction to XR
Comp 4010 2021 Lecture1-Introduction to XRComp 4010 2021 Lecture1-Introduction to XR
Comp 4010 2021 Lecture1-Introduction to XR
 

Plus de Mark Billinghurst

Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsMark Billinghurst
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024Mark Billinghurst
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented RealityMark Billinghurst
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesMark Billinghurst
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the MetaverseMark Billinghurst
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseMark Billinghurst
 
2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR SystemsMark Billinghurst
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR SystemsMark Billinghurst
 
2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR TechnologyMark Billinghurst
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsMark Billinghurst
 
Comp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsComp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsMark Billinghurst
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsMark Billinghurst
 
Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Mark Billinghurst
 
Comp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface DesignComp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface DesignMark Billinghurst
 
Comp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsComp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsMark Billinghurst
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARMark Billinghurst
 

Plus de Mark Billinghurst (18)

Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
 
Future Research Directions for Augmented Reality
Future Research Directions for Augmented RealityFuture Research Directions for Augmented Reality
Future Research Directions for Augmented Reality
 
Evaluation Methods for Social XR Experiences
Evaluation Methods for Social XR ExperiencesEvaluation Methods for Social XR Experiences
Evaluation Methods for Social XR Experiences
 
Empathic Computing: Delivering the Potential of the Metaverse
Empathic Computing: Delivering  the Potential of the MetaverseEmpathic Computing: Delivering  the Potential of the Metaverse
Empathic Computing: Delivering the Potential of the Metaverse
 
Empathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the MetaverseEmpathic Computing: Capturing the Potential of the Metaverse
Empathic Computing: Capturing the Potential of the Metaverse
 
2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems2022 COMP4010 Lecture 6: Designing AR Systems
2022 COMP4010 Lecture 6: Designing AR Systems
 
ISS2022 Keynote
ISS2022 KeynoteISS2022 Keynote
ISS2022 Keynote
 
Novel Interfaces for AR Systems
Novel Interfaces for AR SystemsNovel Interfaces for AR Systems
Novel Interfaces for AR Systems
 
2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology2022 COMP4010 Lecture3: AR Technology
2022 COMP4010 Lecture3: AR Technology
 
Empathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive AnalyticsEmpathic Computing and Collaborative Immersive Analytics
Empathic Computing and Collaborative Immersive Analytics
 
Metaverse Learning
Metaverse LearningMetaverse Learning
Metaverse Learning
 
Comp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research DirectionsComp4010 Lecture13 More Research Directions
Comp4010 Lecture13 More Research Directions
 
Comp4010 lecture11 VR Applications
Comp4010 lecture11 VR ApplicationsComp4010 lecture11 VR Applications
Comp4010 lecture11 VR Applications
 
Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality Grand Challenges for Mixed Reality
Grand Challenges for Mixed Reality
 
Comp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface DesignComp4010 Lecture10 VR Interface Design
Comp4010 Lecture10 VR Interface Design
 
Comp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and SystemsComp4010 Lecture9 VR Input and Systems
Comp4010 Lecture9 VR Input and Systems
 
Advanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise ARAdvanced Methods for User Evaluation in Enterprise AR
Advanced Methods for User Evaluation in Enterprise AR
 

Dernier

Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Enterprise Knowledge
 
Gen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfGen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfAddepto
 
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage CostLeverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage CostZilliz
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek SchlawackFwdays
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxNavinnSomaal
 
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks..."LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...Fwdays
 
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)Wonjun Hwang
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Commit University
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Patryk Bandurski
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationRidwan Fadjar
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLScyllaDB
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024BookNet Canada
 
Search Engine Optimization SEO PDF for 2024.pdf
Search Engine Optimization SEO PDF for 2024.pdfSearch Engine Optimization SEO PDF for 2024.pdf
Search Engine Optimization SEO PDF for 2024.pdfRankYa
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brandgvaughan
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfAlex Barbosa Coqueiro
 
Powerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time ClashPowerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time Clashcharlottematthew16
 
Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyAlfredo García Lavilla
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr BaganFwdays
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Mattias Andersson
 

Dernier (20)

Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024
 
Gen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfGen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdf
 
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage CostLeverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
Leverage Zilliz Serverless - Up to 50X Saving for Your Vector Storage Cost
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptx
 
DMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special EditionDMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special Edition
 
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks..."LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
 
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 Presentation
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQL
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 
Search Engine Optimization SEO PDF for 2024.pdf
Search Engine Optimization SEO PDF for 2024.pdfSearch Engine Optimization SEO PDF for 2024.pdf
Search Engine Optimization SEO PDF for 2024.pdf
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brand
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdf
 
Powerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time ClashPowerpoint exploring the locations used in television show Time Clash
Powerpoint exploring the locations used in television show Time Clash
 
Commit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easyCommit 2024 - Secret Management made easy
Commit 2024 - Secret Management made easy
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?
 

Building AR and VR Experiences

  • 1. BUILDING AR AND VR EXPERIENCES Mark Billinghurst mark.billinghurst@unisa.edu.au May 10th – 13th, 2016 Xi’an, China
  • 2. Mark Billinghurst ▪  University of South Australia ▪  Past Director of HIT Lab NZ, University of Canterbury ▪  PhD Univ. Washington ▪  Research on AR, mobile HCI, Collaborative Interfaces ▪  More than 300 papers in AR, VR, interface design
  • 3. AR/VR Course • Lectures •  2:30 - 4:30 pm everyday •  Lectures/hands-on • Logistics •  Bring your own laptop if possible •  Use Android phone •  Share computer/phone • Material •  All material available for download
  • 4. What You Will Learn • AR/VR fundamentals + history • Basics of Unity Programming • How to make Panorama VR Applications • How to create VR Scenes • How to add Interactivity to VR Applications • Using the Vuforia AR tracking library • Creating AR scenes • Adding AR interactivity • Design guidelines • Research directions
  • 5. Schedule • Tuesday May 10th •  Introduction, Learning Unity, Building 360 VR scenes • Wednesday May 11th •  Creating 3D scenes, adding interactivity, good design • Thursday May 12th •  Introduction to AR, Vuforia basics, building AR scenes •  Field trip to ARA demo space • Friday May 13th •  Adding interactivity, advanced AR tracking, research
  • 8. A Brief History ofTime • Trend •  smaller, cheaper, more functions, more intimate • Technology becomes invisible •  Intuitive to use •  Interface over internals •  Form more important than function •  Human centered design
  • 9. A Brief History of Computing • Trend • smaller, cheaper, faster, more intimate, intelligent objects • Computers need to become invisible • hide the computer in the real world •  Ubiquitous / Tangible Computing • put the user inside the computer •  Virtual Reality
  • 10. Making Interfaces Invisible Rekimoto, J. and Nagao, K. 1995. The world through the computer: computer augmented interaction with real world environments. In Proceedings of the 8th Annual ACM Symposium on User interface and Software Technology. UIST '95. ACM, New York, NY, 29-36.
  • 11. Graphical User Interfaces • Separation between real and digital worlds •  WIMP (Windows, Icons, Menus, Pointer) metaphor
  • 12. Ubiquitous Computing • Computing and sensing embedded in real world •  Particle devices, RFID, motes, arduino, etc
  • 13. Virtual Reality • ImmersiveVR • Head mounted display, gloves • Separation from the real world
  • 15. Augmented Reality Definition • Defining Characteristics [Azuma 97] • Combines Real andVirtual Images • Both can be seen at the same time • Interactive in real-time • The virtual content can be interacted with • Registered in 3D • Virtual objects appear fixed in space Azuma, R. T. (1997). A survey of augmented reality. Presence, 6(4), 355-385.
  • 18. From Reality toVirtual Reality Ubiquitous Computing Augmented Reality Virtual Reality
  • 19. Milgram’s Reality-Virtuality continuum Mixed Reality Reality - Virtuality (RV) Continuum Real Environment Augmented Reality (AR) Augmented Virtuality (AV) Virtual Environment "...anywhere between the extrema of the virtuality continuum." P. Milgram and A. F. Kishino, Taxonomy of Mixed Reality Visual Displays IEICE Transactions on Information and Systems, E77-D(12), pp. 1321-1329, 1994.
  • 21. Virtual Reality Computer generated multi-sensory simulation of an artificial environment that is interactive and immersive.
  • 22.
  • 23. David Zeltzer’s AIP Cube ! Autonomy – User can to react to events and stimuli. ! Interaction – User can interact with objects and environment. ! Presence – User feels immersed through sensory input and output channels Interaction Autonomy Presence VR Zeltzer, D. (1992). Autonomy, interaction, and presence. Presence: Teleoperators & Virtual Environments, 1(1), 127-132.
  • 24. Key Technologies • Autonomy •  Head tracking, body input •  Intelligent systems • Interaction •  User input devices, HCI • Presence •  Graphics/audio/multisensory output •  Multisensory displays •  Visual, audio, haptic, olfactory, etc
  • 25. Early Experimenters (1950’s – 80’s) Helig 1956 Sutherland 1965 Furness 1970’s
  • 27. The First Wave (1980’s – 90’s) NASA 1989 VPL 1990’s Virtuality 1990’s
  • 28. Jaron Lanier •  Founded VPL, coined term “Virtual Reality”
  • 29. Desktop VR - 1995 •  Expensive - $150,000+ •  2 million polys/sec •  VGA HMD – 30 Hz •  Magnetic tracking
  • 30. Second Wave (2010 - ) • Palmer Luckey •  HMD hacker •  Mixed Reality Lab (MxR) • Oculus Rift (2011 - ) •  2012 - $2.4 million kickstarter •  2014 - $2B acquisition FaceBook •  $350 USD, 110o FOV
  • 32. Oculus Rift Sony Morpheus HTC/Valve Vive 2016 - Rise of Consumer HMDs
  • 33. Desktop VR 2016 • Graphics Desktop • $1,500 USD • >4 Billion poly/sec • $600 HMD • 1080x1200, 90Hz • Optical tracking • Room scale
  • 36. Computer Based vs. Mobile VR
  • 37. Mobile VR CPU: 300 Mhz HDD; 9GB RAM: 512 mb Camera: VGA 30fps Graphics: 500K poly/sec 1998: SGI O2 2008: Nokia N95 CPU: 332 Mhz HDD; 8GB RAM: 128 mb Camera: VGA 30 fps Graphics: 2m poly/sec
  • 38. Mobile Phone AR & VR • Mobile Phone AR • Mobile phone • Live camera view • Senor input (GPS, compass) • Mobile Phone VR • Mobile phone • Senor input (compass) • Additional VR viewer
  • 39. VR2GO (2013) •  MxR Lab •  3D print VR viewer for mobiles •  Open source hardware + software •  http://projects.ict.usc.edu/mxr/diy/vr2go/
  • 40. Multiple Mobile VR Viewers Available
  • 44. Google Cardboard • Released 2014 (Google 20% project) • >5 million shipped/given away • Easy to use developer tools + =
  • 47. Version 1.0 vs Version 2.0 •  Version 1.0 – Android focused, magnetic switch, small phone •  Version 2.0 – Touch input, iOS/Android, fits many phones
  • 50. Cardboard App • 7 default experiences •  Earth: Fly on Google Earth •  Tour Guide: Visit sites with guides •  YouTube: Watch popular videos •  Exhibit: Examine cultural artifacts •  Photo Sphere: Immersive photos •  Street View: Drive along a street •  Windy Day: Interactive short story
  • 51. 100’s of Google Play Cardboard apps
  • 53. Cardboard Camera • Capture 360 panoramas • Stitch together images on phone • View in VR on Cardboard
  • 54. Google Expeditions • Teacher led VR experiences • https://www.google.com/edu/expeditions/
  • 55. Building Your Own Application • Cardboard Viewer •  https://www.google.com/get/cardboard/ • Smart phone •  Android/iOS • Cardboard SDK •  iOS, Android, Unity •  https://developers.google.com/cardboard/ • Unity game engine (optional) •  https://unity3d.com • Content
  • 56. Cardboard SDK Features: 1.  Lens distor-on correc-on. 2.  Head tracking. 3.  3D calibra-on. 4.  Side-by-side rendering. 5.  Stereo geometry configura-on. 6.  User input event handling. Unity Cardboard SDK
  • 58.
  • 59. Unity 3D Game Editor
  • 60. SETUP
  • 61. Download and Install •  Go to unity3d.com/download •  Use Download Assistant – pick components you want
  • 62. Getting Started •  First time running Unity you’ll be asked to create a project •  Specify project name and location •  Can pick asset packages (pre-made content)
  • 63. Unity Interface •  Toolbar, Scene, Hierarchy, Project, Inspector
  • 65. Building Scenes • Use GameObjects: •  Containers that hold different components •  Eg 3D model, texture, animation • Use Inspector •  View and edit object properties and other settings • Use Scene View •  Position objects, camera, lights, other GameObjects etc • Scripting •  Adding interaction, user input, events, etc
  • 66. GameObjects •  Every object in Scene is a GameObject •  GameObjects contain Components •  Eg Transform Component, Camera Component
  • 67. Adding 3D Content •  Create 3D asset using modeling package, or download •  Fbx, Obj file format for 3D models •  Add file to Assets folder in Project •  When project opened 3D model added to Project View •  Drag mesh from Project View into Hierarchy or Scene View •  Creates a game object
  • 68. Positioning/Scaling Objects •  Click on object and choose transform
  • 69. Unity Asset Store •  Download thousands models, scripts, animations, etc •  https://www.assetstore.unity3d.com/
  • 71. Making a Simple Scene 1.  Create New Project 2.  Create Game Object 3.  Moving main camera position 4.  Adding lights 5.  Adding more objects 6.  Adding physics 7.  Changing object materials 8.  Adding script behaviour
  • 72. CreateProject •  Create new folder and project
  • 74. Create GameObject •  Load a Sphere into the scene •  GameObject -> 3D Object -> Sphere
  • 75. Moving main camera •  Select Main Camera •  Select translate icon •  Move camera
  • 76. Add Light •  GameObject -> Light -> Directional Light •  Use inspector to modify light properties (colour, intensity)
  • 77. Add Physics •  Select Sphere •  Add Rigidbody component •  Add Component -> Physics -> RigidBody •  or Component -> Physics -> RigidBody •  Modify inspector properties (mass, drag, etc)
  • 78. Add More Objects •  Add several cubes •  GameObject -> 3D Object – Cube •  Move cube •  Add Rigid Body component (uncheck gravity)
  • 79. Add Material •  Assets -> Create -> Material •  Click Albedo colour box in inspector •  Select colour •  Drag asset onto object to apply
  • 80. Add Script •  Assets -> Create -> C# script •  Edit script using Mono •  Drag script onto Game Object
  • 81. Example C# Script GameObject Rotation using UnityEngine;
 using System.Collections;
 
 public class spin : MonoBehaviour {
 
     // Use this for initialization
     void Start () {
     
     }
     
     // Update is called once per frame
     void Update () {
         this.gameObject.transform.Rotate(Vector3.up*10);
     }
 }
 #
  • 82. Scripting C# Unity 3D •  void Awake(): •  Is called when the first scene is loaded and the game object is active •  void Start(): •  Called on first frame update •  void FixedUpdate(): •  Called before physics calculations are made •  void Update(): •  Called every frame before rendering •  void LateUpdate(): •  Once per frame after update finished
  • 84. BUILDING AR AND VR EXPERIENCES Mark Billinghurst mark.billinghurst@unisa.edu.au May 10th – 13th Xi’an LECTURE 2: VR SCENES
  • 86.
  • 87. Types of VR Experiences • Immersive Spaces •  360 Panorama’s/Movies •  High visual quality •  Limited interactivity •  Changing viewpoint orientation • Immersive Experiences •  3D graphics •  Lower visual quality •  High interactivity •  Movement in space •  Interact with objects
  • 88. Immersive Panorama •  High quality 360 image or video surrounding user •  User can turn head to see different views •  Fixed position
  • 89. Demo: Cardboard Camera • Capture 360 panoramas • Stitch together images on phone • View in VR on Cardboard
  • 90. Example Applications • VRSE – Storytelling for VR •  http://vrse.com/ •  High quality 360 VR content • New York Times VR Experience •  NYTVR application •  Documentary experiences • Vrideo •  http://vrideo.com/ •  Streamed immersive movies
  • 91. Vrideo Website – vrideo.com
  • 92. Capturing Panorama • Stitching photos together •  Image Composite Editor (Microsoft) •  AutoPano (Kolor) • Using 360 camera •  Ricoh Theta-S •  Fly360
  • 93. Image Composite Editor (Microsoft) •  Free panorama stitching tool •  http://research.microsoft.com/en-us/um/redmond/projects/ice/
  • 94. AutoPano (Kolor) •  Finds image from panoramas and stitches them together •  http://www.kolor.com/autopano/
  • 95. Steps to Make Immersive Panorama 1.  Create a new project 2.  Load the Cardboard SDK 3.  Load a panorama image asset 4.  Create a Skymap 5.  Add to VR scene 6.  Deploy to mobile phone Need •  Google Cardboard SDK Unity package •  Android SDK to install on Android phone
  • 97. Load Cardboard SDK •  Assets -> Import Package -> Custom Package •  Navigate to CardboardSDKForUnity.unitypackage •  Uncheck iOS (for Android build)
  • 98. Load Cardboard Main Camera •  Drag CardboardMain prefab into Hierarchy •  Assets -> Cardboard -> Prefab •  Delete CameraMain
  • 99. Panorama Image Asset •  Find/create suitable panorama image •  Ideally 2K or higher resolution image •  Google “Panorama Image Cubemap”
  • 100. Add Image Asset to Project •  Assets -> Import Asset •  Select desired image •  Set Texture Type to Cubemap •  Set mapping to Latitude- Longitude (Cylindrical)
  • 101. Create Skybox Material •  Assets -> Create -> Material •  Name material •  Set Shader to Skybox -> Cubemap •  Drag texture to cubemap
  • 102. Create Skybox •  Window -> Lighting •  Drag Skybox material into Skypebox form
  • 104. One Last Thing.. •  CardboardMain -> Head -> Main Camera •  Set Clear Flags to Skybox
  • 105. Test It Out •  Hit play, use alt/option key + mouse to look around
  • 106. Deploy to Mobile (Android) 1.  Plug phone into USB • make sure device in debug mode 2.  Set correct build settings 3.  Player settings • Other settings •  Set Bundle Idenitfier -> com.Company.ProductName • Resolution and Presentation •  Default Orientation -> Landscape Left 4.  Build and run
  • 107. Deploying to Phone 1.  Plug phone into USB 2.  Open Build Settings 3.  Change Target platform to Android 4.  Select Player Settings 5.  Resolution and Presentation •  Default Orientation -> Landscape Left 6.  Under Other Settings •  Edit Bundle Identifier – eg com.UniSA.cubeTest •  Minimum API level 7.  Build and Run •  Select .apk file name
  • 108. Running on Phone •  Droid@Screen View on Desktop
  • 109. Making Immersive Movie •  Create movie texture •  Convert 360 video to .ogg or ,mp4 file •  Add video texture as asset •  Make Sphere •  Equirectangular UV mapping •  Inward facing normals •  Move camera to centre of sphere •  Texture map video to sphere •  Easy Movie Texture ($65) •  Apply texture to 3D object •  For 3D 360 video •  Render two Spheres •  http://bernieroehl.com/360stereoinunity/
  • 110. BUILDING AR AND VR EXPERIENCES Mark Billinghurst mark.billinghurst@unisa.edu.au May 10th – 13th Xi’an LECTURE 3: 3D SCENES
  • 112. 3D Virtual Environments • Viewing content in true 3D • Moving/interacting with scene
  • 113. Example: Cardboard Tuscany Drive • Place viewer inside 3D scene • Navigate by head pointing
  • 114. Key Steps 1.  Creating a new project 2.  Load Cardboard SDK 3.  Replace camera with CardboardMain 4.  Loading in 3D asset packages 5.  Loading a SkyDome 6.  Adding a plane floor
  • 115. New Project •  Camera replaced with CameraMain
  • 116. Download Model Package •  Magic Lamp from 3dFoin •  Search on Asset store
  • 117. Load Asset + Add to Scene •  Assets -> Import Package -> Custom Package •  Look for MagicLamp.unitypackage (If not installed already) •  Drag MagicLamp_LOD0 to Hierarchy •  Position and rotate
  • 118. Import SkySphere package •  SkySphere Volume1 on Asset store •  Import SkySphere package
  • 119. Add SkySphere to Scene •  Drag Skyball_WithoutCap into Hierarchy •  SkySphere_V1 -> Meshes •  Rotate and Scale as needed
  • 120. Add Ground Plane •  GameObject -> 3D Object -> Plane •  Set Scale X to 2.0, Z to 2.0
  • 121. Testing View •  Use alt/option key plus mouse to rotate view
  • 122. Adding More Assets •  Load from Asset store – look for free assets •  Assets -> Import Package -> Custom Package
  • 125. Moving through space • Move through looking •  Look at target to turn on/off moving • Button/tapping screen • Being in a vehicle (e.g. Roller Coaster)
  • 126. Adding Movement Goal: Move in direction user looking when Cardboard Button pressed or screen touched • Key Steps 1.  Start with static screen 2.  Create movement script 3.  Add movement script to Camera head 4.  Deploy to mobile
  • 128. Create Movement Script •  Add new script object •  Assets -> Create -> C# Script •  Edit script in Mono
  • 129. Add Script to Scene •  Drag Script onto Head object •  CameraboardMain -> Head •  Uncheck Track Position Box •  Adjust movement speed
  • 130. Gaze Interaction • Cause events to happen when looking at objects •  Look at target shoot it
  • 131. Steps • Add physics ray caster • Casts a ray from camera • Add function to object to respond to gaze • Eg particle effect • Add event trigger to object • Add event system to scene • Add collider object to target object
  • 132. Adding Physics Raycaster • Select Main Camera •  CardboardMain -> Head -> Main Camera • Add Physics Raycaster Component •  Add component -> Physics Raycaster
  • 133. Add Gaze Function •  Select target object (Lamp) •  Add component -> script •  Add stareAtLamp() public function
  • 134. Add Event Trigger •  Select Target Object (Lamp) •  Add component •  EventTriger •  Add New Event Type -> PointerExit •  Add object to event •  Hit ‘+’ tag •  Drag Lamp object to box under RuntimeOn •  Select Function to run •  Select function list -> scroll to stareAtLamp
  • 135. Adding Event System • Need Event System for trigger to work • Select Lamp object •  UI -> Event System • Add gazeInputModule •  Add component -> Cardboard -> Gaze Input Module
  • 136. Add Collider to Object •  Need to detect when target being looked at •  Select Lamp Object •  Add Sphere Collider •  Add component -> Sphere Collider (type in “Sphere”) •  Adjust position and radius of Sphere Collider if needed
  • 137. Add Gaze Event • Particles triggered looking at lamp • Add particle system •  Add Component -> Particle System •  Pick colour •  set Emission rate to 0 • Add code to stareAtLamp() function •  GetComponent<ParticleSystem>().Emit(10); •  Turns particle system on when looked at
  • 138. Gaze Demo •  Particles launched from Lamp when looked at
  • 139. Adding More Interactivity •  Load Cardboard Demo application •  Assets -> Import Package -> Custom Package •  Load CardboardDemoForUnity.unitypackage •  Launch Demo Scene •  Assets -> Cardboard -> DemoScene
  • 140. Features Shown •  Gaze reticle + selection •  Viewpoint teleportation •  Menu panel overlay •  Audio feedback •  Event system
  • 142. Google Design Guidelines • Google’s Guidelines for good VR experiences: •  Physiological Considerations •  Interactive Patterns •  Setup •  Controls •  Feedback •  Display Reticle •  From http://www.google.com/design/spec-vr/designing- for-google-cardboard/a-new-dimension.html
  • 143. Physiological Considerations • Factors to Consider •  Head tracking •  User control of movement •  Use constant velocity •  Grounding with fixed objects •  Brightness changes
  • 144. Interactive Patterns - Setup • Setup factors to consider: • Entering and exiting • Headset adaptation • Full Screen mode • API calls • Indicating VR apps
  • 145. Interactive Patterns - Controls • Use fuze buttons for selection in VR
  • 146. Interactive Patterns - Feedback • Use audio and haptic feedback • Reduce visual overload • Audio alerts • 3D spatial sound • Phone vibrations
  • 147. Interactive Patterns - Display Reticle •  Easier for users to target objects with a display reticle •  Can display reticle only when near target object •  Highlight objects (e.g. with light source) that user can target
  • 148. Cardboard Design Lab Application •  Use Cardboard Design Lab app to explore design ideas
  • 149. BUILDING AR AND VR EXPERIENCES Mark Billinghurst mark.billinghurst@unisa.edu.au May 10th – 13th Xi’an LECT. 4: AUGMENTED REALITY
  • 150.
  • 151. 1977 – StarWars –Augmented Reality
  • 152. Augmented Reality Definition • Defining Characteristics [Azuma 97] • Combines Real andVirtual Images • Both can be seen at the same time • Interactive in real-time • The virtual content can be interacted with • Registered in 3D • Virtual objects appear fixed in space Azuma, R. T. (1997). A survey of augmented reality. Presence, 6(4), 355-385.
  • 154. •  Put AR pictures here Augmented Reality Examples
  • 156. Milgram’s Reality-Virtuality continuum Mixed Reality Reality - Virtuality (RV) Continuum Real Environment Augmented Reality (AR) Augmented Virtuality (AV) Virtual Environment "...anywhere between the extrema of the virtuality continuum." P. Milgram and A. F. Kishino, Taxonomy of Mixed Reality Visual Displays IEICE Transactions on Information and Systems, E77-D(12), pp. 1321-1329, 1994.
  • 157. Summary • Augmented Reality has three key features • Combines Real andVirtual Images • Interactive in real-time • Registered in 3D • AR can be classified alongside other technologies • Milgram’s Mixed Reality continuum
  • 159. Augmented Reality Definition • Defining Characteristics • Combines Real andVirtual Images • Display Technology • Interactive in real-time • Interaction Technology • Registered in 3D • Tracking Technology
  • 161. Display Technologies " Types (Bimber/Raskar 2003) " Head attached •  Head mounted display/projector " Body attached •  Handheld display/projector " Spatial •  Spatially aligned projector/monitor
  • 163. Objects Registered in 3D • Registration • Positioning virtual object wrt real world • Tracking • Continually locating the users viewpoint •  Position (x,y,z), Orientation (r,p,y)
  • 164. Tracking Technologies #  Active •  Mechanical, Magnetic, Ultrasonic •  GPS, Wifi, cell location #  Passive •  Inertial sensors (compass, accelerometer, gyro) •  Computer Vision •  Marker based, Natural feature tracking #  Hybrid Tracking •  Combined sensors (eg Vision + Inertial)
  • 167. • Interface Components • Physical components • Display elements • Visual/audio • Interaction metaphors Physical Elements Display ElementsInteraction MetaphorInput Output AR Interface Elements
  • 168. AR Design Space Reality Virtual Reality Augmented Reality Physical Design Virtual Design
  • 170. •  Web based AR •  Flash, HTML 5 based AR •  Marketing, education •  Outdoor Mobile AR •  GPS, compass tracking •  Viewing Points of Interest in real world •  Eg: Junaio, Layar, Wikitude •  Handheld AR •  Vision based tracking •  Marketing, gaming •  Location Based Experiences •  HMD, fixed screens •  Museums, point of sale, advertising Typical AR Experiences
  • 172. User Experience • MultipleViews • MapView,ARView, ListView • Multiple Data Types • 2D images, 3D content, text, panoramas
  • 173. Warp Runner • Puzzle solving game • Deform real world terrain
  • 174. Demo:colAR • Turn colouring books pages into AR scenes • Markerless tracking, use your own colours.. • Try it yourself: http://www.colARapp.com/
  • 175. What Makes a GoodAR Experience? • Compelling • Engaging,‘Magic’ moment • Intuitive, ease of use • Uses existing skills • Anchored in physical world • Seamless combination of real and digital
  • 177.
  • 178. What you will learn •  Introduction to Vuforia •  Platform and features •  How to install/set-up Vuforia •  Vuforia Basics •  Marker Tracking, Object tracking •  Deploying to Mobile Device •  Android, iOS
  • 180. Vuforia Overview •  Platform for Mobile Computer Vision •  https://www.qualcomm.com/products/vuforia •  Released by Qualcomm in 2010, acquired by PTC 2015 •  Used by over 200K developers, >20K applications •  Main Features: •  Recognition •  Image, text, object recognition •  Tracking •  Image, marker, scene, object
  • 181. Vuforia Provides •  Android •  iOS •  Unity Extension Device SDK •  Target Management System •  App Development Guide •  Vuforia Web Services Tools & Services •  Dedicated technical support engineers •  Thousands of posts Support Forum
  • 190. Download Vuforia for Unity SDK •  https://developer.vuforia.com/downloads/sdk
  • 192. Installing Vuforia Unity Extension • Create new Unity Project • Import the Vuforia Unity Extension •  Double clicking the *.unitypackage file •  Eg vuforia-unity-5-0-6.unitypackage •  Manually install package •  Assets -> Import Package -> Custom Package • The extension archive will self install •  folders, plugins and libraries, etc
  • 194. Unity Asset Structure •  Editor - Contains the scripts required to interact with Target data in the Unity editor •  Plugins - Contains Java and native binaries that integrate the Vuforia AR SDK with the Unity Android or Unity iOS application •  Vuforia - Contains the prefabs and scripts required to bring AR to your application •  Streaming Assets / QCAR - Contains the Device Database configuration XML and DAT files from the online Target Manager
  • 196. Setting up a Vuforia Project • Register as Developer • Create a Project • Obtain a License Key • Load Vuforia package into Unity • Add license key to AR Camera • Add Tracking Targets • Move ImageTarget into Scene • Add sample object to ImageTarget
  • 197. Register as Developer •  https://developer.vuforia.com/user/register
  • 198. Download Vuforia Packages •  Go to download URL – log in •  https://developer.vuforia.com/downloads/sdk •  Download Current SDK for Unity •  vuforia-unity-5-5-9.unitypackage •  Download Core Features Sample •  vuforia-samples-core-unity-5-5-9.zip
  • 199. Create License Key •  https://developer.vuforia.com/targetmanager/licenseManager/licenseListing
  • 200. Obtain a License Key •  Vuforia 5.5 apps utilize a license key that uniquely identifies each app. License keys are created in the License Manager •  The steps to creating a new license key are.. •  Choose a SDK •  Choose a licensing option based on your requirements •  Provide your Billing Information if you've chosen to use a paid license •  Obtain your license Key
  • 201. License Key Generated – Save This
  • 202. Load Vuforia Package •  Open Unity •  Load package •  Assets -> Import Package -> Custom Package •  Load vuforia-unity-5-5-9 (or current version) •  Note: •  On Windows Vuforia only works with 32 bit version of Unity •  You may need to download Unity 32 version to work
  • 203. Add License Key to Vuforia Project •  Open ARCamera Inspector in Vuforia •  Assets -> Vuforia -> Prefabs •  Move AR Camera to scene hierarchy (Delete Main Camera) •  Paste License Key
  • 204. Adding Tracking Targets •  Create a target on the Target Manager •  https://developer.vuforia.com/targetmanager/ •  OR - Use existing targets from other projects
  • 205. Which Type of Database •  Device Database vs. Cloud Database? •  Device: local, Cloud: online
  • 206. Creating a Target •  Create a database •  Add targets
  • 209. Loaded Image Target • Rating indicates how good a target • Download Dataset -> create unity package • Eg StoneImage.unitypackage
  • 210. Loading the Tracking Image •  Import tracking dataset package •  Assets -> Import Package -> Custom Package •  Drag ImageTarget prefab into Scene Hierarchy •  Select ImageTarget, pick Data Set then Image Target •  Set image width •  On AR Camera load target database and activate •  Database Load Behaviour
  • 213. Add 3D Content •  As a test, create a simple Cube object •  GameObject > Create Other > Cube •  Add the cube as a child of the ImageTarget object by dragging it onto the ImageTarget item. •  Move the cube until it is centered on the Image Target.
  • 216. APPLICATION • Unity •  Creating the Application •  Configure the export settings and build the Application 216
  • 217. Building for Android • Open Build Settings • Change Target platform to Android • Switch Platform • Under Player Settings •  Edit Bundle Identifier – eg com.UniSA.cubeTest •  Minimum API level • Build and Run •  Select .apk file name
  • 219. Adding Interaction •  Look at Vuforia core samples •  Virtual Buttons •  Create virtual buttons on page that triggers actions •  Eg touch button change object colour
  • 222. Books • Unity Virtual Reality Projects •  Jonathan Linowes • Holistic Game Development with Unity •  Penny de Byl
  • 223. Cardboard Resources •  Google Cardboard main page •  https://www.google.com/get/cardboard/ •  Developer Website •  https://www.google.com/get/cardboard/developers/ •  Building a VR app for Cardboard •  http://www.sitepoint.com/building-a-google-cardboard-vr-app-in-unity/ •  Creating VR game for Cardboard •  http://danielborowski.com/posts/create-a-virtual-reality-game-for- google-cardboard/ •  Moving in VR space •  http://www.instructables.com/id/Prototyping-Interactive-Environments- in-Virtual-Re/
  • 224. Vuforia Resources •  Vuforia Product Page https://www.qualcomm.com/products/vuforia •  Vuforia Developer Page https://developer.vuforia.com •  SDK Download Page https://developer.vuforia.com/downloads/sdk •  Installing Vuforia for Unity extension http://developer.vuforia.com/library/articles/Solution/ Installing-the-Unity-Extension •  Tutorials https://developer.vuforia.com/resources/tutorials
  • 225. Unity Resources • Unity Main site • http://www.unity3d.com/ • Holistic Development with Unity • http://holistic3d.com • Official Unity Tutorials • http://unity3d.com/learn/tutorials • Unity Coder Blog • http://unitycoder.com