All posts in "Virtual Reality"

VR Interaction – SteamVR Lab Interaction System – Throwables

Today, I’ll introduce you to VR Interaction using the Lab Interaction System.  If you’ve played The Lab, you already know it includes a variety of different interactions spread across a flurry of mini-games.  With a 98% positive review rating, it’s one of the most popular VR experiences, and the interaction system is one of my favorite parts.

Getting started with The Lab Interaction System only takes a couple quick steps.  In this guide, I’ll show you the setup and how to create your first grabbable & throwable object.

Setup The Lab Interaction System

Create an empty project

Import the SteamVR asset from the Asset Store

In your empty scene, delete the “Main Camera”

Add the [CameraRig] prefab from the SteamVR/Prefabs folder into your scene.

Create a new empty gameobject and name it Player

Reset the position of the Player in the Inspector

 

Right click on the “Player” GameObject in the hierarchy and create 2 Empty Children

Hands Setup

Name them LeftHand & RightHand

Add the Hand component to both the LeftHand & RightHand

Select the LeftHand

In the Inspector, drag the RightHand to the “Other Hand” field.

Select the Right Hand

Assign the “Left Hand” to the “Other Hand” field

Change the “Starting Hand Type” to Right

Controller Prefab

Because we’re not assigning any actual hand models here, we’ll want to fill in the Controller Prefab field.  This will allow the hands to automatically create controller models much the same way the default [CameraRig] controllers do.

Select the left hand and assign the “BlankController” prefab to the “Controller Prefab” field.

Repeat this for the RIGHT HAND

The BlankController prefab can be found in the SteamVR/InteractionSystem/Core/Prefabs folder

Player Setup

Select the “Player” in the Hierarchy.

Add the Player Component to the “Player” GameObject

Drag the hands onto the “Hands” array.

Expand the [CameraRig] and select the “Camera (eye)”

Right click on the Camera (eye) and add a new 3d Object->Sphere as a child.

In the Inspector, change the Sphere Scale to 0.1, 0.1, 0.1

Uncheck (or remove) the Mesh Renderer component of the Sphere

Select the Player

Assign the newly created Sphere to the “Head Collider” field of the Player component

Drag the [CameraRig] to the “Rig Steam VR” field

Environment Setup

So far, we have the player all setup with hands and ready to go.  It’s time to create a little environment where we can grab and throw things!

Create an empty Plane

Reset the position

Create a Cube

Set the transform values to match the image above – position = (0, 0, 2.5) | scale = (1, 1, 4)

Your scene view should look similar to the image here.

Throwables

Ok it’s time to make something throwable, luckily this is the easiest part.

Create a new empty Sphere

Rename it “Ball

Set the Scale to (0.1, 0.1, 0.1)

Set the Position to (0, 0.6, 0.7)

Your ball should now be just above the Cube like this

Add the “Throwable” component to the “Ball”

You’ll immediately see a few other components get added to the ball automatically.

That’s it..

Press Play

Pickup your ball (with the Trigger) and throw it!

Conclusions

The SteamVR Lab Interaction system is actually very featured and this only covers the first of many possible interactions available.  The player and hand setup you have here will work as a great starting point for exploring the other interactions.  For a full demo of all the interactions, make sure you check out the “Interactions_Example” scene in the SteamVR/InteractionSystem/Samples/Scenes folder.  Also if there’s a specific interaction type you’d like more info on, please comment below and let me know.

 

 

Continue reading >
Share

SteamVR Locomotion and Teleportation Movement

SteamVR Teleport

SteamVR ships with an easy to use teleport system that requires very little setup.  If you’ve played the lab, you’ve seen the system in action.  It allows for quick teleportation to specified areas.

To setup the SteamVR teleportation system, there are a couple requirements.

Basic Setup

If you want to work along with the steps below, you’ll need the SteamVR plugin installed.  If you just want to read though, skip ahead to “Hands”

Next, you’ll want to create an empty cube and scale it to (20, 1, 20).

Add some material to the floor, in my example below, you’ll see I went with a wood plank material I found somewhere..

Finally, add a [CameraRig] prefab from the SteamVR plugin prefabs folder.

Hands

The SteamVR teleport system is coupled with the lab’s interaction system.  As such, it requires the use of the Hand and player objects.

The way it’s done in the samples, and the way I prefer to do it is create two new empty children under the [CameraRig].

Re-name them “Left Hand” & “Right Hand”

Add the “Hand” component to each of them.

Set the “Other Hand” field on both of them by dragging the other ‘hand’ into the field for each of them.

For the Right hand, make sure to set the “Starting Hand Type” to right (the default value is left).

Player

For teleport interactions to work, you’ll need to add the “Player” component to the [CameraRig].

Drag & add the Camera(eye) to the “Hmd Transforms” array.

Add the Left & Right hands to the “Hands” array.

Teleport Prefab

Next, you’ll need to add the “Teleporting” prefab to the scene.

You can find the “Teleporting” prefab in SteamVR/InteractionSystem/Teleport/Prefabs

Drag & Add the Teleporting prefab to the scene.

There are plenty of options you can adjust on the Teleport prefab, but for now, we’ll leave it at the defaults.

Teleport Points

Before we can teleport, we’ll need to tell the system where a valid teleport point is.  Again, the SteamVR system comes with a great prefab to get started.

Take the TeleportPoint prefab and place a couple instances in the scene above the ground.

Teleport Points – Testing

Press Play

Now use the trackpad to teleport to the different teleport points.

Teleport Points – Scene Loading

Though we won’t use it here, I also wanted to point out that the Teleport Point has a “Teleport Type” option on it.

This allows you to make the point load a different scene when the player moves there.  If you want to use this functionality, just switch the type and enter the scene name.

Teleport Area

The final step before we can teleport is to add a “Teleport Area”.

A teleport area works a lot like the teleport point, but allows you to specify a bigger ‘area’ where you can land.

One way to set this up would be to just add the “Teleport Area” component to the floor.  If you try this though, the ground material will change, and when you’re not teleporting, the ground will be disabled completely.

Instead, what we’ll do is clone the ground object, move it a bit, and make that our teleport area.

Duplicate the floor.

Make the duplicate a child of the floor

Re-name it to “Floor – Teleport Area”

Set the Y position to 0.01

Add the “Teleport Area” component to it

As I mentioned above, you’ll notice that the material has automatically changed to “TeleportAreaVisible”.  Again, this is controlled by the “Teleporting” prefab in the scene and one of the reasons we can’t just add the component to the ground.

Teleport Area – Testing

With this in place, you should be able to hit play and teleport all around the area.  Also notice that the teleport points will work fine alongside the area.

Conclusions

The SteamVR interaction system used in the lab is a great starting point for many projects.  If you’re looking to get the basics down, definitely check this system out.  It doesn’t cover every form of movement though and is only a small sampling of the possible locomotion types you can add.  But if you just need good teleport movement, it’s easy to setup and works well.  If you’re interested in other locomotion systems, drop a comment below and let me know what you’d like to learn more about.

Continue reading >
Share

Building an Interactive Mobile 360 Video Player for GearVR

 

Building an interactive 360 Video player for Mobile / GearVR

This post will run you through the basics of creating a 360 video player for the Oculus GearVR.  The same techniques will work for google cardboard, daydream, htc vive, and oculus rift with minor changes.  You’ll learn how to play a video, bring up a quick menu, and switch to another video.

Note on 3D 360

This post covers 360 video but not stereo 3D video.

If you’re looking to play 3D 360 Video check out this post: https://unity3d.college/2017/07/31/how-to-play-stereoscopic-3d-360-video-in-vr-with-unity3d/

Sphere Setup

Before we can play 360 video, we need an inverted sphere.

In a previous post, I showed how to use an inverted sphere mesh.

Today, I’ll show you how to create one in Unity using a single editor script – original source here

Create a new folder named “Editor”

Create a script named InvertedSphere.cs in the “Editor” folder.

Replace the script contents with this.

You may need to hit play before continuing just to force the editor to compile your new editor script

In a new scene, right click the Hierarchy and select “Create Other”->”Inverted Sphere

Set the size to 100.

Click “Create Inverted Sphere”

Select your “Inverted Sphere” in the Hierarchy

Add a “Video Player” component to the sphere.

Importing Videos

Create “Videos” folder.

Download some sample 360 videos that you enjoy from youtube or your favorite source.

Place them in the “Videos” folder.

Transcoding

Some videos will play fine on GearVR without transcoding them, but many won’t.  Luckily, transcoding is as easy as checking a box and waiting.

Select your videos.

In the inspector, check the “Transcode” box.

Click Apply

Wait (this took 15 minutes for me)

Video Selection Script

To manage our basic menu system, we’ll need two scripts.

Create a “Code” folder.

In the folder create these two scripts.

VideoSelection.cs

GazeInput.cs

Menu Setup

For this sample we’ll create a basic menu out of cubes.  Once it’s working, feel free to pretty them up however you like or replace the cubes with something that fits better to your style.

Create an empty gameobject in the Hierarhcy

Name it “Menu”.

Reset the transform so it’s in the center of your scene.

Right click on the “Menu” object and create a cube as a child.

Re-name your cube to be “Menu Item – yourvideoname” (example I used a bunny video so mine looks like this)

Adjust the transform values so that X & Z are 1.

Add the VideoSelection component to the new Menu Item

Drag your first video clip onto the Video Clip field of the component.

The second Menu Item

To create another menu item, duplicate the existing one (ctrl-d).

Re-name it

Change the X to -1

Swap out the video clip with a different one.

Camera Setup

The last thing we need to do is setup the camera.

Select the “Main Camera” object in your Hierarchy.

Reset the position.

Set the Position Y value to positive 1.

Add the GazeInput component to the “Main Camera”

 

Building to the phone

The project should be good and running now.  The final step is to put it onto a GearVR (or whatever platform you’re using).

If you already have an OSIG file, place that in the Plugins/Android/Assets folder of your project (create this folder structure if it doesn’t exist).

If you don’t know what I’m talking about, go here and follow the instructions (only takes a minute) – https://dashboard.oculus.com/tools/osig-generator/

With the OSIG in place we’re good to go.

Save the scene and open the Build dialog

Add the saved scene.

Click build and run with the phone plugged in and if all goes right, you’ll have your video player up and running.

**** IMPORTANT – Make sure your phone is plugged in via USB and UNLOCKED or the deployment will fail ****

Controls

The controls for this video player are very simple.

Tap the GearVR touchpad to bring up the 2 cubes.

Look at the cube you want to play from and tap again.  The video will start playing.

Just tap again to bring the menu (cubes) up again.

 

Conclusions & Extensions

This is not the prettiest video player, but should give you a good idea how to get started.  Once you get it working, the first thing I’d recommend doing is making the ‘menu’ a whole lot prettier, adding some text to show what the video names are, or some video preview on the items.

Overall though, the new video player functionality, along with the included Oculus SDK makes it really easy to get going with your own video player now.  I’m excited to see all the variations people build, and may end up building a more full featured one soon myself.

Continue reading >
Share

Avoid the VR waiting room with SteamVR Skybox

Valves SteamVR plugin makes getting started with VR in Unity quite a bit easier.  The [CameraRig] prefab is a perfect starting point for most projects, and the controller and input tracking is amazing.  But there’s quite a bit more to the package than just that.  The SteamVR_Skybox is one of the features in the SteamVR plugin that you should definitely know about and use.

What’s the SteamVR_Skybox do?

The SteamVR Skybox component will allow you to change what a player in the HMD sees when your game drops frames or is is ‘not responding’.

Of course removing frame drops is the best solution, but sometimes you have cases where that’s just not an option.

In many games the most common time this happens is during scene loading.  Even using asynchronous methods to load your scene won’t resolve it for every case.

How do I use the SteamVR Skybox?

What I’ll usually do is create a child under the [CameraRig] and name it SteamVRSkybox.

I set the Y position of it to 1.5 and leave the X & Z alone.

This is to simulate a player of about 1.5 meters in height

Then I’ll add the the SteamVR_Skybox component to it and clickTake Snapshot

What’s this do?

With this component in place, when the game loses a frame and would go back to the loading area, it instead shows what was visible when I took the snapshot.

Sometimes this is enough and the game will just work, but other times you may have a special ‘loading area’ or some intermission setup that you use.

In that case, just place your SteamVRSkybox in the proper place for your snapshot instead of under the CameraRig.

Any Downsides?

While this is a great solution for most cases, depending on your implementation, you may see some less than perfect stuff.

First is that the skybox only shows what’s visible when you take the snapshot, so any gameobjects you’ve spawned at runtime won’t be there…  so if the frame drops are during gameplay, while it’s a tiny bit better, it will still feel weird.

And when you use it to load levels, I recommend you put the player into a loading area ideally.  This would be some room where stuff doesn’t change at runtime and your snapshot can match up correctly.

Also because this is part of the SteamVR plugin, this method won’t work for games targetting the Oculus SDKs.  For oculus, there’s another route you can follow that I’ll share some time in the future.

Conclusions

The SteamVR Skybox is only a tiny part of the awesome SteamVR package.  It’s easy to setup in just a couple minutes and on it’s own can make your game or experience feel a little bit more polished.  Again it’s not a fix all for bad performance, that should be addressed on it’s own, but it is a useful component and worth trying out.

 

Continue reading >
Share

Unity3D Spectator Mode in VR

If you’ve seen many VR games recently, you’ve probably come across ‘spectator mode’ in at least a few of them.  Games going back to Holopoint and big hits like Rick & Morty both implement their own spectator modes.  Luckily, creating a spectator mode is actually very simple, though there are some things to consider when you do it.

The Spectator Camera

The first thing you’ll need to do is create a new camera in your scene.

Next, take a look at the “Target Eye” field.

Set it to “None (Main Display)”

Now position the camera where you want the spectator view to be.

You can make it a child of the play area, set it in a static position, or create a script that allows the user to move the camera freely using mouse and keyboard.

Adding a generic mouse look/move script from the asset store would give you a free move camera.

What do I show?

Most VR games without a spectator mode or multiplayer functionality don’t show the local players body.

The best thing to show however isn’t a full body, instead, I recommend you show the head and hands only.

As a sample, I created a basic head out of 3 spheres and attached it as a child of the CameraRig’s camera.

When I play now, the spectator camera looks like this.

 

A Problem – The Inside of the Head…

If you copy what I’ve done so far, you’ll quickly realize an issue… right when you put on the headset.

Two big black circles… it’s the eyes!

We have an easy fix though.  Set the head (or body if you use a body), to be on a different layer.

Now select your main VR camera and adjust the culling mask to deselect the PlayerHead layer.

Problem solved!

Now the head and eyes won’t be rendered by the VR camera.

 

Other Considerations

It’s very important to note here that adding this spectator view comes at a performance cost.

You’re now rendering for another camera, and depending on your game, this could be a big performance hit.

In general, I’d recommend having spectator mode be OFF by default and allow it for people who want it, but give them a warning that it can impact performance. (Rick & Morty does a great job of this)

Also, if you do setup a spectator mode, go wild on what you allow the player to do..  let them view the game from the sky, behind the player, maybe even as an ant or some select NPCs in your game.. and just make it fun to watch 🙂

Continue reading >
Share

360 Video in Unity 5.6

Unity 5.6 introduced a bunch of great new features including a new video player.  If you tried playing video in Unity 5.5, you know it’s a bit of a pain and generally requires an expensive plugin. I tried doing some Unity 360 Video projects in the past and ran into nothing but pain.  I’m happy to say though that the latest version makes it much easier to render 360 video in Unity.  The contents in this guide was written with the and in mind, but should be applicable to any setup.

Note on 3D 360

This post covers 360 video but not stereo 3D video.

If you’re looking to play 3D 360 Video check out this post: https://unity3d.college/2017/07/31/how-to-play-stereoscopic-3d-360-video-in-vr-with-unity3d/

How to get started

Before you can render 360 video, you’ll need a surface to render it on.  Typically, we use a sphere with inverted normals (the sphere renders on the inside only).

You can create your own, search online, or download this one here: https://unity3dcollegedownloads.blob.core.windows.net/vrgames/invertedsphere.fbx

To create an inverted sphere, follow the steps outlined here – https://unity3d.college/2017/05/15/building-an-interactive-mobile-360-video-player-gearvr/

Setup

Create a new scene

Drag your inverted sphere to the Hierarchy.

Reset it’s position to (0, 0, 0), if it’s not already.

Set the sphere’s scale to (-3000, 3000, 3000).

If you’re using a different sphere than the one provided above, you may need a different scale value.

Without the scale’s X value is set negative here, the video would appear backward

Reset your camera’s position to (0, 0, 0)

Select a 360 video to play.

You can shoot your own or download one..

I used this video to test – https://www.youtube.com/watch?v=a5goYOaPzAo, but most 360 videos should work.

If you aren’t sure how to download videos from youtube, this is the site I use to do it: http://keepvid.com/

Drag the video onto the sphere in the scene view.

Select the Inverted Sphere

You should see the Video Player component added to the sphere.

You’ll also see the Video Clip assigned with your 360 video.

Because Play on Awake is selected, you can just hit play now and watch your video start playing.

Play Controls

Ready to add some basic controls to the player?

Let’s add keyboard controls to pause and play the video.

Create a script named VideoPlayerInput.cs

Replace the contents with this:

Add the VideoPlayerInput component to the Sphere.

 

Press Play

Use the “Space” key to pause and resume playing of your video.

Conclusions

With Unity 5.6, playing video has become simple.  You can play 2D video, 360 video, and even render video on any material you want (we used an inverted sphere, but you could attach it to a fire truck or whatever other model you pick really).

Try out all the new functionality, if you build something interesting, share it below!

Continue reading >
Share

HTC Vive Tracker Unity3D / SteamVR Setup

Friday, my Vive trackers finally arrived.  When I opened them, I wasn’t sure what to expect.. would they be cool, could I do something fun with them, or will they just sit on a shelf?  After just one day, I have the answer.. they’re Awesome!

So far, I’ve barely played with them, but I’ve already put them on my feet, stomped some civilians and kicked some buildings..

And now I’ve strapped one on to a tennis racket and dropped my wife onto a baseball field to smack tennis balls.

How do you use them?

Since they don’t really require any special code to work, I wanted to give a quick demo of how to make them work in Unity with the SteamVR plugin.

Camera Rig Setup

The default camera rig gives you a left and right controller, and works fine as a starting point for many VR projects.

To add trackers, create new children under the [CameraRig] with the context menu.

I used cubes to start and scaled them to (0.1, 0.1 , 0.1 ).

Once you create the objects, add a SteamVR_TrackedObject script to them.

Now select the [CameraRig] prefab and add the tracker(s) to the Objects array.

Turn on all the controllers & tracker(s).

I’ve noticed that the trackers don’t show up if both controllers aren’t on.  I haven’t yet dug into why, so if you already know, drop a comment at the bottom and share please 🙂

Hit Play.

That’s it… they’ll just work and track from there.

The Tennis Racket

One thing to note when you’re setting these trackers up is the position of the tracker on the object.

Unless you’re holding the tracker, it’s going to be offset a bit, and you need to take that into account when positioning things.

For example, with my tennis racket, the tracker is attached to the strings like this.

In-game, my tracker object has the racket as a child, and the object is lined up so that the rackets pivot under the tracker matches closely with where the tracker is placed on the actual racket.

 

Conclusions & Other Games

I have to say I really love these things.  They’re pretty light, they track great just like the controllers, and for some reason I feel more comfortable attaching them to things.. (even though they’re almost the same price).

If you’re unsure about getting one, I’d say do it… they’re well worth the cost IMO even if you’re just playing around with them in the editor.

When it comes to in-game support, I think it’ll be a bit before they’re common, but I do expect to see games start adding capabilities over the next few months, I know I really want to put support in a few games myself.

In conclusion though, I’m excited to see what people come up with, and to experiment and make fun things..  If you happen to have some ideas, please share below or drop me an email.

Continue reading >
Share

Pooled Decals for Bullet Holes

The Bullet Decal Pool System

Todays article came out of necessity.  As you probably know, I’m wrapping up my long awaited VR Course, and one of the last things I needed to create is a decal setup for the game built in it.  To do decals properly, you’d want a full fledged decal system, but for this course and post, we have a system that does exactly what we need and no more.

What is that?  Well it’s a system to create a bullet hole decal where you shoot.  And do to it without creating and destroying a bunch of things at runtime.

It’s worth noting that I wrote this system for a VR game, but it is completely applicable to a normal game. This would work in a 3d game, mobile game, or anything else that needs basic decals.

The end result will look something like this

How do we build it?

Let’s take a look at the code

Code Breakdown

Serialized Fields

We open with 2 serialized fields.

bulletHoleDecalPrefab – The first determines which decal prefab we’ll use.  If you’re building a more generic decal system, you may want to re-name this.  Because it’s part of a VR course, I left the name as is, but if I were putting this in another game, it’d likely be more generic or maybe even an array that’s randomly chosen from.

maxConcurrentDecals – This sets the maximum number of decals the system will show.  We do this primarily for performance, but also to avoid visual cluttering.  Having too many decals could cause a hit on rendering, remember each one is a transparent quad.  This number is variable in the editor though, so you can adjust it as you see fit for your game.

 

Private Fields

We have two private fields in this class.  They’re both using the Queue type to keep a first in first out collection of decals.

decalsInPool – This is where we’ll store the decals that are available and ready to be placed.

decalsActiveInWorld – These are the decals that we’ve placed in the world.  As our pool runs empty, we’ll start grabbing decals from here instead.

 

Awake

Calls our InitializeDecals method()….

 

Private Methods

InitializeDecals() – This is our setup.  Here, we create our queues, then we use a loop to create our initial pooled decals.

InstantiateDecal() – Here we do the actual creation of a single decal.  This is only called by InitializeDecals & a special editor only Update you’ll see soon.

GetNextAvailableDecal() – This method gets the next available decal…. useful description eh’?  It actually just looks at the pool, if there’s at least one decal in it, the method returns the first one in the queue.  If there’s no decal in the pool, it returns the oldest decal that’s active in the world.

 

Public Methods

SpawnDecal(RaycastHit hit) – This is our only public method, it’s the one thing this class is responsible for doing.  In the code that calls it, we’re doing a raycast to determine where our bullet hits.  The raycast returns a raycasthit and we pass it into this method as the only parameter.

The method uses GetNextAvailableDecal() and assuming a decal is available, it places that decal at the raycasthit.point, adjusts the rotation to the raycasthit.normal, and sets the decal to active.  The method ends by adding the decal to the decalsActiveInWorld queue.

 

#if UNITY_EDITOR ????

Everything else in this class is actually wrapped to only run in the editor.

This code has a single purpose, to update our queue size at runtime.

It’s absolutely not necessary for your decal system, but it’s a nice little thing I enjoy having 🙂

I won’t cover each method, but you should play with the queue size at run-time and watch as it keeps everything in sync.

 

 

 

Continue reading >
Share

Oculus Haptic Feedback

Oculus Haptic Feedback

Why you need it and how to get started quickly

Vive & Oculus Haptic feedback systems are extremely important and often overlooked…  On a pretty usual basis, they get overlooked or added in last second (I’m guilty of that myself).

Adding just a little haptic feedback in the right places gives a huge boost to immersion, and leaving it out gives the feeling that something just isn’t quite right.

Why’s it hard?

SteamVR haptics on touch controllers don’t work at all…. (at least at the time of this writing)

Recently, Oculus changed their haptic system, breaking my old code when I upgraded the SDK….

So I’ve written a wrapper.  It handles haptics for either system, and makes it extremely easy.

For this post, I won’t dig into the entire cross platform wrapper, but will instead give you the basics to get started with Oculus haptics in a quick and easy way.

Prerequisites (Oculus)

For this to work, you’ll need the oculus utilities for Unity in your project.

https://developer3.oculus.com/downloads/game-engines/1.11.0/Oculus_Utilities_for_Unity_5/

Shared Code

In the more complete version of my projects, I have quite a bit of shared code and swap between implementations with #defines.

For this simplified sample though, we still have one piece of shared code.  That code is an enum which specifies how strong the feedback vibration effect should be.

The Code (Oculus)

The Oculus code is designed to bypass the need for custom audioclips.  While the clip system is pretty cool and could do quite a bit, it’s not cross platform, and much harder to get started with in my opinion.

 

In the code, we generate 3 OVRHapticsClips named clipLight, clipMedium, & clipHard..  As you may have guessed, these correspond to our enum values.

Use (Oculus)

To use the haptics code, add the script to your controller object.

Assign the controller mask as L Touch or R Touch (whichever matches the controller it’s attached to).

Then call the Vibrate method when you want it to shake.

Demo (Oculus)

If you’re not quite sure how you’d use the code, here’s a real quick sample.  Ideally, you’d have something per controller, managing events for the controller, like a generic VR Controller script that ties these all together and works cross platform.

But to get you started quickly, here’s a simple sample that will vibrate the controllers when you press space.  (just remember to assign the 2 controllers)

What about the Vive?

I’ll cover Vive haptics soon, likely with a more generic implementation that you can use across both platforms.

If you’re really interested in Vive development though, I’m working on a quick weekend long guide that covers everything you need to know to get started, including haptics.  Just sign up (at the top of this post) and I’ll get you the info as soon as it’s ready.

Continue reading >
Share

Getting started with SteamVR Controller Input

Handling SteamVR Controller Input

I’ve talked to quite a few developers recently who weren’t really sure how to get started with input in their VR projects.  While it’s not too hard to get started, there are some things that are important to understand.  In this post, I’ll cover some of the fundamentals of the SteamVR controller inputs and interactions.

Project Setup

Create a new project and import the SteamVR plugin from the Asset Store.

Now find the [CameraRig] prefab in your project view and place it into your scene.

SteamVR - Input - Scene Setup - Add CameraRig and delete cameraSteamVR - Input - Scene Setup - Add CameraRig and delete camera

Next, delete the “Main Camera” that was in our scene by default.  The [CameraRig] already has a Camera component for us that tracks to our head, having another camera in here will just mess things up.

Expand the [CameraRig] and select the left & right controllers.

Add the “SteamVR Tracked Controllercomponent.

Play Time

Now save your scene and press play.

With the game in play mode, select just the left controller.

Now grab the controller and start pressing buttons.

TrackedController Input in Inspector

The TrackedController Script

The TrackedController script is great and actually gives us a bunch of events we can register for in our code.

Let’s take a look at those events.

Code Time

We’re going to hook  into these events with our own class.

Create a new folder named “Code“, then create a new script named “PrimitiveCreator” in that folder.

Paste this code in for the class.

Add the Script to your controllers

Before we go over the code, add the script to your controllers and try it out.

Press play and start placing primitives with the trigger.  Use the trackpad to select which type of primitive the controller should place.

How does it work?

Event Registration

We’re using the OnEnable & OnDisable methods of the MonoBehaviour here to cache our TrackedController and register for events.

The events we care about are TriggerClicked & PadClicked.  When the player clicks the Trigger we call into our HandleTriggerClicked() method [line 7], and when they click the Pad, we call HandlePadClicked() [line 8].

It’s important to pay attention to the OnDisable method though.  Since we’re registering for events when our gameobject is enabled, we need to be sure to deregister them when it’s disabled.  To deregister, we just call with the same syntax as registration, but use a minus instead of a plus.

Events not being deregistered is one of the few ways you can leak memory in a garbage collected language.  In fact it’s the most common leak I see in c# code.

 

Our Event Handlers – The Trigger

Our first event handler throws away the event arguments and calls into SpawnCurrentPrimitiveAtController().  We do this for a few reasons.  The main one is to have a method named for what it does and taking only the parameters it needs.

If you’re comfortable with linq, you could re-write the event registration to avoid this method, but for this example I wanted to keep it simple.

The SpawnCurrentPrimitiveAtController() method does what it says.  It uses CreatePrimitive, passing in the _currentPrimitiveType that we’ve previously set to PrimitiveType.Sphere. [line 9]

Then it adjusts the position and rotation to match our controller. [lines 10 & 11]  Finally, it adjusts the scale to be a size we can work with.  For planes, we need to get even smaller to avoid it covering our entire view. [lines 13-15]

Our Event Handlers – The Trackpad

The other event we registered for was PadClicked.  That event is registered to our HandlePadClicked() method.  In this method, we check the event argument’s padY variable.

padY represents the Y axis on your touchpad.  It’s value can range from -1 to +1, with negative values being above the center and 0 being exactly in the middle.

We check the padY value to see if it’s less than 0, which means the player clicked on the upper part of the pad.  If they did, we increment our primitive type.  If not, we decrement our primitive type.  In each of these methods, we do a quick bounds check for wrapping, but the key part is that we’ve updated the _currentPrimitiveType value that was used when we pull the trigger.

 

Where to go from here

img_5721b39ec05c7

If you’ve already read my post on Getting Started with SteamVR, you may be considering some other options for your trigger click.  There are countless things you could do just to this project.  You could modify the system so that the initial press of the trigger spawns the primitive, but keeps it at the controllers position and rotation (perhaps by making it a child of the controller), then hook into the TriggerUnclicked method to drop it in it’s position.

Try hooking into a few of the other events and see what you can come up with!

If you really want to jumpstart your VR project though and get a game out quickly, I’m building a VR development course where we go over all the steps to build a full game using real world proper practices.

Tell me how and I’ll contact you as soon as there’s more info.

Continue reading >
Share
Page 2 of 4