All posts in "Virtual Reality"

Scaling your Virtual Reality player for fun

Scaling a player in VR opens up a world of options for game design and can be a ton of fun.  In this example, I’ll show you how to quickly adjust the scale of your player, make them shrink and grow at will with a couple lines of c# in your Unity3D project.  We’ll use SteamVR and the CameraRig to make our player tiny, walk through a little door they couldn’t normally get into, then scale them back up on the other side.  Recreate it yourself or download the project sample.


Project Source

Download the project source here:


Continue reading >

How to adjust player size & scale in Virtual Reality with Unity3D

Adjusting player scale in VR is easy if you know what to do.  It gives you the ability to create really amazing effects and interesting gameplay.  You can make your player into a Giant.. or shrink them down to the size of a Mouse..  The technique in this article works across all devices, , , , , etc..

Scaling the Player Up and Down

To adjust the player scale, you need an empty gameobject.

In your scene, create an empty gameobject and reset it’s transform… name it “Player Root”

Move your main camera under the “Player Root” and reset the transform there as well.

Now simply scale the “Player Root”.  As you go up, your player will feel like they’re growing.  If you shrink down small, they’ll in turn be small.

Automatically Resizing the Player

In some situations, you may have a need to automatically resize the player so their point of view is consistent.  If your game requires a certain height to be playable, this may be a solution that can expand your audience to shorter or taller people.  You can use a script similar to the one below to adjust the players scale based on their height at the time it runs.  Ideally, you probably want to run this while they’re loading a scene or give the player a ‘re-orient’ option to allow them to adjust for their size.

Video Version


Resizing the player opens up a variety of fun gameplay opportunities.  If you haven’t tried it before, jump in and play around.. new ideas may come up and you could build the next amazing experience using this simple technique.


Continue reading >

How to make Vive Trackers work without Controllers

A common issue I’ve seen people run into with the HTC Vive Trackers is that they can’t track them without the controllers on.  This is caused by a single line of code in the SteamVR_ControllerManager.cs file and is easy to change.  All you need to do is find the section shown below and comment out the 2nd line there.  As of the time this is written, that line # is 255 (it may change with steamvr updates).


Video Version

Continue reading >

Using the HTC Vive Tracker POGO Pins in Unity3D

If you have an HTC Vive Tracker, you may have already hooked it up and started tracking objects in Unity3D.  That part isn’t too hard, and it’s a lot of fun.  But you can also flip that Vive Tracker over and start using the POGO pins on the back.  The pins are pretty easy to use and give you access to the same buttons you’d have on a normal Vive wand and even include an output for haptic feedback.

HTC Vive Tracker Documentation

To get detailed info, HTC has a guide here:

Video Version

Setting up the HTC Vive Tracker in Unity3D

I’ve gone over it already and just wanted to share the steps involved to get going.

First, you need your tracker in-game.  To keep it simple, we’ll use the CameraRig prefab.

Drop a [CameraRig] into an empty scene (and delete the existing maincamera).

Under the CameraRig, add an empty gameobject and name it “Tracker“.

Add the SteamVR_TrackedObject component to the “Tracker” you’ve just created.

Select the [CameraRig].

Drag the “Tracker” to the Objects array on the SteamVR_ControllerManager.

Turn on both controllers and press play.

Move the tracker around, if you see it move in-game, you’re good to move on to the next part.

Reading Vive Tracker POGO Pins

To show how to read the inputs, I’ve created this example script.  Put it in your project, then add it to the “Tracker” object.

<script src=””></script>

Start playing again and make sure the Tracker is still moving (remember both controllers need to be on too).

Now let’s take a look at the image from the official documentation.

If you’re not familiar with electronics, don’t worry, this one’s pretty simple.

All you need to do is make a connection from GND (Pin 2) to whichever pin you want to trigger.

You can do this with a single wire, or ideally hook it up to a switch that’s attached to your physical device.

To test this, simply touch pin 2 & pin 4 with the same wire, and you’ll see the “Trigger” field set to true.


  • 2 – Ground
  • 3 – Grip
  • 4 – Trigger
  • 5 – Trackpad
  • 6 – Menu Button

Hooking up Hardware

If you’re not sure what to use, try digging out an old electric nerf gun like this: (or find a broken used one on craigslist for free/cheap)

Rip it apart and hook up the wires coming from the trigger to the tracker’s pogo pins.

There are a few different adapters out there you can order/print, like this: – It’d probably be a good idea to have some sort of adapter in there to make pin access easier…




Continue reading >

How to play Stereoscopic 3D 360 Video in VR with Unity3D

If you’ve played 360 video in VR, you know it’s kinda cool… But lately, 3D video is starting to take off.  With off the shelf cameras like the Vuze line, it’s gotten easy to record your own without spending the cost of a new car to get started..  Before today, playing 360 3D video in VR with Unity3D was a bit complicated.. but now, thanks to an open source project put out by Unity Technologies, it’s getting easier.  Earlier today, I stumbled on a post and github project they’ve put together to make 3D 360 video simple to implement.

Video Version

Prefer to watch video?  The entire process is available on youtube here.

Project Setup

To get going, you’ll need a couple things..  First, you need a 3D video.  For this article, I’m using an Over-Under video you can download from here:!portfolio/project-13.html

Once you’ve downloaded your video, you’ll need to grab the script and shader from this github project:

You can download it or clone the repository, whatever you feel most comfortable doing.

Place the shader and script into your project along with the video file you want to play.

You’ll also need to visit your player settings and make sure the Virtual Reality Enabled box is checked.

Render Texture

To use this shader, we need a render texture.  Create a new render texture and name it “Panoramic Render Texture”

Select the RenderTexture and change the size to 2304 x 2304.

The render texture resolution should match your video resolution.

Change the depth buffer to “No depth buffer”.

Render textures are textures that can be rendered to. They can be used to implement image based rendering effects, dynamic shadows, projectors, reflections or surveillance cameras.

The Video Player

To create a video player, drag the video from the project view into the scene view.  A player will automatically be created with the video assigned to it.

Select the video player and look to the inspector.

Change the render mode to “Render Texture”.

Drag the render texture from the project view into the target texture field.

The Material

Next, we need to create a material for the shader.

Create a new material, name it “Skybox”

Drag the render texture onto it.

Set the mapping type to “Latitude Longitude Layout”

Change the image type to 360 Degrees.

Set the 3D layout to “Over Under”

Skybox Setup

The last step is to assign our material to the skybox.

Open the Lighting window.

Drag the Skybox into the Skybox Material field.

All Done

That’s it, save your scene…

Then put on the headset and press play, the video should start playing in 3D.

What about other video types?

This shader appears to have support for a few different video formats.  In this article, we covered a simple 360 degree over under video, but you may have noticed the options for 180 & side by side.  I haven’t tried those yet, but if you’re interested in them, I’d recommend you check out the full documentation they’ve provided here:

Continue reading >

VR Dash movement / locomotion with SteamVR and Unity3D

Ready to add dash locomotion to your game?  It’s pretty simple to setup, but there are some important things to consider before you start. In this guide, I’ll show you how to set it up, what to watch out for, and a trick to minimize motion sickness.  All you need is SteamVR and Unity3D to get started.

Before you get started, you’ll want to have some sort of environment setup.  Ideally you want something where it’s obvious that you’re moving.  The locomotion is FAST, and if you just have a flat white area or single repeated texture, it’s gonna be a whole lot harder to tell what’s going on.

Here’s what my test scene looks like.

SteamVR / Oculus SDK

Once you have an environment setup, you’ll need the SteamVR plugin.  The code here is setup for SteamVR but could definitely be ported to Oculus with minimal effort, all we really care about in code is the button click.

After importing SteamVR, drop the [CameraRig] prefab into the middle of your scene.

Select the right controller and add the SteamVR_LaserPointer component to it.

We’re only using the laser pointer to visualize where we’ll dash, there’s no requirement to use the pointer for things to work, you can use it if you like or choose any other visualization of your choice.

Set the color of the pointer to Green.


It’s time to add a little code.  This is the only code you’ll need for dashing to work, so take a good look at it.

The Code

RequireComponent- The first thing to notice is we’re requiring a SteamVR_TrackedController component to be on the GameObject.  We’ll be attaching this component to the “Controller (right)” and the tracked controller will get added automatically.

Serialized Fields – We have 4 fields that are editable in the inspector.

  • MinDashRange & MaxDashRange – We have a  minimum and maximum dash range so we can prevent people from dashing across the entire map or ‘dashing in place’.
  • DashTime – This is the amount of time it will take the player to move from their position to the end point.  In this setup it’s always the same time, no matter how far you go.
  • MaskAnimator – Here we’re storing a reference to an animator.. we’ll be creating this animator soon.  It’s used to add an effect that will minimize motion sickness.

Private Fields

  • trackedController – We’ll be using the tracked controller reference in a few places, so we’re caching it here to avoid calling GetComponent more than once.
  • cameraRigRoot – The same situation applies here, we’ll use the cameraRigRoot to move the player around, and we don’t want to call GetComponent multiple times.

Start() – Here we do the caching of our private fields, and we register for the PadClicked event on the controllers.  If you wanted to use a different button, you’d register for some other event, like GripClicked or TriggerClicked.

TryDash() – This is what we call when the player clicks the trackpad.  The method performs a raycast from the controller’s position aimed in it’s forward direction.  This is exactly what the laserpointer we added earlier does, so if the laser pointer hits something, we should hit the same thing.  And our RaycastHit named “hit” will have a reference to the thing they both hit.  If it does hit something, and the distance of that hit is within our allowed ranges, we start a coroutine called DoDash().

DoDash() – The DoDash method does the actual work here.  First, we set a bool on our maskAnimator named “Mask” to true.  Then we give it 1/10th of a second for the animation to play.  Next, we take note of the cameraRig’s current position and save it off in startPoint.  We then go into a while loop.  This loop will execute for the amount of time specified in our ‘dashTime’ field above (editable in the inspector).  In the loop, we use Vector3.Lerp to move the cameraRig from it’s startPoint to the endPoint (where the user aimed the laser).

Vector3.Lerp returns a vector at a point between two other vector3’s.  The 3rd value (elapsedPct) determines how far between them it should be.  A value of 0 is right at the start point, a value of 1 is at the end, and a value of 0.5 is right in the middle.

Once we leave the loop, we set the “Mask” parameter on our animator back to false and we’re done.

Back to the Inspector

Attach the DashController to the Controller (right)

Play & Test

You should be able to play now and dash around.  Put on your headset, aim the controller and click the trackpad!

The Mask – Reducing motion sickness

Dashing around usually feels okay for most people, but motion sickness is still possible.

To help remove some of the sickness, we can add a cool masking effect.  And if you have some artistic skill, you can make it look much better.

Download this DashMask.png image

Select the Camera (eye).  Right click and add a Quad.

Drag the DashMask.png image onto the quad, a new material will be created automatically.

Select the quad and change the material’s render mode to “Fade”

Move and resize the quad to match this.  We need it close to our face as it’s going to be used to mask out the world.

Animating the Mask

If you play now, you’ll have a mask sitting in your face (or not depending on where you drag the alpha of the albedo color on that material)

What we want though is for the code above to be able to toggle the mask, with a smooth animated fade that we can customize later.

To do this, let’s create a simple animation for the alpha channel of our quad.

The Animation

Select the Quad

Open the Animation window.

Click create animation, name the animation FadeMask

Click record.

Open the Color picker for the Albedo of the material.

Slide the alpha around and settle it at 0.

Drag the Animator red line to the 1:00 mark.

Open the color picker again.

Slide the alpha to the far right.

The Animator

The animator will control our animations.  It handles the state of our animations and does the transitions between them.

Create a new animator for the quad.

The Mask Parameter

In the parameters section, add a new Boolean parameter and name it “Mask

Adding the Animations

Drag the FadeMask animation you just made onto it.

Do it again, but for the second FadeMask, rename it to “FadeMask Reverse“.

Add transitions between the two and use the “Maskparameter to control them.

Set the FadeMask speed to 5.

Select the FadeMask reverse and change the speed to -1.

We’re playing the animation in reverse here to fade the mask back out.  We could have a separate animation for this, but we don’t need it when we can just set the speed to -1.

Attaching the Animator

Add an animator to the Quad.

Assign the new Animator Controller to your animator.

The last thing we need to do is assign the MaskAnimator field of the Dash Controller.

Select the controller, drag the quad onto it, and you’re done..


The Result

Continue reading >

Oculus Rift + Touch or HTC Vive – Which should I get? – Rift vs Vive – UPDATED FOR NEW PRICING

Oculus Rift with Touch vs HTC Vive – The Differences

With the flurry of price drops from Oculus, it seems like this is a good topic to revisit.  If you’ve read my blog before, you may know that in general, I prefer the Vive for my personal use and development.  I do own an oculus rift and love it though.  If you don’t own either though, deciding which to get can be confusing, with arguments from people on both sides about why one is better than the other.  So here, instead of personal preferences, I’ll try to share the actual differences that I’ve experienced owning both for over a year.

The Elephant – Room Scale

From the start, Oculus said their system was for seated experiences.  Then it switched to standing, and eventually ‘full roomscale support’.  The Vive on the other hand was designed from the ground up to work great with room scale setups.

And make no mistake, there is a difference between the two.  I’ve used the Rift with touch controllers for 100’s of hours, with 2 sensors and with 3.  In general, it ‘works’, but it does have some drawbacks.

Here are the biggest differences with roomscale:


  • Cables are SHORT, so if your PC isn’t in an ideal spot, you’ll need active USB extension cables (around $20 on
  • The HMD cables are short too.. I don’t know why, but there is definitely less slack with the Oculus cables.
  • Mounting is harder.  They’re not built for wall mounting first, so you need to get your own mounting solution OR put them on a flat surface where they won’t get bumped or moved.  For my home setup, this is nearly impossible, but in the right office environment it’s not too bad.
  • Tracking isn’t quite as good.  It drops out occasionally (even with 3), but not often enough to be a huge deal.


  • Lighthouse setup is simple if you have a screwdriver and a wall with power somewhere near.
  • HMD cable is long, so long it sometimes hangs and gets tangled.  It’s still much better to have a long cable though.

Winner: Vive

Visual Quality

Personally, I can’t tell any real difference between the two visually.  My eyesight is good (20/10), and they both look about the same.  I do notice the lenses a bit on them both, but neither bothers me.

Winner: Draw


Here the Rift still wins out.  Even with the deluxe audio strap, the rift is lighter and a bit more comfortable to wear.  That’s not to say that the Vive is uncomfortable, but even a small weight difference can become noticeable if you wear it long enough (unless you happen to have giant neck muscles)..

If you do go with the Vive though, the deluxe audio strap does help with weight distribution and comfort, especially for people like me with giant over-sized heads.

Winner: Rift


With controllers, it’s kind of a toss up depending on the type of games you like to play.

A lot of people love the Oculus controllers, for good reason.  They’re pretty natural feeling and the touch detection is awesome.  For games where your controller represents your hand, the touch controllers are hands down the best choice.

When it comes to games where you hold a gun, paddle, bat, or something else though, I prefer the Vive wands.  They just feel a bit more like I’m holding the actual object.  And the new knuckles controllers look really promising, they could end up better than the touch controllers, but I can’t say for sure until I try them.

Winner: Draw


In general the content for the two devices is very similar.  Almost every game can be played across both devices, and there’s no killer game that would lead me to buy one over the other.  While Cculus does have a few exclusives, there are ways around it, and they’re not enough to sway me.

Winner: Draw


This is for the programmers and game devs.  If you’re wondering how hard it is to build a game for one system or the other… it’s about the same.  Both devices work great with SteamVR, though for the Rift, you do need to strip out SteamVR and use the native SDK.  That’s the only real difference though.  They are both a pleasure to develop for and I’d call this one a draw.

Winner: Draw


This is something people often overlook.  Both of these devices have the same GPU requirements, but they do differ on other requirements.  The Rift requires 3-4 compatible USB 3.0 ports.  I say compatible because many of the USB 3.0 ports I’ve tried on different systems do not work.  In-fact there’s a list of compatible USB 3.0 addon cards that you can buy if your onboard ones won’t work.

I’ve also had a few laptops that wouldn’t work with the Rift (USB compatibility or no video), while I’ve never had any problems with the Vive.

The Vive does require a single USB 2.0 port though.  It usually works fine in 3.0 ports too, but I have at least one system where the 3.0 ports are not compatible with the Vive.  Luckily there are plenty of spare 2.0’s left.

Winner: Vive


The massive shift in pricing is what drove me to re-consider my thoughts on this debate.  If you haven’t seen it already, the Rift has had a price drop to $399 with touch controllers INCLUDED!

That’s an amazing deal, and on prime day they even had a $100 amazon gift card included (looks like they killed that though).

If you want to do roomscale with the Rift though, you’ll wanna include another $99 for a 3rd sensor (and spare controllers).

The Vive also recently got a price drop to .

On top of that, the deluxe audio strap is another $99, and knuckles controllers aren’t gonna be free either.

But when you add in the cost of the 3rd sensor, they’re really not far apart anymore.  It’s about a $100 difference, which in my book turns out to a Draw.

Here, the winner is obvious, Oculus is way ahead.



If you go through the differences above, you’ll see it’s pretty even.  Rift wins in some places, the Vive wins in others.  This may leave you wondering, which is right for me?

First, make sure to check compatibility.  If you only have a laptop, make sure that laptop has a GPU that can handle it, AND the USB 3.0 ports for the Rift.  If you don’t have those ports, the Vive is definitely your choice.

Next, think hard about room scale.  Is it really important to you?  Are you going to play most of your games sitting down on a controller or standing in place?  How much room do you really have to play in?
If room scale isn’t your thing, the Rift is an obvious winner.

If you do have the room though, and really want to have experiences that span a bigger area, the Vive is a better choice.

Of course the giant distance in pricing can also be a big factor.  If you’re just looking at VR, and your system will work with the Rift, and money is a limited resource, take advantage of the Rift price drop and get started today.

As an Amazon Associate I earn from qualifying purchases.

Continue reading >

VR Movement – Ironman / Jetpack Flying with SteamVR

There are a lot of interesting movement systems for VR.  Arm swinging, dashing, teleportation, trackpad, and more.  In this guide, I’ll show you another one I experimented with previously and really enjoyed.  We’ll go over how to make a player fly like Ironman or at least like a person holding some jet packs..


This article assumes you have SteamVR in your project, though converting it to OVR native should be a simple task.

Rigidbody on [CameraRig]

Start by adding the [CameraRig] prefab from the SteamVR package to your scene.

Add a Rigidbody component to the camerarig.

Freeze the rotation values.

Locking rotation is very important, if you don’t, your player will end up spinning around, flipping over, and getting sick

Now, add a box collider to the rig so you can land (this could be a capsule or any other collider that works well for your game)

Controller Setup

Expand the CameraRig and select both controllers.

Add the SteamVR_TrackedController component to them both.

JetpackBasic Script

Create a new script, name it “JetpackBasic”

Replace the contents with this.

Your controller should look similar to this: (the sphere collider and audio source are optional and beyond the scope of this article)

Press Play

Aim the controllers where you want to go.

Pull the trigger.


Continue reading >

How to looking inside or through a wall on HoloLens

If you’re developing for the hololens, one of the coolest features you can have is see through walls.  Of course you can’t really see ‘through’ a wall, but you can place holograms in a way that they look like they’re inside or behind the wall.  For this to work though, you need to be able to mask out the wall in the foreground.

As an example, I’ll show a breakable wall we built at the last HoloHack, where you could see the cheese inside.

The wall is made up of 4 parts..

  1. Plaster
  2. Bricks
  3. Cheese (inside)
  4. The Occluder (makes it all work)



Bricks behind the Plaster

Cheese behind the wall


It’s hard to see in screenshots, but the 3d depth really shows when you look on the hololens.

What’s with the big black cutout?

That’s the occluder.  It’s actually just a mesh, cut out int he shape we wanted the wall.  The material on it is using the Unlit Color shader set to black.

On HoloLens, solid black is rendered transparent.  With a mesh behind it, and the occluder on the wall, this actually makes the real world wall not visible, allowing you the ability to see ‘inside’ the wall.

I wish I could show a video of how this works, but recording on a hololens doesn’t show the occluder right (it looks correct in the headset but not the recording).

It’s also very important that you use an unlit shader (either unlit/color, or your own).  Any variance from a true black (0, 0, 0), will prevent this from working.

Important Note: A quad or UI sprite will NOT work.  The object needs a mesh of some sort.  It does work fine with a primitive cube though.

One of the best examples of this functionality I’ve seen is in Robo Raid.  You can see a quick video of it here.

Continue reading >

SteamVR Laser Pointer Menus – Updated for SteamVR 1.2.2

If you build a VR game or experience, there’s a good chance you’ll end up needing some menus.  There are a lot of great ways to build VR menus, ranging from basic laser pointers to some amazing interaction based systems.  Since laser pointers are one of the simplest and most common systems, this guide will focus on how to create them.  We’ll discuss how to use the SteamVR Laser Pointer system ( SteamVR_Laserpointer.cs ).  And we’ll make your standard Unity UGUI (4.6 UI) interface work with the laser pointers.

SteamVR Laser Pointer (steamvr_laserpointer.cs)

The SteamVR Laserpointer is included in the SteamVR asset pack.  Once you’ve imported the asset pack, you can see the script located in the SteamVR/Extras folder.

CameraRig & Setup

For this example, we’ll use the included [CameraRig] prefab and make a few minor modifications.

Create a new scene.

Delete the “MainCamera” from the scene.

Add the [CameraRig] prefab to the scene.

The CameraRig prefab is located in the SteamVR/Prefabs folder.

Select both the Controller (left) and Controller (right) children of the [CameraRig]

Remove the SteamVR TrackedObject component.

Add the SteamVR_TrackedController component

Add the SteamVR_LaserPointer component

Select a color for your pointers.  I’ve chosen RED for mine…


Because the laserpointer script doesn’t handle input itself, we’ll need to add a new script to tell our UI when we want to interact with it.

Create a new c# script.

Name it VRUIInput

Replace the contents with this.

Attach the VRUIInput component to both the Controller (left) and Controller (right).


Update: this is fixed and not needed as of SteamVR 1.2.2, this fix is no-longer needed.  Upgrade to 1.2.2 and skip this section! 🙂

Before your controllers will track, you’ll need to add the SteamVR_Update poses script to the camera.  This is a known bug in the latest SteamVR asset pack.

Select the Camera (eye) child of the [CameraRig]

Add the SteamVR_UpdatePoses component to it.

Half way there!

If you press play now, you’ll see laser pointers beaming out of your controllers.  They won’t do much yet, but go ahead and check them out to make sure they’re visible.

The UI

It’s time to create a UI that we can interact with.


Create a new Canvas.

Set the RenderMode to “World Space

Set the transform values to match these.

Scale x = 0.01
Scale y = 0.01
Scale z = 0.01
Width = 500
Height = 500
Position x = 0.0
Position y = 0.0
Position z = 8.0

Panel & Button

Under the Canvas, create a Panel.

Under the Panel, create a Button.


For our button to interact with the laser pointer, we need it to have a collider.

That colliders size & shape need to match our button.  We could do this manually, but to avoid having to resize the collider whenever the button changes, you can use this simple script.

Create a new c# Script.

Name it “VRUIItem”

Replace the contents with this.

Attach the VRUIItem component to the button.

You should see a BoxCollider added automatically and scaled to the proper size.

Select the Button Component.

Change the Highlight color to something more obvious.. like green..

Add an event handler that changes the Text component to say something different (so we can tell if the click worked).


Here, I’ve duplicated the button 12 times and resized it a bit to show a bit more action.  Go ahead and try that yourself, or build a real menu for your game now. 🙂

The SteamVR Laser Pointer component, combined with a couple simple scripts, can get you up and running in minutes.  From here, you can simply replace the OnClick events with any normal Unity UI click events you’d use in a non-vr game.

While I’m a big fan of unique and interesting menu systems for VR, laser pointers are definitely an easy to use and intuitive method for input.  And for some games or apps, they’re definitely the preferred choice.


It’s worth noting that another great way to setup UI interactions is via VRTK (VR Tool Kit).  VRTK is something I’ve used in the past and love.  It’s pretty easy to get started with and adds a ton of functionality beyond just laser pointers.  You can read more about VRTK here.


Continue reading >
1 2 3 4
Page 1 of 4