All posts in "Virtual Reality"

How to play Stereoscopic 3D 360 Video in VR with Unity3D

If you’ve played 360 video in VR, you know it’s kinda cool… But lately, 3D video is starting to take off.  With off the shelf cameras like the Vuze line, it’s gotten easy to record your own without spending the cost of a new car to get started..  Before today, playing 360 3D video in VR with Unity3D was a bit complicated.. but now, thanks to an open source project put out by Unity Technologies, it’s getting easier.  Earlier today, I stumbled on a post and github project they’ve put together to make 3D 360 video simple to implement.

Video Version

Project Setup

To get going, you’ll need a couple things..  First, you need a 3D video.  For this article, I’m using a Side-By-Side video you can download from here: https://www.youtube.com/watch?v=UsC6PxAwKNI

If you’re not sure how to download videos from youtube.. I’ve recently started using 4K Video Downloader and love it.. it’s free and seems to have no adware..

Once you’ve downloaded your video, you’ll need to grab the script and shader from this github project: https://github.com/Unity-Technologies/SkyboxPanoramicShader

You can download it or clone the repository, whatever you feel most comfortable doing.

Place the shader and script into your project along with the video file you want to play.

You’ll also need to visit your player settings and make sure the Virtual Reality Enabled box is checked.

Render Texture

To use this shader, we need a render texture.  Create a new render texture and name it “Panoramic Render Texture”

Select the RenderTexture and change the size to 4096×4096.

Change the depth buffer to “No depth buffer”.

Render textures are textures that can be rendered to. They can be used to implement image based rendering effects, dynamic shadows, projectors, reflections or surveillance cameras.

The Video Player

To create a video player, drag the video from the project view into the scene view.  A player will automatically be created with the video assigned to it.

Select the video player and look to the inspector.

Change the render mode to “Render Texture”.

Drag the render texture from the project view into the target texture field.

The Material

Next, we need to create a material for the shader.

Create a new material, name it “Skybox”

Drag the render texture onto it.

Set the mapping type to “Latitude Longitude Layout”

Change the image type to 180 Degrees (this video is 180).

Set the 3D layout to “Side by Side”

Skybox Setup

The last step is to assign our material to the skybox.

Open the Lighting window.

Drag the Skybox into the Skybox Material field.

All Done

That’s it, save your scene…

Then put on the headset and press play, the video should start playing in 3D.

What about other video types?

This shader appears to have support for a few different video formats.  In this article, we covered a simple 180 degree side by side video, but you may have noticed the options for 360 & over under.  I haven’t tried those yet, but if you’re interested in them, I’d recommend you check out the full documentation they’ve provided here: https://docs.google.com/document/d/1JjOQ0dXTYPFwg6eSOlIAdqyPo6QMLqh-PETwxf8ZVD8/edit#

Continue reading >
23 Shares

VR Dash movement / locomotion with SteamVR and Unity3D

Ready to add dash locomotion to your game?  It’s pretty simple to setup, but there are some important things to consider before you start. In this guide, I’ll show you how to set it up, what to watch out for, and a trick to minimize motion sickness.  All you need is SteamVR and Unity3D to get started.

Before you get started, you’ll want to have some sort of environment setup.  Ideally you want something where it’s obvious that you’re moving.  The locomotion is FAST, and if you just have a flat white area or single repeated texture, it’s gonna be a whole lot harder to tell what’s going on.

Here’s what my test scene looks like.

SteamVR / Oculus SDK

Once you have an environment setup, you’ll need the SteamVR plugin.  The code here is setup for SteamVR but could definitely be ported to Oculus with minimal effort, all we really care about in code is the button click.

After importing SteamVR, drop the [CameraRig] prefab into the middle of your scene.

Select the right controller and add the SteamVR_LaserPointer component to it.

We’re only using the laser pointer to visualize where we’ll dash, there’s no requirement to use the pointer for things to work, you can use it if you like or choose any other visualization of your choice.

Set the color of the pointer to Green.

DashController

It’s time to add a little code.  This is the only code you’ll need for dashing to work, so take a good look at it.

The Code

RequireComponent- The first thing to notice is we’re requiring a SteamVR_TrackedController component to be on the GameObject.  We’ll be attaching this component to the “Controller (right)” and the tracked controller will get added automatically.

Serialized Fields – We have 4 fields that are editable in the inspector.

  • MinDashRange & MaxDashRange – We have a  minimum and maximum dash range so we can prevent people from dashing across the entire map or ‘dashing in place’.
  • DashTime – This is the amount of time it will take the player to move from their position to the end point.  In this setup it’s always the same time, no matter how far you go.
  • MaskAnimator – Here we’re storing a reference to an animator.. we’ll be creating this animator soon.  It’s used to add an effect that will minimize motion sickness.

Private Fields

  • trackedController – We’ll be using the tracked controller reference in a few places, so we’re caching it here to avoid calling GetComponent more than once.
  • cameraRigRoot – The same situation applies here, we’ll use the cameraRigRoot to move the player around, and we don’t want to call GetComponent multiple times.

Start() – Here we do the caching of our private fields, and we register for the PadClicked event on the controllers.  If you wanted to use a different button, you’d register for some other event, like GripClicked or TriggerClicked.

TryDash() – This is what we call when the player clicks the trackpad.  The method performs a raycast from the controller’s position aimed in it’s forward direction.  This is exactly what the laserpointer we added earlier does, so if the laser pointer hits something, we should hit the same thing.  And our RaycastHit named “hit” will have a reference to the thing they both hit.  If it does hit something, and the distance of that hit is within our allowed ranges, we start a coroutine called DoDash().

DoDash() – The DoDash method does the actual work here.  First, we set a bool on our maskAnimator named “Mask” to true.  Then we give it 1/10th of a second for the animation to play.  Next, we take note of the cameraRig’s current position and save it off in startPoint.  We then go into a while loop.  This loop will execute for the amount of time specified in our ‘dashTime’ field above (editable in the inspector).  In the loop, we use Vector3.Lerp to move the cameraRig from it’s startPoint to the endPoint (where the user aimed the laser).

Vector3.Lerp returns a vector at a point between two other vector3’s.  The 3rd value (elapsedPct) determines how far between them it should be.  A value of 0 is right at the start point, a value of 1 is at the end, and a value of 0.5 is right in the middle.

Once we leave the loop, we set the “Mask” parameter on our animator back to false and we’re done.

Back to the Inspector

Attach the DashController to the Controller (right)

Play & Test

You should be able to play now and dash around.  Put on your headset, aim the controller and click the trackpad!

The Mask – Reducing motion sickness

Dashing around usually feels okay for most people, but motion sickness is still possible.

To help remove some of the sickness, we can add a cool masking effect.  And if you have some artistic skill, you can make it look much better.

Download this DashMask.png image

Select the Camera (eye).  Right click and add a Quad.

Drag the DashMask.png image onto the quad, a new material will be created automatically.

Select the quad and change the material’s render mode to “Fade”

Move and resize the quad to match this.  We need it close to our face as it’s going to be used to mask out the world.

Animating the Mask

If you play now, you’ll have a mask sitting in your face (or not depending on where you drag the alpha of the albedo color on that material)

What we want though is for the code above to be able to toggle the mask, with a smooth animated fade that we can customize later.

To do this, let’s create a simple animation for the alpha channel of our quad.

The Animation

Select the Quad

Open the Animation window.

Click create animation, name the animation FadeMask

Click record.

Open the Color picker for the Albedo of the material.

Slide the alpha around and settle it at 0.

Drag the Animator red line to the 1:00 mark.

Open the color picker again.

Slide the alpha to the far right.

The Animator

The animator will control our animations.  It handles the state of our animations and does the transitions between them.

Create a new animator for the quad.

The Mask Parameter

In the parameters section, add a new Boolean parameter and name it “Mask

Adding the Animations

Drag the FadeMask animation you just made onto it.

Do it again, but for the second FadeMask, rename it to “FadeMask Reverse“.

Add transitions between the two and use the “Maskparameter to control them.

Set the FadeMask speed to 5.

Select the FadeMask reverse and change the speed to -1.

We’re playing the animation in reverse here to fade the mask back out.  We could have a separate animation for this, but we don’t need it when we can just set the speed to -1.

Attaching the Animator

Add an animator to the Quad.

Assign the new Animator Controller to your animator.

The last thing we need to do is assign the MaskAnimator field of the Dash Controller.

Select the controller, drag the quad onto it, and you’re done..

 

The Result

Continue reading >
16 Shares

Oculus Rift + Touch or HTC Vive – Which should I get? – Rift vs Vive

Oculus Rift with Touch vs HTC Vive – The Differences

With the flurry of price drops from Oculus, it seems like this is a good topic to revisit.  If you’ve read my blog before, you may know that in general, I prefer the Vive for my personal use and development.  I do own an oculus rift and love it though.  If you don’t own either though, deciding which to get can be confusing, with arguments from people on both sides about why one is better than the other.  So here, instead of personal preferences, I’ll try to share the actual differences that I’ve experienced owning both for over a year.

The Elephant – Room Scale

From the start, Oculus said their system was for seated experiences.  Then it switched to standing, and eventually ‘full roomscale support’.  The Vive on the other hand was designed from the ground up to work great with room scale setups.

And make no mistake, there is a difference between the two.  I’ve used the Rift with touch controllers for 100’s of hours, with 2 sensors and with 3.  In general, it ‘works’, but it does have some drawbacks.

Here are the biggest differences with roomscale:

Rift

  • Cables are SHORT, so if your PC isn’t in an ideal spot, you’ll need active USB extension cables (around $20 on monoprice.com).
  • The HMD cables are short too.. I don’t know why, but there is definitely less slack with the Oculus cables.
  • Mounting is harder.  They’re not built for wall mounting first, so you need to get your own mounting solution OR put them on a flat surface where they won’t get bumped or moved.  For my home setup, this is nearly impossible, but in the right office environment it’s not too bad.
  • Tracking isn’t quite as good.  It drops out occasionally (even with 3), but not often enough to be a huge deal.

Vive

  • Lighthouse setup is simple if you have a screwdriver and a wall with power somewhere near.
  • HMD cable is long, so long it sometimes hangs and gets tangled.  It’s still much better to have a long cable though.

Winner: Vive

Visual Quality

Personally, I can’t tell any real difference between the two visually.  My eyesight is good (20/10), and they both look about the same.  I do notice the lenses a bit on them both, but neither bothers me.

Winner: Draw


Comfort

Here the Rift still wins out.  Even with the deluxe audio strap, the rift is lighter and a bit more comfortable to wear.  That’s not to say that the Vive is uncomfortable, but even a small weight difference can become noticeable if you wear it long enough (unless you happen to have giant neck muscles)..

If you do go with the Vive though, the deluxe audio strap does help with weight distribution and comfort, especially for people like me with giant over-sized heads.

Winner: Rift


Controllers

With controllers, it’s kind of a toss up depending on the type of games you like to play.

A lot of people love the Oculus controllers, for good reason.  They’re pretty natural feeling and the touch detection is awesome.  For games where your controller represents your hand, the touch controllers are hands down the best choice.

When it comes to games where you hold a gun, paddle, bat, or something else though, I prefer the Vive wands.  They just feel a bit more like I’m holding the actual object.  And the new knuckles controllers look really promising, they could end up better than the touch controllers, but I can’t say for sure until I try them.

Winner: Draw


Content

In general the content for the two devices is very similar.  Almost every game can be played across both devices, and there’s no killer game that would lead me to buy one over the other.  While Cculus does have a few exclusives, there are ways around it, and they’re not enough to sway me.

Winner: Draw


Development

This is for the programmers and game devs.  If you’re wondering how hard it is to build a game for one system or the other… it’s about the same.  Both devices work great with SteamVR, though for the Rift, you do need to strip out SteamVR and use the native SDK.  That’s the only real difference though.  They are both a pleasure to develop for and I’d call this one a draw.

Winner: Draw


Compatibility

This is something people often overlook.  Both of these devices have the same GPU requirements, but they do differ on other requirements.  The Rift requires 3-4 compatible USB 3.0 ports.  I say compatible because many of the USB 3.0 ports I’ve tried on different systems do not work.  In-fact there’s a list of compatible USB 3.0 addon cards that you can buy if your onboard ones won’t work.

I’ve also had a few laptops that wouldn’t work with the Rift (USB compatibility or no video), while I’ve never had any problems with the Vive.

The Vive does require a single USB 2.0 port though.  It usually works fine in 3.0 ports too, but I have at least one system where the 3.0 ports are not compatible with the Vive.  Luckily there are plenty of spare 2.0’s left.

Winner: Vive


Price

The massive shift in pricing is what drove me to re-consider my thoughts on this debate.  If you haven’t seen it already, the Rift has had a price drop to $399 with touch controllers INCLUDED!

That’s an amazing deal, and on prime day they even had a $100 amazon gift card included (looks like they killed that though).

If you want to do roomscale with the Rift though, you’ll wanna include another $99 for a 3rd sensor (and spare controllers).

The Vive on the other hand is still hovering at $799 ($699 on prime day).

On top of that, the deluxe audio strap is another $99, and knuckles controllers aren’t gonna be free either.

Here, the winner is obvious, Oculus is way ahead.

Winner: Rift


Recommendation

If you go through the differences above, you’ll see it’s pretty even.  Rift wins in some places, the Vive wins in others.  This may leave you wondering, which is right for me?

First, make sure to check compatibility.  If you only have a laptop, make sure that laptop has a GPU that can handle it, AND the USB 3.0 ports for the Rift.  If you don’t have those ports, the Vive is definitely your choice.

Next, think hard about room scale.  Is it really important to you?  Are you going to play most of your games sitting down on a controller or standing in place?  How much room do you really have to play in?
If room scale isn’t your thing, the Rift is an obvious winner.

If you do have the room though, and really want to have experiences that span a bigger area, the Vive is a better choice.

Of course the giant distance in pricing can also be a big factor.  If you’re just looking at VR, and your system will work with the Rift, and money is a limited resource, take advantage of the Rift price drop and get started today.

Continue reading >
67 Shares

VR Movement – Ironman / Jetpack Flying with SteamVR

There are a lot of interesting movement systems for VR.  Arm swinging, dashing, teleportation, trackpad, and more.  In this guide, I’ll show you another one I experimented with previously and really enjoyed.  We’ll go over how to make a player fly like Ironman or at least like a person holding some jet packs..

Prerequisites

This article assumes you have SteamVR in your project, though converting it to OVR native should be a simple task.

Rigidbody on [CameraRig]

Start by adding the [CameraRig] prefab from the SteamVR package to your scene.

Add a Rigidbody component to the camerarig.

Freeze the rotation values.

Locking rotation is very important, if you don’t, your player will end up spinning around, flipping over, and getting sick

Now, add a box collider to the rig so you can land (this could be a capsule or any other collider that works well for your game)

Controller Setup

Expand the CameraRig and select both controllers.

Add the SteamVR_TrackedController component to them both.

JetpackBasic Script

Create a new script, name it “JetpackBasic”

Replace the contents with this.

Your controller should look similar to this: (the sphere collider and audio source are optional and beyond the scope of this article)

Press Play

Aim the controllers where you want to go.

Pull the trigger.

Fly!

Continue reading >
15 Shares

How to looking inside or through a wall on HoloLens

If you’re developing for the hololens, one of the coolest features you can have is see through walls.  Of course you can’t really see ‘through’ a wall, but you can place holograms in a way that they look like they’re inside or behind the wall.  For this to work though, you need to be able to mask out the wall in the foreground.

As an example, I’ll show a breakable wall we built at the last HoloHack, where you could see the cheese inside.

The wall is made up of 4 parts..

  1. Plaster
  2. Bricks
  3. Cheese (inside)
  4. The Occluder (makes it all work)

 

Plaster

Bricks behind the Plaster

Cheese behind the wall

 

It’s hard to see in screenshots, but the 3d depth really shows when you look on the hololens.

What’s with the big black cutout?

That’s the occluder.  It’s actually just a mesh, cut out int he shape we wanted the wall.  The material on it is using the Unlit Color shader set to black.

On HoloLens, solid black is rendered transparent.  With a mesh behind it, and the occluder on the wall, this actually makes the real world wall not visible, allowing you the ability to see ‘inside’ the wall.

I wish I could show a video of how this works, but recording on a hololens doesn’t show the occluder right (it looks correct in the headset but not the recording).

It’s also very important that you use an unlit shader (either unlit/color, or your own).  Any variance from a true black (0, 0, 0), will prevent this from working.

Important Note: A quad or UI sprite will NOT work.  The object needs a mesh of some sort.  It does work fine with a primitive cube though.

One of the best examples of this functionality I’ve seen is in Robo Raid.  You can see a quick video of it here.

Continue reading >
8 Shares

SteamVR Laser Pointer Menus – Updated for SteamVR 1.2.2

If you build a VR game or experience, there’s a good chance you’ll end up needing some menus.  There are a lot of great ways to build VR menus, ranging from basic laser pointers to some amazing interaction based systems.  Since laser pointers are one of the simplest and most common systems, this guide will focus on how to create them.  We’ll discuss how to use the SteamVR Laser Pointer system ( SteamVR_Laserpointer.cs ).  And we’ll make your standard Unity UGUI (4.6 UI) interface work with the laser pointers.

SteamVR Laser Pointer (steamvr_laserpointer.cs)

The SteamVR Laserpointer is included in the SteamVR asset pack.  Once you’ve imported the asset pack, you can see the script located in the SteamVR/Extras folder.

CameraRig & Setup

For this example, we’ll use the included [CameraRig] prefab and make a few minor modifications.

Create a new scene.

Delete the “MainCamera” from the scene.

Add the [CameraRig] prefab to the scene.

The CameraRig prefab is located in the SteamVR/Prefabs folder.

Select both the Controller (left) and Controller (right) children of the [CameraRig]

Remove the SteamVR TrackedObject component.

Add the SteamVR_TrackedController component

Add the SteamVR_LaserPointer component

Select a color for your pointers.  I’ve chosen RED for mine…

VRUIInput.cs

Because the laserpointer script doesn’t handle input itself, we’ll need to add a new script to tell our UI when we want to interact with it.

Create a new c# script.

Name it VRUIInput

Replace the contents with this.

Attach the VRUIInput component to both the Controller (left) and Controller (right).

UpdatePoses

Update: this is fixed and not needed as of SteamVR 1.2.2, this fix is no-longer needed.  Upgrade to 1.2.2 and skip this section! 🙂

Before your controllers will track, you’ll need to add the SteamVR_Update poses script to the camera.  This is a known bug in the latest SteamVR asset pack.

Select the Camera (eye) child of the [CameraRig]

Add the SteamVR_UpdatePoses component to it.

Half way there!

If you press play now, you’ll see laser pointers beaming out of your controllers.  They won’t do much yet, but go ahead and check them out to make sure they’re visible.

The UI

It’s time to create a UI that we can interact with.

Canvas

Create a new Canvas.

Set the RenderMode to “World Space

Set the transform values to match these.

Scale x = 0.01
Scale y = 0.01
Scale z = 0.01
Width = 500
Height = 500
Position x = 0.0
Position y = 0.0
Position z = 8.0

Panel & Button

Under the Canvas, create a Panel.

Under the Panel, create a Button.

VRUIItem.cs

For our button to interact with the laser pointer, we need it to have a collider.

That colliders size & shape need to match our button.  We could do this manually, but to avoid having to resize the collider whenever the button changes, you can use this simple script.

Create a new c# Script.

Name it “VRUIItem”

Replace the contents with this.

Attach the VRUIItem component to the button.

You should see a BoxCollider added automatically and scaled to the proper size.

Select the Button Component.

Change the Highlight color to something more obvious.. like green..

Add an event handler that changes the Text component to say something different (so we can tell if the click worked).

Conclusions

Here, I’ve duplicated the button 12 times and resized it a bit to show a bit more action.  Go ahead and try that yourself, or build a real menu for your game now. 🙂

The SteamVR Laser Pointer component, combined with a couple simple scripts, can get you up and running in minutes.  From here, you can simply replace the OnClick events with any normal Unity UI click events you’d use in a non-vr game.

While I’m a big fan of unique and interesting menu systems for VR, laser pointers are definitely an easy to use and intuitive method for input.  And for some games or apps, they’re definitely the preferred choice.

VRTK

It’s worth noting that another great way to setup UI interactions is via VRTK (VR Tool Kit).  VRTK is something I’ve used in the past and love.  It’s pretty easy to get started with and adds a ton of functionality beyond just laser pointers.  You can read more about VRTK here.

 

Continue reading >
18 Shares

VR Interaction – SteamVR Lab Interaction System – Throwables

Today, I’ll introduce you to VR Interaction using the Lab Interaction System.  If you’ve played The Lab, you already know it includes a variety of different interactions spread across a flurry of mini-games.  With a 98% positive review rating, it’s one of the most popular VR experiences, and the interaction system is one of my favorite parts.

Getting started with The Lab Interaction System only takes a couple quick steps.  In this guide, I’ll show you the setup and how to create your first grabbable & throwable object.

Setup The Lab Interaction System

Create an empty project

Import the SteamVR asset from the Asset Store

In your empty scene, delete the “Main Camera”

Add the [CameraRig] prefab from the SteamVR/Prefabs folder into your scene.

Create a new empty gameobject and name it Player

Reset the position of the Player in the Inspector

 

Right click on the “Player” GameObject in the hierarchy and create 2 Empty Children

Hands Setup

Name them LeftHand & RightHand

Add the Hand component to both the LeftHand & RightHand

Select the LeftHand

In the Inspector, drag the RightHand to the “Other Hand” field.

Select the Right Hand

Assign the “Left Hand” to the “Other Hand” field

Change the “Starting Hand Type” to Right

Controller Prefab

Because we’re not assigning any actual hand models here, we’ll want to fill in the Controller Prefab field.  This will allow the hands to automatically create controller models much the same way the default [CameraRig] controllers do.

Select the left hand and assign the “BlankController” prefab to the “Controller Prefab” field.

Repeat this for the RIGHT HAND

The BlankController prefab can be found in the SteamVR/InteractionSystem/Core/Prefabs folder

Player Setup

Select the “Player” in the Hierarchy.

Add the Player Component to the “Player” GameObject

Drag the hands onto the “Hands” array.

Expand the [CameraRig] and select the “Camera (eye)”

Right click on the Camera (eye) and add a new 3d Object->Sphere as a child.

In the Inspector, change the Sphere Scale to 0.1, 0.1, 0.1

Uncheck (or remove) the Mesh Renderer component of the Sphere

Select the Player

Assign the newly created Sphere to the “Head Collider” field of the Player component

Drag the [CameraRig] to the “Rig Steam VR” field

Environment Setup

So far, we have the player all setup with hands and ready to go.  It’s time to create a little environment where we can grab and throw things!

Create an empty Plane

Reset the position

Create a Cube

Set the transform values to match the image above – position = (0, 0, 2.5) | scale = (1, 1, 4)

Your scene view should look similar to the image here.

Throwables

Ok it’s time to make something throwable, luckily this is the easiest part.

Create a new empty Sphere

Rename it “Ball

Set the Scale to (0.1, 0.1, 0.1)

Set the Position to (0, 0.6, 0.7)

Your ball should now be just above the Cube like this

Add the “Throwable” component to the “Ball”

You’ll immediately see a few other components get added to the ball automatically.

That’s it..

Press Play

Pickup your ball (with the Trigger) and throw it!

Conclusions

The SteamVR Lab Interaction system is actually very featured and this only covers the first of many possible interactions available.  The player and hand setup you have here will work as a great starting point for exploring the other interactions.  For a full demo of all the interactions, make sure you check out the “Interactions_Example” scene in the SteamVR/InteractionSystem/Samples/Scenes folder.  Also if there’s a specific interaction type you’d like more info on, please comment below and let me know.

 

 

Continue reading >
13 Shares

SteamVR Locomotion and Teleportation Movement

SteamVR Teleport

SteamVR ships with an easy to use teleport system that requires very little setup.  If you’ve played the lab, you’ve seen the system in action.  It allows for quick teleportation to specified areas.

To setup the SteamVR teleportation system, there are a couple requirements.

Basic Setup

If you want to work along with the steps below, you’ll need the SteamVR plugin installed.  If you just want to read though, skip ahead to “Hands”

Next, you’ll want to create an empty cube and scale it to (20, 1, 20).

Add some material to the floor, in my example below, you’ll see I went with a wood plank material I found somewhere..

Finally, add a [CameraRig] prefab from the SteamVR plugin prefabs folder.

Hands

The SteamVR teleport system is coupled with the lab’s interaction system.  As such, it requires the use of the Hand and player objects.

The way it’s done in the samples, and the way I prefer to do it is create two new empty children under the [CameraRig].

Re-name them “Left Hand” & “Right Hand”

Add the “Hand” component to each of them.

Set the “Other Hand” field on both of them by dragging the other ‘hand’ into the field for each of them.

For the Right hand, make sure to set the “Starting Hand Type” to right (the default value is left).

Player

For teleport interactions to work, you’ll need to add the “Player” component to the [CameraRig].

Drag & add the Camera(eye) to the “Hmd Transforms” array.

Add the Left & Right hands to the “Hands” array.

Teleport Prefab

Next, you’ll need to add the “Teleporting” prefab to the scene.

You can find the “Teleporting” prefab in SteamVR/InteractionSystem/Teleport/Prefabs

Drag & Add the Teleporting prefab to the scene.

There are plenty of options you can adjust on the Teleport prefab, but for now, we’ll leave it at the defaults.

Teleport Points

Before we can teleport, we’ll need to tell the system where a valid teleport point is.  Again, the SteamVR system comes with a great prefab to get started.

Take the TeleportPoint prefab and place a couple instances in the scene above the ground.

Teleport Points – Testing

Press Play

Now use the trackpad to teleport to the different teleport points.

Teleport Points – Scene Loading

Though we won’t use it here, I also wanted to point out that the Teleport Point has a “Teleport Type” option on it.

This allows you to make the point load a different scene when the player moves there.  If you want to use this functionality, just switch the type and enter the scene name.

Teleport Area

The final step before we can teleport is to add a “Teleport Area”.

A teleport area works a lot like the teleport point, but allows you to specify a bigger ‘area’ where you can land.

One way to set this up would be to just add the “Teleport Area” component to the floor.  If you try this though, the ground material will change, and when you’re not teleporting, the ground will be disabled completely.

Instead, what we’ll do is clone the ground object, move it a bit, and make that our teleport area.

Duplicate the floor.

Make the duplicate a child of the floor

Re-name it to “Floor – Teleport Area”

Set the Y position to 0.01

Add the “Teleport Area” component to it

As I mentioned above, you’ll notice that the material has automatically changed to “TeleportAreaVisible”.  Again, this is controlled by the “Teleporting” prefab in the scene and one of the reasons we can’t just add the component to the ground.

Teleport Area – Testing

With this in place, you should be able to hit play and teleport all around the area.  Also notice that the teleport points will work fine alongside the area.

Conclusions

The SteamVR interaction system used in the lab is a great starting point for many projects.  If you’re looking to get the basics down, definitely check this system out.  It doesn’t cover every form of movement though and is only a small sampling of the possible locomotion types you can add.  But if you just need good teleport movement, it’s easy to setup and works well.  If you’re interested in other locomotion systems, drop a comment below and let me know what you’d like to learn more about.

Continue reading >
16 Shares

Building an Interactive Mobile 360 Video Player for GearVR

Building an interactive 360 Video player for Mobile / GearVR

This post will run you through the basics of creating a 360 video player for the Oculus GearVR.  The same techniques will work for google cardboard, daydream, htc vive, and oculus rift with minor changes.  You’ll learn how to play a video, bring up a quick menu, and switch to another video.

Sphere Setup

Before we can play 360 video, we need an inverted sphere.

In a previous post, I showed how to use an inverted sphere mesh.

Today, I’ll show you how to create one in Unity using a single editor script – original source here

Create a new folder named “Editor”

Create a script named InvertedSphere.cs in the “Editor” folder.

Replace the script contents with this.

You may need to hit play before continuing just to force the editor to compile your new editor script

In a new scene, right click the Hierarchy and select “Create Other”->”Inverted Sphere

Set the size to 100.

Click “Create Inverted Sphere”

Select your “Inverted Sphere” in the Hierarchy

Add a “Video Player” component to the sphere.

Importing Videos

Create “Videos” folder.

Download some sample 360 videos that you enjoy from youtube or your favorite source.

Place them in the “Videos” folder.

Transcoding

Some videos will play fine on GearVR without transcoding them, but many won’t.  Luckily, transcoding is as easy as checking a box and waiting.

Select your videos.

In the inspector, check the “Transcode” box.

Click Apply

Wait (this took 15 minutes for me)

Video Selection Script

To manage our basic menu system, we’ll need two scripts.

Create a “Code” folder.

In the folder create these two scripts.

VideoSelection.cs

GazeInput.cs

Menu Setup

For this sample we’ll create a basic menu out of cubes.  Once it’s working, feel free to pretty them up however you like or replace the cubes with something that fits better to your style.

Create an empty gameobject in the Hierarhcy

Name it “Menu”.

Reset the transform so it’s in the center of your scene.

Right click on the “Menu” object and create a cube as a child.

Re-name your cube to be “Menu Item – yourvideoname” (example I used a bunny video so mine looks like this)

Adjust the transform values so that X & Z are 1.

Add the VideoSelection component to the new Menu Item

Drag your first video clip onto the Video Clip field of the component.

The second Menu Item

To create another menu item, duplicate the existing one (ctrl-d).

Re-name it

Change the X to -1

Swap out the video clip with a different one.

Camera Setup

The last thing we need to do is setup the camera.

Select the “Main Camera” object in your Hierarchy.

Reset the position.

Set the Position Y value to positive 1.

Add the GazeInput component to the “Main Camera”

 

Building to the phone

The project should be good and running now.  The final step is to put it onto a GearVR (or whatever platform you’re using).

If you already have an OSIG file, place that in the Plugins/Android/Assets folder of your project (create this folder structure if it doesn’t exist).

If you don’t know what I’m talking about, go here and follow the instructions (only takes a minute) – https://dashboard.oculus.com/tools/osig-generator/

With the OSIG in place we’re good to go.

Save the scene and open the Build dialog

Add the saved scene.

Click build and run with the phone plugged in and if all goes right, you’ll have your video player up and running.

**** IMPORTANT – Make sure your phone is plugged in via USB and UNLOCKED or the deployment will fail ****

Controls

The controls for this video player are very simple.

Tap the GearVR touchpad to bring up the 2 cubes.

Look at the cube you want to play from and tap again.  The video will start playing.

Just tap again to bring the menu (cubes) up again.

 

Conclusions & Extensions

This is not the prettiest video player, but should give you a good idea how to get started.  Once you get it working, the first thing I’d recommend doing is making the ‘menu’ a whole lot prettier, adding some text to show what the video names are, or some video preview on the items.

Overall though, the new video player functionality, along with the included Oculus SDK makes it really easy to get going with your own video player now.  I’m excited to see all the variations people build, and may end up building a more full featured one soon myself.

Continue reading >
27 Shares

Avoid the VR waiting room with SteamVR Skybox

Valves SteamVR plugin makes getting started with VR in Unity quite a bit easier.  The [CameraRig] prefab is a perfect starting point for most projects, and the controller and input tracking is amazing.  But there’s quite a bit more to the package than just that.  The SteamVR_Skybox is one of the features in the SteamVR plugin that you should definitely know about and use.

What’s the SteamVR_Skybox do?

The SteamVR Skybox component will allow you to change what a player in the HMD sees when your game drops frames or is is ‘not responding’.

Of course removing frame drops is the best solution, but sometimes you have cases where that’s just not an option.

In many games the most common time this happens is during scene loading.  Even using asynchronous methods to load your scene won’t resolve it for every case.

How do I use the SteamVR Skybox?

What I’ll usually do is create a child under the [CameraRig] and name it SteamVRSkybox.

I set the Y position of it to 1.5 and leave the X & Z alone.

This is to simulate a player of about 1.5 meters in height

Then I’ll add the the SteamVR_Skybox component to it and clickTake Snapshot

What’s this do?

With this component in place, when the game loses a frame and would go back to the loading area, it instead shows what was visible when I took the snapshot.

Sometimes this is enough and the game will just work, but other times you may have a special ‘loading area’ or some intermission setup that you use.

In that case, just place your SteamVRSkybox in the proper place for your snapshot instead of under the CameraRig.

Any Downsides?

While this is a great solution for most cases, depending on your implementation, you may see some less than perfect stuff.

First is that the skybox only shows what’s visible when you take the snapshot, so any gameobjects you’ve spawned at runtime won’t be there…  so if the frame drops are during gameplay, while it’s a tiny bit better, it will still feel weird.

And when you use it to load levels, I recommend you put the player into a loading area ideally.  This would be some room where stuff doesn’t change at runtime and your snapshot can match up correctly.

Also because this is part of the SteamVR plugin, this method won’t work for games targetting the Oculus SDKs.  For oculus, there’s another route you can follow that I’ll share some time in the future.

Conclusions

The SteamVR Skybox is only a tiny part of the awesome SteamVR package.  It’s easy to setup in just a couple minutes and on it’s own can make your game or experience feel a little bit more polished.  Again it’s not a fix all for bad performance, that should be addressed on it’s own, but it is a useful component and worth trying out.

 

Continue reading >
6 Shares
1 2 3
Page 1 of 3