• Home  / 
  • Author's archive:
About the author

Jason Weimann

Unity OnInspectorGUI – Custom Editors, Gizmos, and Spawning Enemies

By Jason Weimann / September 12, 2016

Creating games can be difficult and time consuming.  You have to code all kinds of systems, add and modify art and sound, and of course design levels.

As a programmer, I often found myself overlooking level design, and forgetting just how time consuming and frustrating it could be.

But I also know that as a programmer, there are things I can do to make it easier for myself (and any designers working on the games).

Today, I’ll show  you one very useful technique you can use to drastically reduce the time spent on design work, while making it a much more fun process.

The Example – Spawn Points

Enemies are a very common thing in video games, and in a large number of them, enemies are created/spawn throughout the game.

The GameObject spawning them could be simple, instantiating an enemy on a set interval.

Before I show you my technique, let me show you how I used to create them.

Version 1 – A simple transform (very bad)

When I first started placing spawn points in a game, I did it by simply placing a transform.  The screenshot below is actually a step beyond what I used to do, because in this one I’ve actually enabled the Icon so you can see it.

Custom Editors - Spawn Point as Transform

If you haven’t used the Icons before, the selection dialog is just to the left of the Active checkbox in the inspector.

Custom Editors - Icon Selector

I quickly moved on from just placing a transform though because it got really hard to tell exactly where the spawn point was in the world.  If the transform is below the ground, I wouldn’t be able to tell without moving the camera all around.  The same goes for a spawn point that’s in a building, hovering over the ground, etc.

Version 2 – Using a cube (less bad)

The next evolution of my spawn points involved cubes.  Creating spawn points with a cube renderer mostly resolved the issue with not being able to easily see the position in the scene.

To make this work though, I needed my spawn points to disable the renderer in their Awake() call so I didn’t have random boxes showing in the world when the game was being played.

It also didn’t really solve the issue of spawning enemies on the ground, so I’d have to make my spawners do a raycast downward to the ground to get their spawn point before popping out an enemy.

I’d try to place the boxes just a bit over the ground, but found that I wasted a lot of time lining things up right, testing, making minor movements, testing, etc.

In addition to that, it felt ugly, but I used this technique for a very long time….

Custom Editors - Spawn Point as Cube

Version 3 – Custom Editors

After using the previous methods for way too long, I finally came up with a solution that solved my previous problems and made building levels much faster.

Custom Editors - Enemy Spawners Scene View

As you can see in the image, Version 3 looks drastically different.  There are colored spheres with lines attaching them.  There’s text over them instead of in an Icon, and that text has a lot of info to it.

Before I show you how it’s done, let me explain what it is you’re seeing.

The Green spheres show actual spawn points for this game.  These are points where enemies will be instantiated.

The Blue spheres are waypoints.  Enemies spawn at the green spheres then walk to the blue ones.

The lines between them show which waypoints belong to each spawnpoint.

What’s that Text?

The text over the spawn point shows a few things.  Let’s examine the top left spawn point.

Custom Editors - Spawn Point Up Close

Intro 1 0:25-0:28 Spawn 2 [1/3] after 5(8s)

Intro 1 – This is the name of the wave/area this spawn point belongs to.  In this case, it’s the first introductory wave the player gets when they start the game.

0:25-0:28 – Here you see the time in the wave that this spawn point will be active.  This spawn point is active for a very short time, starting 25 seconds into the wave and ending only 3 seconds later.

Spawn 2 [1/3] – This tells us how many enemies will spawn from this point.  It’s going to spawn 2 zombies, one every three seconds (the [1/3] shows the count and interval).  The first one will spawn immediately, and the second after 3 seconds.

after 5 – This part isn’t visible on all spawn points, only on spawn points that delay their start.  You can see that in the Hierarchy, this spawn point is under a gameobject that enables after 20 seconds.  Each spawnpoint in a timer can have an additional delay added to them to avoid a large list of timers in the hierarchy.  The 5 second delay is what makes this spawner start at 0:25 instead of 0:20.

Custom Editors - Hierarchy

(8s) – The last thing you see just shows how long this spawnpoint is enabled.  For this one, after 8 seconds it will auto disable itself.  This is just time of the last spawn minus the time the spawn point becomes enabled (28 – 20 in this case). 

Snapping to the Terrain or Navmesh

One final benefit of this system that I want to show before getting into code is the ability to have your spawn points and waypoints automatically snap to the terrain or navmesh.  In the example below, you can see that when I move this waypoint around it will automatically find its place on the ground as soon as I release it.

This saves a ton of time and resolves that entire issue of lining things up.  Don’t do these things manually, have the editor do it for you.

Custom Editors - Waypoint Snapping

How It Works

To make my custom spawn points work like they do, I take advantage of two great features in Unity, Gizmos and Custom Inspectors.

Both parts do about half of the work required to get the full functionality.

Let’s start with this snippet from my EnemySpawner.cs script

The first thing we do here is get the Wave parent of this spawner.  This is the GameObject that all spawners and timers will be under for a specific wave or area of the game.

In the example above, you saw the green part “Intro 1“.  That part was just the name of the wave we find right here.

Line 6 takes this wave name and formats uses string.Format to split the wave name from the current spawners name, which is why “Intro 1” is above the spawning details.

On Line 8, we check to see if the wave this gizmo is for is currently selected.  We then use that to determine if we want a green spawner gizmo or a gray one.  I do this so we can easily tell which spawners are related.  All spawners in a wave will be colored at the same time, and all the ones from other waves will just show up as gray.

Custom Editors - Disabled Spawners

Line 12 draws the sphere using Gizmos.DrawSphere, in whichever color we’ve chosen.

Lines 14-15 will draw the final text above the sphere if the spawner is in the selected wave.

The OnDrawGizmos code is pretty short, and on it’s own it does a bit of really useful stuff, but there’s a lot missing.  It does show the spheres, and it places the name above the sphere with the wave name as a prefix, but there’s a lot more we want to happen.

For example the label from line 15 has a lot of useful info, and we pull that from the name, but we don’t want to manually enter that info, we want it auto generated and updated whenever we change things.

Overriding ToString()

To generate the name, with all the useful data, we override the ToString method of our EnemySpawner class.

If you’ve never overridden the ToString method, you may want to check out this description for a simpler sample of how it works https://msdn.microsoft.com/en-us/library/ms173154.aspx

Every object in c# implements the ToString method that you can override (the default return value for most objects is the name of the class/type).

In this example, we’re building up the rest of the label text.  While I won’t go into the details of each line, the end result of this method looks like this:

"0:25-0:28 Spawn 2 [1/3] after 5(8s)"

The Custom Editor

To tie this all together, we use a custom editor for the EnemySpawner.

Before you see the bigger parts of the script, let’s start with the initial attribute that tells Unity this class is a custom editor.

The CustomEditor attribute allows you to tell the engine which MonoBehaviour you want the editor to be used for.  This is specified by giving it the type of the MonoBehaviour.  In this example it’s typeof(EnemySpawner).

Also remember to add the using UnityEditor statement and make the base class for your custom editor be of typeEditor“.

The Editor class has one important method you need to override.  Check out this expanded version of the script and the OnInspectorGUI method that’s being overridden.

This method is called every frame in the editor while the Inspector window is visible and the object is selected.  If the Inspector is not visible, or is showing some other game object, this code won’t be called.

Code Breakdown

The first thing we do in this OnInspectorGUI method is cache the component we’re working with.

On line 12, we assign the target gameobject to the _enemySpawner variable.

The variable target is defined by the editor class and specifies the gameobject this editor is showing currently

Line 13 calls the base editor class version of OnInspectorGUI so it can handle anything that we’re not dealing with.  This is required because we’re overriding the behavior of OnInspectorGUI.

Lines 14-19 are a single method call to create a range slider that will fill the min and max movement speed.  I do this just to enforce the idea that the max must be greater than the minimum.  As a benefit, it also makes the value a little easier to visualize.


Lines 21-24 are there to add waypoints to the spawners.  I won’t cover in detail how they work, but these buttons essentially add a child object that will be used as a waypoint.  If it’s a random waypoint, my navigation code will select one at random, if it’s static, the enemies will path around them in order.  These also have their own gizmo and custom editor code to make them show up as blue in the scene view.

Line 28 just calls a method to disable any left over colliders or renderers on the spawner.  Generally there aren’t any, but sometimes one gets created with a cube or sphere and I want to make sure that’s disabled right away.  I could just remove them here too, but disabling does the same job and feels safer.

Line 30 does one of the most important parts.  It calls the method to stick the spawner to the ground.  Sticking the spawner down is done by a raycast from the spawners current position aimed downward.  We get the hit point and update the spawners position.

Line 33 wraps it all up by updating the spawners name.  It uses the overridden ToString() method we created above to determine the objects new name.

Auto Naming in Action


Important Note

For a custom editor to work, you need to place the script in a sub-folder named “Editor“.  This sub-folder can be anywhere in your project, and you can have multiple Editor folders, but only scripts in an Editor folder will work.

Custom Editors - EditorFolder

Custom Editors - EnemySpawner

Continue reading >

Unity Interfaces

By Jason Weimann / September 4, 2016

Unity Interfaces – Getting Started

Lately, I’ve realized that many Unity developers have never programmed outside of Unity projects.
While there’s nothing wrong with that, it does seem to leave some holes in the average Unity developers skill set.
There are some great features and techniques that aren’t commonly used in Unity but are staples for typical c# projects.

That’s all fine, and they can be completely productive, but some of the things I see missing can really help, and I want to make sure to share those things with you.

Because of this, I’ve decided to write a few articles covering some core c# concepts that can really improve your code if you’re not using them already.

The first in this series will cover c# interfaces.

If you google c# interfaces, you’ll come across the msdn definition

An interface contains definitions for a group of related functionalities that a class or a struct can implement.

Personally, I prefer to use an example to explain them though, so here’s one from an actual game.

The ICanBeShot interface

In Armed Against the Undead, you have guns and shoot zombies..Armed Against the Undead
But you can also shoot other things like Ammo pickups, Weapon unlocks, Lights, etc.

Shooting things is done with a standard raycast from the muzzle of the gun.  Any objects on the correct layer and in range can be shot.

If you’ve used Physics.Raycast before, you’ll know that it returns a bool and outputs a RayCastHit object.

The  RayCastHit has a .collider property that points to the collider your raycast found.

In Armed, the implementation of this raycast looks like this:

private bool TryHitEnvironment(Ray ray)
	RaycastHit hitInfo;

    if (Physics.Raycast(ray, out hitInfo, _weaponRange, LayerMask.GetMask("EnvironmentAndGround")) == false)
        return false;

    ICanBeShot shootable = hitInfo.collider.GetComponent<ICanBeShot>();

    if (shootable != null)

    return true;

Here you can see that we do a raycast on the EnvironmentAndGround layer (where I place things you can shoot that aren’t enemies).

If we find something, we attempt to get an ICanBeShot component.

That component is not an actual implementation but rather an Interface which is on a variety of components.

It’s also very simple with a single method named TakeShot defined on it as you can see here:

public interface ICanBeShot
    void TakeShot(Vector3 hitPosition);

If you’ve never used an interface before, it may seem a little strange that there’s no actual code or implementation.  In the interface, we only define how the methods look and not the implementation.  We leave that part to the classes implementing our interface.

How the Interface is used

So now that I have my interface, and I have a method that will search for components implementing that interface, let me show you some of the ways I’m using this interface.

Implementation #1 – Ammo Pickups

public class AmmoBox : MonoBehaviour, ICanBeShot
    public void TakeShot(Vector3 hitPosition)

		if (_isSuperWeaponAmmo)

This ammo script is placed on an Ammo prefab.

Ammo Scene and Inspector

Ammo Scene and Inspector

Notice the box collider that will be found by the raycast in TryHitEnvironment above (line 5).


Ammo Inspector

Ammo Inspector

In the case of the AmmoBox, the TakeShot method will add ammo to the currently equipped weapon.  But an AmmoBox isn’t the only thing we want the player to shoot at.

Implementation #2 – Weapon Unlocks

public class WeaponUnlocker : MonoBehaviour, ICanBeShot
    public void TakeShot(Vector3 hitPosition)
        PlayerNotificationPanel.Notify(string.Format("<color=red>{0}</color> UNLOCKED", _weaponToUnlock.name));

        if (_particle != null)
            Instantiate(_particle, transform.position, transform.rotation);


Compare the AmmoBox to the WeaponUnlocker.  Here you see that we have a completely different implementation of TakeShot.  Instead of adding ammo to the players guns, we’re unlocking a weapon and notifying the player that they’ve unlocked it.

And remember, our code to deal with shooting things didn’t get any more complicated, it’s still just calling TakeShot.  This is one of the key benefits, we can add countless new implementations, without complicating or even editing the code to handle shooting.  As long as those components implement the interface, everything just works.

Implementation #3 – Explosive Boxes

These are crates that when shot will explode and kill zombies.

Implementation #4 – Destructible Lights

In addition to everything else, the lights can also take a shot (in which case they explode and turn off the light source component)


Again to make the benefits of Unity interfaces clear, re-examine our code in TryHitEnvironment.

ICanBeShot shootable = hitInfo.collider.GetComponent<ICanBeShot>();

if (shootable != null)

We simply look for any collider on the right layer then search for the ICanBeShot interface.  We don’t need to worry about which implementation it is.  If it’s an ammo box, the ammo box code will take care of it.  If it’s a weapon unlock, that’s covered as well.  If we add a new object that implements the interface, we don’t need to touch our current code.

Other Benefits

While I won’t cover everything that’s great about interfaces in depth here, I feel I should at least point out that there are other benefits you can take advantage of.

  1. Unit Testing – If you ever do any unit testing, interfaces are a key component as they allow you to mock out dependencies when you write your tests.
  2. Better Encapsulation – When you code to interfaces, it becomes much more obvious what should be public, and your code typically becomes much better encapsulated.
  3. Loose Coupling – Your code no-longer needs to rely on the implementations of methods it calls, which usually leads to code that is more versatile and changeable.



Continue reading >

Using the Valves – Lab Renderer for VR

If you’ve done any VR development, one of the key things you should know is that FRAMERATE is crucial!  Get ready to learn how the Lab Renderer can help you keep that framerate high.

Dipping under 90FPS will quickly make your game nauseating and ruin the fun.The Lab Renderer - Low FPS Stats

Even if you have the most entertaining, exciting, innovative, other adjective game, if you can’t stay over the target frame rate consistently, your game will suck.

If you’re working in Unity, my original recommendation would have been to keep your vertex count low, bake as much lighting as possible, and limit yourself to 1 or 2 real-time lights where they really give a big payoff.

A few weeks ago though, Valve did something amazing and released their custom renderer for VR as a Unity plugin.

As the name implies, this is what they used to build the VR title “The Lab“, and I’d say was integral in making it look as nice as it does.

The Lab Renderer

The Lab Renderer - Asset Store Screenshot

The description on the asset store page doesn’t do it justice though, so I want to cover the performance gains I’ve seen by switching to this plugin for VR projects.


Real Time Lighting

With Unity in a typical VR game, most lighting is baked.  In fact, most games in general use quite a bit of light baking.

If you don’t know the difference yet between baked and real time lighting, I’d recommend you give this article a read:


It’s long but worth your time.

The reason most lighting is baked though is purely for performance.  Ideally, given unlimited performance, real time lighting for everything would be great.  With the lab renderer, you get much closer to this.

In my previous VR games and experiments, I’d come to accept a limit of 1-3 real time lights being active at a time.

The lab renderer however supports up to 18 real time lights with little to no performance hit.

In my most recent VR game Armed Against the Undead, prior to switching to the Lab Renderer, I couldn’t have more than 1 real-time light enabled at a time without my FPS dipping below 90.

When using baked lighting, I was stuck waiting for light baking every time there was a change, in some cases waiting for 15-60 minutes, which as you can imagine can completely break the flow of work.

Once I made the switch, I was able to immediately turn off all baking, enable many more lights, and even make the lights all destructible!

Anti Aliasing

This comes straight from the Lab Renderer page, because I don’t think I could state it better…. but essentially, you get fast MSAA

   Single-Pass Forward Rendering and MSAA

Forward rendering is critical for VR, because it enables applications to use MSAA which is the best known anti-aliasing method for VR. Deferred renderers suffer from aliasing issues due to the current state of the art in image-space anti-aliasing algorithms.

Unity’s default forward renderer is multi-pass where geometry is rendered an additional time for each runtime spotlight or point light that lights each object. The Lab’s renderer supports up to 18 dynamic, shadowing lights in a single forward pass.


The key reason you should look into using the Lab Renderer is FPS.

In my experience, the FPS gain from the lab renderer is around 50%.  This could of course vary drastically from project to project, but the difference I’ve seen is huge.

For Armed Against the Undead, the FPS gain is enough to take the game well over 100fps on an NVidia 970.

To demonstrate the difference, I’ve copied the existing project and built a side by side comparison of the game using both the standard and lab renderers.

I’ve added screenshots of both, so you can see there’s no real difference visually, but the FPS gain is huge.  I’ll let the screen shots speak for themselves..

The Standard Shot

Using the Standard Renderer

Using the Standard Renderer (79FPS)

The Lab Shot

Lab Renderer - Using Lab 110fps

Using the Lab Renderer (110FPS)


While the Lab Renderer offers some great performance gains, there are a few downsides that you need to take into account before making the switch.  Some things just aren’t supported yet, and I don’t know when / if that will change.


To take advantage of the Lab Renderer, you need to use the “Valve/VR_Standard” shader.  If you have your own custom shaders, this may be an issue.  If your project is all setup with legacy shaders, you’ll need to manually convert them and make your assets look right again.  While I haven’t had a hard time doing this, I have to admit I’m not an artist and ‘close enough’ works fine for me.  But if you’re very picky about the art, this may be an extra time sink or in some cases a breaking change.  That said, if you’re using the standard shader for everything, the conversion is practically automatic and works great.

In addition to this, your general full screen shaders won’t work either.  If you have special full screen shaders you really need, you may be able to figure out a way to get them working, but in my experience none have worked so far.


The unity terrain uses it’s own shaders, not the standard shader.  From what I’ve seen so far, there isn’t an easy way to make the lab renderer work right with terrains.  There are of course tools that convert terrains to meshes, and if you already have one that you like, that may be a good solution.  If your project makes heavy use of the Unity terrain system though, and you can’t easily swap that out, the Lab Renderer may not be right for it.

Getting Started

So now that you know the benefits, and can accept the drawbacks, it’s time to start implementing.

Luckily, doing this is easy and only takes a minute.


Commit your current project to source control!

If something goes wrong, or you find a drawback/issue you didn’t expect, you’ll want an easy way to revert.

GIT – You should be using it

If you’re not using any source control, read my article on Git and set it up before you do the conversion.  It will only take you 15 minutes and will prevent a ton of pain going forward.

Find your camera and add the “Valve Camera” script to it.

The Lab Renderer - Valve Camera

If you’re using the [CameraRig] prefab from the SteamVR Plugin, add this to the the (eye) not the (head)

The Lab Renderer - CameraRig - Eye

Disable Shadows

Shadows are handled by the lab renderer, so you need to disable the built in system.  When you want to adjust shadow settings, look at the “Valve Camera” script you added above.

In your project settings, you need to turn off shadows.  To do this, open Edit->Project Settings->Quality

Set Shadows to “Disable Shadows
Disable Shadows


 Update your materials to use the new shader

Use the Menu item “Valve->Convert Active Shaders to Valve”The Lab Renderer - Update Materials to Valve Shader


This will convert all of your materials using the Standard shader over to use the “Valve/VR_Standard shader“, and could take a while if you have a large # of them.

Optionally, you can use “Convert All Materials to Valve shaders”, that will find everything in your project, not just your open scene.  I recommend the Active ones first because it’s usually a lot faster and will give you a good idea of how things are going to run.  If your system is fast though, just select the All option from the start so you don’t need to do it later.

Setup your Lights

Find all of your lights that you want to be realtime (In my case that was all of them), and add the “Valve Realtime Light” script to them.

When you add this, take a look a the options available.  You may not need to modify anything right now, but it’s good to know what you can do with them.

Lab Renderer - Valve Realtime Light Script



After you’re setup and running, you may find your performance isn’t as high as you expect.  This could be caused by some materials not being correct.

The automatic update of materials works great to catch things using the standard shader, but if you had some legacy ones you missed/forgot, they could be causing less than optimal performance.

Luckily, Valve already thought of this and added some helper options to the “Valve Camera” script.

With the game running, enable the “Hide All Valve Materials” on your “Valve Camera“.The Lab Renderer - Hide All Valave Materials

Everything using the correct shader will disappear and only things that still need to be updated will be visible.

Swap those over to the “VR Shader” and check your performance again.

Start Now

My final recommendation is to start using the lab renderer today.  Don’t wait until your project is complete as the task will become much more difficult.

If you start early, you can tune your game for the performance available, and avoid the pitfalls of using something that’s incompatible.

If you want to learn more about the Lab Renderer, you can check out the steam forum here: http://steamcommunity.com/app/358720/discussions/0/

And if you have any feedback or tips you’d like to share, please comment or email me.

Continue reading >

VR discussion on .Net Rocks

If you haven’t heard it already, check out this discussion on Unity3D & VR/Vive development.

1310 Building Virtual Reality Apps for Vive VR in Unity3D with Jason Weimann


The hosts of the show, Richard and Carl have a wide variety of experience and brought some interesting points to the chat.

If you haven’t heard them before, the show is about programming in general with a slight leaning toward c# (aka the best Unity language).


Continue reading >

Recommended Unity3D Assets you should be using!

By Jason Weimann / June 9, 2016

Preface: None of these asset creators are funding this, and I get nothing for sharing these outside the joy of spreading some of my favorite assets.

In this post, I wanted to cover some of my favorite assets that I’ve been using recently. There are a ton of posts on the best free assets out there.  I've noticed though that people tend to shy away from recommending some of the better paid assets, so I wanted to go over the premium ones that I constantly find myself recommending to other developers.

While this is not a complete list of every asset I'd give 5 stars to, it is the list of ones I've found myself recommending more than a few times in the last month.  All the things on this list are assets I've purchased myself and I lead all my friends to buy as well.

I've also been taking feedback on other readers recommendations and was pleased to see some overlap with my recommendations​.  If you have your own recommendations for assets you couldn't work without, please let me know!

Font Styling

Update: TextMeshPro is FREE now and will eventually be integrated into the engine (that's how good it is)

If you’re doing any UI work at all in your game, TextMeshPro is something you should be using. When I first saw the videos and pictures, I was pretty skeptical.

But I gave it a try and was blown away. Within 5 minutes, I’d turned my crappy looking UI into something professional looking just by adjusting a few sliders.

Previously, I’d have to jump into Photoshop, write out some text, add some effects, import it, and see how it worked out. Now I just use the TextMeshPro component instead of the Text component and I’m done.

Insert Icons into your text

In addition to making the text just look great, it also supports in-line icons,bolding, italics, color codes, and more. My description can’t do it justice, so definitely jump over to the videos and look for yourself.

Again, if you’re using text in your game, get this one.

Cartoon Environments: BitGem Dungeon

If you’ve tried some of my demos, you’ll see that I use this one quite a bit.  I really like the way this one looks and how well it performs.  It was easy to get over the required 90fps for a VR game and runs great on mobile too. If you need a dungeon (and some amazing characters to go along with it), I definitely recommend the BitGem ones.

Dungeon Starter Set

Guns & Weapons: Weapon Pack

This pack is amazing. It’s advertised as a weapons pack, but once you open it up, you have the start of a full on FPS. The demo scene lets you run around and shoot things, swap weapons, kill stuff, etc. It has the sounds, particles, and animations to really tie everything together. This is my new go-to weapon pack. While the price is near the highest for the asset store, I think it’s still an amazing deal given the crazy amount of really high quality work you get.

FPS Weapons

Just to reiterate why I think this pack is so great, here’s a list of the stuff it includes!

  • Over 20 weapons
  • Fire & Reload Animations
  • Bullet models (for some of them, but I reused across them all)
  • Impact particles (bullet holes)
  • Muzzle flashes
  • Fire & Reload sound effects
  • A fully playable demo implementation
  • A flame thrower!

CartoonFX Easy Editor
The CartoonFX packs have been around for a while, and for the longest time, they were the only cartoon particles worth mentioning. Now there are a bunch of good ones, and CartoonFX still stands out as one of my favorites. The thing I like the most about it though is the EasyEditor script that’s included.

If you’ve ever needed to scale a particle effect, you probably know how tedious it can be. With the CartoonFX Easy Editor, it’s ultra simple. Pick a new scale, click a button, and it’s done. It covers the scale of all the children (including scale over time).

Even if you don’t want the particles, get one of these packs just for the editor and the time it will save you. Even if you don’t want, like or need cool cartoony particle effects… get one of these packs just for the editor. It’s a huge time saver and something I recommend using.

Continue reading >

Armed Against the Undead Alpha


Today I’m happy to announce an early alpha build of Armed Against the Undead.  The game is available now to anyone who owns the HTC Vive, but only for the next 48 hours.  I’d like to get 100 people in and some feedback so the project can be moved to the next phase and become what the players want.  If you have the HTC Vive and want to play, sign up here:

The alpha signup has expired.  If you really want to play though, sign up to be notified when the next alpha test starts so you don’t miss out.

Note: When you close the game, it will pop up a survey to get feedback.  I know it may be annoying, but please fill it out so I can work to make the game as good as possible.

Game Description

The game takes place in the depths of a dead city.  Your goal is to get out alive….  There are lots of weapons and even more zombies..

Game Modes

The game has 2 distinct modes and more are planned for later.



Armed - Survival ScorePlayer Health: 1 (it’s survival)

This mode is exactly what you’d expect.  Stand in one place, shoot zombies, and try not to die.  The zombies (and other monsters) come from all directions.  Occasionally, you’ll get new weapons to fend off the beasts.  One hit and you die!  Survive as long as you can and beat the top score.



Armed - Save the InfectedPlayer Health: 20 (no you can’t heal)

Story mode takes you through the city, fighting your way out.  You’ll progress through different areas collecting new weapons along the way.  You can choose your path, save the humans, search for an easy way, or go find some better weapons down a back alley.

In this alpha build, only one path is available for testing.

Protect any humans you come across, they’ll bring you nice new weapons.  Make it to the subway and escape alive!


WeaponsArmed - Grip To Reload

Armed Against the Undead has Armed in the name!  There are over 20 weapons available to destroy the undead horde with.  For the alpha, the following weapons are unlocked and available.


Damage: 1

Rate: Slow


Damage: 10

Rate: Very Slow

Armed - Pistol and Shotgun


Damage: 5 per second

Armed - Chainsaw


Damage: 1

Rate: Medium

Armed - SMG


Damage: 1

Rate: Fast


Damage: 1

Rate: Very Fast



  • Aim for the head.  Zombies die from 1 shot to the head every time.
  • Shooting zombies in the legs can slow them down (they’ll start crawling if you hit them enough).
  • Use the shotgun for up-close encounters.  It’s range is small, but it kills just about everything in a single shot.
  • Listen for the growls.  If a zombie comes within 9 feet of you, it will growl, and you’ll have a second or two to shoot it.
  • The chainsaw needs to be IN the zombies to cut them, but kills stuff quickly and doesn’t use ammo.
  • Don’t forget to use the Grip to reload.Armed - Save the Infected

Get more ammo for your weapons by shooting at the ammo box (with the weapon you want ammo in)

Continue reading >

Unity Coding Standards

Today, we’ll talk about Unity Coding Standards.  We’ll cover things to do, things to avoid, and general tips to keep your projects clean, maintainable, and standardized.

Things to avoid

I want to prefix this by saying that the thoughts in this post are guidelines and not meant to be a criticism of anyone.  These are personal preferences and things I’ve picked up from experience across a variety of different projects.

If you find that you commonly do and use some of these things, don’t be offended, just try to be conscious of the issues that can arise.  With that little disclaimer, here are some of the key things I always fight to avoid and recommend you do your best to limit.

Public Fields

I won’t go deep into this as I think I’ve already covered it here.  Just know that public fields are generally a bad idea.  They often tend to be a precursor to code that’s difficult to read and maintain.

If you need to access something publicly, make it a property with a public getter.  If you really need to set it from another class, make the setter public too, otherwise use a property that looks like this:

public string MyStringThatNeedsPublicReading { get; private set; }

Large Classes

I’ve seen far too many Unity projects with class sizes that are out of control.  Now I want to clarify that this is not something only specific to unity, I’ve seen classes over 40k lines long in some AAA game projects.  I’ve seen .cs & .js files in web apps over 20k lines long.

That of course does not make them right or acceptable.

Large classes are hard to maintain, hard to read, and a nightmare to improve or extend.  They also always violate one of the most important principals in Object Oriented Programming.  The principal of Single Responsibility.

As a general rule I try to keep an average class under 100 lines long.  Some need to be a bit longer, there are always exceptions to the rules.  Once they start approaching 300 lines though, it’s generally time to refactor.  That may at first seem a bit crazy, but it’s a whole lot easier to clean up your classes when they’re 300 lines long than when they reach 1000 or more.  So if you hit this point, start thinking about what your class is doing.

Is it handling character movement?  Is it also handling audio?  Is it dealing with collisions or physics?

Can you split these things into smaller components?  If so, you should do it right away, while it’s easy.

Large MethodsCoding Standards - too long list

Large classes are bad.  Large methods are the kiss of death.

A simple rule of thumb: if your method can’t fit on your screen, it’s too long.  An ideal method length for me is 6-10 lines.  In that size it’s generally doing one thing.  If the method grows far beyond that, it’s probably doing too much.

Some times, as in the example below, that one thing is executing other methods that complete the one bigger thing.  Make use of the Extract Method refactoring, if your method grows too long, extract the parts that are doing different things into separate methods.


Take this Fire() method for example.  Without following any standards, it could easily have grown to this:


protected virtual void Fire()
	if (_animation != null && _animation.GetClip("Fire") != null)

	var muzzlePoint = NextMuzzlePoint();
	if (_muzzleFlashes.Length > 0)
		var muzzleFlash = _muzzleFlashes[UnityEngine.Random.Range(0, _muzzleFlashes.Length)];

		if (_muzzleFlashOverridePoint != null)
			muzzlePoint = _muzzleFlashOverridePoint;

		GameObject spawnedFlash = Instantiate(muzzleFlash, muzzlePoint.position, muzzlePoint.rotation) as GameObject;

	if (_fireAudioSource != null)


	if (OnFired != null) OnFired();

	if (OnReady != null)

	var clip = _animation.GetClip("Ready");
	if (clip != null)
		_isReady = false;

	if (OnAmmoChanged != null)
		OnAmmoChanged(_currentAmmoInClip, _currentAmmoNotInClip);

	RaycastHit hitInfo;

	Ray ray = new Ray(muzzlePoint.position, muzzlePoint.forward);
	Debug.DrawRay(muzzlePoint.position, muzzlePoint.forward);

	if (TryHitCharacterHeads(ray))

	if (TryHitCharacterBodies(ray))

	if (OnMiss != null) OnMiss();

	if (_bulletPrefab != null)
		if (_muzzleFlashOverridePoint != null)
			muzzlePoint = _muzzleFlashOverridePoint;
		Instantiate(_bulletPrefab, muzzlePoint.position, muzzlePoint.rotation);

This method is handling firing of weapons for an actual game.  If you read over it, you’ll see it’s doing a large # of things to make weapon firing work.  You’ll also notice that it’s not the easiest thing to follow along.  As far as long methods go, this one is far from the worst, but I didn’t want to go overboard with the example.

Even so, it can be vastly improved with a few simple refactorings.  By pulling out the key components into separate methods, and naming those methods well, we can make the Fire() functionality a whole lot easier to read and maintain.


    protected virtual void Fire()

		var muzzlePoint = NextMuzzlePoint();


		if (OnFired != null) OnFired();


		if (TryHitCharacters(muzzlePoint))

		if (OnMiss != null) OnMiss();


With the refactored example, a new programmer just looking at the code should be able to quickly determine what’s going on.  Each part calls a method named for what it does, and each of those methods is under 5 lines long, so it’s easy to tell how they work.  Given the choice between the 2 examples, I’d recommend #2 every time, and I hope you’d agree.


The last thing I want to cover in this post is casing.  I’ve noticed in many projects I come across, casing is a mess.  Occasionally, project I see have some kind of standard they’ve picked and stuck to.  Much of the time though, it’s all over the place with no consistency.

The most important part here is to be consistent.  If you go with some non-standard casing selection, at least be consistent with your non-standard choice.

What I’m going to recommend here though is a typical set of C# standards that you’ll see across most professional projects in gaming, business, and web development.


Casing: Pascal Case

public class MyClass : MonoBehaviour { }


Casing: Pascal Case (No Underscores unless it’s a Unit Test)

private void HandleWeaponReady()

Private Fields

Coding Standards - Private FieldCasing: camelCase – with optional underscore prefix

// Either
private int maxAmmo;
// OR my prefered
private int _maxAmmo;

This is one of the few areas where I feel some flexibility.  There are differing camps on the exact naming convention to be used here.

Personally, I prefer the underscore since it provides an obvious distinction between class level fields and variables defined in the scope of a method.

Either is completely acceptable though.  But when you pick one for a project, stick with it.

Public Fields

It’s a trick, there shouldn’t be any! 😉

Public Properties

Casing: Pascal Case

public int ReaminingAmmoInClip { get; private set; }

These should also be Automatic Properties whenever possible.  There’s no need for a backing field like some other languages use.

Again you should also mark the setter as private unless there’s a really good reason to set them outside the class.


Wrap Up

Again, this is just a short list of a few things that I think are really important and beneficial for your projects.  If you find this info useful, drop in a comment and I’ll work to expand out the list.  If you have your own recommendations and guidelines, add those as well so everyone can learn and grow.
Thanks, and happy coding!

Continue reading >

Getting Started with SteamVR and Unity 5.6 [Updated]

Do you have your Vive?  Are you looking at SteamVR?

Are you ready to start building fun games and experiences in Unity?

Read along for some basic setup steps and a couple useful tips!

The SteamVR Plugin

If you’ve created a new project, the first thing you’ll need is the SteamVR Plugin.

This plugin has everything you need to get up and running, including some sample scenes.

If you’re looking for info on SteamVR with previous version of Unity, it’s been archived here: http://unity3d.college/steam-vr-unity-5-4-beta/

The [CameraRig] Prefab

The SteamVR team has done a great job at making it easy to start out with the Vive.

Once you’ve imported the SteamVR plugin, you can find the [CameraRig] prefab located in the SteamVR\Prefabs folder.

Create a new Scene

Delete the Default Camera

In a new scene, drag the [CameraRig] prefab into your hierarchy.

Now hit play again and enjoy the boring blue skybox.

If you don’t see anything, check your error log.

You may have a message saying:

VR: OpenVR Error! OpenVR failed initialization with error code VRInitError_IPC_ConnectFailed: "Connect to VR Server Failed (301)"!

If so, you need to launch SteamVR.

To do that, open steam and click the SteamVR icon in the top right corner.

Once it’s started, go back to Unity and click play again.

If it still fails to start, post any error message you see in the console into the comments below so I can address your problem.

An Empty World

Looking around and replacing the controllers


If all is working well now, you see the skybox and nothing else, until you turn on your controllers

Turn them on and you should immediately notice they appear in-game.  The triggers should adjust as you press them, and the trackpad should light up as you touch it (just like in the SteamVR Tutorial).

The controllers are available because of the [CameraRig] prefab.  If you expand it out in the Hierarchy, you’ll see the “Controller (left)” and “Controller (right)” children.

In the image shown here, I’ve turned on only the right controller, so the left is still deactivated (dark grey).

When you’re in play mode, the “Model” child of the controller creates children for the different components.

No Controllers? – Important fix for Unity 5.6 and SteamVR

Update: this is fixed and not needed as of SteamVR 1.2.2, this fix is no-longer needed.  Upgrade to 1.2.2 and skip this section! 🙂

If you’re using unity 5.6 and the current version of the SteamVR plugin, you’ll notice that the controllers don’t actually turn on.

Until the SteamVR plugin is updated, you’ll need to implement this quick fix to get the controllers updating properly.

Select the Camera (eye)

Add the “SteamVR Update Poses” Component to it.

And done..  Now the controllers will track again

Replacing the Controllers

One of the questions I get quite often is “how do I replace the controllers with a [sword/gun/hand/random other thing]?”

As you may already expect, you can simply add the thing you’d like to replace the controller with as a child of the “Controller (right)” or “Controller (left)” GameObjects.

For this example, we’ll replace the controller with a shotgun (like I did in the Zombie shooter game)

First, I drag the shotgun model under the “Controller (right)” GameObject

This ‘works’, but there’s a bit of a problem.  While the shotgun will move around with the controller, it won’t be aligned correctly.

Now there are a variety of ways you can fix this, but the simplest one and the one I recommend you use is to make a new GameObject for the Shotgun and have the model be a child of it.

With the “Controller (right)” GameObject selected, click GameObject->Create Empty Child

You should see this

Rename the new “GameObject” to “Shotgun

Move the “Shotgun” model (in my case named DBS) to be a child to the “Shotgun” GameObject

Fixing alignment and position

Press play and go to your Scene view.

Because we can’t see the controller without hitting play, these changes must be done in Play mode, follow along to see how to keep those changes once you’ve left play mode.

In the Scene view, adjust your weapon model to be aligned with the controller how you want it to be (some guns for example hold at a different angle than the shotgun pictured below)

While still playing, look to the Inspector to copy the transform values

Stop playing, the gun will reset.

Now go back to the model for the gun (child of “Shotgun” in this example) and use the Paste Component Values menu option.

When you play again, your gun (or other object) should be properly aligned and move with your controller.

Once it looks right, disable the “Model” child of the “Controller (right)” GameObject.

You can delete it, but disabling it gives the same effect and allows you to easily re-enable it if you decide to make adjustments to your handheld items.

That’s all you need to get started!  You should be able to look around, place objects where your hands are, and get to building stuff!

Serious about VR?

If you really want to get started with SteamVR and build your own game today, my Professional VR Game Development course can jumpstart your project.

Start Today!


Using the Valves – Lab Renderer for VR

Getting started with SteamVR Controller Input

Continue reading >

7 Vive Games – The Retrospective

My experience building 7 Vive games in 7 days

When I first received my Vive and tried out the sample projects, I was hooked.  My mind was immediately racing with all the possibilities for this new format. I’ve done VR work since the DK1 shipped, but this was revolutionary.  Day 1, I started trying thing’s out in Unity and was blown away with how well it worked.

I had a couple ideas and quickly got to work on the first.  This was going to be a remake of Rampart in VR.  You would pick up the walls and drop them in place, then use the controllers to aim and fire cannonballs.  Within a day, I realized my design was far from optimal and I needed to rethink my plans.


Next, I decided to write a getting started tutorial.  I’d done previous VR tutorials and classes with baseball themes that were a hit so I started the conversion to room-scale.  Within a few hours, I had a working prototype where you could swing a bat and really feel like you were in the game.  This was great, I knew I had to push forward with the Vive.

While I was working on the baseball game, my mind kept wandering to all the other things that could be done.  On the way to a family trip to Disneyland I was talking to a good friend about how great the Vive is. (it’s close and these trips happen monthly, in fact I’m writing this in the car on another trip back).  While talking to him, I decided it be great to prototype a bunch of games.  A few minutes into the conversation, I’d decided to challenge myself and build a game a day.

I told my wife the plan, and it was locked in.


The first thing I needed to do was build a plan.  I had to come up with a few ideas for games I thought might be fun.  Luckily, there were plenty of friends and to bounce ideas off and I had a flurry of great ideas coming in.

I needed to have a release schedule.  Since it was going to be 7 days, I knew there was some risk of slipping, so I immediately decided to pad my releases by a day.  A game built and recorded Monday would go out on Tuesday.  I’d also tried to have a simple game idea that could be built safely in 2hrs or less in-case things went drastically wrong.  To my surprise, the 2h game idea was a total failure, but everything else worked out great.

The games

Before going into the lessons learned, I want to give a quick rundown of the different games that were built.

Dungeon Defense

The first game I built was a dungeon defense game where you ruled over a little dungeon that was being invaded by goblins.  In this game, your goal is to prevent the invading goblins from stealing gold from a few scattered chests placed throughout the scene.  To do that, you’d shoot at them from above with some magic wands.  Initially, I’d considered building this as more of a tower defense game, but I quickly came to my senses and remembered that building a basic tower defense game would require quite a bit more work than could be done in 4 hours, let alone doing it as a VR experience.

The driving force behind the theme of being giant and ruling over a tiny dungeon or world came from a demo I saw at the first Occulus connect conference.  They showed a small cartoon town with a lot going on that you could view from wherever your head would fit.  I thought that was a great experience and wanted to follow along while adding some interaction.


VIVE - Archery

VIVE - Archery

Archery was an idea championed by my wife.  The gameplay here is simple, you have 10 arrows per round, you shoot targets and get points.  Targets further away or moving are worth more points. Your left hand holds the bow (sorry I’m right handed), and the right hand pulls the string and arrow back.  Once you’ve aimed, you pull the trigger and the arrow launches away.

Zombie Shooter

Zombie - FullScreen

VIVE - Zombie Shooter

This was probably my favorite of the set.  It’s a straight forward game where zombies come from all directions and your only task is to stay alive.  You’re equipped with 2 shotguns that automatically reload every second.  The real thrill in this one comes from having a variety of zombie speeds.  I’ve seen plenty players focus on a single area, forgetting to turn around, then be startled when zombies are at their back.

Candy Catcher

VIVE - Archery

Candy catcher is a fun goofy game where candy falls from the ceiling.  You have 2 plates to catch the candy with and a tray to drop it into.  Each piece of candy you drop into the tray is worth a point.  The initial concept for this was going to be catching ice cream scoops, but good ice cream scoop art was hard to come across.  In the end though, the ice cream shop with the fun music and colorful candy turned out to be pretty entertaining.

Baseball Pitching

Baseball Pitching Image

Pitching was something I wanted to do so I could add it as a feature in VR Baseball … Unfortunately, it didn’t work out and wasn’t fun at all.  Nobody really seemed to enjoy it, including myself.

That said, VR Baseball itself is quite a bit of fun.  Hitting a ball is a blast.  Attempting to throw one.. is not.

Stack 'em Quick

Stack 'em Quick Image

In this game there are 4 colored blocks.  In front of the player, a copy of these 4 blocks will appear in a random order.  The player’s goal is to pick up the blocks from the ground and stack them in a matching pattern.  When you complete a match, you score a point and a new combination is generated.  This was actually a lot of fun and one of the most popular games I built.  The gameplay is simple, but the challenge is really exciting.

Dungeon Crawler - Hack & Slash

Hack & Slash Image

One of the most requested things during this week was a sword fighting experience.  Having already built a basic dungeon defense game on day 1, I thought I’d re-use some of that work and build a fun hack, slash, & casting type game.  To move the player along, they float on a ‘magic carpet’.  As you move, goblins come around corners and start attacking you.  Your controllers are swords that you swing, tearing through the goblins with ease.  There are also a few weapon replacements you can pickup throughout the experience. In total, this game lasts a full 10 minutes, though it starts feeling repetitive and slow after the first 2.

What I learned

During this week, I learned a lot about VR development, and more specifically the Vive.  I’ve done a large variety of VR projects in the past on the Oculus DK1, DK2, and GearVR, but this was something totally new.

HTC Vive

The HTC Vive is an amazing device.  If you’ve never tried room scale before, I implore you to go find an opportunity to do it.  Room scale experiences are a huge change from any previous VR games I’ve played in the past.  It’s hard to explain how big a leap it is in text or words..  For me, the jump to the Vive is equivalent to the jump from Google cardboard to an Oculus device. Controller tracking was available previously with devices like the Razer Hydra, but they pale in comparison.  The Vive controllers are a near perfect input device, working well as a bat, gun, hand, sword, and more.

Unity integration with the Vive is even more noteworthy.  The job the Valve team has done integrating the Vive with Unity is nothing short of incredible. To convert a game to be Vive compatible, you literally only need to import the SteamVR asset pack and drop in the [CameraRig] prefab. This simple action gives you full room scale tracking for both the player and the controllers. Customizing your game from here can be as simple as adding a child gameobject to the controllers and disabling the ‘model’ that renders the controllers by default, which is exactly what I did for most of the games produced.

Room Scale

Room scale as I mentioned is amazing, but it requires a big shift in mindset when developing games.  There are many opportunities for really interactive experiences and a few gotchas to watch out for.

Great Things

  • Moving around! – Experiences where the player had to actually move physically were a big hit.  The one that stands out the most in this set was Stack ’em Quick.  In this game, people were running all around the play area picking up boxes, carrying them to another part, and stacking them up. They’d duck down low to pick up the boxes, and reach up high to put them on top.  I think the full use of room scale and movement is what made this game stand out as the second best.
  • Swinging the arms – You’d may be be surprised to learn that many people actually get tired or exhausted playing some room scale games.  Even my wife who’s extremely fit gets worn out after 10-15 minutes in some of the more intense games.  Of course like anything, how tired you get depends on how intense you play, but many of these games are so engaging that people get really active while playing them. Archery & StackEm were both games where people tended to get very immersed and exert themselves.  VR Baseball is another example where swinging a controller as a bat wears people out much like a real batting cage.


  • In-game Player Movement – Unfortunately, movement is a bit more complicated when you’re dealing with a room scale setup and a standing player.  Game 7 is a good representation of the challenges presented.  When moving a standing player, movement at any real speed is very disorienting.  Rotation is far worse!  Turning the player at even the slowest speeds can make them fall over or get VR sickness.  If you try the Slash & Hack demo, you’ll see that even rotation at a 1-2 degrees/second is not fun.

General Virtual Reality Performance – FPS!

I can’t stress enough how important it is to remain above 90 fps when building a VR game. That number may seem really high, and it is, but it can’t be avoided. 60 fps used to be the standard for high end PC gamers, but in a VR game, 60 will make you sick.

You have a limited budget for vertex count in your VR games.
Use that vertex budget on things that are close and very visible (hand held items like weapons).

Visual Effects, Trees, and Anti Aliasing

One of the ​issues I ran into a few times before realizing what was going on was a terrible effect when I'd put trees in a game.  If you look at the games above, none of them actually use trees at all.  When I'd put them in, I would inevitably end up blinded & sick from light flickering through the branches.  It turned out that the biggest cause of this was the default quality setting in Unity that I was running in.  Without anti-aliasing turned on to at least 4x, trees were unbearable.  Even with it enabled, many trees still look terrible.  If you plan to have a forest or just a bunch of trees in your game, make sure you test them out right away, and look for trees that don't flicker and blind your players.

Asset Store & Mixamo

Without the Unity asset store, this project would not have been possible. All the games other than #6 relied solely on cheap and free assets from the store. There are some important things to take into consideration when finding assets for VR projects though.

Many of the assets on the store look great but perform terribly.  There were a few assets I purchased for these projects that ended up not working out due to excessive poly/vertex counts.  Currently, there’s no easy way to find out how things will perform without either emailing the asset providers or downloading and trying them out.

One thing that worked well for me though was searching for mobile/optimized keywords when finding assets.  Assets designed for mobile tend to be highly optimized and have little to no impact on framerate.  You do however have to find assets that also look good in VR.

Favorite Assets Used

BitGem Dungeon Builder Starter - I used this for the first and last games.  While it didn't say in the description that it was mobile ready or optimized, it ended up performing better than anything else I'd tried.  The pack and included demo scene looked great in both 3rd and 1st person views.  If you want to build a VR dungeon game, I'd highly recommend you give this asset a try!

Goblin Piker – I also re-used the goblin piker in both the first and last games. This was an asset I was very impressed with.  I wanted something cartoony looking that would fit well in the Dungeon I was using and this was a perfect match.


If you haven’t checked out mixamo.com yet, do it asap.  It’s a site with great character models and a giant variety of animations.  And right now, it’s all FREE!  These models and animations used to be pretty expensive, but they were recently purchased by Adobe, and until they get a pricing system setup, they’ve made all the assets free to everyone.  The Zombies in game 3 were sourced from here, and there are countless other amazing models you can get there.


This may be the most interesting part of the retrospective.  Building 7 games allowed me to try a large variety of mechanics and to not worry about total failure.  Because of that, I got a good amount of insight into what was fun and what was ... "less than fun".

The Fun Stuff

Picking things up
Of all the mechanics I explored, this one was probably the biggest surprise for me.  The feeling of actually picking up objects in Stack 'em Quick, made that game a real joy to play.  Even though you're not actually closing your hands, just pulling the trigger to grip something, then moving and releasing it somewhere else is very enjoyable.  I should note that I've played other Vive games with similar mechanics, and every time, I've felt far more immersed just due to this simple mechanic.  If there's one mechanic you should definitely include, it's this!

This may seem obvious, but shooting stuff never gets boring.  The way the Vive controllers are designed, they work equally well as pistols or shotguns.  Aiming and pulling that trigger is something everyone who played seems to really love.  It's also a very easy to pick-up mechanic that doesn't need to be explained.  I should note though that aiming in VR can be difficult at first.  In the Zombie game, I actually added a simple laser sight to make it easier for people to get started.  (I actually just added the laserpointer script that valve provided with the SDK on a child object and aimed it the right way)

God Mode / Little Worlds

The first game I did had a big gods eye view with little minions below.  As I mentioned before, hovering over a miniature world is oddly satisfying.  Everyone who played this in front of me ended up bending down to their knees to look closer at the little goblins walking around.  Now I'm not sure what the best experience will be for this type of game, but I have no doubt that someone will crack this style and make a game-changing product.  And in typical Valve fashion, the only thing you need to do for this to work is increase the scale of the [CameraRig] prefab.

360 Degree Gaming
The only games the really featured things to do from all sides were the Dungeon Shooter and Zombies (hack & slash to a lesser degree).  For Zombies, I'd say this was a key feature.  Being surprised by things from behind made this game so much more intense.  Eventually, players learned to keep watching their back, but inevitably they'd get distracted and surprised again.  It's also important when doing these kind of experiences to make good use of audio cues.  While they may not notice the Zombie running up behind them visually, they'll definitely realize it when they hear the footsteps and a loud growl.  This is such a great mechanic that I've decided to make it a key part of my next Vive game.

​Room Scale
I mentioned this before, but it's worth repeating.  Making the player move around the room really adds a lot.  While I don't expect this to be really common just due to the limited room space many players have, I think it should not be overlooked.  Actually walking across a room to pick something up adds more immersion than anything else.

The Bad

​​Turning Players

I was shocked at just how bad a feeling it is to have your camera rotate while you're standing still.  In the hack & slash game, my original design was to move the player around with the touchpad.  Within a few seconds of testing, I realized this was a terrible idea.  Moving forward with the touchpad was a bit disorienting and I almost fell over.  Turning was SOOOO much worse.  When I started, I had the turning bound to the headset with forward movement bound to the touchpad.  At the first turn, I had immediate VR sickness that didn't fade for at least an hour.  From here, I tried a few tweaks, and eventually ended up with VERY SLOW controlled rotation and movement on a flying carpet.  Even though the movement is abysmally slow, it's still somewhat nauseating.  Perhaps someone else will find a way to conquer this problem, but to date, I've found no fun way to do it.

Throwing Objects / Releasing the hand
The baseball pitching game seemed like a great idea before I actually tested it out.  The problem I ran into was that when you throw a ball, you instinctively want to open your hand and release that ball.  Throwing a ball by releasing your finger from a trigger just doesn't feel right.  I think throwing something from the end of a long stick may be a bit less un-natural with the Vive controllers, but a basic ball just isn't fun, and almost nobody could do it successfully.

However this doesn't mean you can't throw anything, there are plenty of great ways to throw objects, it just feels bad throwing balls specifically..

Hitting Controllers together
With archery, this was a real problem.  Everyone who played would put the right controller up to the left and smack them together.  After that collision, they'd smack themselves in the face with the controller while pulling back the string.​  A big part of this was changing the in-game representation of the controllers into something that didn't physically match the device held in the players hands.  Of course, even if we show the controllers, I've still seen plenty of people smack themselves in the face while trying to bring things up close to their eyes for inspection.

Sword Fighting
Unfortunately, sword fighting on the Vive is not an easy problem to solve.  The biggest issue here is that there's no physical requirement to swing a sword and nothing that prevents the player from simply wiggling the controller back and forth as fast as they like.  A real sword requires force, strength, and control... but a light weight controller can't really reproduce this.

Update: A couple of games have figured this out, and made it a lot of fun.  But it took time and work to get it right.


I hate to conclude this post at all, but I fear if I write for as long as I really want, nobody would make it to the end.  I have to say that building these 7 games was one of the most exciting experiences I've ever had and while it was a bit exhausting, it was worth every bit.  I also have to reiterate just how impressed I've been with Valve and HTC.  The device works amazing, and the software & support are the best I've ever had the pleasure to work with.

If you take away only one thing from this article and experience, let it be experimentation.  When you decide to build your own VR game, don't build out an elaborate design and plan from the start.  Try everything!  Take every little idea you have and put a day or two into it.
See how things feel in VR.  You'll be surprised at what fails and even more surprised at what's a giant hit.

Downloads - Get the Games now!

If you'd like to try out these games yourself, you're in luck!  They're all freely available for download right now!

Thinking about building your own VR game?

After building these 7 prototypes, I went full in and built 6 full games for the HTC Vive & Rift that are selling on Steam now.  If you're considering building a VR game of your own and want to get a head start on the process, give my VR Game Development course a test run.






Continue reading >
1 11 12 13 14 15 17
Page 13 of 17