All posts in "Unity3D"

How to make games – Making the transition from business apps and web development into gaming

By Jason Weimann / July 26, 2017

Switching to Game Development

Video games are the driving force behind many programmers aspirations.  I grew up with a Commodore 64, with a tape drive and 3 games of my own.  After beating all 3, I wanted more… but as a kid, buying more games was not an option.  Knowing I couldn’t buy games, I decided to try to make my own.  I dove into the basic language of the C64, wrote text to the screen, discovered the amazing GOTO function and made some pretty terrible stuff.

While the games I built weren’t really worth playing, the experience of writing my own code sparked something inside me.  I realized that even a kid could control computers, make them do what I wanted… I learned that I could code.

Even though my first attempts at coding were all with games, like many people, my first actual programming job had nothing to do with games at all.  In-fact I’ve spent a lot of my career working on business apps, high traffic web sites, hardware test tools, and a variety of other non-game software.  And in that time, I’ve met dozens of ‘business programmers’ who were inspired to learn the craft by video games.  Many of them also had long run deep desires to one day program their own games.  But as professional developers, with bills, families, and responsibility, they often struggled to take the dive.  A couple of them attempted to make the transition into ‘game development’, but most struggled to get started, and that’s for a good reason.

Game development is different

There’s a tendency many people have to think that ‘programming is programming’.. the idea that it’s all the same, you’re just using slightly different libraries or syntax.  I’ve always found this argument to be flawed.  You wouldn’t hire a firmware programmer to build a high traffic AWS hosted web site.  And I wouldn’t hire a database tool programmer to write firmware for my phone.

While all programming is loosely related, and knowing one form does help you learn another faster, there are huge differences in how you code depending on your platform, project, and language.  The same is true for game development.

Don’t let that discourage you though.  If you learn the differences and embrace them, you can quickly take your existing skills and start building games of your own.  But the key there is embracing the differences.. if you try to code your game like you code your WPF app, you’ll struggle, make life harder than it should be, and increase the risk that you never complete a project.

In this series, I’ll cover some of the key differences.  I’ll try to explain how to translate your existing skills into game development, and give you a heads up on pitfalls to avoid.

Embracing the Engine

I want to start with the biggest mistake I see people make as they try to transition.  Almost everyone I’ve worked with who took the dive failed here first.

When I say embrace the engine, I mean really embrace it.  Don’t jump in with the plan to start building your game as the first project you do.  Instead, when you start out, build a bunch of very different games from samples, guides, video tutorials, or anywhere else.  Build small things that can be completed in a day or two.

Imagine you were teach a new junior programmer your craft.  If you’re a web developer, what would you recommend they do first?  Build their ‘facebook but better clone’?  Or setup a simple sample page using your framework of choice?

Remember that when you’re switching to a new platform and technology, you’re starting out like a junior developer.  Perhaps a brilliant junior developer, but still junior none the less.

Building these small games will teach you a lot.  You’ll discover that some things you thought would be hard are actually really easy, if you know what you’re doing.

Moving things

Another area I’ve seen almost everyone struggle with is basic movement.  Many of the developers I’ve known were so accustom to moving objects on their screen via code and never think to try out the animation system.  Instead of spending 5 minutes creating a beautiful animation, they spend hours hard coding movements, then break down when they want to change it even a bit and everything breaks.  Game engines do plenty of things really well, and animation is one of the things they do best.

The same applies for physics..

For example, you may think that building a rope game like Cut the Rope requires some complex physics code, advanced math knowledge, and a ton of work.  But if you know the engine, you can build the entire thing without writing a line of physics code and little to no math skills.

I’ve known people who tried to do similar games, coding it from scratch, because they never took the time to learn about HingeJoint2D component..

The Loop – You can see it! (kinda)

In most programming environments, there’s a hidden loop running.  That loop is pumping messages and events to your components, making your object interactions clean and seamless, calling your web controller methods for you.  This abstraction is extremely helpful for business and web apps, you wouldn’t want to write that code yourself and you don’t need to.

When you jump into game development though, you’ll notice that the loop is visible (or pretty close to visible).  In Unity, the engine calls methods on every gameobject that’s active in your scene, without you telling it to.

These methods call at different times and you need to know how to use them.  For example the Awake() method is called early in the objects lifecycle and is often used to cache references to other objects & initialize state.

The Update() method is called every frame and is a common place for movement code & input handling.

But don’t think you’d do collision checks in Update.. in-fact going back to embracing the engine, collisions are sent to you in a more eventful way using methods like OnCollisionEnter, OnCollisionStay, OnTriggerEnter, etc.

Wait, what’s the difference between a collision and a trigger??  This is exactly the kind of thing that you want to learn early.. (triggers don’t do physics interactions like bouncing/stopping and just call code, while collisions act as you’d expect physics to do)

To learn a bit more about these methods in Unity, take a look at the Execution Order page here.  It has a nice chart that shows you the methods and when they’re called: https://docs.unity3d.com/Manual/ExecutionOrder.html

Frame Rate Matters

Of course we all know that the frame rate of our games is important.  If it drops too low, the game feels bad.  But it also impacts how our code runs..

That’s because methods like Update() are called every frame.  So if your game runs at 10FPS, Update is called 10 times a second.  If it runs at 300fps, well you’ll call Update 300 times a second..

Why does this matter?  Because your code needs to take this into account.  If you move an object on it’s x axis by 1 unit every Update, the speed will vary based on framerate.  At 10fps, it’d move 10 units per second, at 300fps, it’d move a whole lot further..

Again, there’s an easy solution to this in the Time.deltaTime static float you can access.  Multiplying your movement amounts by this will smooth everything out based on your current framerate… but you have to know it exists to use it.

Project Setup

Another thing that’s VERY different from traditional business programming is the project setup.  In my experience, I usually had a solution file that referenced some project files which all referenced code files..  And I had a couple images in a resources folder.

In Unity, the setup is completely based on folder structure, and code is only a tiny portion of your project.  Remember that it’s important, code is only a tiny portion of your project!

And the startup method of your project?  The void main()?  You won’t see it.  For your purposes, it doesn’t exist.

Instead, you start with a scene, in that scene you can do your setup, or in many cases the scene will already be setup and ready to go.. with little to no code initialization required.

That may sound disturbing.  There’s no single place of truth in code that defines how your app starts, where dependencies go or what order to initialize everything in.

It’s not in code… but it is in your scene.  Your scene is where you setup these things.

Often experienced developers find out about this, and immediately create a ‘setup’ scene with a single code file and a setup / main method that builds up their game.  Don’t do this….

This is exactly what fighting against the engine looks like..  Scenes exist for a reason, to make game development clean and quick.  Building everything up in code makes it harder to build, harder to debug, harder to expand, and harder to complete… with no benefits beyond that warm cozy feeling of having a main() or App_Start() replacement.  Scenes are there to help you, and learning to use them correctly is vital to having success.

What about dependencies and Injection?

I’ve written posts in the past about DI in Unity.  After a quick stint of web development, my love for DI was renewed and I experimented again with building a Unity project with DI.

Dependency injection is great in most environments.  If you’re building a web site or a business app and not using a good DI container, you’re probably missing out…  In game development DI is a bit different.

There are a few frameworks out there that do DI for Unity, but they don’t get a ton of traction.. and there’s a good reason..  they also fight against the engine.  They do exactly what I just said not to do, they build up the game in code instead of in the scene.

There may be edge cases where DI in a game makes sense, but in general I’d avoid it.

But how do I setup my dependencies?

Instead of using a DI framework, you really need to embrace the concepts of prefabs and component based development.

Prefabs are somewhat analogous to dependency injection for Unity.  Instead of configuring your object dependencies in code, you do it in the editor and create a prefab to be re-used later.

Here, my dependencies for which types of coins to spawn are defined on the gameobject and reference prefabs for different coin values.

What about components?

Another big difference with game development is that engines are generally driven by a component based system.  In Unity, this is done by adding ‘components’ to ‘gameobjects’.  These components are just classes that you write (or come with the engine).  The classes all inherit from a common class called MonoBehaviour, and as I covered earlier, these components have built in methods that are called by the engine loop.

C# developers – It’s worth noting that the methods of a monobehavior act much like a virtual method that’s overridden, but at the time of this writing they feel more like ‘magical’ named methods because they lack the override keyword.  For your purposes though, just imagine they have the override keyword in front of them.

Every gameobject in Unity contains at the very least 1 component called a transform.  The transform handles position, rotation, and scale.

To create a working thing in your game, you’ll combine a variety of components.

For example if you were creating a coin that the player could grab, you’d have a transform, a sphere collider, an animator (to make it spin), and a script of your own to handle the coin being hit and consumed.

An example of a few components put together to make a collectible coin

In this example your code could be as simple as implementing the OnTriggerEnter() method and calling a single line to Destroy the coin… and maybe another line to increment the players coin count..

The biggest benefit of component based development is reusability.  If you develop your components to be small and generalized, you can often mix and match them on a gameobject to create the behavior you want.  Don’t expect to do this right away or in every case, but keep it at the forefront of your mind when writing your own components.

Some skills that translate well (and will give you an advantage)

So far, I’ve talked a lot about things I’ve seen developers struggle with when making the transition to game development.  But there are some skills I’ve seen that are more common on the non-game side of programming and can give you a good boost getting started.

SOLID – If you’re an advocate of solid principals, they can be very helpful.  I’d especially focus on the single responsibility principal.  I’ve seen countless experienced game developers who do amazing things, but struggle to understand and use the benefits of the Single Responsibility Principal.  Remember it when you’re building your components, keep them small, reusable and replaceable.

Another skill I’ve found to be more prevalent in the web world is good use of design patterns.  It’s important to remember not to use patterns that are fighting against the engine, but when you do have a good use for a pattern, having the experience to recognize it, and code it, can be a great help.  A simple example of this would be a knowing when to build a simple state machine instead of a giant switch statement… (i’ve seen way too many of those giant switches).  But again, it’s important to know that there’s also a visual state machine built into the engine with mecanim… it’s primarily used for animation, but fill some other roles occasionally.

And of course your general programming experience is helpful.  And knowledge of the .net framework is a huge help for Unity.

Next Up

There’s still a lot I want to cover.. differences in testing, art, rendering, assets, audio, scene management, project management and more..

In the followup article, I’ll cover some of those things and questions people may have in the comments.

So if you’re wondering about how to translate some skill into game development, please ask away.

Continue reading >
Share

VR Dash movement / locomotion with SteamVR and Unity3D

Ready to add dash locomotion to your game?  It’s pretty simple to setup, but there are some important things to consider before you start. In this guide, I’ll show you how to set it up, what to watch out for, and a trick to minimize motion sickness.  All you need is SteamVR and Unity3D to get started.

Before you get started, you’ll want to have some sort of environment setup.  Ideally you want something where it’s obvious that you’re moving.  The locomotion is FAST, and if you just have a flat white area or single repeated texture, it’s gonna be a whole lot harder to tell what’s going on.

Here’s what my test scene looks like.

SteamVR / Oculus SDK

Once you have an environment setup, you’ll need the SteamVR plugin.  The code here is setup for SteamVR but could definitely be ported to Oculus with minimal effort, all we really care about in code is the button click.

After importing SteamVR, drop the [CameraRig] prefab into the middle of your scene.

Select the right controller and add the SteamVR_LaserPointer component to it.

We’re only using the laser pointer to visualize where we’ll dash, there’s no requirement to use the pointer for things to work, you can use it if you like or choose any other visualization of your choice.

Set the color of the pointer to Green.

DashController

It’s time to add a little code.  This is the only code you’ll need for dashing to work, so take a good look at it.

The Code

RequireComponent- The first thing to notice is we’re requiring a SteamVR_TrackedController component to be on the GameObject.  We’ll be attaching this component to the “Controller (right)” and the tracked controller will get added automatically.

Serialized Fields – We have 4 fields that are editable in the inspector.

  • MinDashRange & MaxDashRange – We have a  minimum and maximum dash range so we can prevent people from dashing across the entire map or ‘dashing in place’.
  • DashTime – This is the amount of time it will take the player to move from their position to the end point.  In this setup it’s always the same time, no matter how far you go.
  • MaskAnimator – Here we’re storing a reference to an animator.. we’ll be creating this animator soon.  It’s used to add an effect that will minimize motion sickness.

Private Fields

  • trackedController – We’ll be using the tracked controller reference in a few places, so we’re caching it here to avoid calling GetComponent more than once.
  • cameraRigRoot – The same situation applies here, we’ll use the cameraRigRoot to move the player around, and we don’t want to call GetComponent multiple times.

Start() – Here we do the caching of our private fields, and we register for the PadClicked event on the controllers.  If you wanted to use a different button, you’d register for some other event, like GripClicked or TriggerClicked.

TryDash() – This is what we call when the player clicks the trackpad.  The method performs a raycast from the controller’s position aimed in it’s forward direction.  This is exactly what the laserpointer we added earlier does, so if the laser pointer hits something, we should hit the same thing.  And our RaycastHit named “hit” will have a reference to the thing they both hit.  If it does hit something, and the distance of that hit is within our allowed ranges, we start a coroutine called DoDash().

DoDash() – The DoDash method does the actual work here.  First, we set a bool on our maskAnimator named “Mask” to true.  Then we give it 1/10th of a second for the animation to play.  Next, we take note of the cameraRig’s current position and save it off in startPoint.  We then go into a while loop.  This loop will execute for the amount of time specified in our ‘dashTime’ field above (editable in the inspector).  In the loop, we use Vector3.Lerp to move the cameraRig from it’s startPoint to the endPoint (where the user aimed the laser).

Vector3.Lerp returns a vector at a point between two other vector3’s.  The 3rd value (elapsedPct) determines how far between them it should be.  A value of 0 is right at the start point, a value of 1 is at the end, and a value of 0.5 is right in the middle.

Once we leave the loop, we set the “Mask” parameter on our animator back to false and we’re done.

Back to the Inspector

Attach the DashController to the Controller (right)

Play & Test

You should be able to play now and dash around.  Put on your headset, aim the controller and click the trackpad!

The Mask – Reducing motion sickness

Dashing around usually feels okay for most people, but motion sickness is still possible.

To help remove some of the sickness, we can add a cool masking effect.  And if you have some artistic skill, you can make it look much better.

Download this DashMask.png image

Select the Camera (eye).  Right click and add a Quad.

Drag the DashMask.png image onto the quad, a new material will be created automatically.

Select the quad and change the material’s render mode to “Fade”

Move and resize the quad to match this.  We need it close to our face as it’s going to be used to mask out the world.

Animating the Mask

If you play now, you’ll have a mask sitting in your face (or not depending on where you drag the alpha of the albedo color on that material)

What we want though is for the code above to be able to toggle the mask, with a smooth animated fade that we can customize later.

To do this, let’s create a simple animation for the alpha channel of our quad.

The Animation

Select the Quad

Open the Animation window.

Click create animation, name the animation FadeMask

Click record.

Open the Color picker for the Albedo of the material.

Slide the alpha around and settle it at 0.

Drag the Animator red line to the 1:00 mark.

Open the color picker again.

Slide the alpha to the far right.

The Animator

The animator will control our animations.  It handles the state of our animations and does the transitions between them.

Create a new animator for the quad.

The Mask Parameter

In the parameters section, add a new Boolean parameter and name it “Mask

Adding the Animations

Drag the FadeMask animation you just made onto it.

Do it again, but for the second FadeMask, rename it to “FadeMask Reverse“.

Add transitions between the two and use the “Maskparameter to control them.

Set the FadeMask speed to 5.

Select the FadeMask reverse and change the speed to -1.

We’re playing the animation in reverse here to fade the mask back out.  We could have a separate animation for this, but we don’t need it when we can just set the speed to -1.

Attaching the Animator

Add an animator to the Quad.

Assign the new Animator Controller to your animator.

The last thing we need to do is assign the MaskAnimator field of the Dash Controller.

Select the controller, drag the quad onto it, and you’re done..

 

The Result

Continue reading >
Share

Build a Unity multiplayer drawing game using UNET

In this article you’ll learn how to build the foundation for a multiplayer drawing game.  We’ll start with drawing on a texture and add some Unity UNET network components to make it multiplayer.  By the end, you’ll see other players drawing in real time and have a good foundation to build your own drawing game in Unity3D.

Requirements

To start, we’ll need a couple pieces of art.

The Canvas

First is our default/blank canvas.

For this example, I’m using a solid white texture that I put a little border around.

Download this canvas or create your own

Color Wheel

The other thing we need is a color wheel.

Download this wheel or find your own

Add both of those textures (or alternate ones you choose) to your project in a Textures folder.

Setup the Canvas

Create a new Quad.

Resize it so it looks similar to this.

Drag the canvas texture you’ve chosen onto the Quad.

Change the material shader on the quad to Unlit/Texture.

Make sure you select a resolution and don’t use free aspect while you’re setting this up.  I”m using 1920×1080.

Name it “Paint Canvas”.

Create a script named PaintCanvas.cs and replace the contents with this.

PaintCanvas Code

This script is used for two things.  First, it makes a copy of the texture in memory so we can easily read and write to it.  This texture is created in PrepareTemporaryTexture and assigned to the static property “Texture”.

The second thing it does is set the texture data from a byte array.  This is so we can ‘load’ a texture from the host when we join the game as a client.  You’ll see this later when we get into the networking code.

Add the new PaintCanvas component to  the “Paint Canvas” Quad.

Back to the Inspector

Your PaintCanvas should look similar to this.

Color Picker

It’s time to setup our color picker so the player can choose different colors easily.

Create a Quad

Name it “Color Picker”

Drag the ColorWheel texture onto the Quad.

Change the Shader to “Unlit/Transparent Cutout”

Create a new script named “ColorPicker.cs” and replace the contents with this.

Add the ColorPicker component to your “Color Picker” object in the inspector.

ColorPicker Code

This script starts off with a static “SelectedColor” property.  This is where we’ll store the users color selection when they click on the colorpicker.

But the work here is done in Update().  We check to see if the user has clicked this frame, and if so, we do a raycast from the click point into the scene).

If the ray intersects with something (line 17) and that something happens to be the color picker (line 20), then we pull the texture coordinate from the RayCastHit multiply it by the height and width of the texture, and get the actual pixel on the texture that we clicked.

With that pixel, we use the Texture2D method GetPixel to get its color and store it in that “SelectedColor” property (line 32).

Finally, we set the material color of our “preview” object to the selected color (line 34).  This is where the user will see the color they’ve picked and have “selected”.

Back to the Inspector

Your color picker should look like this.

Move the wheel so it looks like this.

Selected Color Preview

If you look at the Color Picker code or inspector, you may notice that we have a “Selected Color Preview” field.

Create another Quad and position it just below the wheel like this.

Create a new Material and name it “SelectedColor

Assign the material to the “Selected Color Preview” Renderer

Change the Shader to “Unlit/Color

We don’t want light having any impact on how our drawing appears.  Unlit/Color makes that not an issue.

It should look like this in the inspector.

Brush Size Slider

Let’s also add a brush size slider.

Create a new Slider from the GameObject->UI menu.

Move it so it looks like this.  (I also added a text object at the top, feel free to do that too if you want)

Create a new script named “BrushSizeSlider.cs” and replace the contents with this.

BrushSizeSlider Code

This class is really just wrapping the slider value so we don’t have to reference it anywhere in the editor.  Because our player is network instantiated, and we don’t have any objects in the scene that would make sense referencing this slider, we’re just putting the BrushSize into a static global int.  This isn’t the best architecture for something like this, but we’re not building out a full game or editor so it’s a quick enough way to get things working.

Add the BrushSizeSlider component to your slider GameObject.

Your slider should look like this.

Save your work to a scene!

Player & Brush Time

Now that we have the board setup, it’s time to create the player & brush.

Create a new script and name it “PlayerBrush.cs“.

Replace the contents with this.

PlayerBrush Code

This is the most important part of the project.  Here, we’re doing the drawing and networking.  (ideally we’d split this into 2 classes soon)

The Start method is only called on the server/host because of the [Server] tag.  That’s because the Start method is getting the texture data and sending it to the client.  This may be a little confusing, but remember that the “Player” will be spawned on the server, and this method is being called on the player gameobject but only on the host/server.

RpcSendFullTexture is the opposite.  It uses the [ClientRpc] tag because we only want that method being called on the clients (it’s also required to send the command to the client).  This method calls the SetAllTextureData method we covered earlier, setting the data of the client to match the server.

.Compress() & .Decompress() – These methods are extensions that you’ll see in a while.  They’re used to compress the texture data into something much smaller for network transmission.  The compression here is extremely important, without it, this wouldn’t work.

When you look at the Update method, it should feel familiar.  Like with the color picker, we’re checking for left mouse down, doing a raycast, and checking to see if the click was on what we wanted.  In this case, we’re looking to see if they have the mouse down on the PaintCanvas.

If they do, we get the texture coordinate (like before), get the pixel x & y, then we send a command to the server using CmdBrushAreaWithColorOnServer.  We also call BrushAreaWithColor on the client (if we don’t do this, the pixel won’t change until the server gets the message, handles it, and sends one back.  this would feel laggy and bad, so we need to call the method on the client immediately).

CmdBrushAreaWithColorOnServer doesn’t really do any work itself.  It’s a [Command] so clients can call it on the server, but all it really does is tell the clients to brush the area using an Rpc.  It also tells the server/host to draw using BrushAreaWithColor.

BrushAreaWithColor is responsible for changing the pixels.  It does looping over the brush size to get all the pixels surrounding the clicked one.  It does this in a simple square pattern.  If you wanted to change the brush to be circular or some other shape, this is the method you’d modify.

Back to the Inspector

Create a new “Empty GameObject“.

Name it “Player

Add a NetworkIdentity component to it & check the “Local Player Authority” box.

Add the “PlayerBrushcomponent.

Your Player object should look like this.

Drag the Player object into a folder named Prefabs.  This will make your player a prefab so we can instantiate it for network play.

Error! – An Extension Method!

The playerbrush script references some compression methods that we’ll use to sync data to the clients when they first connect.

To add these compression methods, let’s create a new script for our extension methods.  If you don’t know what extension methods are, you can read more about them here: Unity Extension Methods

Create a new script named “ByteArrayExtensions.cs” & replace the contents with this.

ByteArrayExtensions Code

I won’t go into the details on how this compression works, but this is some standard c# code for doing a quick in memory compression of data.  These are extension methods that will ‘extend’ the byte arrays.  It’s essentially just a cleaner way so we can write somearray.Compress() instead of ByteArrayExtensions.Compress(somearray).

The Networkmanager

The last thing we need is a networkmanager.

Create a new gameobject and name it “[NetworkManager]“.

Add the NetworkManager component to it.

Add the NetworkManagerHUD component to it.

Expand the NetworkManager component.

Drag the “Player” prefab from your Prefabs folder (NOT THE SCENE VIEW ONE).. onto the “Player Prefab” area of the component.

The work is done, Save your Scene!

Testing the game

Now that everything’s setup, it’s time to test.

To do this, create an executable build (make sure to add your scene to the build settings).

In one instance, start a LAN host, then the other can join.

Download

If you want to download the full working project sample, it’s available here: https://unity3dcollege.blob.core.windows.net/site/Downloads/MPPainting.zip

Online Multiplayer

If you want to go beyond LAN mode, you’ll need to enable Multiplayer in your Services window.  It will take you to the Unity webpage where you’ll need to set a maximum room size.  For my test I went with 8 players.

Conclusion & Notes

This project is meant only to be an example for UNET & drawing.  To keep things simple, I used quite a few static fields, which I’d highly recommend against in a real project of any size.

 

 

Continue reading >
Share

Unity Cinemachine – Unity 2017 – Auto Dolly Camera

By Jason Weimann / July 20, 2017

Unity Cinemachine is a new tool to make camera management easy and awesome.  It’s loaded up with functionality to make good camera shots simple to setup in your Unity game or application.  I’ve only just started experimenting with it and I’m extremely impressed by the quality and ease of use it provides.  If you want to make a cinematic, it’s the perfect tool.. but it’s also perfect for most gameplay scenarios as well.  Today, I’ll cover the Auto Dolly feature I’m using from Cinemachine for a project I’m working on.  Auto Dolly is a feature that mimics a real life camera dolly with an operator keeping the best shot for you ‘automatically’.

Unity Cinemachine Setup

The Cinemachine functionality is added through an asset pack available on the asset store here: https://www.assetstore.unity3d.com/en/#!/content/79898

 

Before Cinemachine

Here I have a simple scene with a character fake skating around.  The camera’s locked in to cover the entire scene. It ‘works’ but it’s kinda lame.  Let’s fix it up with Cinemachine & Auto Dolly!

Creating the Camera & Dolly

Cinemachine adds a menu to your toolbar.  Under that menu are a variety of different camera types you can setup.

Today, we’ll add the Dolly Camera with Track.

Clicking this button adds a few things to our scene AND modifies our camera.

The icon on the Main Camera indicates that it now has a Cinemachine Brain.  We won’t dig into the details of the brain in this post, but take a quick look at the inspector to get an idea of what it can do.

Positioning the Dolly Track

Before we can really see anything, we need to position the Dolly Track.  Here, I’ve moved into a spot where I feel I have a good view and repositioned the track.

Ctrl-Shift-F is the hotkey to reposition an the selected object to your current scene view camera location and orientation.

Adding Waypoints

Next, we need to add a waypoint to the dolly, this is going to be the far left point for my example, but you can re-arrange waypoints at will, so there’s no need to create them in any specific order.

To add a waypoint, click the big button titled “Add a waypoint to the path”

Each waypoint has a position offset from the dolly track’s transform position.

For the first waypoint, I want to zero out the Z value so it lines up exactly with where the dolly track starts.

Look At Target

Let’s take a quick look at the CM vcam1 gameobject.

This is our virtual camera.  Cinemachine uses this object to determine where the camera should be and how it should act.  You can have multiple virtual cameras and do smooth transitions between them (that’s one of the great things about Cinemachine).

Today though, we’re going to stick with a single VCam, but allow it to move along the dolly path.

Before we go further with the dolly though, let’s make the VCam work as a stationary camera.

For this game, I want the player to be focused at all times, to do that, we’ll set the “Look At” field on the Virtual Camera to the player model “Ace_01”.

What’s this look like?

As you can see, we already have a slighty more interesting camera.  It’s tracking the player and keeping them center frame.  The guides you see are provided by Cinemachine to give you an idea what it’s doing (trying to keep the player int hat middle box).  As you adjust properties on the VCam, you’ll see the guide visuals adjust as well.

Back to The Dolly Track

With the camera sitting in one place, we can keep an eye on the player, but he does get far away.  The Dolly system will fix this.

Adding a waypoint

For the Dolly to work, we need another waypoint.

To do this, I click the Add waypoint button and change the X position.

I’ve also repositioned my DollyTrack1 object so that the ends are pretty equidistant from the center of the ice lake.

IMPORTANT – FOLLOW TARGET

If we hit play now, you won’t notice a change.  The camera won’t move along the dolly because the camera doesn’t know it’s allowed to follow something.

To fix this, I’ll select the Virtual Camera and assign “Ace_01” to the “Followfield.

IMPORTANT – One more Checkbox

There’s one more thing we need to do.. check the Auto Dolly Enabled box.

Check it out!

Here you can see what it looks like with the camera sliding along the dolly, keeping the player close and in-frame the entire time.

If you look at the Scene view, you can see the camera sliding along.

Let’s make it better with a TargetGroup!

Right now we have a great view of our player, but what about when you want to go after that puck?  Having it off-screen makes that kinda difficult.

No problem though, we can address this with the TargetGroup component.

Create a new gameobject and add the “Cinemachine TargetGroup” component

The TargetGroup allows us to specify a # of targets.  For this game, I want to add our player model and the puck.

We also want to give the player a bit more weight to keep them more in-frame the entire game.

Assigning the Target Group to the Virtual Camera

To switch to a group, we just replace the Aim and Follow field values with our new TargetGroup.

The Result

Conclusion

This example barely scratches the surface of Cinemachine but already shows how easy it is to get great camera effects with no code and very little work.  I plan to continue digging into Cinemachine and share as I do.  If you’re building a game in Unity 2017, I definitely recommend you give Cinemachine a try.

Continue reading >
Share

Create a VR Teleport system in your Unity game with the SteamVR Interaction System

By Jason Weimann / July 18, 2017

VR Teleportation is one of many great locomotion systems.  While it’s not ideal for every game, there are plenty where it’s the perfect movement method.  With SteamVR Teleportation is also it is the easiest to get started with.  This article shows the basics to get you moving around with the VR Teleport method in just a couple minutes.

Prerequisites

To use this locomotion method, you’ll need SteamVR and a compatible headset like the or .

You can download the SteamVR plugin from the asset store.

VR Teleportation Setup

For this teleportation example, we’ll be using the SteamVR Player prefab.

Find the player prefab in the SteamVR->InteractionSystem->Core->Prefabs folder.

Drag it to your scene.

Reset the transform to 0, 0, 0.

The Ground

Create a Plane (GameObject->3D Object->Plane)

Duplicate your plane and rename it “Teleport Area

In the Inspector tab, add the “Teleport Areacomponent to the Teleport Area plane.

Set the Y position of the “Teleport Area” gameobject to 0.01.  We want it just above the ground collider.

The Teleporting Prefab

We’re almost done.. the final step is to add the Teleporting prefab to the scene.

Find the “Teleportingprefab in SteamVR->InteractionSystem->Teleport->Prefabs.

Drag it into your scene.

Time to Teleport

That’s everything you need to get started.  Press play and start teleporting around.

Teleport Options

The Teleporting prefab provides a large selection of customization.  You can change colors, arc, materials, sfx, and more.

I’d recommend experimenting with the different settings to get more familiar with the SteamVR Teleportation system.

Teleport Point

Before we end, I want to mention the TeleportPoint prefab.  It’s in the same folder as the “Teleporting” prefab.

The teleportpoint prefab is setup to create a “point” as the name implies.  Instead of being able to freely choose any area on a mesh, this can be used to teleport the player to very specific locations.

If your game or application has specific points you want the player at, take a look at this prefab.  (it also supports loading  a new level / scene when they teleport to it)

Conclusions & Extension

The SteamVR Teleportation is one of the easiest to get started with.  If you want quick locomotion, definitely try it out.

VR Course

Teleportation is just one of the locomotion systems covered in the Professional VR Developer course.  If you’re ready to start building a VR game of your own the Professional VR Developer course is my recommended way to get moving fast.

Use the code “TELEPORT” for a special discount.

 

 

Continue reading >
Share

Importing Real World Unity Terrain Heightmap for free with terrain.party

By Jason Weimann / July 17, 2017

Earlier today I learned about a great way to import real world terrain to Unity for free. I’ve used Worldbuilder, Gaia, and other methods before, and they’re all great. But none of them are as easy as the method I’m about to present. That’s not to say that this solution is ‘better’, the competitors have some amazing functionality, but if you just want to get some real world maps into your game, do it in 5 minutes, and do it freely, terrain.party may be right for you.

Terrain.Party

If you take a look at the site https://terrain.party you’ll quickly see that it’s a minimalist website, but with some awesome functionality.

Selecting an Area

While you’re there, you can scroll around the world to find the location you like, or click the search button and enter the place you’re interested in.

Choosing a Size

On the right hand side, you’ll see a few controls for zooming and exporting.

Use the zoom control to adjust how big of an area you want.

Take note of the size, you’ll want to use this as your reference when importing into Unity (you can re-scale, but it’s a good starting point to keep in mind).

Exporting the Heightmaps

Once you have a terrain area you like, click the Export button and save off your heightmaps.

Clicking the export button will download a zip file with multiple heightmaps.

Converting the PNG to DATA/RAW

Take the (Merged) one and open it in GIMP (or photoshop if you have it and prefer).

Select File->Export As

Choose the type “.data” and click export.
Keep note of where you decide to export it, you’ll be importing into Unity it in a minute.

Creating the Unity Terrain and Importing the Heightmap

Back in Unity, create a new Terrain object (GameObject->3D Object->Terrain)

Select the Terrain Settings

Set the Terrain Width & Height values to match what you had in Terrain.Party.

Set the Height to 2000.

Importing

Click Import Raw…

Browse to your heightmap file and click Open.

The Result

That’s it, your heightmap should be imported!

Your terrain should look something like this (of course it will vary based on the location you picked)

Conclusions & Thx

I really like how easy it is to get a terrain pulled in with this method.  But I want to be sure to mention that other alternatives while not free are great as well and provide a lot more in the line of options and functionality.  If you don’t need that functionality, or don’t want to spend money, this is definitely worth trying out.  And it only takes about 5 minutes…

I also wanted to give special thanks to Adam Smith for bringing this to my attention in our FB Group

 

 

Continue reading >
Share

Using Unity3D Xml Files for game Data – Quiz Game Example

By Jason Weimann / July 14, 2017

Recently I shared a method for creating a QuizGame using the ScriptableObject functionality built into Unity3D.  Today, we’re going to modify that example to be a little more scalable for cases where you might want 1000’s of questions that aren’t all managed in the editor.  To accomplish this, we’ll use some Unity3D xml files and XmlSerializer.  By using xml files, we can easily swap or modify the data outside the engine, and if we pull the XML data from a web service, we could change the questions for our quiz game without any modification or update to the game itself.

Setup – Read Part 1

If you haven’t seen the previous post, take a look at it now.  It covers the basics for laying out a sample UI and how the game will work.

You can see that here: HowTo: Build a Unity3D Quiz Game – Like Trivia Crack / Heads Up

The differences

In the previous examples, our QuizQuestion class was actually pretty heavy.  Almost all of that was due to editor customizations that made it more streamlined to work with.

Since we’ll be serializing our QuizQuestions in xml now, we can go with a very simple class that only contains 4 auto properties.


The QuizCollection

Because of how the previous sample is architected, we don’t need to change much to allow for XML serialization of our questions.  Other than the QuizQuestion class, we only need to adjust our collection.

Instead of using Resources.LoadAll, we’ll be using the XmlSerializer and StreamReader classes to do the work.

The XmlSerializer requires a type for it’s constructor.  Our data is an array of QuizQuestions, so we initialize it by passing “typeof(QuizQuestion[])” into the constructor on line 23.


In Awake, we check to see if a questions.xml file exists.  If not, we’ll write out a sample xml file that can be used as a starting point.

Then we load the questions..  everything else is the same as the previous example.

Separation of Concerns

The reason this change is so small is that in part one, we focused on keeping a clear separation of concerns.  Keeping things split up like this makes change easy and painless.

What about JSON?

There’s no doubt a good number of people will think “why use xml, what about json?”, “json is sooooo much better”, “etc”.

As a whole, I agree json is a better format.  It’s cleaner and removes a lot of the clutter from verbose ceremony..

Unfortunately, I still really dislike the built in json serializer for Unity.  If you do want to use JSON though, I’d recommend something like JSON.Net for Unity.  I’ve used it in the past and it was great.

For this example though, that seemed like overkill.  Xml will work fine here, and realistically you’ll probably want a web service providing the data here… and writing out to xml/json for most services is as simple as swapping the request header.

Conclusions & Download

This post is meant to answer a specific question from the Unity3D.College facebook group.  If you have a question of your own, join us and share 🙂
If you’d like to download the full project sample, it’s available here: https://unity3dcollege.blob.core.windows.net/site/Downloads/QuizGameXml.zip

Continue reading >
Share

HowTo: Build a Unity3D Quiz Game – Like Trivia Crack / Heads Up

By Jason Weimann / July 12, 2017

How To: Building a Unity3D Quiz Game like Triva Crack or Heads Up

Quiz games like Trivia Crack are extremely popular.  Because they’re fun, easy to play, and suitable for a wide range of people.  They’re also pretty easy to build in Unity3D.  Today, we’ll go over some of the basics for building a Trivia or Quiz game, with a full example you can try out yourself.  You’ll learn how to create questions, hook them up to a UI, and build the basics of your own Quiz game.  Get ready to build your own Unity3D quiz game!

The UI

The first thing we need is a UI.  Starting with a basic UI gives us a good idea of what we’re going to need code wise and how our game components will interact with each other.  We could do everything else first and build the UI last, but I really prefer to have some sort of outline for what I’m building before digging into the details too much.

Take a look at the Canvas.

Question Panel

The Question Panel is where we’ll hold the current question.  It’s just a panel with a text object under it.  The panel is there to provide a little visual separation, and the text is set to “New Text”.

Answers

Next, we have an answers panel area.  The answers are all buttons, and this panel is setup with a “Vertical Layout Group” to keep them stacked.

The answer buttons were created by adding a new UI->Button and changing the color.

Correct and Wrong Popups

These popups are panels with a text object under them.  Plain and simple, but ready for you to pretty up.

Time for Code

Now that we have an idea how the UI is going to look, it’s time to get some code written.

The first thing I wrote is a class for the questions.  Here, I decided to go with a ScriptableObject so we can edit the questions and answers right in the engine and visually manage them all.  If we got to a point where we have 100’s of questions though, I’d recommend switching to another format like json or xml and managing the questions with really good custom editors or an external tool.

QuizQuestion.cs

Code

The QuizQuestion class is primarily for storing data.  It has a “question” field, an array of “answers“, and an integer for the correct answer.

It’s important to note that the correctAnswer field is using a 0 based index.  That means if the first question is the correct answer, the value should be 0.  If the 3rd question is correct, the value should be 2.  This matches up with our array indexes.

The Asked boolean is there to mark off questions as we ask them, so we don’t ask the same question twice (unless we run out of questions).

OnValidate()

Everything else in the class is there to re-name the ScriptableObject.. why?  because it makes our questions easier to manage by giving a good view of the question and answer.

Here’s what that looks like:

UI Controller.cs

Code

The UI Controller has ONE purpose.  Control the UI.

Here, we reference the different UI components like the QuestionText & Answer Buttons.

Then we provide a public method to setup the UI for a question.  This way, our QuizController (coming later), doesn’t need to worry about the presentation details and can stay small.

One thing we do in the SetupUIForQuestion method that’s of interest is disable any answer buttons beyond the # of answers we have available.  The current UI supports 6 answers, but we could add more buttons and it would ‘just work’.

We also handle submitted answers here.  Not the logic for if they’re correct, but the UI actions based on their correctness.

Architecture Note: The primary goal of this class is to keep UI logic OUT of the QuizController and in a very specific place for the UI.

Question Collection.cs

Code

The QuestionCollection class has the job of loading and providing questions.  In the LoadAllQuestions method, we grab every QuizQuestion in the “Questions” folder.

It provides one public method GetUnaskedQuestion, which as the name implies, will return an unasked question.

Quiz Controller.cs

Code

Because we’ve split up responsibilities, our QuizController class is nice and succinct.

It’s primary purpose is to handle the state of our game.

The PresentQuestion method asks the questionCollection for a question.

Then it tells the UI Controller to update for the question.

When a button is clicked, SubmitAnswer is called and we check to see if the answer is correct.

We tell the UI Controller if the player was correct or not, then wait a few seconds and present the next question.

Button Events

The Quiz Controller’s SubmitAnswer() method still needs to be called by something.  I mentioned previously that it’s called by buttons, so let’s take a look at how that’s setup.

Each button should pass in the value for it’s index.  We could add some code to auto configure all this, but since we’re only gonna have 6 buttons, it’s easier to just put in the numbers manually.

Remember to start with 0 for the first button, and keep them in order.

Conclusions, thoughts, and extensions

This project shows how to build a basic quiz game.. but more importantly, it should serve as a reminder to split up your classes and work towards SRP.  By keeping things separated, we’re able to easily understand what everything is doing and extending the project is much easier.

One thing that’s important to point out though is that keeping questions in scriptableobjects may be unscalable.  If you want to have 1000’s of questions, you’re gonna want to build a new QuestionCollection that gets data from XML, JSON, or some other data source (you’ll need remove the ScriptableObject stuff from QuizQuestion too).  Again, since everything is split out, that’s a pretty easy task.

This post came from a question in the Unity3d.College facebook group, where we talk about problems people present and work together to give good solutions.

Project Download

Want the source?  You can get it here: https://unity3dcollegedownloads.blob.core.windows.net/vrgames/QuizGame.zip

 

Part 2 – XML Loading

This article is continued in part 2 here: https://unity3d.college/2017/07/14/using-unity3d-xml-files-game-data-quiz-game-example/

Part 2 shows how to convert the data source from ScriptableObjects to something a little more scalable with XML or JSON files.

Continue reading >
Share

Oculus Rift + Touch or HTC Vive – Which should I get? – Rift vs Vive – UPDATED FOR NEW PRICING

Oculus Rift with Touch vs HTC Vive – The Differences

With the flurry of price drops from Oculus, it seems like this is a good topic to revisit.  If you’ve read my blog before, you may know that in general, I prefer the Vive for my personal use and development.  I do own an oculus rift and love it though.  If you don’t own either though, deciding which to get can be confusing, with arguments from people on both sides about why one is better than the other.  So here, instead of personal preferences, I’ll try to share the actual differences that I’ve experienced owning both for over a year.

The Elephant – Room Scale

From the start, Oculus said their system was for seated experiences.  Then it switched to standing, and eventually ‘full roomscale support’.  The Vive on the other hand was designed from the ground up to work great with room scale setups.

And make no mistake, there is a difference between the two.  I’ve used the Rift with touch controllers for 100’s of hours, with 2 sensors and with 3.  In general, it ‘works’, but it does have some drawbacks.

Here are the biggest differences with roomscale:

Rift

  • Cables are SHORT, so if your PC isn’t in an ideal spot, you’ll need active USB extension cables (around $20 on monoprice.com).
  • The HMD cables are short too.. I don’t know why, but there is definitely less slack with the Oculus cables.
  • Mounting is harder.  They’re not built for wall mounting first, so you need to get your own mounting solution OR put them on a flat surface where they won’t get bumped or moved.  For my home setup, this is nearly impossible, but in the right office environment it’s not too bad.
  • Tracking isn’t quite as good.  It drops out occasionally (even with 3), but not often enough to be a huge deal.

Vive

  • Lighthouse setup is simple if you have a screwdriver and a wall with power somewhere near.
  • HMD cable is long, so long it sometimes hangs and gets tangled.  It’s still much better to have a long cable though.

Winner: Vive

Visual Quality

Personally, I can’t tell any real difference between the two visually.  My eyesight is good (20/10), and they both look about the same.  I do notice the lenses a bit on them both, but neither bothers me.

Winner: Draw


Comfort

Here the Rift still wins out.  Even with the deluxe audio strap, the rift is lighter and a bit more comfortable to wear.  That’s not to say that the Vive is uncomfortable, but even a small weight difference can become noticeable if you wear it long enough (unless you happen to have giant neck muscles)..

If you do go with the Vive though, the deluxe audio strap does help with weight distribution and comfort, especially for people like me with giant over-sized heads.

Winner: Rift


Controllers

With controllers, it’s kind of a toss up depending on the type of games you like to play.

A lot of people love the Oculus controllers, for good reason.  They’re pretty natural feeling and the touch detection is awesome.  For games where your controller represents your hand, the touch controllers are hands down the best choice.

When it comes to games where you hold a gun, paddle, bat, or something else though, I prefer the Vive wands.  They just feel a bit more like I’m holding the actual object.  And the new knuckles controllers look really promising, they could end up better than the touch controllers, but I can’t say for sure until I try them.

Winner: Draw


Content

In general the content for the two devices is very similar.  Almost every game can be played across both devices, and there’s no killer game that would lead me to buy one over the other.  While Cculus does have a few exclusives, there are ways around it, and they’re not enough to sway me.

Winner: Draw


Development

This is for the programmers and game devs.  If you’re wondering how hard it is to build a game for one system or the other… it’s about the same.  Both devices work great with SteamVR, though for the Rift, you do need to strip out SteamVR and use the native SDK.  That’s the only real difference though.  They are both a pleasure to develop for and I’d call this one a draw.

Winner: Draw


Compatibility

This is something people often overlook.  Both of these devices have the same GPU requirements, but they do differ on other requirements.  The Rift requires 3-4 compatible USB 3.0 ports.  I say compatible because many of the USB 3.0 ports I’ve tried on different systems do not work.  In-fact there’s a list of compatible USB 3.0 addon cards that you can buy if your onboard ones won’t work.

I’ve also had a few laptops that wouldn’t work with the Rift (USB compatibility or no video), while I’ve never had any problems with the Vive.

The Vive does require a single USB 2.0 port though.  It usually works fine in 3.0 ports too, but I have at least one system where the 3.0 ports are not compatible with the Vive.  Luckily there are plenty of spare 2.0’s left.

Winner: Vive


Price

The massive shift in pricing is what drove me to re-consider my thoughts on this debate.  If you haven’t seen it already, the Rift has had a price drop to $399 with touch controllers INCLUDED!

That’s an amazing deal, and on prime day they even had a $100 amazon gift card included (looks like they killed that though).

If you want to do roomscale with the Rift though, you’ll wanna include another $99 for a 3rd sensor (and spare controllers).

The Vive also recently got a price drop to .

On top of that, the deluxe audio strap is another $99, and knuckles controllers aren’t gonna be free either.

But when you add in the cost of the 3rd sensor, they’re really not far apart anymore.  It’s about a $100 difference, which in my book turns out to a Draw.

Here, the winner is obvious, Oculus is way ahead.

Winner: Draw – (UPDATED WITH NEW VIVE PRICING)


Recommendation

If you go through the differences above, you’ll see it’s pretty even.  Rift wins in some places, the Vive wins in others.  This may leave you wondering, which is right for me?

First, make sure to check compatibility.  If you only have a laptop, make sure that laptop has a GPU that can handle it, AND the USB 3.0 ports for the Rift.  If you don’t have those ports, the Vive is definitely your choice.

Next, think hard about room scale.  Is it really important to you?  Are you going to play most of your games sitting down on a controller or standing in place?  How much room do you really have to play in?
If room scale isn’t your thing, the Rift is an obvious winner.

If you do have the room though, and really want to have experiences that span a bigger area, the Vive is a better choice.

Of course the giant distance in pricing can also be a big factor.  If you’re just looking at VR, and your system will work with the Rift, and money is a limited resource, take advantage of the Rift price drop and get started today.

As an Amazon Associate I earn from qualifying purchases.

Continue reading >
Share

VR Movement – Ironman / Jetpack Flying with SteamVR

There are a lot of interesting movement systems for VR.  Arm swinging, dashing, teleportation, trackpad, and more.  In this guide, I’ll show you another one I experimented with previously and really enjoyed.  We’ll go over how to make a player fly like Ironman or at least like a person holding some jet packs..

Prerequisites

This article assumes you have SteamVR in your project, though converting it to OVR native should be a simple task.

Rigidbody on [CameraRig]

Start by adding the [CameraRig] prefab from the SteamVR package to your scene.

Add a Rigidbody component to the camerarig.

Freeze the rotation values.

Locking rotation is very important, if you don’t, your player will end up spinning around, flipping over, and getting sick

Now, add a box collider to the rig so you can land (this could be a capsule or any other collider that works well for your game)

Controller Setup

Expand the CameraRig and select both controllers.

Add the SteamVR_TrackedController component to them both.

JetpackBasic Script

Create a new script, name it “JetpackBasic”

Replace the contents with this.

Your controller should look similar to this: (the sphere collider and audio source are optional and beyond the scope of this article)

Press Play

Aim the controllers where you want to go.

Pull the trigger.

Fly!

Continue reading >
Share
1 10 11 12 13 14 21
Page 12 of 21