• Home  / 
  • Author's archive:
About the author

Jason Weimann

Taking high resolution and 360 screenshots in Unity with Ansel

By Jason Weimann / May 3, 2017

Taking HD Screenshots in Unity

One of my favorite sessions from Vision Summit 2017 was presented by NVIDIA about VRWorks and Ansel.  I was really impressed with the ability to create high definition screenshots in Unity using Ansel.  It seemed amazingly simple to setup, and the example images I saw were breathtaking.  The ability to also take 360 degree screenshots AND 360 degree 3D screenshots for VR games set it over the top though.  So when I got home, I started playing with it and wanted to share the results.

Requirements & Setup

Ansel is an NVIDIA tool, so it only works for people with an NVIDIA card.

If you have one though, setup is simple.

Drivers

First, make sure you have drivers that support it.

If your driver version is 368.81 or higher, you should be good.

You can get the most recent NVIDIA drivers here: http://www.nvidia.com/Download/index.aspx

Asset Pack

Once you have them installed, you’ll need to add the plugin from the asset store.

 

The Script

Once you have the asset pack imported, you simply need to add the Ansel script to your camera.

If you’re doing this for VR using the SteamVR CameraRig prefab, make sure to attach it to the child camera (eyes), not the parent.

Not Allowed – The Command Line to enable it

Before you can start taking screenshots though, you’ll need to run the following command to ‘whitelist’ everything (unless you have your app whitelisted).

NvCameraEnable.exe whitelisting-everything

You can find the executable in your NVIDIA install folder.

Mine is in C:\Program Files\NVIDIA Corporation\ansel\Tools

Activating & Taking Screenshots

To activate Ansel, create a build, run your build, then press Alt-F2.

You’ll have a nice dialog appear with a variety of options for taking and adjusting your screenshots.

Movement

While you’re taking screenshots, you can move the camera around!

Use W,A,S,D & X,Z to move, and hold left mouse to rotate.

Results

I wanted to share an image of the super sized screenshots.

Click on the image to view the full size version.

VR Issues

My main use for Ansel is to take great screenshots of VR games.

Unfortunately, it’s a bit more complex than I’d expected, with a few issues that popped up for me.

Luckily, those can be overcome pretty easily.

  1. Single Pass Rendering – This is not supported at the time of this writing.  It does appear to be coming soon, but from what I’ve seen, screenshots with SPS enabled do not appear to work properly.
  2. VR Support & Camera Movement – If you have “VR Supported” checked in player settings, you won’t be able to move the camera around.  In my case, I was able to un-check the option and still take my screenshots, though I’d really prefer being able to move it without breaking VR mode.

Overall Conclusion

I really like Ansel and what they’re doing to make 360 & VR media easier to share.  Right now, it’s a bit rough, but I fully anticipate that to be resolved with the 2017.1 version.  NVIDIA has some great engineers and Ansel is just a small part of what they’re providing on top of all the awesome hardware (VRWorks looks amazing).

Continue reading >
Share

Unity WebGL Deployment

By Jason Weimann / April 27, 2017

So you’ve built a webgl game and you want to share it…  It’s time to find some unity webgl hosting..  Deploying a webgl game to dropbox or google drive used to be easy simple options, but if you’ve done some research, you’ll see that they’re dead.  Luckily, there’s still a great, easy, and FREE way to host your webgl deployment.

Introducing Azure for Unity WebGL Hosting

Setting up this deployment can be done in under 10 minutes (even faster if you already have an azure account setup).  Once it’s setup, deployment is as simple as an FTP copy (drag / drop the folder over).

Creating the account

If you don’t already have an azure account, create one here: https://azure.microsoft.com/en-us/free/

Once your account is ready, sign into the dashboard.

Creating the website

Click on New

Select Web App from the list

 

Give your site a name

Choose a resource group (or create one by typing in a new name)

Keeping it free

Unless you’re ready to pay for the hosting (and have a need for it), you’ll want to set your service plan to free.

Click on the App Service plan link from the overview window

Select “Scale Up”

Choose the “Free” tier and click select.

Setting up FTP Credentials

Once you’ve created the website (it’ll take a few minutes to be ‘ready’), you’ll need to setup the FTP deployment credentials.

Select “Deployment Credentials”

Choose a username and a password.

Click Save

FTP Setup

If you have an FTP client you already use and like, go ahead and stick with that.

If not, I’d recommend FileZilla – https://filezilla-project.org/

With whatever FTP program you use, you’ll need the FTP Hostname and deployment name.

You can see these on the overview page.

Notice that the username is prefixed with the website name and a backlash.  Mine is “webgldemo2\webgldemodeployer”

 

Enter your FTP credentials and connect.

Uploading the FTP

Upload all the files into the wwwroot directory.

Just drag them over (if you’re using something like filezilla).

IMPORTANT – Adding the config

You may have noticed there’s a web.config file in that folder.

Without that file, your game won’t load correctly.

To make it simple, you can download that web.config file here: https://unity3dcollege.blob.core.windows.net/site/Downloads/web.config

 

Conclusion

That’s all there is to it, now that you’ve deployed your game, you can play it by loading the webpage.

To find your web address just take another look at the overview page.

Continue reading >
Share

Unity GPU Instancing

By Jason Weimann / April 25, 2017

Unity 5.6 is available now and has some great performance improvements you can take advantage of.  Today I’m starting a new series covering some of these enhancements and the performance benefits you can gain.  These simple tips will keep your FPS high and your players happy.  This post will cover the new Unity GPU Instancing system.

Introducing Unity GPU Instancing

GPU instancing allows your to render a large number of the same mesh/material in just a few draw calls.

While it was possible to get instancing working partially in 5.5, full support is now available in 5.6.

Now, the standard shaders all support GPU instancing with the check of a box.

How do I use it?

Enabling Instancing is actually very simple in most cases.  If you’re using the Standard shader, look to the bottom and check the “Enable Instancing” box.

That’s all you need to do!

If you want instancing for other shaders, you can check out more details on how to set that up here: https://docs.unity3d.com/Manual/GPUInstancing.html

 

When should I use it?

It’s important to note that GPU instancing doesn’t work on skinned mesh renderers, so you can’t just drop it on any character and see the gains.

Where you’ll see the most improvement is with a large number of objects which share a mesh but have some variety in scale.

For my testing, I setup an asteroid field with 1000 asteroids from one of the free space packs on the asset store.

I generated them with random locations and scales, then compared the draw data.

Here’s the script I used to generate the asteroids:

The Results

Instancing Off

Instancing On

That’s right, the draw calls drop from 4165 to 11!  It’s a huge difference.

What about dynamic batching?

You may be wondering how this is different from dynamic batching.

There are 2 things that really stand out to me.

#1. Dynamic batching requires the objects to be the same scale.  In many cases that won’t matter, but for things like asteroids, trees, or other objects where you want scale variety, instancing comes to the rescue.

#2. GPU instancing is done on the GPU while dynamic batching is on the CPU.  Moving this load over to the GPU frees up CPU time and makes for more efficient use of the system overall.

Conclusion

Instancing is here, it’s great, and you should definitely start checking that box whenever you’re gonna have many instances of your mesh in the scene.  It’s an easy to use optimization and may be one of my favorite Unity 5.6 changes.

Continue reading >
Share

360 Video in Unity 5.6

Unity 5.6 introduced a bunch of great new features including a new video player.  If you tried playing video in Unity 5.5, you know it’s a bit of a pain and generally requires an expensive plugin. I tried doing some Unity 360 Video projects in the past and ran into nothing but pain.  I’m happy to say though that the latest version makes it much easier to render 360 video in Unity.  The contents in this guide was written with the and in mind, but should be applicable to any setup.

Note on 3D 360

This post covers 360 video but not stereo 3D video.

If you’re looking to play 3D 360 Video check out this post: https://unity3d.college/2017/07/31/how-to-play-stereoscopic-3d-360-video-in-vr-with-unity3d/

How to get started

Before you can render 360 video, you’ll need a surface to render it on.  Typically, we use a sphere with inverted normals (the sphere renders on the inside only).

You can create your own, search online, or download this one here: https://unity3dcollegedownloads.blob.core.windows.net/vrgames/invertedsphere.fbx

To create an inverted sphere, follow the steps outlined here – https://unity3d.college/2017/05/15/building-an-interactive-mobile-360-video-player-gearvr/

Setup

Create a new scene

Drag your inverted sphere to the Hierarchy.

Reset it’s position to (0, 0, 0), if it’s not already.

Set the sphere’s scale to (-3000, 3000, 3000).

If you’re using a different sphere than the one provided above, you may need a different scale value.

Without the scale’s X value is set negative here, the video would appear backward

Reset your camera’s position to (0, 0, 0)

Select a 360 video to play.

You can shoot your own or download one..

I used this video to test – https://www.youtube.com/watch?v=a5goYOaPzAo, but most 360 videos should work.

If you aren’t sure how to download videos from youtube, this is the site I use to do it: http://keepvid.com/

Drag the video onto the sphere in the scene view.

Select the Inverted Sphere

You should see the Video Player component added to the sphere.

You’ll also see the Video Clip assigned with your 360 video.

Because Play on Awake is selected, you can just hit play now and watch your video start playing.

Play Controls

Ready to add some basic controls to the player?

Let’s add keyboard controls to pause and play the video.

Create a script named VideoPlayerInput.cs

Replace the contents with this:

Add the VideoPlayerInput component to the Sphere.

 

Press Play

Use the “Space” key to pause and resume playing of your video.

Conclusions

With Unity 5.6, playing video has become simple.  You can play 2D video, 360 video, and even render video on any material you want (we used an inverted sphere, but you could attach it to a fire truck or whatever other model you pick really).

Try out all the new functionality, if you build something interesting, share it below!

Continue reading >
Share

Add Unity Analytics to your Unity3D game today!

By Jason Weimann / April 9, 2017

Should you use Unity Analytics?

Definitely!!!  If you’re building a game and expect to have anyone playing it, there’s really no good excuse not to use the Unity Analytics system.

The Unity Analytics system is free, easy to integrate, and if used properly can provide great insight into what your players are doing and how to make your game better.

With even the most basic Unity Analytics setup, you can track player growth and retention, with nothing more than the flip of a switch.

If you decide to go further though, the possibilities are amazing.

How do I enable Unity Analytics?

To enable the analytics system, you need to open the Services Tab

If you’re not logged in, you’ll see the Sign in button.

Sign into your Unity account.

Next you’ll see a screen to create your project, just click the Create button.

Click the button on Analytics to toggle it from Off to ON

Click the “Enable Analytics” button

If your game is targeted at kids, check the box.

Then click the Continue button

That’s it, basic analytics are enabled.  See how simple that was..

How Can I test it?

The first thing you can do is hit Play.

When you do, you should see data appear in the Validator (still on that analytics/services tab)

Checking out your data on the webpage

You can visit the analytics site here: https://analytics.cloud.unity3d.com

When you do, you’ll see your project but no data.

This is because there’s a delay in the data processing.  In my experience it can range anywhere from a few hours to a day.

So don’t worry, your data is probably getting written, just check back later to view it.

Custom Unity Analytics Data

What we have so far will give you some really useful info like Daily Active Users (DAU), Monthly Active Users (MAU), retention, session count, and more.

But to get the most out of the system, you really want to track actionable items.

Custom data is easy to track too though, it can be as simple as a single call to Analytics.CustomEvent, or you can build a dictionary of data and send that all at once.

Personally, I like to create a new class to handle wrapping all my analytics calls so that they’re not scattered throughout the project.

Let’s take a look at a sample of how I prefer to setup mine.

I start by preventing the object from getting destroyed when we switch scenes (I always include this object in my opening/loading/menu scene).

Then I’ll register for events on things I care about.  In this example, I’m registering with our level controller and player controller events.

Level Events

The LevelController event will happen whenever a player beats a level, and we simply send an event with a string that appends the levelName.

Since this data is pretty simple, I prefer to have it in an easy to use format that I can build funnels on in the Analytics webpage UI.

Player Events

The OnPlayerDied event has a bit of extra info I want though.  In there, I track the position the player died at and what killed them.

I then put that data into a dictionary and pass that dictionary into the CustomEvent for analysis later.

Analytics.FlushEvents

The last thing to notice is in the OnDestroy method.  Here, I force the analytics system to push the data up to the servers.

To minimize networking data, the system determines when to send updates on it’s own.

If we don’t call FlushEvents on close, we may not get the players analytics data until the next time they launch the game.

If they never launch it again, we’ll never get the data, and never have any insight into why they didn’t come back.

 

Conclusion

If you’re not using analytics, just turn it on…  Even if you’re not sure what data you want to capture, or what you’ll do with it…. just turn it on.

And once you have some idea of even the basic things you may care about, where’d my players die, how many levels are they beating, etc… you can start tracking that data easily.

Maybe you’ll find out “NOBODY beating level 3!”, or “Nobody’s activating this special ability”…  then you can adjust level 3, change the instructions or controls for the special ability, and compare..

Or alternatively you can make wild guesses about what your players are doing and what they like…. but I think you’ll prefer getting the data 🙂

Continue reading >
Share

HTC Vive Tracker Unity3D / SteamVR Setup

Friday, my Vive trackers finally arrived.  When I opened them, I wasn’t sure what to expect.. would they be cool, could I do something fun with them, or will they just sit on a shelf?  After just one day, I have the answer.. they’re Awesome!

So far, I’ve barely played with them, but I’ve already put them on my feet, stomped some civilians and kicked some buildings..

And now I’ve strapped one on to a tennis racket and dropped my wife onto a baseball field to smack tennis balls.

How do you use them?

Since they don’t really require any special code to work, I wanted to give a quick demo of how to make them work in Unity with the SteamVR plugin.

Camera Rig Setup

The default camera rig gives you a left and right controller, and works fine as a starting point for many VR projects.

To add trackers, create new children under the [CameraRig] with the context menu.

I used cubes to start and scaled them to (0.1, 0.1 , 0.1 ).

Once you create the objects, add a SteamVR_TrackedObject script to them.

Now select the [CameraRig] prefab and add the tracker(s) to the Objects array.

Turn on all the controllers & tracker(s).

I’ve noticed that the trackers don’t show up if both controllers aren’t on.  I haven’t yet dug into why, so if you already know, drop a comment at the bottom and share please 🙂

Hit Play.

That’s it… they’ll just work and track from there.

The Tennis Racket

One thing to note when you’re setting these trackers up is the position of the tracker on the object.

Unless you’re holding the tracker, it’s going to be offset a bit, and you need to take that into account when positioning things.

For example, with my tennis racket, the tracker is attached to the strings like this.

In-game, my tracker object has the racket as a child, and the object is lined up so that the rackets pivot under the tracker matches closely with where the tracker is placed on the actual racket.

 

Conclusions & Other Games

I have to say I really love these things.  They’re pretty light, they track great just like the controllers, and for some reason I feel more comfortable attaching them to things.. (even though they’re almost the same price).

If you’re unsure about getting one, I’d say do it… they’re well worth the cost IMO even if you’re just playing around with them in the editor.

When it comes to in-game support, I think it’ll be a bit before they’re common, but I do expect to see games start adding capabilities over the next few months, I know I really want to put support in a few games myself.

In conclusion though, I’m excited to see what people come up with, and to experiment and make fun things..  If you happen to have some ideas, please share below or drop me an email.

Continue reading >
Share

Contest Winners!

By Jason Weimann / March 7, 2017

Many weeks ago, I sent out an open invitation for members of the site to enter a special contest.

The rules were simple, create a quick game with Unity and submit a video or playable version, and try to use an asset from the contests sponsor BitGem.

The response for this contest was great, we had a ton of awesome submissions come flying in from all around the world.

Today, I’m going to reveal the winners to the public and give you a quick glimpse of some of the projects that were created.

First Place “ImpBall” – $100 Cash

DOWNLOAD (51MB)

The first place winner came from Adrian Higareda.

ImpBall blew me away.  It’s a really neat little game with a lot of character to it.  Even the UI was really well done, and the game is just fun.  It’s even got good audio..  all around it’s a very awesome submission and worth trying out.

 

 

Second Place “Ad Noctis” – $100 BitGem Voucher

DOWNLOAD (42MB)

Ad Noctis surprised me.  Submitted by Javier Alvear, it starts out with a cool cut scene (done in-game), then drops into a boss fight.

You play as an archer battling a boss monster built from a chest full of weapons.

The boss has special moves, charging and swinging, and is difficult to beat…

Overall, it’s a great experience and a really awesome submission.

Javier also has a facebook page for his team here – https://www.facebook.com/KiltroGameStudio/?fref=ts

 

Third Place “Skelinvasion”

DOWNLOAD (21MB)

And rounding off an extremely close 3rd place is “Skelinvasion”.  It felt really polished and one my fellow reviewers described it as “Like Gandalf meets Legolas”.

Skelinvasion is a of course a game where you fight off an invasion of skeletons.  Created by Miguel Martorell, it’s a fun 3rd person experience well worth checking out.

He’s even created a nice youtube video to show off some of the gameplay.

Honorable Mentions

The top 3 winners were great, but there were so many great submissions that at the very least deserve some mention and applause.

Qutopia

A very cool Augmented Reality experience

BowmanVSZombies

Arrows and waves of zombies, you’re the bowman trying to stop the horde!

Skeletons and Treasure

This was a great Android submission and is even available on the google play store.

ARTest

Another augmented reality submission that used multiple devices.  Definitely some potential here.

BenBones

I loved the name of this one and the game.  An endless runner with tons of levels and a load of fun.  If there was a 4th place, this probably would have taken it

StopThem

The second Android game in our mentions, a cool endless wave game with nice tap inputs.

 

More Contests?

I really enjoyed this experience, the submissions were amazing.

Since the contest ended, I’ve had a few people ask me if there’d be another contest anytime soon.

If you’re interested in being a part of one of these, or just want to see others submissions, maybe even with some public voting, drop a comment at the end to let me know.

The more comments that appear, the sooner I’ll kick off contest #2 🙂

 

Continue reading >
Share

Pooled Decals for Bullet Holes

The Bullet Decal Pool System

Todays article came out of necessity.  As you probably know, I’m wrapping up my long awaited VR Course, and one of the last things I needed to create is a decal setup for the game built in it.  To do decals properly, you’d want a full fledged decal system, but for this course and post, we have a system that does exactly what we need and no more.

What is that?  Well it’s a system to create a bullet hole decal where you shoot.  And do to it without creating and destroying a bunch of things at runtime.

It’s worth noting that I wrote this system for a VR game, but it is completely applicable to a normal game. This would work in a 3d game, mobile game, or anything else that needs basic decals.

The end result will look something like this

How do we build it?

Let’s take a look at the code

Code Breakdown

Serialized Fields

We open with 2 serialized fields.

bulletHoleDecalPrefab – The first determines which decal prefab we’ll use.  If you’re building a more generic decal system, you may want to re-name this.  Because it’s part of a VR course, I left the name as is, but if I were putting this in another game, it’d likely be more generic or maybe even an array that’s randomly chosen from.

maxConcurrentDecals – This sets the maximum number of decals the system will show.  We do this primarily for performance, but also to avoid visual cluttering.  Having too many decals could cause a hit on rendering, remember each one is a transparent quad.  This number is variable in the editor though, so you can adjust it as you see fit for your game.

 

Private Fields

We have two private fields in this class.  They’re both using the Queue type to keep a first in first out collection of decals.

decalsInPool – This is where we’ll store the decals that are available and ready to be placed.

decalsActiveInWorld – These are the decals that we’ve placed in the world.  As our pool runs empty, we’ll start grabbing decals from here instead.

 

Awake

Calls our InitializeDecals method()….

 

Private Methods

InitializeDecals() – This is our setup.  Here, we create our queues, then we use a loop to create our initial pooled decals.

InstantiateDecal() – Here we do the actual creation of a single decal.  This is only called by InitializeDecals & a special editor only Update you’ll see soon.

GetNextAvailableDecal() – This method gets the next available decal…. useful description eh’?  It actually just looks at the pool, if there’s at least one decal in it, the method returns the first one in the queue.  If there’s no decal in the pool, it returns the oldest decal that’s active in the world.

 

Public Methods

SpawnDecal(RaycastHit hit) – This is our only public method, it’s the one thing this class is responsible for doing.  In the code that calls it, we’re doing a raycast to determine where our bullet hits.  The raycast returns a raycasthit and we pass it into this method as the only parameter.

The method uses GetNextAvailableDecal() and assuming a decal is available, it places that decal at the raycasthit.point, adjusts the rotation to the raycasthit.normal, and sets the decal to active.  The method ends by adding the decal to the decalsActiveInWorld queue.

 

#if UNITY_EDITOR ????

Everything else in this class is actually wrapped to only run in the editor.

This code has a single purpose, to update our queue size at runtime.

It’s absolutely not necessary for your decal system, but it’s a nice little thing I enjoy having 🙂

I won’t cover each method, but you should play with the queue size at run-time and watch as it keeps everything in sync.

 

 

 

Continue reading >
Share

Oculus Haptic Feedback

Oculus Haptic Feedback

Why you need it and how to get started quickly

Vive & Oculus Haptic feedback systems are extremely important and often overlooked…  On a pretty usual basis, they get overlooked or added in last second (I’m guilty of that myself).

Adding just a little haptic feedback in the right places gives a huge boost to immersion, and leaving it out gives the feeling that something just isn’t quite right.

Why’s it hard?

SteamVR haptics on touch controllers don’t work at all…. (at least at the time of this writing)

Recently, Oculus changed their haptic system, breaking my old code when I upgraded the SDK….

So I’ve written a wrapper.  It handles haptics for either system, and makes it extremely easy.

For this post, I won’t dig into the entire cross platform wrapper, but will instead give you the basics to get started with Oculus haptics in a quick and easy way.

Prerequisites (Oculus)

For this to work, you’ll need the oculus utilities for Unity in your project.

https://developer3.oculus.com/downloads/game-engines/1.11.0/Oculus_Utilities_for_Unity_5/

Shared Code

In the more complete version of my projects, I have quite a bit of shared code and swap between implementations with #defines.

For this simplified sample though, we still have one piece of shared code.  That code is an enum which specifies how strong the feedback vibration effect should be.

The Code (Oculus)

The Oculus code is designed to bypass the need for custom audioclips.  While the clip system is pretty cool and could do quite a bit, it’s not cross platform, and much harder to get started with in my opinion.

 

In the code, we generate 3 OVRHapticsClips named clipLight, clipMedium, & clipHard..  As you may have guessed, these correspond to our enum values.

Use (Oculus)

To use the haptics code, add the script to your controller object.

Assign the controller mask as L Touch or R Touch (whichever matches the controller it’s attached to).

Then call the Vibrate method when you want it to shake.

Demo (Oculus)

If you’re not quite sure how you’d use the code, here’s a real quick sample.  Ideally, you’d have something per controller, managing events for the controller, like a generic VR Controller script that ties these all together and works cross platform.

But to get you started quickly, here’s a simple sample that will vibrate the controllers when you press space.  (just remember to assign the 2 controllers)

What about the Vive?

I’ll cover Vive haptics soon, likely with a more generic implementation that you can use across both platforms.

If you’re really interested in Vive development though, I’m working on a quick weekend long guide that covers everything you need to know to get started, including haptics.  Just sign up (at the top of this post) and I’ll get you the info as soon as it’s ready.

Continue reading >
Share

GIT for Unity Developers – Remotes

By Jason Weimann / February 7, 2017

What’s a Remote?

So far, everything we’ve done is on your local computer.

We haven’t added any remote servers, and if your computer gets stolen, hdd dies, or some other disaster, you’ll wish your project was backed up somewhere else.

Adding a remote server also allows you collaborate with others, but even if you don’t plan to do that, the benefits for yourself are more than worthwhile.

Adding a Remote

First, you’ll need to create an account on one of the big remote git hosting sites.

You’re welcome to use whichever you like.  I use BitBucket so I’ll guide you through setting one up on there.

Create an account on https://bitbucket.org or log into your existing one.  (Sourcetree and BitBucket use the same credentials)

Once you’re logged in, you’ll be at your “Dashboard”

From there, click the repositories menu, then “Create Repository”.

Give the repository a proper name and click “Create repository”.

Once the repository is created, you’ll be redirected to the project “overview” page.

On there, you’ll see a URL that looks something like this, but with your username.

Select the text and copy it.  This has the full URI for your repository remote.

For example, mine is: https://jasonweimann@bitbucket.org/jasonweimann/git-branching.git

Back to SourceTree

Now that you have your remote URI, switch back to sourcetree then select Repository->Repository Settings…

Click the “Add” button to add a remote.

Check the “Default Remote” box.

Paste your URI from the clipboard into the “URL / Path” field.

Click OK.

When you’re prompted to log in, use the same credentials you used for the bitbucket account.

Pushing to the Remote

Now it’s time to push.

The Push command tells GIT to send your current branch to the remote repository.

Click the “Push” button.

Click the checkbox for “master”, then click OK.

You’ll see the progress bar for a second or two.

If all goes well, you’ll be back in your “master” branch view.

Take note of 2 new things though.

On the left, under “Remotes” we now have “origin”.

And in the commit history, we can see that the “origin/master” branch is set to the same commit as our local “master” branch.

This means that our branches match, which we’d expect since we just pushed our master branch to the remote “origin”.

Pulling from Remotes

Pushing your work to a remote server is great, but if you don’t know how to pull from the server there’s no value.

Of course like most things in GIT, it’s not too hard of a process, but there are some intricacies that are important to understand.

To try this out, let’s clone the repository into a new folder, from the remote.

Click the “Clone / New” button.

You’ll be presented with a dialog to enter your Source URL & Destination path.

Paste the source path into the URL that you used for the remote origin.

For the destination path, make sure the directory is DIFFERENT from the directory you’re working in.  In my screenshot, you can see I’ve placed it into a folder named “gitpull”.

Click the button.

Wait for the clone progress window to finish.

When it finishes, you’ll be greeted with something that looks very familiar.

Why does this look exactly the same?

What we’ve done here is take our remote repository and clone it to our local system in a brand new folder.

This is the same process that another person would follow to clone your repository and work with you.

I want to be clear though that we’re doing this for learning purposes only, there’s no good reason I can think of for you to clone your own repositories in more than one place on a system.

If you want to work on a desktop and a laptop though, or some other multi computer setup, this is exactly the process you’d follow (though you could have the same local directory / Destination Path since they’d be on different computers).

Let’s Push and Pull again!

Switch to the Git Branching repository by clicking on the tab.

All of your recently opened repositories show up as tabs in sourcetree.  If you don’t see the one you want, they’re also bookmarked on the left.  Double click the one on the left to open it as a tab.

Select the master branch.  It should look like this.

Back to Unity

Let’s jump back over to Unity now.

Right click on the TallBox and create a new Cube as a child.

Adjust the transform values so it looks something like this.

Apply the prefab.

Save the project.

Before we go back to GIT

Let’s open another instance of Unity.

Open the project located in the folder you did the pull to.

Mine was c:\git\gitpull.

It should look just like your project did before we edited the prefab.

Back to SourceTree!

Okay it’s time to commit, push, and pull.

If your working copy looks like this, go back to Unity and make sure you saved.

Hit the Stage All button.

Notice that in my unstaged files, I had the TestScene, but when I hit Stage All, I only have the prefab.

This is because the Scene isn’t actually changed.  The timestamp on it updated when I saved it earlier so it showed up in the Unstaged area.

But the contents didn’t change at all since we only modified the prefab, so when we go to stage, it does a comparison and realizes it doesn’t need to be committed.

Enter a commit message “Added top to the TallBox” and click commit.

Now push the commit to your remote.

You’ll see a dialog like this again, click Push.

Now select the tab of your gitpull repository.

Nothing changed??

It probably looks the same, that’s just because it hasn’t refreshed.

Click the Fetch button to get a refreshed view of the repository and it’s remotes.

You’ll get this dialog, leave the default and just click ‘OK’.

It’s changed! The remote is ahead

Now when you look at your master branch, you’ll see something that could be confusing.  I know it confused me at first…

What this image is conveying is that the “origin/master” branch on the remote is ahead of my local “master” branch.

The local master branch even says on it that it’s “1 behind”, meaning there’s 1 commit on the remote that we don’t have locally.

 

Don’t Pull yet

We could hit pull and get our local master up to the same commit as the remote, but before we do that, let’s reproduce a VERY common issue that people get caught up on every day.

This needs to be done in the other instance of Unity not the primary one, need to update this and talk about opening the other instance.

Go back to Unity

Make sure you’re in the the ‘gitpull’ project

To make things easier, I’d recommend opening both project in separate Unity instances if you can.  If you’re resource limited though, you can just open the projects as needed.

Open the TestScene.

In the Project View, create a new folder and name it “Materials”.

In the Materials folder, create a new material, then name it “Blue”

Select the “Blue” material and change the color in the inspector.

Assign the blue material to the tallbox prefab.

Click Apply to update the prefab

Save your project

 

Save the scene.

 

Back to SourceTree

In the ‘gitpull’ repository, click pull.

Click ‘OK’

 

An ERROR!

You should have received this error message.

It’s telling you that you can’t pull because you have local changes that would be overwritten if you did.

Let’s look at those local changes.

Go to your “Working Copy”

Here, you’ll see we do have changes, and since TallBox.prefab was modified in the most recent commit, it definitely would conflict.

Let’s see how to resolve that!

Stage your changes.

Commit your changes in the working copy.

Switch back to the “master” branch view.

Pull again

Success!!!

But what’s this??  “Uncommitted changes”??

What’s happened here is GIT has automatically created a merge for you.  It knows that there were changes to the TallBox.prefab file from a remote, and that you had changes to it locally.

Now it’s up to you to decide how to resolve it.

Go back to the “Working Copy”

Great, in this instance there’s no conflict!  We’ve done a successful merge, and because the properties we changed were different, everything worked automatically!

It’s worth noting that often you’ll run into a conflict.  In that case, it’s up to you to decide which version to take.  In some instances, you can combine the changes and get something that works, but with prefabs, if there’s a conflict, it’s often more difficult to work around a conflict than to just re-create the change.

Click Commit

Go back to Unity

Check out the merged prefab!

 

Notes

This guide has taken you through a lot, and if you aren’t really comfortable with this process yet, try repeating it 2 or 3 times.  By then, if I haven’t totally messed this up, it should be a bit more apparent how things are merged, how remotes work, and how you can get some real value out of git.

Again, I’d really recommend you practice with this simple project before trying to do it on your real projects.  Here, if something goes wrong, you just start right over, in a real project, if you mess up, there’s the possibility you waste hours trying to fix stuff 🙂

If there’s demand for more on GIT, I’ll continue on with this series.  There are still plenty of topics to cover… branching, more in-depth merging, stashing, etc…

If you’d like more GIT tutorials, just drop a comment and let me know.

Thanks!

Continue reading >
Share
1 14 15 16 17 18 21
Page 16 of 21