Thursday, February 28, 2013

Kinect + Unity 3D Game Development - The Great Task List

As I mentioned before, I am a VERY novice game developer. And as such, there are so many things I need to do and learn in order to make progress, it's getting really difficult to keep track of them all.  And so, I believe it's high time I start a good, ol' fashion Task List.

Below you'll find an ever-growing list of tasks that I've either completed or need to complete in order to finish my first Virtual Reality / Kinect game.  While this is really my blogging version of thinking out loud, I do hope it ends up helping people who are in a similar position as I am as I first start writing this. 

TO DO:

1. Hook up the Kinect and settle on an SDK which binds it to Unity 3D - DONE. Here I explain why I chose what I chose.

2. Understand avatar tracking, and successfully create a trackable avatar of my own

3. Create a split screen view:  On the top of the screen I want to see the avatar from a distance, and on the bottom of the screen I want to see a first person view

4. Make it so that when the 1st person view camera moves using the mouse, the avatar head actually moves accordingly.

5. Ensure that the head movements are realistic.

6. Can the head be moved using the Kinect? Try and see.

7. Make another avatar which faces the first avatar, but about 10 feet away (maybe more)

8. Animate that avatar doing a throwing motion on key press.

Kinect + Unity 3D - Zigfu's ZDK vs. Omek Beckon SDK + Motion Toolit

I must admit, I'm a bit uneasy about writing this post.  I've been in a week-long correspondence with the Omek Beckon team about getting their SDK working on my computer, and I they have been more then willing to try and help. 

Which is why I feel bad saying that I've chosen to go with their competitor, Zigfu.

For those who have no idea what I'm talking about, let me provide a little context with a quick recap: I'm trying to create a game which will incorporate Kinect movement tracking and (eventually) the Oculus Rift VR glasses, but I'm very much a beginner at game development. And while I've been programming for quite some time, I find my baby steps into this new programming genre are fraught with numerous challenges.

One such challenge has been deciding how I'm going to utilize my Kinect sensor within Unity 3D, my chosen game engine.  At the time of this writing, there are really only three main options:

1. Write a Unity-Kinect wrapper myself which would bind the device's SDKs to Unity

2. Zigfu's ZDK

3. Omek's Beckon SDK and Motion Toolkit

Let's just nix the first option right here.  I don't feel like reinventing the wheel. So that just leaves Zigfu versus Omek. And the best way to compare the two is to list both of their pros and cons.

Zigfu Pros
1. Has a free development version which is just as up-to-date as the professional version
2. Built off of the most updated versions of OpenNI, NiTE, and official Microsoft Kinect SDK
3. Clearly has an increasingly growing online community happy and willing to provide support
4. Very easy to install and get going
5. A good handful of example scenes that come with the installation
6. Apparently written by some REALLY smart guys

Zigfu Cons
1. In terms of official online support, I didn't see an easy way to contact the Zigfu guys.  But they seem to be active on the unitykinect (or maybe it's kinectunity) Google Group, so if you post there, you'll probably get a response from them directly
2. From what I can tell, no official documentation about their SDK or API

Omek Beckon SDK Pros
1. A development team which is approachable and very willing to help
2. Excellent documentation regarding installation procedures, API reference, and more
3. What appears to be a very good, intuitive API and Motion Toolkit package for Unity
4. They also have a free development version

Omek Beckon SDK Cons
1. It doesn't work on my development computer, and nobody seems to know why
2. It appears to be built off of less up-to-date code, and it doesn't work at all with the official Microsoft Kinect SDK
3. There is clearly less of an online community, which makes finding answers to technical or programming problems a lot harder
4. After getting it installed and working on a different computer, I couldn't get my avatar to jump, and that pissed me off

And so, i've decided to go with Zigfu. Now you might disagree with me, and i'd welcome that. The Omek Beckon guys were so helpful, they deserve another shot. Just not by me; it still doesn't work on my development computer.

Monday, February 25, 2013

Kinect + Unity 3D Glossary: Sorting out the noise

The goal seems simple enough.  I just want to build a 3D game in Unity which utilizes the motion-tracking capabilities of the Kinect and the amazing VR abilities of the Oculus Rift.  Since the Oculus won't actually be delivered to me for another few months, that leaves integrating only the Kinect. Seems simple enough.

The problem is, I have no idea where to start.  I've only dabbled in game development - and by dabbled, I mean I watched all the "Getting Started With Unity" tutorials at Unity Cookie - and I know nothing of motion-tracking.  But I do know programming, which is why I thought things would be super simple. So simple, in fact, that by the time I opened my nice and shiny, brand new Kinect 360 sensor, I had practically deluded myself into thinking that I was just a few tiny drivers away from seeing an avatar of myself jumping up and down on my computer screen.

Yeah, no.

Right after opening that box and seeing the beautiful Kinect staring back at me, I started looking for the necessary drivers/wrappers/etc.  It only took about ten minutes for me to understand how much I don't understand.  The problem was an overload of new terminology.

So to make things easier for me, and hopefully for whoever else runs into the same problem, I'm breaking things down here. In many cases, I'm not necessarily going to paraphrase what was written about some of these glossary terms, because frankly some of it doesn't yet make perfect sense to me.  But I will provide the links I found useful in deciphering exactly what was needed, what wasn't, and why.   


Glossary Relating to Motion-Tracking and Unity

 OpenNI  - An open-source framework for 3D natural interaction sensors. I found a good link here which explained why it would be very good to download and use OpenNI 2.0 as a means of connecting to and utilizing the Kinect.  According to the author of that link, the SDK "gives you access to the raw data provided by the sensor" (in my case, the Kinect).

NiTE - Termed "middleware," it is described by the author of this article  as being the glue which allows the respective sensor SDK to access to higher level "processed" - gestures and skeleton detection and tracking of the sensor. According to that same author, PrimeSense's middleware, NiTE 2.0 has been revamped to work with the new OpenNI 2.0 SDK.

Microsoft Kinect SDK - This is the official SDK created by Microsoft for their Kinect sensor.  I imagine there is a lot of overlap between this and the OpenNI/NiTE, but even if you are using  OpenNI/NiTE, you still need to download this because it comes with some drivers that you apparently can't get anywhere else (mentioned in the same article as before). It should be noted, however, that according that same author, "The latest Microsoft SDK also has more resources and features [then OpenNI/NiTE], and comes with the 'middleware' built in. It also allows you to code in .NET languages as well as C++." 

PrimeSense - A company which specializes in Natural Interaction device development (basically, devices that pick up your movement). These guys are in no small part responsible for the magic inside the Microsoft Kinect, and are now currently producing devices that will be in direct competition with the Kinect. Furthermore, as mentioned before, PrimeSense created the NiTE middleware (just scroll down until you get to the NiTE section).

SensorKinect - Written by PrimeSense, it described by this article as follows: "In simple terms, OpenNI and SensorKinect are the drivers that help to access RGB-D data [Depth & Color] from the Kinect. Interestingly, the latest version of OpenNI (2.0) makes installing the SensorKinect redundant.

KinectWrapperPackage - The KinectWrapperPackage is the Unity wrapper used by this project to bind the official Kinect SDK (version 1) to Unity. I've included it in this list because it came up a lot in my searches.  Personally, I wouldn't use it because: a) it doesn't seem to be updated or continually supported, and b) it is based on an old Kinect SDK and an old version of Unity, and therefore not all of the functionality will work anymore (unless you want to continue developing using the old stuff).

KinectSDK / Unity3D Interface v5 - Another wrapper which binds the Kinect SDK to Unity.  And while this seemed to be a good solution at the time it was written, it hasn't been updated since early 2012, the author has stated explicitly that he has no intentions of updating it, and it was only written to be compatible with the official Kinect SDK v1. But the source code is there, if one is so inclined to try and adjust it to be compatible with more current SDKs.

UnityWrapper - The open-source UnityWrapper appears to be the first official attempt to bind OpenNI to UnityHowever, it was written using previous versions of the various SDKs, and - from what I can tell - hasn't been updated in over a year.

Zigfu/ZDK - Most of the founders and co-founders of this company are former PrimeSense and Microsoft men, so it almost goes without saying that they know what they're doing.  Furthermore, Amir Hirsch, the main founder (from what I can tell), helped author the popular UnityWrapper which integrated the Kinect into Unity.  Using what he built for that, he then moved on and created the ZDK - a more advanced wrapper that binds the Kinect to Unity, and exposes a lot of the power of OpenNI.  There's a free version of the ZDK for non-commercial use, and a $200 version for commercial use.

Omek Beckon SDK - According to the installation guide which you get after downloading the SDK, Omek Beckon is mutually exclusive with the Kinect SDK.  Meaning, if you have the Kinect SDK on your machine, you have to remove it. UPDATE!  I had previously said that one had to download OpenNI and NiTE to get this working, and that's entirely wrong.  One need only read the Beckon Installation Guide to get things moving.  I should note, however, that as of Feb. 24, 2013, the Beckon SDK did not use the most recent version of OpenNI, and they did use other drivers/wrappers that I've mentioned are already are considered older (such as the SensorKinect). On the other hand, they do have some really cool features, such as the Gesture Authoring Tool, as well as some other out of the box functionality from their Motion Toolkit for Unity.

Unity 3D free / pro - A fantastic program which enables even non-programmers to quickly create 3D and 2D games. 

...And that's it!  For now.  I'll add more terminology as I come across them. And if you feel like I missed something important that should be here, tell me and I'll put that up, too.

Sunday, February 24, 2013

Motion-Control Baby Steps

Click here if all you care about is getting the Kinect working with Unity 3D so you can dance like me in the below video.


In my Glossary post, I tried to make sense of all the terminology necessary in order to create a 3D Virtual Reality game using the Kinect, the Oculus Rift, and Unity 3D.  However, just knowing the terminology isn't enough.  You also have to know what to do with it.

When I first got my Kinect 360 sensor, I just wanted to use it right away.  So I downloaded the latest official Microsoft Kinect SDK and... nothing.  I hadn't done any research up until that point, so that's when I had to start.  And one of the first phrases to frequently pop up in my searches was the Omek Beckon SDK. A quick glance at it's features made me think this was a magic bullet to all my challenges:
  • Motion Toolkit for Unity3D allows you to drag and drop ready to use components to quickly add gesture from within the Unity framework
  • Simple-to-use extension for developing in C# and .NET frameworks
  • Flash wrapper for quick development
  • Gesture Authoring Tool for the creation of custom gestures in minutes
  • Ability to track up to 5 skeletons simultaneously
  • Support for multiple camera positions
  • No calibration required
I admit, my first reaction was this was going to be too easy. I mean, what's the fun in learning how to program the Kinect into my game if I don't actually do any programming?

But using the magic bullet wasn't as easy as I thought.  After removing the previous SDK that I had downloaded (the official one), I was able to successfully install the Beckon SDK (the microsoft SDK and the Beckon SDK are incompatible).  To use the SDK with Unity 3D, I next had to import from the Asset Store the Motion Toolkit. It was then time to start up Unity and run through the sample scenes.

This is where I hit some problems.  When the sample scenes used my Kinect, many of my movements weren't being picked up.  In fact, the only movements which were picked up were moving right and left and forward and back. 

And I was suddenly at a loss.  After all, I knew nothing about ...well, pretty much anything, so it was impossible for me to debug this.  So I hit the forums and tried to find other people with my problem.

Unfortunately, this version of Beckon hasn't been out for too long, and there wasn't exactly a wealth of helpful material online.  ***UPDATE! --getting Beckon installed and running has been a bit problematic for me.  I'm currently in contact with the Beckon team on the matter.  Once it's installed and I've played around with it a bit, I'm going to write a post comparing it to Zigfu's ZDK, which I talk about below.

Following another lengthy visit to Google - during which I compiled the Glossary - I discovered this very helpful little video.  Not only did it have good background music, but it provided very nice instructions on just how to get the Kinect motion tracking working using nothing but OpenNI 2, NiTE 2, and the most updated official Kinect SDK (as of the time of this writing), and not Beckon.

So now I had motion tracking working!  Yay!!  ...but I had no idea how to translate that to something usable within Unity 3D.  Back to Google.

At this point, I kept running into search results regarding something called the UnityWrapper, written by Amir Hirsch (apparently an extremely smart fellow). This enabled people to incorporate Kinect functionality within Unity 3D games.  The problem was, the last time that wrapper was updated was early 2012.  However, the Read Me there pointed me to a Zigfu repository on github, so there I went. This quickly brought me to Zigfu and their ZDK.

I must admit, I had come across this page far earlier, in some of my initial searches, and quickly dismissed it.  At first, it seemed like a quick fix which costs money.  But as it turned out, I hadn't read the fine print, as it only cost money for commercial use.  And since by this time I was getting impatient, I decided to give it a try. But before doing so, I wiped my computer of any remnants of previous Kinect SDKs/drivers, as per a piece of advice I read in this article, which stated the following:

These libraries usually do not play well with each other, with each of them requiring their own driver’s and dependencies for using them. There are options to bridge these gaps but in general terms, it will be necessary to completely remove any legacy or conflicting installations before switching from one to the other. That includes drivers, .dll’s, and registry/environment path settings.

While the ZDK download page doesn't explicitly state what needs to be installed prior to using the software, I found this elsewhere on the site:  "Like all the ZDK offerings, the ZDK for Unity3D works with both Mac and PC, with OpenNI/NITE and the Microsoft Kinect SDK, and with all consumer-level 3D sensors."  So I figured it went without saying that I needed to install the latest OpenNI/NiTe, and the latest Microsoft Kinect SDK. 

Which meant I didn't actually have to uninstall the other SDKs, but whatever.

After finishing the reinstalls, I opening up Unity, imported the ZDK package, opened up a sample scene - the avatar one looked nice - and ran it.  And... it worked!! It worked fabulously!!

So now I feel obliged to do two things: First, thank the ZDK, OpenNI/NiTE guys.  Awesome job.


Second, I should probably summarize, in just a few words, what I actually did to get things working.  Here it is:

Quick Guide To Getting Kinect To Work in Unity: 
1. Download the latest versions of OpenNI, NiTE, and the Kinect SDK, and install them (in that order) 
2. Import the latest ZDK package into Unity 
3. Go have fun.


I should note, the method I chose was in large part influenced by the fact that I didn't want to shell out money for anything. If you don't care about money, the Beckon SDK may be for you; or the pro version of the ZDK.

K, now it's time to end this post and go reverse engineer these ZDK sample scenes.

In the beginning...

I'm starting this blog at a moment that I view as a crossroads.  I've spent many years honing my programming skills primarily as a web applications developer, which has been more than fine for me.  The languages and technology I've used up until now have been plenty diverse and challenging, so I felt no reason to drastically switch gears.

Until this came along.

The Oculus Rift - the greatest thing to be invented since God


I have been obsessed with Virtual Reality since that weird Pterodactyl game came out in the mid-90's. I still remember my disappointment when Sega announced that they were not going to be releasing their Virtual Reality goggles due to some ridiculous health concerns (I think it had something to do with blindness).

And yet that obsession slowly waned as the technology seemed to inexplicably drift farther and farther away. I say inexplicably because to me, the technology was so amazing, it was perplexing why more people weren't trying harder to make it practical.


But they are now.  The Oculus is here, and my inner obsession is back with a vengeance. Except now I have years of programming experience under my belt.  Which means not only is now the time to get into game development, it's time to get into 3-D, virtual reality game development.

Oh, and since the motion-tracking technology of the Kinect is cool, too, it's time to get into that, as well.

Which is why I consider this a crossroads.  I'm switching programming gears, and jumping into an aspect of the field that I'm pretty sure I know very little about.  And since I'm certain there will be significant bumps along the way, now's the perfect time to start documenting.  After all, if it weren't for all my peers who documented their own struggles and solutions, I wouldn't be the programmer I am today.

Now I should be very clear here, I am NOT a prolific blogger.  I've tried blogging before, twice, and I've failed miserably, twice.  However, things are different now.  First, I'm no longer trying to blog about cartooning (in a previous life I was a professional amateur cartoonist); and second, I'm DEFINITELY not trying to blog consistently. I had read somewhere awhile back that in order to be successful at blogging, one had to be consistent; that only by blogging every day or so would one create and maintain a growing readership.

And so, when I started my previous two blogs, that advice was taken to heart.  ...in the beginning.  And then, relatively quickly, I started to slack.  Eventually, the only thing that I posted consistently were apologies for not posting consistently.

So let it be known right now that this blog is not going to be consistent.  I will post something when I want to, and that's it.  I will not promise massive ten-part tutorials, and I will not care about maintaining a readership base. 

But, as I explained clearly in my Mission Statement, I will post.  After all, it's time to be part of the dialogue, and not just an active listener.