In the 5th semester I had two project classes: The first one was called Active Sports Games, which is about learning to use and program Microsoft’s Kinect. The second class was called Modern Shader Techniques and about creating shaders in Unity.
Since the goal in both classes was to create a game, our team decided to combine the two classes to improve the quality of the finished game. The title of the game is Hover Hero, a temple run-esque game, that includes well-known and nostalgic objects from movies and video-games.
The game was created over a span of several months as a group project, our team consited of four members across both courses:
|Felix Kosian||Programming, Shaders|
|Alexander Epple||Programming Lead, Shaders, UI|
|Musfira Naqvi||3D Art|
The start of the project proved to be difficult. The Kinect is not the most reliable interaction method, if view is obstructed the tracking becomes unreliable und unsteady. Gestures need to be implemented very exact and detailed in order to work reliably.
There is also a heavy delay between performing a gesture and gesture recognition. All of those problems make it very hard to have fair and fun gameplay loop.
Another problem was the gameplay itself: The world shouldn’t feel too repetitive but also had to be balanced, something that (semi) procedural generation makes difficult.
Finally, all shaders had to be implemented by our team, no Unity shaders were allowed.
The Kinect issues were the toughest ones to deal with, we were however allowed to use the great Unity Kinect plugin by Rumen Filkov. With the help of this fantastic plugin and many endless hours of play testing and improving gesture recognition, the game eventually became fun to play and tracking ended up working better than we originally thought.
Most of the gameplay is using gestures, but we also allow the player to move around to avoid obstacles. Every item has its own distinct gesture bound to it, and playing the game feels very intuitive and direct because of that.
I implemented several different versions of almost all gestures until we were happy with the recognition quota. The game supports tracking of jump, crouch, shield and two fire gestures. Additionally there is a sword and step left/right gesture that are used to improve game flow and balance.
The player is measured before game starts and height, width and initial positions of certain body parts are saved. The calibration phase enables to tailor each gesture and the gameplay to the specific dimensions of the player and increased accuracy considerably.
The fire gestures and the jump were notoriously hard to implement: The fire gestures consisted of several steps, which had to be followed precisely and the jump gesture caused tracking issues and was limited by the laziness of the average player.
The game supports a throw and a Hadouken fire attack. The throw gesture works like shot put: After pushing an arm away from a shoulder in some direction, the shot trajectory can be calculated. This works great and is surprisingly accurate. The Hadouken has to be charged first by forming a ball with the hands in front of the body and then pushing it away, similar to the throw.
The jump has several parameters that need to be fulfilled to start the tracking. Ultimately the most difficult part here was to recognize the gesture early enough but not too early. Early enough so the players don’t have to jump too high and not too early so that random tracking issues don’t cause the character to randomly jump. Our final solution required the player to jump a reasonable 8 cm into the air.
The world generation was challenging as well, we did however come up with a good solution that allows for varied and balanced levels.
The generation is split into several steps, the first of which selects a biome like cave or temple. Then, either a straight or a curved element is randomly selected for spawning. Those elemets are pre-built, short tracks, that have well defined spawn points for obstacles, items and enemies.
Each obstacle spawn point has several (usually 4-6) different obstacle options, all of which are pre-built as well. This enables a fair and fun gameplay that isn’t too repetetive.
Finally, the randomized part is spawned at the end of the current one. Because this is an endless runner, this continues until the end of the game.
The shaders were overall the most fun part of the project. Everything graphics related was done using custom shaders, we of course did not implement a custom render pipeline though.
The base shader supports forward rendering of up to four lights (additional lights can be switched off for performance optimization), ambient occlusion, normal and specular maps as well as emission.
The standart shader uses the Blinn-Phong model, PBR could not be implemented due to time constraints. A special version for obstacles was also implemented: The obstacles dissolve and become transparent when getting closer to the camera, this was necessary to be able to see obstacles behind other obstacles early enough.
I also implemented several shaders used for special obstacles and items.
There is a lava shader that uses flow maps and vertex displacment at runtime to make it seem like it’s slowly flowing downwards.
A fireball shader is used for the Hadouken and the fireball projectiles. This shader uses periodic perlin noise in combination with turbulence to displace a sphere at runtime. It also manipulates the uv coordinates of the texture to make it look more dynamic.
Multiple combinations of custom billboard particle shaders and compute shaders are used for particle systems such as blood, explosions and a portal. The portal was inspired by Dr. Strange and especially difficult to implement: It is a particle system that winds up in the beginning, rotates on a circle and also has the most amount of particles.
Lastly, there is a Halo-inspired laser sword. It uses emissive glow, a colored toon-shaded outline, as well as basic lambert lighting.
Overall the game looks very good and is one of the prettiest games I made to date. The biggest future improvement would to re-implement the standart shaders using PBR and use deferred rendering.
At the end of the project, I created a gameplay trailer that features a lot of the things we implemented, the trailer is linked below. Since the game has a couple of copyright protected assets, the version on GitHub does not work right of the bat, but our latest build (requiring a Kinect V2) is available for download.
If you have any questions regarding implementation details, you can of course contact me and I will answer to the best of my ability.