IVIS is a fire simulation meant to train incident commanders. This is an early demo of the fire simulation, showing off the fire and smoke behavior. The system needs to be accurate, so things like combustion, radiation, conduction and convection are calculated in real time.
The fire system is dynamic, interactable and networked. This is to allow a team to interact and respond to the fire as they would in a real life situation.
My role is programming the character controller, simulation, and networking.
This billboard tool is created with C#. It simplifies creating billboards, by creating them in engine for the user. It creates an albedo map, normal map, height map, and a packed smoothness(g) metal(a) map. It also creates a billboard asset with a billboard ready material.
I originally started creating this tool because it seemed like there were no in engine tools that did this. I figured it would be way easier for people to generate billboard assets in engine, because then there is less back and forth time from modeling tools like maya and 3ds max.
Drone Training Module Designer
About The Tool
The module tool is used to generate training modules for Zephyr, a drone flight simulator. The tool is for level designers, so they could create modules for levels instead of the programmers. It saves files to a json file for reading in by the game manager when levels are loaded.
The node editor framework was provided by the unity community. I extend it to automatically generate nodes based on attributes on classes and fields. When the tool is opened, it searches through the assembly to find all classes that inherit from the module facet class and generate a custom node based on the fields specified with attributes. The tool will also check the parent classes for any attributes as well.
The module designer tool is still incomplete however. There is a grading system that goes hand in hand with the module system. That is what the "POE" button is for. The points of evaluation are configurable per facet and per module, so they will require special pop up ui for editing the POEs per facet and per module.
AI and Character Controllers
About The Project
This is a testing grounds project. I use this project to quickly prototype mechanics for third person or first person gameplay. The AI system is implemented in such a way that allows for quick and easy editing.
The third person controller uses a collection of different mechanics. The character is based off Unreals base character class. The controller can shoot, jump, wall climb, crouch stealth, and pickpocketing. In addition there is a health and ammo system.
This is a demo of a third person character controller made for one of my senior project classes at George Mason University. The controller uses touch controls as input and is based off the Zelda:Ocarina of Time character controller. This fealt like a good matchup due to the limited number of inputs for both games.
This controller is still rough around the edges, but proved to me that a solid 3rd person game would be possible on mobile. I learned a lot about developing input systems using these constraints. Integrating input solutions into the players state machine allowed for more control over how touches were treated for each state. This made the inputs feel more robust.
About The Game
Orignially made as a senior project in college, Gaslight is a piece of work that represents my starting point in shader programming. I handle time of day, wind, nature and camaera effects for the scene.
I worked with the art team to configure the art pipeline to mitigate time spent reworking assets. I was able to streamline the workflow to help with in engine performance. In addition, I wrote shaders for the shrubbery, leaves, terrain, and the skybox.
About The Game
Cadence Killer is a final project from Advanced Sound Design, Game 367 at George Mason University. The purpose of the final is to incorporate sound in to a game as a core mechanic.
In Cadence Killer there is a Rythm based game where the player has to play the proper notes for the song. My contribution is the rythm mechanic, which is used as means of fighting band members to solve the mystery of who killed the protagonist.
Injection is a 24 hour project for the Online and Mobile Gaming class at George Mason University. It incorporates crossplatform input management, which allows it to port directly to pc and mobile. The development period was 24 hours for this game. This was only doable based on the strict schedule we made for finishing the game. Schedule:
4 hours of asset development(art, music, ui, etc.), 20 minute nap, 4 more hours of asset creation, 20 minute nap, 2 hours of art implementation, 2 hours of level creation, 20 minute nap, 4 hours of programming, 3 hour nap, and 4 hours for playtesting and debugging.
Injection features block puzzle mechanics, where the player will often have to pick up blocks of fat and place in intricate ways to get through some tricky puzzles. There are also a lot of platforming elements. There are chase mechanics where the player must platform their way through a series of difficult jumps, and some quick action block puzzles in order to out run dangerous acids.