My name is Stevie Vu and I'm focused on Virtual Reality. Follow this blog if you want updates on how my projects are going as well as what I'm learning.
Decided to not create a VR focused game and instead prototyped an endless runner mobile game for Android. Also took the opportunity to further test out Unity's new input system.
I was able to put everything together over three days which I'm pretty happy with. Most of the time was spent watching different video tutorials on the input system and how to configure it. It isn't too difficult to use when setting up the controls for one player. However, I haven't been able to get the local multiplayer to work using the same keyboard. For now I think I'll take a break from it and try a different project before re-visiting.
This is an overview of the VR Food Truck Prototype game which was built for Oculus Quest using Unity 2021. Recently we had to deal with a killer heat wave in Vancouver. During that time it was so hot I couldn't stand cooking in the kitchen and dealing with more heat so food was all about take-out and delivery. Thus the concept of VR food delivery game using a cannon to send you the food.
VR Food Truck Gameplay Trailer
Game Overview
This is a time based game so you are simply trying to get as high a score as you can before it runs out. You gain points by sending a sandwich with the ingredients the customer wants inside.
I like to think of this as a VR version of Overcooked which was a cooking game that I really liked.
Loading Scenes Asynchronously and Object Pooling
One of my learning goals for this project was to better manage loading within the game. To create a more seamless experience for the player I have a screen fade that occurs during game transitions. The screen fades to black and then levels are loaded asynchronously. During the period where the screen is black I have all the needed object pools and other game objects loaded. The goal is to make it so that any screen tearing/skipping or loading goes unnoticed by the player as a result.
Unity XR Toolkit and Action-based Input
This is my first time implementing Unity's XR Interaction Toolkit with the new action-based input system. I found that its pretty intuitive and not too difficult to use. I added additional functionality to the scripts to make this VR game work.
Sandwich and Topping System
The Sandwich system relies on a gameobject with a sandwich handler and gameobject with a topping handler. The topping gameobjects are created in an object pool at the start of the scene and retrieved from the pool when the player goes to grab a topping. How the topping interacts with the sandwich is simply through checking for the sandwich handler when the topping collides with other objects. If the topping collides against a object with a sandwich handler, the topping is set to inactive and calls the add topping method in the sandwich and passes on a string for what topping needs to be added.
The sandwich keeps track of what toppings have been added to it in a list<string> . A customer can then check this list in the sandwich handler to see if the toppings it currently has matches the customers own personal list for sandwich toppings.
Customer System
Customers are initialized upon loading the game level and sent in a Queue<customer> and set to inactive. Current customer is set to active and then they send their list of topping requests to the player topping request UI canvas. One unique challenge here was that I didn't want the customers to overlap with their spawn positions so when they are assigned a spawn location they check it against a list of used spawn locations. If the random spawn location they were assigned is used, the script generates a new spawn location number until it gets one that is unused.
VR Canon
The VR canon is controlled with a simple lever that uses Unity's configurable joint and locks its position and rotation to mimic a lever. The cannon aim is adjusted based on how you move the lever and it is is calculated by transforming the current position of the lever against its maximum and minimum range.
Conclusion
Overall I feel that this project went very smoothly with good separation between systems and that the scripts and classes are well organized. I was able to apply a lot of the lessons I learned from previous VR projects and I can see myself using the asynchronous scene loading with screen fading for future projects as well.
This is a overview of the VR Card Battle Game System prototype which was built for Oculus Quest using Unity 2019. This was a project inspired from a friend who is a fan of card games. I thought it'd be pretty cool to make a real-time VR battling card battle game with deck building.
YouTube Video - Game System Overview
Overview
The game play loop is very simple and is a cycle of defeating your enemies to get rewards which allow you to build a stronger card deck and beat more enemies. Victory is achieved when the enemies health is reduced to zero. You accomplish this by using magical weapons which spawn from a selection of magical cards. Defeated enemies provide several magical cards which you can incorporate into your deck for the next enemy.
The Gameplay Loop
In terms of similar games I think of it as a combination of the projectile slashing from Beat Saber mixed with card games like Slay the Spire.
Card System
The cards form the core system of the game and I tried to keep the scope limited so there are only four types of cards..
The primary damage card in the game is the basic sword card which fires a projectile when charged up. The second card is a shield that allows you to block projectiles and recharge mana. Third is a spell fireball card which deals larger damage at the cost of mana. Finally is the draw card which provides a small heal and allows you to draw a new card.
Decks are based on a list of Card Scriptable Object which contain the basic information for the card. The deck gets loaded at the beginning of a match for the player. When you draw your hand a generic Card is created which is then passed the Card Scriptable object which will determine the look and effects of the card.
Example Card Scriptable Object
VR Interactions
VR interactions and set up is done using Unity's XR Interaction toolkit which is easily extendable and its how I add my own interactive elements to the game objects such as making the sword fire a lightning range attack when you use the trigger button on the controller.
One of my focus for this project was to implement SOLID design principles as well as to try and avoid creating a massive singleton. This led me to learning more about how to use scriptable objects as a bridge and the power of combining events with scriptable objects which is incredibly useful.
If your curious about using events and scriptable object together I recommend you check out the Unity's Open Project where they explain how they implemented it.
Enemy AI
The main challenge was creating an enemy AI that behaves differently based on what cards the player chooses . To challenge myself I decided to try and implement two enemies which use either a finite state machine or a simple behavior tree.
My finite state machine enemy was based on a tutorial a unity tutorial where they used scriptable objects. I just changed the scripts and behaviors in the scriptable objects to suit my specific purposes. If your interested in the tutorial you can learn more about it here.
The enemy basically checks against a series of decisions whether it needs to change state and those decisions are based on what card the player currently has selected. Each card selection state will lead to attack state where you can plug in what you want to occur. You just add the corresponding attack scriptable object which would be how many projectiles and which spawn location you want it to occur from.
The second enemy uses a behavior tree that relies on scriptable objects which I learned from this tutorial. There are probably really good existing solutions but I wanted to try and program a basic behavior tree for my own personal learning.
The end result isn't visually easy to use but still fairly straight forward for customizing. You have three types of nodes which are the Selector, Sequence and Tasks nodes. It has a default idle behavior which is to look at the player and wait for card selection. When the player has selected a card the behavior tree runs the corresponding attack sequence.
Both enemies share the same code for projectile management and the behavior tree or state machine just need to provide the attack data.
Conclusion
All in all, I learned a lot from this project and hope you enjoyed this rough overview. The code is available on my github if you want to take a closer look.