latest posts

Over the past couple of weeks I have been taking a deep dive back into game development, specifically VR Game Development. With the Rift, Vive, Hololens and HDK to name a few are either on the market or going to be on the market within the next few months I wanted to be at the forefront like I was with IoT. As with any endeavor, I took a look into the main two approaches presented to developers: Unity/Unreal/Other existing engine or homebrew engine.

Existing Engine Approach

The first approach is clearly the most bang for the buck as far as investment, both Unity and Unreal Engine offer a ton of functionality out of the box, have large communities and lots of documentation. I personally tried both Unity and Unreal over a weekend to get antiquated with some success (more so on the Unity side). The main problem with both of these approaches is that I felt like I was on rails, constrained by someone elses code and having used frameworks in my day job from Xamarin and Telerik, updates always proved problematic.


For those that have not followed this blog for long, I have off and on ventured into OpenGL, Glide and XNA since 1998. Knowing that OpenGL and DirectX shifted off the fixed pipeline models in favor of a programmable pipeline, I knew I would be in for a learning curve no matter which route I went down. In looking into Vulkan one evening I uncovered a huge lack of documentation on the API and how you are supposed to use it so I started down the DirectX path for the first time since DirectX 8. For those unaware, SharpDX provides a really lightweight wrapper for DirectX 12 in C#. Deep diving into the API everything was clicking as most things were what I was familiar with in either my day job (command lists are really similar to a Service Bus/Messaging architecture) or SwapChain (double buffering).

As one could infer, I went with a homebrew DirectX 12 approach. To follow the Unity approach of a launcher Win32 app and then directly launching the game, I utilized my XAML WPF skills to throw together a quick frontend to manage graphics adapter, resolution, FSAA and full screen options. This ended up being a great approach to the DXGI namespace in SharpDX to query all of the supported resolutions and graphics adapters. Which led to abstracting out the input handling, sound handling, graphics renderer and level loading. As of right now basic sound playing, keyboard input all utilizing DirectX 12.

The big element remaining to get a basic "game" up and running in writing an export from Blender to a series of triangle strips to then be read by my renderer code. Lastly getting the virtual camera working with keyboard and mouse input. Once these elements on in there, a lot remains such as collision detection, physics, texture mapping, 2D UI and most importantly the VR component. I recently purchased 2 AMD Radeon RX-480 cards (another blog article is coming on those in particular) and will be getting the Razer HDK v2 as soon as it is available.

For those curious, all of this work is being open sourced like everything I do and is available on my github in the HVR.DX12 project.