Since 2015, I’ve worked on a variety of research projects that I’d retrospectively say were all related to the future of augmented reality, and particularly head mounted displays for augmented reality. I’ve recently started shifting my research focus to a new area, so I figured this would be a good time to post a retrospective of what I’ve done and what I feel I’ve learned over these last 3 years.
I’ve recently been spending a lot of my time working on projects based on the G3D Innovation Engine by Morgan McGuire. I personally learned how to program using unix development tools, so I’m used to using the command line for everything, and Makefiles are the build tool I’m most comfortable with. G3D, being a research game engine, primarily for graphics research, has had to focus on Windows-based development for a number of years now.
I recently received my Jetson TK1 Development Board, which I’m evaluating for potential use in an embedded systems course next Spring. This board is attractive to me because it has a high-end quad core ARM A15 processor as well as a high-end mobile GPU with support for the latest OpenGL and CUDA development environments. This post is going to serve as a way for me to record the things needed to set up the device in the first place, and hopefully more posts will follow as I manage to develop some examples on the board that will hopefully turn into lab assignments for the class.
Today, NVIDIA announced a new program to help game developers effectively use NVIDIA hardware called GameWorks. The thing that excites me the most about GameWorks is the inclusion of Optix in a program that directly targets game developers. Optix is a ray tracing environment developed by people I know from my studies at Utah that has previously been used to help develop rendering engines for design and production environments. This recent announcement means that the important global illumination features that work more naturally in ray tracing will be more readily available for game developers.