I recently received my Jetson TK1 Development Board, which I’m evaluating for potential use in an embedded systems course next Spring. This board is attractive to me because it has a high-end quad core ARM A15 processor as well as a high-end mobile GPU with support for the latest OpenGL and CUDA development environments. This post is going to serve as a way for me to record the things needed to set up the device in the first place, and hopefully more posts will follow as I manage to develop some examples on the board that will hopefully turn into lab assignments for the class.
This semester has been extremely exciting. In particular, E85 has been updated from MIPS to use the ARM v7 instructions set. This has been a good experience for me despite some difficulties and a few moments of intensity where I wasn’t sure things would come together in time. In doing this, I’ve come to learn a lot about the ARM ISA, and I’ve decided I like it a lot. In particular, I think it strikes a reasonably good balance between RISC and CISC instructions where there are a couple nice addressing modes for stack manipulation that aren’t quite RISC, but are close enough.
This semester I am teaching a project-based class that has almost no formal instruction by the professor. Instead of providing close direction, I am attempting to give the students more autonomy and control over their learning and the things they will get out of the class. This presents a number of difficulties, and I plan to discuss them as the semester proceeds. Today, I’d like to say something about student-run class sessions, which will happen for the first time in my class next week.
I’ve often wanted to be able to make animated gifs of things. I have also been interested in simple programmatic video editing from time to time, particularly for animating a series of still frames from images rendered by my graphics architecture simulations. Today on reddit, I came across this blog post that describes a sequence of ways to use the MoviePy library to generate animated gifs. It looks like exactly the kind of solution I want since I like to use python whenever possible for simple things.
I’ve been thinking recently about the kinds of things I’d like to work on most during the break from teaching, and I’ve decided that I really want to take a deep look at game design from a theoretical and practical standpoint. It seems to me that there could be some need for a more formal and methodological approach to the design of games than currently exists. In my experience, game designers (the ones that put the fun into the game) are hard to find and extremely hard to train.
I’ve recently had the pleasure of updating a recent journal article submission with my list of affiliations. On the article, there are 5 authors, and among them, I am the only one with different affiliations, though I share the other affiliation with the rest of the authors. After some quick googling on how I should perform this, I came up with the following snippet of code that should have worked.
I came across a really interesting piece of C++ code today hidden in
this list of obscure C++
It includes an implementation of some of the
Today, NVIDIA announced a new program to help game developers effectively use NVIDIA hardware called GameWorks. The thing that excites me the most about GameWorks is the inclusion of Optix in a program that directly targets game developers. Optix is a ray tracing environment developed by people I know from my studies at Utah that has previously been used to help develop rendering engines for design and production environments. This recent announcement means that the important global illumination features that work more naturally in ray tracing will be more readily available for game developers.
Today I had the fortunate presence of mind to record videos for a couple of my students’ lab projects that I found particularly interesting. In this post I’ll go ahead and embed these videos for anyone interested in watching them. To explain briefly, the assignment was to use assembly programming for the PIC 32 microcontroller to have a speaker produce a song. Optionally, students were allowed to play a song of their choosing after putting that song in the program memory.
For my other blog post today, I decided I wanted to embed the youtube videos I recorded earlier. Since the rest of my website is fully responsive now, I thought it would be best to find a way to allow embedded videos to also be responsive to the browser window. This is one of the main reasons I started using bootstrap in the first place, and jekyllbootstrap for my website. Luckily I came across the excellent post Todd Motto made posting his fluidvids.