What time is it

Rotation: what is  it?

The window’s rotation mechanism made it too easy for players to get their hands crossed up. So, I altered the form of the mechanism. Now it is a circular disc with only one handle.


But, I was running into a bug where, when rotating the window, sometimes the window camera itself would become jittery. I arrived at the conclusion that this was because of the sharp variance in values caused by allowing the user to manually rotate the window – since even though it snaps to a right angle once they let go, they have full control while grabbing the handle.


So, I decided to restrict the player’s direct control to only rotating the disc, then smoothly Slerp the window’s rotation to match the disc’s rotation once the player releases the handle.


Next, I decided to prototype some visual effects for the tunnel interaction. We knew we wanted to use live-action video, so I first took a look at playing video files in Unity. I heard that Unity recently released a newly revamped VideoPlayer system in the 5.6 beta, but for now I’m just using the old MovieTexture system (since we’re using 5.5).

If you try to import a video file into Unity (with 5.5 and below, at least – not sure about 5.6), it will try to convert it into an OggVorbis video file (.ogv) using Apple Quicktime. Not a big deal if you’re on a Mac and have QuickTime, but the Windows version of Quicktime has been unsupported for 2 years now (most recent version was released in 2014). I’m working on Windows currently, and even if I wasn’t, I’d prefer to not use Quicktime if I don’t have to.

Fortunately, if you convert the video file to an .ogv yourself, you don’t have to deal with any Quicktime shenanigans. So, I found some stock footage of a bird, converted it to an .ogv, and brought it into Unity. The movie texture was very easy to set up.

I wanted to apply some effects to the video. Drew suggested that I use a convolution filter and showed me an example of such a filter in glsl. A convolution filter works by taking a pixel in an image, finding the surrounding pixels, and then averaging the colors of the pixels into one color in accordance with specified weights. By adjusting the weights you can get common image editor effects like edge-detect and emboss, and beyond.

Convolution filters can also work with fragments in a fragment shader.

After converting the example to Unity Cg with the help of Drew and John, I was able to apply some cool effects to the video.


I found this playlist of simulated mechanisms created in Autodesk Inventor. Feeling inspired, I worked on a little mechanism as part of the window interaction that ended up becoming a clock interaction. (in the below gif you can also see a convolution filter effect applied to the window pane)


Before, the building in the desert was a train station. We decided that we should change it, to make the optional trainwreck interaction in the desert more out-of-place. Now it will be a clock tower.

I also made a little robot with Unity primitives and animated a dance for it.





This week I primarily focused on getting the magic window ready for testing.

First thing I had to change was that rotation mechanism. The placeholder was just a flat disc on a pedestal with a handle. I ended up changing it to a two-handed valve.

Then, I set up a test interaction.


As can be seen above, I set up and tested the logistics of a basic “pressure plate” interaction, where the player would have to pick up a ball (in the window realm) and pull it out of the window to drop it on a pressure plate.

However, through further talks with Levi, I was reminded of one of the primary design pillars of the interactions – that they have discrete states. So, we decided to have the window snap to right angles, rather than what is seen in the above gif (where you can freely rotate the window).

The next step was to package up the window so it would be ready for QA. The plan was to create a desert “playground” scene with some of Alexis’ desert art, and then to set up the magic window and a few toy interactions within the scene.

the desert ruins, and the window within

It took a bit of work to get everything working again, but now we’ve got a nice lil desert scene with the magic window, ready for testing. Currently, the only thing you can interact with in the window realm are     a few spheres    that can be picked up, and will fall to the ground (and bounce) when dropped.


Another thing I did this week: I made a ghost shader. This will be used for ghosts in the desert. These ghosts will be visible from far away, but will disappear if the player approaches them.

This is accomplished by first calculating the world position of each fragment in the fragment shader, then finding the distance between each fragment and the player, then setting the alpha based on this distance.



A magic window

Drums are an instrument, and about one-and-a-half weeks ago  I made a drum interaction to familiarize myself with making an an actual interaction. I had heard we wanted to do some sort of interaction where the player could play a drum. Fortunately, the framework created by Levi and Eric made it very simple to add new interactions.



At one of the meetings we divvied up some of the necessary interactions that we knew were going to be in the game, while the design for the others could be fleshed out. I started working on the “Magic Window” interaction, an important part of the desert area. The initial idea was to have some objects that would be hidden when viewed in the open, but visible when viewed through a magic window.

After being given some advice from a fellow game programmer, I attempted to go the depth buffer route. Essentially you can enable cameras in Unity to pass along information to a shader that can be used to determine the depth of the objects that the camera is rendering, relative to the camera. So, going off of this, I had two cameras, one that only could see the magic window, and one that could only see the hidden objects. Then, I compared the objects’ depth to the window to see if they should be rendered. That produced this effect:


However, the problem with using the depth buffer is that several of our materials are either transparent, or not in the correct render queue to be rendered to the depth buffer. Supposedly, there’s a way to force such objects to render to the depth buffer, but I couldn’t figure it out. Plus, it got a bit messy since we’re using several cameras in the game.

It seemed like there should be an easy way to do this. After getting some advice to look for a more Portal-style approach, I did some research and found a free script/shader combo that was essentially what we were looking for. Basically, with this approach, we have a duplicate of the room containing the magic window. Then, there is a second camera that tracks the player camera and renders to a RenderTexture, which is applied to the magic window’s material. This method is built off of a reflection script/shader on the Unity community wiki.

After incorporating that into my testing scene for the Magic Window, it took some modification to get everything to work right. But, I ended up with this:


The magic window idea was developed at further meetings. Primarily the magic window will be looking into the past (or some other realm on top of the current one). Additionally, the window will be able to rotate.

Implementing rotation took some more adjustments, but finally, it’s looking pretty good. The base is there, so now we can start building more complex interactions around it. Right now there’s just a very basic interaction where if you drag an object out from behind the window, it becomes visible.