AURA is a music exploration game made as the final project for UCSC's Games and Playable Media Master's program developed by Lun Cao and I. The behavior of most of the objects in the game is determined by the music playing in the background, which was written by me and designed to coexist with the game in harmony.
Homeward was a year-long project for UCSC's Games and Playable Media Master's program. It's a game about a wayward astronaut marooned on an unknown planet trying to get home.
JRPG Town Mayor Simulator 2014 is a prototype made as a final project for the Professional Development I class taught by Brenda Romero at UCSC's Games and Playable Media program. Players play as the mayor of a town in a JRPG attempting to spread the word of their great and amazing town to their neighbors (who also happen to be mayors of their own towns).
RYTHMIC is a prototype made in Unity in 3 weeks (with a couple weeks of prep work beforehand). It's a progressive music game featuring music made by me, Space Town.
Four//Four is a videogame made in the Unity game engine, utilizing the third-party tools 2D Toolkit and PoolManager. It's an abstract rhythm game featuring music from various independent and chiptune artists such as Henry Homesweet, Trash80, Decktonic, and A_Rival.
Ad Infinitum^3 is a videogame/theater piece created by Andy Muehlhausen that I did soundtrack and sound design work for. The piece involved an HTML5 page broadcast on a local network that users would use to control their avatar on a very big screen. We "performed" the piece by taking what would normally be scripted game events such as cutscene transitions and triggered them ourselves, in sync with the music. The players interacted on screen with a dancer that was being tracked by a Kinect in addition to each other and the performers. The first performances were in May of 2013.
Eternities is a music-based video game written in Actionscript 3 using the Flixel framework. The game is scripted to correspond with a song named Eternities that I composed under my artist name Space Town Savior.
Personal [ space ] was a project by myself and Jackson Callaway designed to explore the idea of interactive spaces when the users of such a space are not motivated nor attentive of them. The project involved using an Arduino board that was reading ultrasonic range finders which were controlling various parameters of a dubstep "wobble" generated in Max/MSP. We took this installation and placed it on various areas at the UCSD campus and recorded the interactions that users had with our project. We found that without a visual cue, students on a college campus have little incentive to interact with an object, as there is no connection between the object and the sound emitted. Once we created a visual cue, people were much more inclined to interact with the space. This project was presented as the final project to a class about gesture-based interaction with computers.
This was a paper I worked on with Joshua Lewis and David Kirsh. The paper is an exploration of raw player input as a unit of study. Because replays in Starcraft are actually time-stamped lists of actions that are "performed" by the game when the replay is loaded, we believed that decoding these replays and running the data through analysis techniques would yield quantifiable results. We took a large corpus of player replays from various tournaments including WCG, and used software developed by Andras Belicza to read the proprietary file format that replays are recorded in. We were able to find a correlation between various aspects of a player's play such as APM and SVA (spatial variance of actions) and whether the player won or lost. This paper was presented as a poster in the CogSci 2011 conference.
Singularity is an audiovisual performance created by Patrick Trinh and Kris Calabio using the Source Engine. It was created as part of a class at UCSD, and performed as the final project for the class hosted by Brett Stalbaum. The performance involved a level created in the Hammer Editor, featuring textures made entirely of text that composed a spoken-word piece. The piece was "performed" by a randomly selected audience participant (in this case, the professor himself), by having the performer navigate the space and read the words aloud. The performer's voice had reverb and delay added to create a slow, surreal effect. At the same time, Kris Calabio and I performed a music piece arranged for the original Gameboy (via LSDJ) and guitar. We had composed the piece such that the structure corresponded with rooms of the level.