Figuring out what doesn't work is often the best way to design a solution that does.
As illustrated in my previous post, we had established a basic concept for the Project Ace experience, but had yet to determine how it would actually work. On the visual side, we needed to project video onto the surface of the ping pong table. A computer would power the visuals, including any animation and sound. From a data perspective, we were challenged to find the best means of tracking and collecting real-time data from gameplay. Capturing this data was the hard part.
The most common way of tracking an object is to use a camera and distinguish it from the background using pattern or color. Since we planned on projecting onto the table, this was not an option because the ball would also reflect the projection and would blend in with the table.
Kinect is able to overcome light and color blending by bouncing infrared light off a scene and returning a “depth map.” We thought we could combine the depth image from the Kinect program with blob tracking in OpenCV (Open Source Computer Vision) to track the ball. While Kinect was able to distinguish the ball from the table, there was not enough depth resolution to use this for a 1.5” ball at traveling at high speeds (because it is tuned to sense a person standing a few feet away). Regretfully, Kinect was off the table.
[Depth map through Kinect’s eye.]
Next, we tried infrared-reflective paint in conjunction with infrared spotlights and an infrared camera. We’ve had success in the past using this technique to track “invisible” markers for an AR project. Although promising, this technique proved overly complicated, so we abandoned an infrared-based solution and returned to the drawing board.
We finally stumbled on MIT Media Lab’s PingPong++ project. Initiated way back in 1998, the folks at MIT produced an acoustic sensor solution that was simple, reliable and open source, collecting data through vibrations instead of visuals. They provided documentation on how to build the sensor circuit and software for calibration. While this solution would not allow us to track the ball in space, we decided that having accurate, reliable hit detection was more important.
[Standing on the shoulders of giants.]
Despite having a detailed part list and a circuit diagram, building the sensor circuit was quite challenging. We quickly learned a lot about electronics and became proficient at un-soldering our mistakes. Once the sensors were completed, we were able to install them on the underside of the table and calibrate for accuracy. Now we could start building our game and focusing on the experience as a whole.
[Building, testing, refining…and testing again.]
With a clearer sense of the trackable data, the concept of the experience began to take shape in the form of a very interesting game. We altered the game mechanics of traditional ping pong by overlaying a grid of tiles, which players had to “break” to win. While the affordance of ping pong remained, the players’ accuracy now outweighed power and speed.
[Our final design concept, aptly named Break Pong.]
[This final motion study demonstrates finalized the look and feel, along with the basics of gameplay. Incidentally, Alex and Lofton were playing a normal game of ping pong during the filming.]
Staying true to the open source and non-proprietary nature of PingPong++ we decided to use Processing to create the game software. We didn’t had much experience developing a full-fledged game in Processing, but it proved to be an intuitive platform. Within a day we had a working prototype of one level, the projection proportionately displaying on the surface of our table, and sensors that performed with “relative” accuracy. Everything worked, but more importantly, players were having fun.
[Testing an early build. Lofton and Corey knew what game they were playing.]
My final post in the series will discuss the completed gameplay, process and what comes next for Project Ace.