Kinect is cool and all, but it has one major weakness: it can't track things outside in daylight. The reason is that it uses infrared lasers to sense depth. The sun essentially blinds the Kinect with infrared light.
For one of our HAX projects we need to be able to track people going by outside of our windows. At night Kinect would be perfect for this since it would allow us to differentiate between a person and their shadow, and ignore reflections from passing cars - something a single camera can't do.
We got around this by using two web cams that are aligned and spaced a couple feet apart.
The cameras are suspended up about 20 feet, looking through the window at the sidewalk. You can see our hacked-together prototype below:
Using computer vision software (OpenCV) we compare the left and right images and measure the pixel distances between recognized features. The result is called a disparity map - depth is represented as levels of gray in the image. From there we pick out the white blobs which are people walking along the sidewalk (or, as you might notice below, a tree branch at head-level - but this can be ignored with better calibration).
The results aren't quite as sexy as the Kinect, but at least we're not limited to a dark room!