Students with the STARMAC project at Hybrid Systems Lab at UC Berkeley have used a hacked Microsoft Kinect to serve as the guidance system for an autonomously navigating flying robot.
The attached Microsoft Kinect  delivers a point cloud to the onboard computer via the ROS  kinect driver, which uses the OpenKinect/Freenect  project’s driver for hardware access. A sample consensus algorithm  fits a planar model to the points on the floor, and this planar model is fed into the controller as the sensed altitude. All processing is done on the on-board 1.6 GHz Intel Atom based computer, running Linux (Ubuntu 10.04).
A VICON  motion capture system is used to provide the other necessary degrees of freedom (lateral and yaw) and acts as a safety backup to the Kinect altitude–in case of a dropout in the altitude reading from the Kinect data, the VICON based reading is used instead. In this video however, the safety backup was not needed.