Adafruit's recent success in bringing the Xbox 360 motion controller Kinect to the open source community has opened up a host of possibilities for researchers with limited funding. In the latest string of Kinect enabled research showcases, MIT researcher Philipp Robbel has mated the Kinect sensor with the iRobot Create Robot kit.
Robbel's creation gets spatial awareness, thanks to the Kinect, letting it analyze and build real time reconstruction of its surroundings for what can be called true sight. It can then use its inbuilt wireless communication module to remotely send data back to the host computer. One can spot a host of engineering, military and research possibilities for a relatively inexpensive robot with these features.
The robot is also capable of recognizing human gestures, so it can be controlled on field, by multiple persons without the need for separate controllers. Robbel's research intends to build a team of robots working cooperatively in search and rescue missions. The original idea is to allow the robot to rescue humans and then follow their leads (using Kinect's human detection capabilities) to get to others.
Robbel's KinectBot is far from a complete product; it's more of a proof of concept, pending further tests of its 3D imaging capabilities. However, if and when it does make the cut, this just might be the first proper implementation of the Kinect system in a serious engineering product.
Robots thus far have mostly been relying on radars, lasers and ultrasonic devices paired to simple algorithms for navigation. Anything fancy involving camera based imaging has required massive complexity, which hasn't been feasible for most robotic research projects. This limitation has been a major hindrance to developing prototypes with true spatial awareness, which has always been the prerogative of high budget military and space research projects.
No comments:
Post a Comment