"Controlling a Baxter Robot using Gaze and Gesture in a 3D Setup" is a group project developed for the Cobots Seminar 2015 at Saarland University.
The goal was, as the title suggests, to control a Baxter robot using gaze and gesture in a 3D Setup while sitting at a table in front of the robot. For gesture recognition we use Leap Motion, for gaze recognition "EyeVius". These devices are connected to a laptop on which a Unity3D instance processes all the data and sends them to the Baxter robot via a UDP connection. Furtermore we included speech recognition to make it easier for the user to give commants while concentrating on the point where he/she wants Baxters arm to move towards. With this setup it's possible to pick up objects from the table or the reachable space and drop them again on the table or the reachable space.
As a part of a team of four I was responsible for organisation, prototyping, gesture recognition, voice in- and output, and the communication between Unity3D and Baxter (UDP and custom protocol). One of the main challenges was to find approriate gestures which are eays to learn, intutive and do not interfer with each other. The other group members where mainly responsible for gaze input and the programming of Baxter.
My part of the project was developed with Unity3D using mainly C# as scripting language. For the gesture recognition I worked with the Leap Motion SDK. The voice input was implemented with the help of the Windows speech API.
If you want to learn more about this project (for example if you would like to see some code samples) feel free to contact me.