Contents
Neuro Gesture- For Kinect
The following package is a Neural Network based Gesture Recognition package. It uses a Feed-Foward Neural Network, which is trained by a Back Propogation Learning Algorithm. The data being collected is the position of the right hand in the 3-D space, with the help of Kinect Sensor.
The package consists of 3 parts
- Generating dataset from Kinect for Gesture Recognition.
- Training the dataset, with a Feed-Forward Neural Network.
- Predicting the gestures being given as inputs.
Topics
Subscribed
/skeleton_markers
Published
/neuro_gest_kinect/gesture
Getting Kinect Ready
My friend, Malla Srikanth has given a brief info about installing the Kinect drivers here, along with installing PCL for ROS. But below is how I did it.
First of all download the suitable drivers from here. Start by intalling the suitable OpenNI driver, preferably the latest one. The OpenNI drivers are needed for the functioning of packages like openni_launch and softwares like RTABMap, or libraries like PCL, etc.
Once the tar file is downloaded, go to the directory where it is downloaded and uncompress it, by using
$ tar -zxvf filename.tar.gz
or simply double-click it to open it with Archivr Manager. Then install the driver by.
$ sudo ./install.sh
Ok! Next Download the NITE middleware, form the same place.
Again uncompress it and install it
$ tar -zxvf filename.tar.gz $ sudo ./install.sh
Now, it's time to install the Openni_launch for ROS. Simply type
$ sudo apt-get install ros-<rosdistro>-openni-launch
At theis point, to check everything is installed, just connect the Kinect and type
$ roslaunch openni_launch openni.launch
If the launch file runs, then all the drivers are working fine.
Next we install the skeleton_makers package. Just copy it to your catkin_ws/src folder, by using the command
$ git clone https://github.com/pirobot/skeleton_markers.git
Then
$ catkin .. $ catkin_make
Then to see it in actoin, just type the command
$ roslaunch skeleton_markers markers.launch
Pybrain
Once the Kinect is setup, PyBrain must be installed. The instructions for that can be seen here.
neuro_gesture_kinect
So with everything in place, it is time to clone the Neuro_gesture_kinect package in your catkin_ws/src folder.
$ git clone https://github.com/parhartanvir/neuro_gesture_kinect.git
Then make the workspace
$ cd .. $ catkin_make
Then go to the scripts folder of the package and make the nodes executable
$ chmod +x gesture_kinect.py $ chmod +x gesture_rec_kinect.py $ chmod +x tariner.py
OK! Once that is done, its time to test the package. The data_sets folder already contains datasets and trained Neural Networks for two gestures, viz
- Circle
- Square
So, to test the package, just run
$ roslaunch skeleton_markers markers.launch
Then open terminal and change the directory to point in the data_sets folder in the package. Then run
$ rosrun neuro_gesture_kinect gesture_rec_kinect.py
Then, just follow the steps given in the terminal. To see the gesture, just echo /neuro_gest_kinect/gesture, by
$ rostopic echo /neuro_gest_kinect/gesture
Making Own Dataset
You can make your own dataset of an infinite number of gestures and training examples. To do that simply run the node
$ rosrun neuro_gesture_kinect gesture_kinect.py
It will ask you to input the directory for storing datasets. Put the desired path of the folder, where you would like to store the datasets. Then simply follow the given instructions.
Then, to train the dataset, in the terminal, change directory to point to the folder that you just created. Then run
$ rosrun neuro_gesture_kinect trainer.py
Then in the same directory run
$ rosrun neuro_gesture_kinect gesture_rec_kinect.py
Then follow the instructions, as given in the terminal, to make gestures with your right hand. Have Fun!!!
Expected Updates
- Multiple Learning Algorithms to choose from.
- Use of Recurrent Neural Networks, for continuous input.