In the present day framework of interactive, intelligent computing, an efficient human–computer interaction is assuming utmost importance. Gesture recognition can be termed as an approach in this direction. It is the process by which the gestures made by the user are recognized by the receiver. Gestures are expressive, meaningful body motions involving physical movements of the fingers, hands, arms, head, face, or body with the intent of
- Conveying meaningful information (or)
- Interacting with the environment.
They constitute one interesting small subspace of possible human motion. A gesture may also be perceived by the environment as a compression technique for the information to be transmitted elsewhere and subsequently reconstructed by the receiver. Gesture recognition can be seen as a way for computers to begin to understand human body language, thus building a richer bridge between machines and humans than primitive text user interfaces or even GUIs (graphical user interfaces), which still limit the majority of input to keyboard and mouse. Gesture recognition enables humans to interface with the machine (HMI) and interact naturally without any mechanical devices. Gesture recognition can be conducted with techniques from computer vision and image processing.
Gesture recognition is a topic in computer science and language technology with the goal of interpreting human gestures via mathematical algorithms. Gestures can originate from any bodily motion or state but commonly originate from the face or hand. Current focuses in the field include emotion recognition from the face and hand gesture recognition. Many approaches have been made using cameras and computer vision algorithms to interpret sign language. Using the concept of gesture recognition, it is possible to point a finger at the computer screen so that the cursor will move accordingly. This could potentially make conventional input devices such as mouse, keyboards and even touch-screens redundant. Gesture recognition can be conducted with techniques from computer vision and image processing. In computer interfaces, two types of gestures are distinguished: We consider online gestures, which can also be regarded as direct manipulations like scaling and rotating.
The objective of the project is to be able to control the direction of movement of a simple robot with hand gestures. This is accomplished with two major components which are Arduino microcontroller and an accelerometer. The basic working principle of the project involves the accelerometer transmitting the hand gesture measurements to the Arduino microcontroller which processes it and tells the robot to move in the desired direction.
The primary goal in designing a simple wireless gesture controlled robot was to make it as simple as possible with minimal resources while still achieving our objective of controlling the robot’s movements with the gestures of the hand. Thus, based on the aforementioned criteria, the following main components were used:
For the gesture controller,
1. ADXL335 3-axis accelerometer
2. ATMEGA328 microcontroller
3. HT12E 212 series encoder
4. 433MHz RF transmitter module
For the robot itself,
1. 433MHz RF receiver module
2. HT12D 212 series decoder IC
3. L293D motor driver IC
THE GESTURE CONTROLLER:
The design of the gesture controller makes it wearable on the wrist/hand for convenience of operation. Block diagram representations and the flow charts of the transmitter and receiver section of the gesture control system
To briefly summarize, the accelerometer measures the angle of hand tilts and passes on these values to the microcontroller as analogue values which are then processed and converted to digital form and encoded by the encoder IC to eliminate interference of the 433MHz transmission. Figure 3 is the flow chart of the gesture controller.
Analysis: At the start, the gesture controller is switched on. The main function of the accelerometer is to measure the degree of tilting of the hand gestures. The values of the measurements are analog variables and are more or less useless at this point. These analog variables then get transferred to the Arduino microcontroller. The microcontroller on the Arduino is programmed to recognize a set of analog variables from the stream of analog values it will receive from the accelerometer and assign specific functions to them according to its programming. In the decision and control part, the microcontroller’s Artificial Intelligence (AI) system begins deciding which of the analog variables matches its preprogrammed values and to which it assigns the function which is a LOW or HIGH output. If the microcontroller cannot find a match, the variables are discarded and it waits to receive newer measurement values from the accelerometer. This loop continues until a match is found. When a matching variable is received, the microcontroller assigns the appropriate function to it and passes on the information in digital form to be encrypted by the encoder IC which is then transmitted at a particular frequency (in this case 433MHz). This breaks the loop
The robot moves via a set of wheels attached to DC motors which are controlled by a motor driver IC as depicted
Information transmitted at 433MHz is captured and sent to the decoder IC for decryption. The decrypted information is passed on to the motor driver IC which serves as an interface for operating DC motors and is the flow chart of the robot.
Analysis: At start the robot is switched on. The RF receiver module on the robot remains on standby ready to accept any information being transmitted through the air at its frequency channel. Any information that is not on that particular frequency is ignored. In the decision and control part, the information received at 433MHz is sent to the decoder IC which commences the decryption process. At this point, the decoder is expecting information that is encrypted by an encoder IC of the same cryptology pattern as its decrypting process. If the encryption cannot be decoded, the decoder discards the information and awaits another encrypted data to be sent. If the information is not encrypted at all, it is also discarded. This loop continues until the decoder receives encrypted information that matches its decryption process and then commences with decoding the information. The decoded information, which is the original digital processed data sent from the microcontroller to the encoder IC, is then sent to motor driver which controls the movement of the motors according to the information it received. At this point, the loop is broken. Finalizing the decision of making a gesture controlled robot that will be maneuvered by a hand gloved mounted with the transmission circuit assembly. The circuit assembly will consist of accelerometer & Arduino board along with an RF transmitter, which together function as an input device. We decided on this project because we wanted to do a basic application of controlling a vehicle with your hand. The controls of our robot are based on gesture of hand, which becomes simple for any person to handle it. The basic working principle for our robot is passage of the data signals of accelerometer readings to the Arduino board. While we have used two-axis accelerometer. In which, one axis will control the speed in forward or backward direction and other axis will control the turning mechanism.