You might have seen Hand Gesture Controlled Robots, where the motion of a robot is controlled by the gestures of the hand. Another interesting project based on a similar principle is an Arduino based Hand Gesture Control of your computer or laptop.
Human Machine Interface or HMI is a system comprising of hardware and software that helps in communication and exchange of information between the user (human operator) and the machine.
We normally use LED Indicators, Switches, Touch Screens and LCD Displays as a part of HMI devices. Another way to communicate with machines like Robots or Computers is with the help of Hand Gestures.
Instead of using a keyboard, mouse or joystick, we can use our hand gestures to control certain functions of a computer like play/pause a video, move left/right in a photo slide show, scroll up/down in a web page and many more.
In this project, we have implemented a simple Arduino based hand gesture control where you can control few functions of your web browser like switching between tabs, scrolling up and down in web pages, shift between tasks (applications), play or pause a video and increase or decrease the volume (in VLC Player) with the help of hand gestures.
The project Arduino based Hand Gesture Control of Computer is implemented using Python. So, before proceeding with this project, I suggest you to work on this simple project CONTROLLING ARDUINO’S ON-BOARD LED WITH PYTHON.
In this project, you can find the basics of how to use the Arduino with Python, installing Python on your computer, setting up the Serial Library (important for communicating with Arduino) and the project codes.
So, I assume that you have already installed Python and pySerial (library for communicating with serial ports) and also worked on the basic project of blinking Arduino’s LED with Python.
Principle behind the Project
The principle behind the Arduino based Hand Gesture Control of Computer is actually very simple. All you have to do is use two Ultrasonic Sensors with Arduino, place your hand in front of the Ultrasonic Sensor and calculate the distance between the hand and the sensor. Using this information, relevant actions in the computer can be performed.
The position of the Ultrasonic Sensors is very important. Place the two Ultrasonic Sensors on the top of a laptop screen at either end. The distance information from Arduino is collected by a Python Program and a special library called PyAutoGUI will convert the data into keyboard click actions.
The circuit diagram of Arduino part of the project is shown in the following image. It consists of an Arduino UNO board and two Ultrasonic Sensors and you can power up all these components from the laptop’s USB Port.
- Arduino UNO x 1 [Buy Here]
- Ultrasonic Sensors x 2
- USB Cable (for Arduino)
- Few Connecting Wires
- A Laptop with internet connection
Design of the Project
The design of the circuit is very simple, but the setup of the components is very important. The Trigger and Echo Pins of the first Ultrasonic Sensor (that is placed on the left of the screen) are connected to Pins 11 and 10 of the Arduino. For the second Ultrasonic Sensor, the Trigger and Echo Pins are connected to Pins 6 and 5 of the Arduino.
Now, coming to the placement of the Sensors, place both the Ultrasonic Sensors on top of the Laptop screen, one at the left end and the other at right. You can use double sided tape to hold the sensors onto the screen.
Coming to Arduino, place it on the back of the laptop screen. Connect the wires from Arduino to Trigger and Echo Pins of the individual sensors. Now, we are ready for programming the Arduino.
Programming Your Arduino to Detect Gestures
The important part of this project is to write a program for Arduino such that it converts the distances measured by both the sensors into the appropriate commands for controlling certain actions.
We have already seen a project called PORTABLE ULTRASONIC RANGE METER, where you can measure the distance of an object placed in front of an Ultrasonic Sensor with the help of Arduino.
A similar concept is used here to measure the distance of your hand in front of both the Ultrasonic Sensors in this project. The fun part starts after calculating the distance.
The hand gestures in front of the Ultrasonic sensors can be calibrated so that they can perform five different tasks on your computer. Before taking a look at the gestures, let us first see the tasks that we can accomplish.
- Switch to Next Tab in a Web Browser
- Switch to Next Tab in a Web Browser
- Scroll Down in a Web Page
- Scroll Up in a Web Page
- Switch between two Tasks (Chrome and VLC Player)
- Play/Pause Video in VLC Player
- Increase Volume
- Decrease Volume
The following are the 5 different hand gestures or actions that I’ve programmed for demonstration purpose.
Gesture 1: Place your hand in front of the Right Ultrasonic Sensor at a distance (between 15CM to 35CM) for a small duration and move your hand away from the sensor. This gesture will Scroll Down the Web Page or Decrease the Volume.
Gesture 2: Place your hand in front of the Right Ultrasonic Sensor at a distance (between 15CM to 35CM) for a small duration and move your hand towards the sensor. This gesture will Scroll up the Web Page or Increase the Volume.
Gesture 3: Swipe your hand in front of the Right Ultrasonic Sensor. This gesture will move to the Next Tab.
Gesture 4: Swipe your hand in front of the Left Ultrasonic Sensor. This gesture will move to the Previous Tab or Play/Pause the Video.
Gesture 5: Swipe your hand across both the sensors (Left Sensor first). This action will Switch between Tasks.
Based on the above mentioned gesture, the following Arduino Program has been written.
|* gesture control program for controlling certain functions in windows pc|
|* Code by BalaAppu|
|* Website: www.electronicshub.org|
|const int trigPin1 = 11; // the number of the trigger output pin ( sensor 1 )|
|const int echoPin1 = 10; // the number of the echo input pin ( sensor 1 )|
|const int trigPin2 = 6; // the number of the trigger output pin ( sensor 2 )|
|const int echoPin2 = 5; // the number of the echo input pin ( sensor 2 )|
|////////////////////////////////// variables used for distance calculation|
|int distance1, distance2;|
|unsigned long temp=0;|
|void find_distance (void);|
|// this function returns the value in cm.|
|/*we should not trigger the both ultrasonic sensor at the same time.|
|it might cause error result due to the intraction of the both soundswaves.*/|
|void find_distance (void)|
|duration = pulseIn(echoPin1, HIGH, 5000);// here this pulsein function wont wait more then 5000us for the ultrasonic sound to came back. (due to this it wont measure more than 60cm)|
|// it helps this project to use the gesture control in the defined space.|
|// so that, it will return zero if distance greater then 60m. ( it helps usually if we remove our hands infront of the sensors ).|
|r = 3.4 * duration / 2; // calculation to get the measurement in cm using the time returned by the pulsein function.|
|distance1 = r / 100.00;|
|/////////////////////////////////////////upper part for left sensor and lower part for right sensor|
|duration = pulseIn(echoPin2, HIGH, 5000);|
|r = 3.4 * duration / 2;|
|distance2 = r / 100.00;|
|pinMode(trigPin1, OUTPUT); // initialize the trigger and echo pins of both the sensor as input and output:|
|find_distance(); // this function will stores the current distance measured by the ultrasonic sensor in the global variable “distance1 and distance2”|
|// no matter what, the program has to call this “find_distance” function continuously to get the distance value at all time.|
|if(distance2<=35 && distance2>=15) // once if we placed our hands in front of the right sensor in the range between 15 to 35cm this condition becomes true.|
|temp=millis(); // store the current time in the variable temp. (” millis ” Returns the number of milliseconds since the Arduino board began running the current program )|
|while(millis()<=(temp+300)) // this loop measures the distance for another 300 milliseconds. ( it helps to find the difference between the swipe and stay of our hand in front of the right sensor )|
|if(distance2<=35 && distance2>=15) // this condition will true if we place our hand in front of the right sensor for more then 300 milli seconds.|
|temp=distance2; // store the current position of our hand in the variable temp.|
|while(distance2<=50 || distance2==0) // this loop will run untill we removes our hand in front of the right sensor.|
|find_distance(); // call this function continuously to get the live data.|
|if((temp+6)<distance2) // this condition becomes true if we moves our hand away from the right sensor (**but in front of it ). here ” temp+6 ” is for calibration.|
|Serial.println(“down“); // send “down” serially.|
|else if((temp-6)>distance2) // this condition becomes true if we moves our hand closer to the right sensor.|
|Serial.println(“up“); // send “up” serially.|
|else // this condition becomes true, if we only swipe in front of the right sensor .|
|Serial.println(“next“); // send “next” serially.|
|else if(distance1<=35 && distance1>=15) // once if we placed our hands in front of the left sensor in the range between 15 to 35cm this condition becomes true.|
|if(distance2<=35 && distance2>=15) // if our hand detects in the right sensor before 300 milli seconds this condition becomes true. ( usually it happens if we swipe our hand from left to right sensor )|
|Serial.println(“change“); // send “change” serially.|
|l=1; // store 1 in variable l. ( it avoids the program to enter into the upcoming if condition )|
|break; // break the loop.|
|if(l==0) // this condition will become true, only if we swipe our hand in front of left sensor.|
|Serial.println(“previous“); // send “previous” serially.|
|while(distance1<=35 && distance1>=15) // this loop will rotate untill we removes our hand infront of the left sensor. this will avoid not to enter this if condition again.|
|l=0; // make l=0 for the next round.|
If you observe in the Arduino Code, the gesture mentioned above have been converted into 5 Commands that are sent to the Serial Port. Using these 5 commands, you can write a Python Program to control certain Keyboard Functions in order to achieve the required task.
Python Programming for the Project
Writing Python Program for Arduino based Hand Gesture Control is very simple. You just need to read the Serial data from Arduino and invoke certain keyboard key presses. In order to achieve this, you have to install a special Python Module called PyAutoGUI.
The following steps will guide you through the installation of PyAutoGUI on Windows Computers. The module PyAutoGUI will help you to programmatically control the mouse and keyboard.
With the help of PyAutoGUI, we can write a Python Program to mimic the actions of mouse like left click, right click, scroll, etc. and keyboard like keypress, enter text, multiple key press, etc. without physically doing them. Let us install PyAutoGUI.
If you remember in the previous project, where we controlled an LED on Arduino using Python, we have installed Python in the directory “C:\Python27”.
Open Command Prompt with Administrator privileges and change to the directory where you have installed Python (in my case, it is C:\Python27).
If you have installed the latest version of Python, then pip (a tool for installing packages in Python) will already be installed. To check if pip is installed or not, type the following command.
You should upgrade to the latest package of pip using the following command. If pip is already in its latest version, then ignore this step.
After upgrading pip, you can proceed to install PyAutoGUI. In order to install PyAutoGUI, type the following command.
If everything goes well till now, you can proceed to write the Python Code. If you observe the Arduino Code given above, the Arduino sends out five different texts or commands through Serial Port upon detecting appropriate hand gestures. These commands are
Using these commands along with few functions in PyAutoGUI (like hotkey, scroll, keyDown, press and keyUp), you can write a simple Python Code that will execute the following tasks of keyboard and mouse.
- Data = “next” – – > Action = Ctrl+PgDn
- Data = “previous” – – > Action = Ctrl+PgUp
- Data = “down” – – > Action = Down Arrow
- Data = “up” – – > Action = Up Arrow
- Data = “change” – – > Action = Alt+Tab
The Python Code for Arduino based Hand Gesture Control of Computer is given below.
|# gesture control python program for controlling certain functions in windows pc|
|# Code by BalaAppu|
|# Website: www.electronicshub.org|
|import serial # add Serial library for serial communication|
|import pyautogui # add pyautogui library for programmatically controlling the mouse and keyboard.|
|Arduino_Serial = serial.Serial(‘com12’,9600) # Initialize serial and Create Serial port object called Arduino_Serial|
|incoming_data = str (Arduino_Serial.readline()) # read the serial data and print it as line|
|print incoming_data # print the incoming Serial data|
|if ‘next’ in incoming_data: # if incoming data is ‘next’|
|pyautogui.hotkey(‘ctrl’, ‘pgdn’) # perform “ctrl+pgdn” operation which moves to the next tab|
|if ‘previous’ in incoming_data: # if incoming data is ‘previous’|
|pyautogui.hotkey(‘ctrl’, ‘pgup’) # perform “ctrl+pgup” operation which moves to the previous tab|
|if ‘down’ in incoming_data: # if incoming data is ‘down’|
|#pyautogui.press(‘down’) # performs “down arrow” operation which scrolls down the page|
|if ‘up’ in incoming_data: # if incoming data is ‘up’|
|#pyautogui.press(‘up’) # performs “up arrow” operation which scrolls up the page|
|if ‘change’ in incoming_data: # if incoming data is ‘change’|
|pyautogui.keyDown(‘alt’) # performs “alt+tab” operation which switches the tab|
|incoming_data = “”; # clears the data|
NOTE: We have used Google Chrome as Web Browser and VLC Player as Media Player. Also, we modified the hotkeys of VLC Player to suit our Python Program. The modifications are as follows.
- Keypress = Up Arrow – – > Action = Increase Volume
- Keypress = Down Arrow – – > Action = Decrease Volume
- Keypress = Ctrl+PgUp – – > Action = Play/Pause
Application of Arduino based Hand Gesture Control of Computer
- In this project, we have implemented Arduino based Hand Gesture Control of Your Computer, where few hand gestures made in front of the computer will perform certain tasks in the computer without using mouse or keyboard.
- Such Gesture based Control of Computers is already present and a company called Leap Motion has been implementing such technology in computers.
- This type of hand gesture control of computers can be used for VR (Virtual Reality), AR (Augmented Reality), 3D Design, Reading Sign Language, etc.