• Our Lab
    • About
    • Research Themes
    • Gallery
    • Exhibitions
    • Workshops >
      • Workshop Info
      • FAQ
    • Intern Diaries
  • Projects
    • Flagship Projects
    • Summer Projects
  • Publications
  • Our Team
    • Professor Incharge
    • Alumni >
      • Batch 2014
      • Batch 2016
      • Batch 2017
      • Batch 2018
      • Batch 2019
      • Batch 2020
      • Batch 2021
      • Batch 2022
    • Core Coordinators
    • Junior Year Coordinators
  • Contact
  • Spin-offs
    • Makxenia
    • AidBots
  • Intranet
IvLabs
  • Our Lab
    • About
    • Research Themes
    • Gallery
    • Exhibitions
    • Workshops >
      • Workshop Info
      • FAQ
    • Intern Diaries
  • Projects
    • Flagship Projects
    • Summer Projects
  • Publications
  • Our Team
    • Professor Incharge
    • Alumni >
      • Batch 2014
      • Batch 2016
      • Batch 2017
      • Batch 2018
      • Batch 2019
      • Batch 2020
      • Batch 2021
      • Batch 2022
    • Core Coordinators
    • Junior Year Coordinators
  • Contact
  • Spin-offs
    • Makxenia
    • AidBots
  • Intranet

HAND GESTURE CONTROLLED BOT

Picture
OVERVIEW:
                              
What if a physically handicapped person sits in a chair and can move   anywhere just by his hand gesture .Inspired from such an idea we made a small robot   model which is moved by the hand gestures of the person.
​​

Picture
INSIGHT:

IMU(MPU 6050):
The MPU-6050 is made by Invensense and consists of a MEMS (microelectromechanical system) accelerometer and gyroscope with 16-bit analog-to-digital converters The MPU-6050 has a remarkable innovation called the Digital Motion Processor (DMP) integrated into the chip whose programming is proprietary to Invensense. It allows 6-axis sensor fusion calculations to be performed by the DMP at a fixed rate of 200 Hz and the results delivered to the host microcontroller in the form of a quaternion, Yaw, Pitch, and Roll, tap interrupts, portrait/landscape detection, etc...This information is sent to other  devices through I2c bus which is in built in IMU.
​​

Picture
Atmega8:
 We used atmega8 to take data from receiverBT-module and run the motors accordingly

Picture
Arduino:
   We used arduino to get data from IMU and send data to BT-module ​

Picture
BLUETOOTH MODULE(HC-05):
                                                            This can act as both master and slave module.Using AT commands we got to set it in any one of the modes .The one on the transmitter side should be inmaster mode and the receiver side should be in slave mode .

Picture
WORKING PRINCIPLE:
                                                 IMU sensor detects the yaw, pitch and roll values of the hand and sends it to the arduino(microcontroller).Based on the difference between these values particular instructions are sent to the receiver side through bluetooth module wirelessly .On receiver  side Atmega8 processthe data and controls the motors accordingly .

Picture
BLOCK DIAGRAM:

​MEDIA:
TEAM MEMBERS:
  1. DIVYA MUSAPETA
  2. RUPALI GAREWAL
  3. DISHA KAMALE
  4. SNEHAL AHIRE
Powered by Create your own unique website with customizable templates.
  • Our Lab
    • About
    • Research Themes
    • Gallery
    • Exhibitions
    • Workshops >
      • Workshop Info
      • FAQ
    • Intern Diaries
  • Projects
    • Flagship Projects
    • Summer Projects
  • Publications
  • Our Team
    • Professor Incharge
    • Alumni >
      • Batch 2014
      • Batch 2016
      • Batch 2017
      • Batch 2018
      • Batch 2019
      • Batch 2020
      • Batch 2021
      • Batch 2022
    • Core Coordinators
    • Junior Year Coordinators
  • Contact
  • Spin-offs
    • Makxenia
    • AidBots
  • Intranet