-
Updated
Mar 9, 2021 - Jupyter Notebook
handgesture
Here are 28 public repositories matching this topic...
This project aims to control the system volume and screen brightness using hand gestures. Use your left hand for brightness control and right hand for volume control. For detection of hand landmarks, mediapipe library is used.
-
Updated
Jun 25, 2021 - Python
Detects Left, Right and Stop movements which are hand gestures which are steering wheel movements.
-
Updated
May 17, 2020 - Jupyter Notebook
This is a project to make a robotic hand imitate hand gestures performed by a human.
-
Updated
May 9, 2020 - Python
AI Rock Paper Scissor with hand gesture is an AI based python project in which you can detect hand and fingers and with the help of your fingers co-ordinates you can figure out if it is an rock, paper or scissor sign with the help of this you can play this game with the AI where the AI plays with the player and the scores are given as per who wi…
-
Updated
Jul 6, 2022 - Python
A virtual mouse controlled by hand gestures using finger points. Implemented using mediapipe and openCV.
-
Updated
Jun 14, 2024 - Python
-
Updated
Mar 17, 2024 - Jupyter Notebook
Computer Vision based project that counts the sign number by analysing hand gesture of the user.
-
Updated
Aug 1, 2023 - Python
[Side Project] - Project: Hand Gesture recognition
-
Updated
Feb 14, 2021 - JavaScript
PPT Control Using Hand Gesture is a project that allows you to control PowerPoint presentations using hand gestures. Instead of relying on traditional input devices like a keyboard or mouse, this project leverages computer vision techniques to detect hand gestures and map them to specific presentation commands.
-
Updated
Jun 24, 2023 - Python
-
Updated
Feb 17, 2024 - Python
-
Updated
Mar 2, 2024 - Jupyter Notebook
-
Updated
Apr 5, 2024 - Jupyter Notebook
👋 Hand Gesture Recognition model implemented using mediapipe
-
Updated
Nov 24, 2024 - Python
This Project uses OpenCV and MediaPipe to Control system volume
-
Updated
May 18, 2023 - Python
This project adapts Google's Dinosaur Game with Gesture Recognition, enabling players to control the dinosaur's actions via hand movements. It employs TensorFlow and OpenCV for gesture recognition and Python for scripting. This fusion of gaming and technology enhances user engagement, showcasing the innovation achievable through computer vision.
-
Updated
Jun 10, 2024 - PureBasic
-
Updated
Mar 4, 2024 - Python
An interactive 3D solar system simulation using Babylon.js, where WebSocket-driven hand gestures let users control planet rotation and zoom for an immersive experience.
-
Updated
Oct 30, 2024 - JavaScript
-
Updated
Jul 26, 2018
Improve this page
Add a description, image, and links to the handgesture topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the handgesture topic, visit your repo's landing page and select "manage topics."