Hand Tracking for Mouse Input (2023)

wonger_ | 221 points

It unsettled me a lot about just how much work was put into making the JavaScript version of this work instead of a purely Python version, due to how OpenCV works. I wonder how universal the laggy OpenCV thing is, because my friend faced it too when working on an OpenCV application. Is it so unavoidable that the only option is to not use Python? I really hope that there is another way of going about this.

Anyways, I am very glad that you put in all that effort to make the JavaScript version work well. Working under limitations is sometimes cool. I remember having to figure out how PyTorch evaluated neural networks, and having to convert the PyTorch neural network into Java code that could evaluate the model without any external libraries (it was very inefficient) for a Java code competition. Although there may have been a better way, what I did was good enough.

ancientstraits | a month ago

Mediapipe is a lot of fun to play with and I'm surprised how little it seems to be used.

You might also be interested in Project Gameface, open source Windows and Android software for face input: https://github.com/google/project-gameface

Also https://github.com/takeyamayuki/NonMouse

xnx | a month ago

This is cool, but a moving average filter is pretty bad at removing noise - it tends to be longer than it needs to be because its passband is so bad. Try using a IIR filter instead. You don't need to deal with calculating the coefficients correctly because they'll just be empirically determined.

out = last_out * x + input * (1-x)

Where x is between zero and one. Closer to one, the more filtering you'll do. You can cascade these too, to make a higher order filter, which will work even better.

plasticeagle | a month ago

Amazing work! I have been working on robotifying operation task for my company - a robot hand and a vision that can complete a task on the monitor just like humans do. Have been toying with openAI vision model to get the mouse coordinates but it’s slow and does not return the correct coordinates always (probably due to LLM not understanding geometry)

Anyhow , looking forward to try your approach with mediapipe. Thanks for the write up and demo, inspirational.

pacifi30 | a month ago

I did a very similar project a few months back. My goal was to help alleviate some of the RSI issues I have, and give myself a different input device.

The precision was always tricky, and while fun, i eventually abandoned the project and switched to face tracking and blinking so i didn't have to hold up my hand.

For some reason the idea of pointing my webcam down, didn't dawn on me ever. I then discovered Project Gameface and just started using that.

Happy programming thank you for the excellent write up and read!

AlfredBarnes | a month ago

Very nice! The sort of thing that I expect to see on HN. Do you currently use it? I mean maybe is not perfect for a mouse replacement but as a remote movie control as shown in one of the last videos is definitely a legit use case. Congrats!

liendolucas | a month ago

> Python version is super laggy, something to do with OpenCV

Most probably I'm wrong, but I wonder if it has anything to do with all the text being written to stdout. In the odd chance that it happens on the same thread, it might be blocking.

aranelsurion | a month ago

Some problems in life can be easily fixed with crimson red nail polish.

Aspos | a month ago

Mediapipe makes hand tracking so easy and it looks SO cool. I did a demo at PyData NYC a couple of years ago that let you rotate a Plotly 3D plot using your hand:

https://youtu.be/ijRBbtT2tgc?si=2jhYLONw0nCNfs65&t=1453

Source: https://github.com/jcheng5/brownian

jcheng | a month ago

A great demo, but how I wish there was a keyboard-less method for words input based on swipe-typing, meaning I do not press virtual keys, I just wave my index finger in the air, and the vision pick ups the path traces and converts them for words. Well, if there's something else asking for even less effort, maybe even something that's already implemented - I am all open to suggestions!

ewuhic | a month ago

Such a cool and inspirational project! Regarding the drift on pinch, have you tried storing the pointer position of the last second and use that as the click position? You could show this position as a second cursor maybe? I've always wondered why Apple doesn't do this for their "eye moves faster than hands" issue as well.

omikun | a month ago

Related online demo on using mediapipe for flying spaceships and camera/hand interaction to grab VR cubes (2nd link for the demo). There was a discussion on hackaday recently [2].

[0] https://tympanus.net/codrops/2024/10/24/creating-a-3d-hand-c...

[1] https://tympanus.net/Tutorials/webcam-3D-handcontrols/

[2] [https://hackaday.com/2024/10/25/diy-3d-hand-controller-using... DIY 3d hand controller

zh3 | a month ago

It's projects like this that really make me want to start on a virtual theremin. Wish I had the time :(

kelseyfrog | a month ago

This is a very cool demo! Well done!

One suggestion for fixing the cursor drift during finger taps is instead of using hand position, use index finger. Then tap the middle finger to the thumb for selection. Since this doesn’t change the cursor position, yet is still a comfortable and easy to parse action.

HermanMartinus | a month ago

An inspiring project. I am looking forward to see some gloves connected to a VR device. I think that some cheap sensors, a bit of bayesian modelling and a calibration step can offer a proper realtime hand gesture tracking.* I am already picturing being able to type on a AR keyboard. If the gloves are more expansive there might be some haptic feedbacks. VR devices might have more open OSes in the future or could use a "streaming" platform to access remote desktop environments. I am eager to see all the incoming use cases!

*: a lot of it. Plus, the tracking might be task-centered. I would not bet on a general hand gesture tracking with cheap sensors and bayesian modelling only.

mufasachan | a month ago

Cool path and write-up. Thank you!

Just because of the use case, and me not having used it in an AR app while wanting to, I'd like to point to doublepoint.com 's totally different but great working approach where they trained a NN to interpret a Samsung Watch's IMU data to detect taps. They also added a mouse mode.

I think Google's OS also allows client BT mode for the device, so I think it can be paired directly as a HID, IIRC.

Not affiliated, but impressed by the funding they received :)

hoc | a month ago

Remeinds me of the Leap Motion controller, now there's a version 2: https://leap2.ultraleap.com/downloads/leap-motion-controller...

ps8 | a month ago

So cool! I was just wondering the other day if it would be possible to build this! For front facing mode, I wonder if you could add a brief “calibration” step to help it learn the correct scale and adjust angles, e.g. give users a few targets to hit on the screen

jacobsimon | a month ago

If compelling enough I don't mind setting up a downward facing camera. Would like to see some more examples though where it shows some supremacy over just using a mouse. I'm sure there are some scenarios where it is.

KaoruAoiShiho | a month ago

This is very cool - can you do window focus based on the window I am looking at next? :)

0x20cowboy | a month ago

This has tons of potential in the creative technology space. Thanks for sharing!

alana314 | a month ago

Man, I feel making diagrams / writing handwritten notes will be great with this!

vkweb | a month ago

Very impressive! This opens up a whole new set of usages for this headset

SomeoneOnTheWeb | a month ago

erm i snuk in hackers news i kid erm what the sigmwa

yireawu | a month ago

could this be the next evolution of gaming mice?

bogardon | a month ago