New Hand-Tracking Algorithm By Google Brings Hope For The Deaf

New Hand-Tracking Algorithm By Google AI Labs Is Amazing & Fast

A number of companies such as SignAll and Kintrans have developed hand-tracking software that attempts to enable millions of people that rely on sign language to easily communicate with anyone. However, they have had very little success. This is where the latest hand-tracking algorithm from Google’s AI labs comes in with its promise of making this software everything that it is meant to be.

The hand-tracking algorithm by Google AI labs uses your smartphone and its camera for creating an intricately detailed map of a person’s hand that can then be tracked for the sake of communication. Google researchers Valentin Bazarevsky and Fan Zhang said in a blog post, ‘Whereas current state-of-the-art approaches rely primarily on powerful desktop environments for inference, our method achieves real-time performance on a mobile phone, and even scales to multiple hands.’

SQ3D Created A Prototype House Using 3D Printing In 12 Hours

They further added, ‘Robust real-time hand perception is a decidedly challenging computer vision task, as hands often occlude themselves or each other (e.g., finger/palm occlusions and handshakes) and lack high contrast patterns.’ The researchers have managed to make the new hand-tracking algorithm capable of calculating the hand signals faster by simplifying the process as much as they possibly could, thus leading to lesser data and eventually lesser processing time.

The new hand-tracking algorithm starts by training on the palm of a person instead of taking in the dimensions of the complete hand. Then a separate algorithm kicks in to look at the fingers as well as palm and then assigns a total of twenty-one coordinates on knuckles, fingertips, etc. The researchers had to manually add those twenty-one points for 30,000 images of hands in different poses and lighting conditions to enabled the AI to learn these coordinates.

SQ3D Created A Prototype House Using 3D Printing In 12 Hours

The team of developers has open-sourced its code while being hopeful that others will come up with innovative ways of using it and improving it. The new system also utilizes the existing MediaPipe augmented reality framework of Google. The researchers said in their blog post, ‘We hope that providing this hand perception functionality to the wider research and development community will result in an emergence of creative use cases, stimulating new applications and new research avenues.’

Leave a Reply

Your email address will not be published. Required fields are marked *