Google Move Mirror matches your poses to photos as you dance for your webcam

INSUBCONTINENT EXCLUSIVE:
Google has launched a new AI Experiment, Move Mirror, which tracks your movements and finds still photos that match your poses
It uses a catalog of over 80,000 images, including shots of people cooking, doing karate and skiing, and displays them in real time,
creating a flipbook-style effect.Grant the Google Mirror website permission to access your webcam, then start throwing some shapes, save the
animation as a GIF, and share it online.Move Mirror is an interesting demonstration of what Google calls pose estimation: tracking a
person's body as they move in 3D space
and some people use assistive devices like wheelchairs and crutches, all of which makes it difficult to gauge the position of their
limbs.Strike a pose, there's nothing to itMotion capture suits and infrared technology work well, but the need for dedicated hardware means
they aren't always practical
and they're tough for the average developer to implement in their own apps.Move Mirror demonstrates how machine learning models can be run
right in a web browser, understanding and inferring data from an ordinary webcam or mobile phone camera
It has lots of potential uses, including teaching dance moves and learning home yoga workouts, and no data is sent to third-party servers,
demonstrate how machine learning works, and what it's capable of.Other Experiments include AutoDraw, which invites you to sketch an object,
then matches it to a piece of clipart, and AI Duet, which takes melodies played on a virtual piano, and interprets and embellishes them to
create a response.Via Engadget