EgoEV-HandPose: Egocentric 3D Hand Pose Estimation and Gesture Recognition with Stereo Event Cameras
Researchers have developed two new frameworks for improving 3D hand pose estimation from egocentric camera views. EgoForce utilizes a differentiable forearm representation and a unified transformer to achieve state-of-the-art accuracy across various camera types, reducing MPJPE by up to 28%. EgoEV-HandPose, on the other hand, employs stereo event cameras and a novel KeypointBEV fusion module to jointly estimate bimanual hand poses and recognize gestures, achieving an MPJPE of 30.54mm and 86.87% gesture recognition accuracy. Both methods aim to enhance applications in AR/VR and human-computer interaction by providing more robust and accurate hand tracking. AI
IMPACT These advancements in egocentric hand tracking could significantly improve the realism and interactivity of AR/VR experiences and human-computer interfaces.