High Fidelity Backlog

Bringing the Avatar to Life

Written by High Fidelity | Apr 19, 2013 7:00:00 AM

A key part of our research direction at High Fidelity is to breathe life into the avatar, by capturing data on movement, facial, and gaze information from as many channels or devices as we can and streaming this data, along with very high quality audio, at low latency and high FPS.

 
 

This video clip is an early example of our work in that direction. We built a simple test device consisting of gyros and accelerometers (the same parts that are in cell phones, the fuelband, Google glass and many other devices) that are sending their data with very low latency to a computer that is rendering an avatar. As you can see, if the sensors are captured at high FPS and synced to the audio very tightly, the effect is impressive.

The difference between the sort of face tracking you can do with a camera (low FPS) and what you can get from the raw gyros (60+ FPS) is pretty clear. Our goal is to build a platform in which data from many different devices can be simultaneously captured and streamed to create a very compelling version of an avatar that can communicate with a higher degree bandwidth and emotional impact that anything we have seen before.