<img src="https://certify.alexametrics.com/atrk.gif?account=bIEZv1FYxz20cv" style="display:none" height="1" width="1" alt="">
Skip to content
Blog / Latest Articles

Avatar Interaction: Ryan the Stylist, Emily the Client



The Inquiry
How to create virtual touch? Without haptic feedback rigs or direct stimulation to the brain, how can we get closer to that special, sometimes intimate, sometimes intricate, sometimes magical feeling that is touch? We’re trying a lot of different approaches, but this video illustrates one combination: a front-facing PrimeSense depth camera, the FaceShift facial tracking SDK, the Leap Motion controller, and the Hifi virtual world software. There’s no physical feeling for either party, but as you’ll see, Ryan is virtually touching Emily’s hair, and that’s one step in the right direction.

The Setup
Emily and Ryan both sat at MacBook Pro’s with PrimeSense depth cameras clipped to the top of their screens (we 3D printed the clip), the Faceshift SDK extracting head position and facial features, and our Interface software processing and streaming the data to control the avatar’s face and body. Ryan’s hand is detected by the Leap Motion controller. The end-to-end latency is about 100 milliseconds. For our headphone and microphone, we usually use this Sennheiser headset.

You might notice that the audio is noisy. This is because we applied some gain to bring the levels up. “But you claim high fidelity audio,” you might be thinking. Well, one of the brilliant things about our audio architecture is that it works similar to the real world. The further you are away, the harder it is to hear. But this doesn’t work for recording/capturing, something we’ve yet to optimize for.

We captured the video with the Screen Recording functionality in QuickTime player, piping the sound out from Interface and in to QuickTime using Soundflower. To capture, I logged in as my avatar and stood next to Ryan and Emily, recording what I observed.

When you see Emily’s right hand raise, it’s because she’s moving her mouse. In our current version, moving your mouse cursor will also move your hand and arm.

Add Spatial Audio to Native Apps -- Find Out More

Published by High Fidelity November 27, 2013

Don't Miss Out

Subscribe now to be first to know what we're working on next.

By subscribing, you agree to the High Fidelity Terms of Service

Need an audio solution?



Detailed docs covering how our API works and sample code.

Guides + Examples

Complete guides and walkthroughs covering all you’ll need to get started.