<img src="https://certify.alexametrics.com/atrk.gif?account=bIEZv1FYxz20cv" style="display:none" height="1" width="1" alt="">
Skip to content
Blog / Latest Articles

Using VR to Decode Nonverbal Communication

A group led by researcher Andrea Stevenson Won at Cornell University is currently using High Fidelity to record body movements made during communication, taking advantage of how High Fidelity accurately transmits both head and hand movements with low latency, for multiple participants. You can read more at their website.

We are learning more about human communication through VR: In part, this is because we are having to establish and enable what is minimally needed for effective and enjoyable communication. For example, do we need to see finger movements, or just hands? How about arms or upper body posture? What about mouth movements, versus eyebrows?

But Cornell’s work is an example of how we can now begin to go beyond this feature prioritization process and actually start to decode parts of human communication that have previously been difficult or impossible to study. When people talk, they use their body in a very rich and subtle way to communicate. We don’t know what fraction of communication body language accounts for, but we know it is a lot. The challenge is that for scientists to study it they have historically had to rely on high speed cameras that watch subjects, later laboriously estimating body motion frame-by-frame from multiple camera views.

Using High Fidelity, the Cornell team was able to quickly setup their own VR server, upload some useful content and avatars, and then most importantly, use JavaScript to record the real-time motions of the participants and display them live on a web surface in-world as well as record the results directly to a server. The transmission latency of both audio and body motion in High Fidelity is sufficiently low that it is undetectable to participants, who can therefore converse normally while standing together as seen here, as avatars. The real-time nature of the data collection and analysis means that if desired, the researchers can watch live indicators suggesting the meaning of different kinds of body motion.

Published by High Fidelity January 18, 2017
blog-circles

Don't Miss Out

Subscribe now to be first to know what we're working on next.

By subscribing, you agree to the High Fidelity Terms of Service

Need an audio solution?

icon-local-spatializer

Deliver the Best Audio Experience

High Fidelity’s client-side audio solutions work with your existing audio networking layer to deliver the most incredible audio experience.