Reverb is a critical feature of 3D audio, allowing the reflections of sound from the walls of a space or nearby surfaces to be audible in addition to the ‘dry’ primary sound made by people or objects.
To try out the new settings run reverbtest.js and adjust the sliders to change the sound of the space.
Check out the example video:
A discovery from creating VR content is that particle systems are a lot more interesting when attached to people with hand controllers. Having created some great examples like these musical ‘rave gloves’ below, we’ve been working to get the performance of particles as high as possible with the expectation that they will be heavily used. If you’d like to try these out, use the script: Flowarthut.
Avatars from 3D scans
Our initial research at High Fidelity, which pre-dated the Oculus rift and Vive, was to use depth cameras to capture facial expressions conveyed live onto cartoon-style avatars. With the likelihood of first-generation HMD’s making millions of people want to become avatars, we’ve begun experimenting with whether we can use 3D scans of actual people in conjunction with the simpler audio-driven mouth animations that HMD wearers will be limited to using. As a first test, we scanned our whole team using two different 3D scanners (one for hair and one for the face), and then used some simple techniques to make the results not be (too) uncanny. Our holiday card this year is all of us posing together. Although there is still a lot of work to do, we achieved these results with less than 2 hours of work beyond the initial scans, and the effect of having our team meetings using our ‘real’ avatars is quite compelling. A great discovery is at the pixel resolutions of the rift, in a 20 person meeting, one can instantly recognize one’s co-workers at a glance.
Supporting Large Groups
High Fidelity will achieve high scalability both by maximizing the number of people who can be together on a single server, and also by connecting servers together to create larger single spaces. In pursuit of this first goal, we’ve begun optimizations to maximize avatar concurrency for the single-server case. So far, as is show in this image, we’ve been able to support a couple hundred avatars connected to a single server, with the nearest 40 or so rendering at 75Hz in the Oculus using fairly detailed avatars and including joint and facial animation.
With the help our early Alphas users, we took on a challenge to build a mini game, that used some of the functionality that is already available within High Fidelity. The result is Winter Smashup. This is a fun, time based game that was built collaboratively with input from 7 alphas. The game works best with a HMD and hand controller.
Here is the intro: