We at High Fidelity don’t care for exclusive parties, so I’ll be the benevolent bouncer who gives you a look inside our Alpha testing environment. There are no celebrities in here, but there are a bunch of Alphas who are rock-stars to us.
Thevideos we’ve posted in the pastgive a good idea of some of the functionality we’re working on, including facial expressions, realtime voxel rendering, and reduced latency. But the Alphas have all taken on individual projects that are speeding up the work and adding new features. Here’s a look behind the scenes at what they’re doing.
We can’t create a truly open system without making it compatible with other open-source tools, which is why Judas has been creating a workflow that will allow artists to make 3D models in the open source program Blender using HiFi’s native FBX format. Progress has been steady, and Judas is working with both HiFi and Blender developers to make this happen.
“Only last week something was added in that allowed me to import the HiFi avatars into Blender without destroying the rigs we need to animate them,” Judas said.
In another part of realspace, Ai_Austin is building a virtual space for users. Ai_Austin is a proponent of I-Rooms — or “virtual spaces for intelligent interaction” — which will make it possible for users to meet, interact, and ultimately collaborate.
You might have figured out by now that 3D worlds are no good if they can’t handle 3D models accurately, which is why Ai_Austin also tests mesh handling for complex 3D objects. The image above shows the “supercar” mesh, which has 575,000 vertices and 200,000 faces, being tested in HiFi. There are several other meshes he uses, too, including one of the International Space Station that was provided by NASA.
Care to take a stroll on the ISS? The solar panels look particularly beautiful tonight.
Immersion remains one of our biggest goals for HiFi, which means both visualsandcontrols that don’t remove users from the experience. The Nintendo Power Glove looks cool, but there’s a reason no one uses it. Fortunately, Ctrlaltdavid is working on a script that will make theLeap Motion devicecapable of controlling avatar hands and fingers without bulky rigs and sensors.
The Leap Motion sits on a desk and registers movement within a given range. It detects individual hand and finger motions and translates them directly into identical movements for avatars.
Currently, HiFi supports theRazer Hydra controller, a super-responsive motion capture tool for hands, but it doesn’t track individual finger movements as precisely as Leap Motion does. Ctrlaltdavid said that the Leap Motion will make avatar expressions more accurate, realistic, and meaningful, which is exactly the kind of interaction we’re aiming for. Even better, it doesn’t require users to wear a complex motion detection rig. Move your hands, and the avatar copies it. High five!
This kind of simple but meaningful interaction is the core of what we and the Alphas are trying to accomplish. As Judas put it:
“High Fidelity is about people. A grin, a smile, a hand gesture, a wave — not some pre-recorded gesture — breathe personality into lifeless avatars. [We’re creating] an environment that normal people want to gather in, not because of polycount, latency, or server technology, but because their friends are there. Every game has amazing graphics; HiFi should have amazing people.”
We hope you’ll be one of the amazing people in our world soon.