Over the last month we have been working on some fantastic VR demos that we will be sharing with you throughout the next month. Some of the features that make up these experiences are outlined in the attached video and detailed below:
Particles and Procedural Textures
We have been expanding out what can be done with particles and have introduced procedurally rendered textures. Both of these capabilities increase the interactivity and immersion within the environment.
A fire effect using particles
Over a few days in September one of our engineers introduced the ability to create procedurally generated entities. These entities allow you to make custom shader effects (like the ones that you can get over at ShaderToy) and bring them into Interface. The effects that you can create can be applied to entities and skyboxes, with some amazing results. Just one of the many examples that we are seeing can be seen below.
Several examples of entity and skybox procedurally generated textures
If you are interested in learning how to apply your own shaders you can watch this video. Happy shading!
In 2016, when the consumer versions of the HMDs are released, you are also going to be using a hand controller. It is therefore important that we can make your avatar body simulate correct movement with the hand data that we receive back from the controllers. Over the last couple of months a team has been working on avatar kinematics and the attached animated pictures are a couple of first takes on what is possible.
Using a DK2 and Hydra controller for simple movements
Manipulating blocks using hand controllers
Our CEO, Philip Rosedale had a great radio interview with the Ted radio hour on NPR. You can hear the talk here.
Look out for some more exciting updates in October.