More than 20 years before the Oculus Rift, award-winning recording artist and producer Thomas Dolby invited visitors to New York’s Guggenheim Museum to don unfamiliar headsets and experience a mind-bending musical performance in virtual reality. For Thomas, it was the beginning of a decades-long interest in virtual music production and performance.
More recently, we’ve been lucky to work with Thomas on several VR experiences of our own. He scored our Halloween “Zombie Island” events in real-time and headlined High Fidelity’s first virtual reality festival, FUTVRE LANDS, in what, to our knowledge, is the largest live concert ever staged entirely in VR.
Today, we’re excited to make the collaboration official. Thomas Dolby is formally joining High Fidelity’s board of advisers!
While his 80s synth-pop hit “She Blinded Me with Science” made him a household name, and albums like The Golden Age of Wireless, Aliens Ate My Buick and the category-defying A Map of The Floating City cemented his place in music history, Thomas’s long and distinguished career as a technology pioneer deserves recognition of its own. In the 90's he founded software synthesizer company Beatnik, Inc., where he helped develop the first polyphonic ringtones. His audio software was licensed to every major cell phone company, including Nokia — which used it to create the famous ringtone one journalist described as “the international anthem of telecommunication.”
Having pushed beyond the artistic frontier in his own work, now he’s helping the next generation do the same. He currently leads the Music for New Media program at John Hopkins’ Peabody Conservatory. He teaches students how to compose and produce music for forms of non-linear entertainment, including virtual reality.
We’re proud to welcome this artistic trailblazer to the High Fidelity team, where he joins other noted visionaries on our advisory board. Read more about Thomas’ early VR experiments, his experience at FUTVRE LANDS, and his hopes for virtual reality in the Q&A below.
High Fidelity: Throughout your career, you’ve combined music and technology, analog and digital, in creative and unexpected ways. What drew you to VR, and High Fidelity in particular?
Thomas Dolby: I co-developed my first VR app in 1993 — it was ‘The Virtual String Quartet’, an installation at the Guggenheim Museum in New York. At that time the graphics were awful — it took 3 months to program and ran on an IBM 386 with an 8 bit HMD at about 5 frames per second, but the sound was GREAT! There was a line right around the block for a week for the 5-minute experience, but nobody came back a second time. So it was clear to me that 1993 was way too early for VR. Now 25 years later, technology has finally caught up — and it’s open source, extensible, and free. I like High Fidelity’s platform better than the others because it’s easy to create VR *in* VR — even the music. When I worked with High Fidelity to perform the live score for “Escape from Zombie Island” while looking down upon the mortals from a virtual platform 500 feet in the air, I knew the new era had truly arrived!
HF: At High Fidelity’s FUTVRE LANDS festival, you played for an audience of hundreds of avatars. What are your thoughts about that experience, and how did it compare to performing in person?
TD: It was fun as hell, but very stressful! All the usual issues at a live gig — the instruments, the lights, the PA, stage directions, crowd control, etc.— plus the added challenges of seven trackers on my hands and feet, running on a breakthrough VR technology with the servers pushed to the limits. And although it was my 3D avatar up there on the stage, it was still ME that would fall flat on my face and look and sound like an idiot! But that said, I was delighted with how well the audience played along and got into the spirit of it. Everybody seemed to have a great time, as they knew they were part of something historic. I am sure the experience will only get better from here.
HF: From an artist’s perspective, what do you see as the single most important advance still to come for VR?
TD: What puts me off the idea of doing VR on any given day is that I’ve got to fire up a (horrible) Windows PC, check all the wires and the sensors, get into the HMD and headphones, then find my way to the right app; and even then I’m tethered in one place and I’ll get the cord wrapped around my legs or my chair. So as you can imagine, I’m very excited about wireless standalone headsets like the Oculus Quest that are coming in 2019. Even if it’s a slight step backwards in graphics quality, I’m sure Moore’s Law will soon take care of that. And the price point is fantastic, on a par with a game console. So I’ve no doubt this year will be a turning point for the adoption of VR, and the volume of content available for it.
HF: It’s often observed that early TV imitated theater before evolving a style of its own. Do you think something similar will happen with music, or other kinds of performance, in virtual reality? What will change for artists once you remove the limitations of the physical world?
TD: When video games came along in the early 90s, many artists and musicians were very inspired by the idea of non-violent, experiential content, and the opportunity to create for it. There was a brief moment when games like Myst seemed to offer an alternative to the proliferation of Doom-type games; but that era sort of fizzled. Now with VR going mainstream, there’s new hope for imaginative, non-violent content, and incredible new forms of creative expression for artists like me. This will include music applications as well as role playing, social games, and transmedia experiences. VR is still clunky, but once it smooths out you’ll see much more intuitive human-computer interfaces — utilizing the physical skills we all grew up with, rather than a QWERTY and mouse or game controller. In the music domain this is thrilling, because the UI metaphor for most music software is based on you having knowledge of analog recording studios, which are waaay obsolete! I’d much rather paint in air or flap my arms around to control music; and the audience can hover around me or even join in!
HF: You’ve started incorporating VR into your courses at Johns Hopkins. What have you found most interesting about the way your students approach VR? Do they see possibilities that hadn’t occurred to you?
TD: At the Peabody Institute, we’ve launched the first music composition degree program at a major US university. These students will be the first to receive training in the fundamentals of audio for VR and games, based on the fundamentals and legacy of 100 years of Hollywood music scoring. In 4–5 years they will enter the workforce, and by that time we may have AI characters, full body trackers and haptic feedback, custom in-ear monitors, smart contact lenses, even neural implants. My students have their own attitude to VR and a level of enthusiasm that is nothing I can teach them. There is no rule book as yet for this stuff, so it will be made up as they go along. Should VR sound be spatialized? Should it be mixed with real world ambiance? How do you juggle the multiple sources of other players’ voices, NPCs, sound FX and music? Do we really need an orchestra over there telling us how to feel? Hey, I don’t know the answers either, but as a class we will work it out together!