Oculus Rift inventor Palmer Luckey was in Cambridge Saturday, at the Microsoft NERD Center, for a recruitment event and mini virtual reality conference.
If you couldn’t make it, you missed three different tracks of presentations, running from 3 p.m. to 9 p.m., plus demos of several Oculus Rift games.
Most of the focus of the event was on gaming, and probably the majority of those in attendance were in the gaming industry, or studying a related field. But there was also some discussion of the use of virtual reality for training, education, business, and other serious applications. And, of course, for virtual social worlds like Second Life and OpenSim.
“We’re building the metaverse,” said Luckey. “Virtual reality allows for a massive leap forward in innovation.”
He specifically addressed the fact that the Oculus Rift is a device that makes many type of games more accessible to the general public. Today, game controllers come with multiple joysticks and buttons, and players not only have to be able to use the controllers without looking at them, but also quickly enter complex combinations.
The Oculus Rift allows for a more natural interface. For example, to look around in a world, players simply move their heads. Compare this to the alt-left mouse click combination necessary in Second Life and OpenSim viewers.
“This will make games more accessible to non-players,” he said.
In addition, designing interfaces for video games is extremely demanding. Virtual world platforms used for more serious applications, such as training and education, tend to be more realistic and straight-forward. As a result, serious virtual world applications will benefit significantly from work already being done in video games.
“If you can make it work in games, you can make it work anywhere,” he said.
The Rift is great
I went into the demos expecting to see some of the issues that people had been writing about, such as the “screen door effect” Â resulting from the low resolution display.
Even though I was trying out the first set of Rifts, not the latest high-def displays, or the even-higher-def displays that are planned for the eventual consumer version, I had no problems with the graphics at all. Even without adjusting the optics in any way.
I put on the goggles — which felt as light as skiing goggles or snorkeling masks — and was inside the world. There were no visual cues to indicate that I wasn’t there — no light from the room around me leaking in around the edges, no frame around the view. I had my field of vision, and could look around and feel that I was in a different place.
I tried four different demos. In one, I was a disembodied point of view trying to walk along a narrow bridge through a psychedelic fog. Then there was the skydiving simulation, where you have to avoid hitting obstacles as you fall. I wasn’t a fan of either of these — they didn’t feel particularly real or engaging.
Then I tried out a helicopter simulation, and though you couldn’t actually manipulate helicopter controls, it felt as though you were a helicopter pilot, flying around a giant toy city. Let’s just say that I’m not about to go and get my helicopter flying license. And also — I get motion sickness in a virtual helicopter that’s making a lot of crazy turns. I stopped playing after a few minutes because of queasiness. But I have a feeling that I would have reacted the same way in a real helicopter, as well.
Finally, my favorite simulation, was a space flight game called Gimbal Cop, from Defective Studios.
You’re in the cockpit of a spaceship. You can look down and see your chair, and look all around the cockpit, which is pretty much a bunch of glass panes arranged into a hemisphere around you. Your goal is to fly through a bunch of hoops floating in space, without ever crossing the track left behind your ship as it flies.
The presence of the cockpit gives a nice anchoring effect, and makes the game seem very real — even though you’re in space and the objects you’re flying around are anything but realistic. And, according to one of the presentations Saturday night, cockpits and similar anchoring elements help reduce motion sickness.
I crashed quickly, but went back for a second round. The game was definitely a lot of fun to play in immersive 3D.
My recommendation? If you have $300 to spend, the current developers kit version of the Oculus Rift is already pretty good. Good enough to use to build simulations, design interfaces, and build virtual worlds in. If this is important to you, go and order a kit.
There’s no official release date yet for the consumer product, but from what I heard at the conference, it may hit the market in the second half of 2014. That’s a long time to wait. Might as well buy the dev kit now — especially that OpenSim now has viewer support for the Oculus Rift, and a rudimentary user interface.
Still a lot of questions
In fact, the user interface is a big challenge, and not just for Second Life and OpenSim. Palmer Luckey talked about some of the problems in his presentation. For example, menus and icons arranged around the edges of the screen are pretty much unviewable in the Rift, since they’re at the very edge of your field of view and if you turn your head to look at them directly, they turn with you.
In addition, while the Rift does make it easier to look around inside a virtual world, it makes moving around and interacting with the environment a bit harder, since you can’t see the keyboard. A Siri-style voice-based interface would be good, but isn’t there yet.
For motion, you can fork over another $500 for the Virtuix Omni treadmill, which doesn’t let you sit down, or play sitting down using a mouse, joystick, or keyboard commands. Neither of these is an optimal solution.
Finally, to really interact with a virtual environment, you need to be able to see and control your own hands and feet in that environment.
One promising approach is mind-reading. I’m not kidding — there are already folks out there combining the Rift with devices that read mind waves.
The video above is a demonstration of using the Oculus Rift in conjunction with the EPOC Emotiv mind-reading headset for motion. Hand position is tracked using the Razer Hydra.
As the mind-reading devices get better, I can see using them for selecting objects that you’re looking at and for basic in-world commands, in addition to the more simple task of moving left, right or forward.
This is all still a ways away, however.
Meanwhile, the big question is whether folks are willing to pony up $300 — or whatever the final consumer version of the Oculus Rift will cost — for another gaming peripheral. Of course, the price will drop over time. Luckey predicted that the cost of VR goggles will be under $50 within ten years, based on the rapidly falling prices of screens, processors, and sensors.
But that might not be enough for mainstream adoption.
In addition to the current interface issues, the price tag, and the motion sickness, another problem with the Oculus Rift is its immersion. When you’re in there, you’re in there. You can’t see the giant spider crawling over your knee, the stranger breaking into your house, your dog stealing the last cookie from your plate, or your teenage daughter leaving the house in that miniskirt that covers absolutely nothing and which she’s specifically told not to wear.
One researcher making a presentation this weekend said that he actually felt uncomfortable when he was alone in a room with young female test subjects. Wearing the Rift makes you uniquely vulnerable. You can’t see the real world around you, your hearing is occupied with in-game sounds, and your other senses are dulled because you can’t — yet — use them in the virtual reality environment. In fact, virtual reality simulations are now being used to help burn victims deal with excrutiating pain. Not only do patients notice the pain less because they’re distracted by the game, but MRI measurements show that the brain’s pain signals are actually reduced.
It’s one thing, however, to be lost in a virtual reality SnowWorld while safely bundled up in a hospital bed and the only people causing you pain are doctors and nurses with your best interests at heart. It’s entirely different if you’re in a home, office, or airplane when you need to have some minimal awareness of your surroundings.
Finally, in order to gain any significant traction, we need a dramatic change in our computing environments. The graphical user interface and the mouse were around long before Windows, but it took Microsoft putting it on every desktop to convince people to switch over. I remember people — mostly crotchety geezers — complaining that the mouse is a toy and no self-respecting person would ever use it, and they were planning to stick with the much more efficient DOS operating system, thank you very much.
As far as I know, other than a few patents being filed here and there, nobody is seriously working on a 3D operating system, one that allows us to manage not only the information on our computers, but also to manage experiences and environments that we have built, and ones shared with us by others.
Of course, there might be a a team in Xerox Parc, Apple, or Google working on one right now, and not ready to talk about it just yet.
Or it might be someone from left field, like Jesse Dostie, a Hartford-based engineer interested in combining Linux with OpenSim or a similar platform. (If anyone is interested in working with him on this, drop me a line and I’ll put you in touch.)
- Kitely adds PBR materials support, larger textures - November 8, 2024
- OSCC 2024 Submission Deadline Approaching - October 19, 2024
- AvatarLife Viewer adds video calls, screen sharing - October 19, 2024