At some distant point in the future, virtual reality will be good enough and immersive enough that we can telecommute to our jobs by simply stepping into some version of a Holodeck or plugging into some version of the Matrix.
Today’s virtual reality has only very limited use for the average corporate employee.
There are some uses. Rapid prototyping, virtual simulations, the occasional virtual meeting or conference. But these are, right now, very much niche applications.
One problem is that a typical job involves dealing with two different types of tasks — ones that involve sharing and processing experiences, and ones that involve processing and sharing information.
Virtual reality is great at the experience part. You can your colleagues can, say, ride a rollercoaster together, visit a mockup of your new office building together, or attend a quarterly earnings presentation together.
And your current computer setup is optimized for dealing with information. You’ve got your word processing, your email, your Internet, your spreadsheets, your Skype — everything you need to move data around.
The problem is when these two things overlap. You don’t want to edit a word processing document by waving your hands in the air — you would get very tired, very quickly. The mouse-and-keyboard combination allows you to work for hours at a time. But to use it, you’d have to leave the virtual world.
This is why so many people are frustrated at attending meetings in Second Life or OpenSim — it takes a long time to get into the world, to get the audio to work, to make sure everyone is wearing clothes — and then if all you do is watch a PowerPoint presentation, then what was the point? You could have watched the same presentation online and had a quick conference call, instead.
But with the online presentation, you don’t get the sense of presence — you’re missing the experience part of the equation. You don’t get to interact with your co-workers, to gossip with them afterwards around the water cooler, to catch lunch together after the meeting. And that’s exactly when a big chunk of work actually happens.
But it just occurred to me that we do actually have a technology that combines the two. And I hate to say it, it really feels bad to have to admit it, but it’s Google Glass.
Imagine that you’re sitting in your home office, working at your computer. But when you look up, instead of seeing the walls around you, you see the rest of your company’s office, your co-workers, your boss. They can walk over and see what you’re working on, catch you up on the latest company gossip.
Google Glass — or some future variant of it — would allow you to see your computer, your keyboard, and your office furniture, while also interacting with a virtual environment.
I got the idea when looking at the Atlas system for the Oculus Rift (now running a Kickstarter campaign).
The Atlas uses an iPhone camera to map your real environment into a virtual environment, so you can walk around your actual living room while you think you’re walking around inside, say, a zombie castle. I wondered if it was possible to use the same idea to integrate a computer, keyboard and your chair into a virtual environment. But then you’d have a situation where you’d have a camera projecting your actual computer screen into a virtual world, which is just weird. Going the other way — looking at your actual screen, but having virtual people projected into your real office — makes more sense, and also means you’re not wearing a heavy Oculus Rift on your head all day.
Instead, you’re wearing a pair of glasses, which many of us wear anyway.
How long will it take for us to see this hardware?
When I started writing this column a couple of days ago, I would have guessed a year or two for someone to take the Google Glasses idea, and combine it with full stereoscopic 3D and a video camera.
Turns out, they’ve already done it.
They’re called Space Glasses and they’re already available to order, for $667 each, from Meta, for November delivery. The company was founded just last year by students at Columbia University.
The folks at Meta think their glasses will make computers obsolete because any blank wall can become a computer screen, and any flat surface a keyboard.
I’d probably still want a keyboard, though — I like the tactile feeling of the keys. But I don’t think I’ll mind getting rid of all the monitors I have on my desk.
- OpenSim user growth cools down with chilly weather - November 15, 2024
- Kitely adds PBR materials support, larger textures - November 8, 2024
- OSCC 2024 Submission Deadline Approaching - October 19, 2024