The History of Virtual Reality & The Future: Rift, Omni, STEM, castAR

Written by  Sunday, 20 October 2013 15:38
  •  

The concept of a "virtual" reality has existed for decades and has nebulous origins, but the first technological steps can be pinpointed to Ivan Sutherland's head-mounted display (HMD); the device, lovingly-dubbed the Sword of Damocles for its massive size and imposing demeanor, was built in 1968 and placed the user into wireframe rooms. The term itself, "Virtual Reality," didn't even popularly exist until 1985.

Since Sutherland's pioneering innovations, the industry has had disorienting cycles of ups-and-downs for Augmented & Virtual Reality tech. There were holes in the yet-unfolding plot: Missing technology (we'd only just moved from tubes to transistors), a smaller pool of talent, and the interest and funding were primarily in medical or military-industrial fields.

history-vr-slider

 

The equipment that was purpose-built for those fields made tremendous technological leaps, but would by-and-large never be faced with a consumer. And, as with many technologies that started in the military, much of the early VR/AR equipment was classified.

Jumping to the modern age, we see a resurgent force of VR companies garnering significant online support among the gaming and enthusiast communities. Arguably championed by the team at Oculus VR, the movement for new, consumer-accessible virtual reality hardware is finally building momentum.

But how did we get here?

After Oculus took the world by storm and built a sturdy backbone of supporting developers, other VR tech finally saw the opportunity to step into the light. Over the course of several conventions and a couple follow-up calls, we've spoken to developers involved with key emergent VR groups right now: Sixense (wireless motion tracking, an interface), Oculus VR (the Rift, a head-mounted display), and Virtuix (omnidirectional treadmill for physical movement, an interface). All these devices linked together looks something like what you'll find in the below video, shot at PAX Prime:

In this article, we'll give a brief recap on the history of virtual reality, talk about emergent VR tech (Oculus Rift, Sixense STEM, Virtuix Omni), and the fusion of each of these technologies to create a fully-immersive gaming experience. Other topics -- like user price-accessibility and top-level hardware discussion -- will see supplemental coverage throughout the piece.

For a history lesson and industry analysis, we recruited experts Amir Rubin (Co-Founder, Sixense) and ~30-year VR veteran Paul Mlyniec (Head of Development, MakeVR); you'll find quotes and insight from these individuals below.

It should be noted that VR, AR, and other technologies, like biometric feedback, are each massive subjects in their own right. Books are written on each topic, so we're focusing on the need-to-know information for gamers and hardware enthusiasts.

First: Why Talk About VR Now? 

Virtual reality technology has been prohibitively expensive and sized up until recent innovations. The Ultimate Display ("Sword of Damocles"), Sutherland's first HMD, took an entire room and would never be dreamt affordable outside of research, military, and industrial environments.

According to Mlyniec (in the industry since ~1984, started studying holography in the 70s) and Rubin (in the industry since ~1989), the biggest barriers to VR market penetration have been consumer accessibility, input devices, and game development frameworks. Those barriers have been broken with the advent of newer, more compact, more easily-fabricated manufacturing processes and expanded consumer interest -- gaming is a large proponent of this resurgence.

A Quick Recap: The Turbulent VR Industry & A History of Virtual Reality 

It's been a long road to get to recent hype and innovations.

1960s - Sutherland's Ultimate Display ('65) is really where VR/AR became viable "things." Up until that point, we'd mostly been looking through (still impressive) viewfinders at stills and photography. The Ultimate Display, or Sword of Damocles, took the better part of a room and exposed users to now-primitive wireframe interiors. Here's one of the earlier top-level block diagrams of Sutherland's HMD (and next to it, an unintentionally-creepy photo of the device in use):

sutherland-HMD

Alongside Sutherland's innovations, the military continued investing in flight simulation software and hardware. The early machines included mechanical pivot/YAW (you've all seen something like this in an arcade or mall), joystick and interface input, and a cockpit, but no on-screen graphics. As things progressed, some flight simulators became capable of pre-rendered video output (either real camera footage or graphics), but nothing dynamic or particularly interactive.

This would soon change.

In 1968, David Evans and Ivan Sutherland formed Evans & Sutherland. This firm would help revolutionize computer graphics and indirectly, eventually, produce founders of the likes of Silicon Graphics, Pixar, and Adobe.

1970s - The University of Utah had already recruited Evans and Sutherland as professors (c. 1967), rapidly making UU the hub for everything CS- or CG-related. Students of Sutherland—like Edwin Catmull, a Boeing veteran and aspiring physicist—would eventually bridge traditional animation with computers, creating 3D computer-animated graphics. Military, medical, and industrial simulators rapidly adopted this technological advancement.

The 70s and UU is really where all the great minds merge: With Sutherland and Evans as established researchers, the University attracted folks like John Warnock (founder, Adobe & PostScript) and Jim Clark (founder, Silicon Graphics, Inc - eventually outpaced by yet-unborn nVidia and ATi).

And in 1977, of course, Star Wars IV: A New Hope made the relevance of computer graphics a bit more mainstream.

1978 saw the birth of x86 architecture, pioneered by Intel's 8086 processor. More on this in a moment.

intel-8086

1980s - Innovations were supported by greater accessibility to 16-bit CPUs. The world of VR was at one of its peaks right around 1984, when Scott Fisher led NASA Ames' VR group to the development of its own VR hardware and software. Under NASA Ames, Fisher's team had developers building HMDs (McGreevy, Humphries, Fisher), their Virtual Interface Environment Workstation, and gloves (Mark Bolas, Jaron Lanier). Lanier later went on to coin the actual term "virtual reality" in 1985, work on the DataGlove, and continue the advancement of high-end VR through the 90s.

"They were doing something, in a sense, like a quasi-augmented reality, and they were way ahead of their time." - Paul Mlyniec.

Rising star VPL packaged HMDs with gloves and Polhemus / Ascension tracking, which Mlyniec says made VPL the "first company to really commoditize head-mounted displays."

1984 also welcomed Gary Bishop's design of 3-dimensional computer input. Using optical sensors that compiled multiple one-dimensional images to track and relay distance traveled by the device, passive tracking could be immensely improved.

These years mark the maturity of many aspects of computer technology and, to many degrees, the structural foundation for modern VR/AR technologies.

vpl-vr-techSome of VPL's early VR tech, including the DataGlove. Source.

1990s - In the early 90s, location-based entertainment was booming and powerful game consoles and PCs were spreading like wildfire. In '91-93, Virtuality Group created a line of then-advanced VR-equipped arcade game machines with stereoscopic 3D visuals; the HMD consisted of a visor fitted with two displays (capable of a mind-blowing 276x372 resolution).

Then, as with most budding industries, there was a hard crash in the mid-to-late 90s. Mlyniec, recalling these events, said:

"[Lanier] did a lot to single-handedly advance the industry, and at some sense even drove the height of the first wave of Virtual Reality in the early 90s. [This] could be characterized as a bubble that burst in the late 90s. It looked like there was no stopping Virtual Reality -- but what did stop it was the cost of every piece of equipment.

The tracking was five-, seven-, ten-thousand for a pair of hands. The head-mounts were expensive, the best you could do was about $15,000 by then. You could get a reasonably-powered graphics system for around $50,000 [from SGI]. There was no possibility of using it at a consumer level - games were out of the question - so it just existed in labs and government. It couldn't break through the barrier."

No matter how good the technology got, how small the transistors were made, VR just didn't have the option to saturate any lower-level markets due to cost of deployment. This stalled innovation as the still-new industry boomed to a point where it was threatened by its own weight, creating a lull in the market.

"I think some of us hoped that it was so attractive that the prices would come down, but they just plain didn't."

As the decade continued and rolled into the early 2000s, SGI (Silicon Graphics, Inc.) found itself threatened. The company—responsible for producing some of the most powerful graphics boxes of the time—had been thoroughly thrashed by rising forces nVidia and ATi. NVidia recruited many of SGI's best engineers and began working on low-end, consumer-class hardware in the form of video cards; SGI's philosophy, up until now, was to produce the entire computing unit -- including graphics processing -- and sell it as a biz-client/enterprise solution. NVidia swooped in at the ground-level, focusing strictly on cards and building an audience of consumer and SMB clients.

With the rise of the giants, alas, we arrive to the modern era. Continue on to Page 2 for the future of VR.


Last modified on Sunday, 20 October 2013 17:36

From Around the Web