But this year, the changes in this technology were more evolutionary rather than revolutionary. Notably, some players in the industry (most notably Qualcomm) have started using the term XR to describe the convergence of AR and VR. But if you couldn't make it this year, here are the 9 things you need to know about XR at CES 2019.
1. HTC Vive has added eye tracking
HTC Vive has added eye tracking to their desktop-based VR headset. The feature works pretty well at keeping track of where the wearer is looking in the virtual world. They had a couple of working demos of environments where you could use your eyes to select menu options on the screen. It quickly became evident to me that the process of staring at a spot on the screen was tedious. It surprisingly hard for me to stare at one location on a screen when graphics are moving in the background; maybe I'm just extra twitchy. I think that feature will be best used to tune game dynamics to either present things in the user's view or to sneak things into the virtual world where the user is not looking. For now, I'll stick to handheld controllers for rapid interaction in the virtual world.
2. The Vive Focus Is a Great Low-Cost Solution
I found the HTC Vive Focus, their stand-alone VR headset, to be a compelling solution for lower cost VR deployments or corporate demonstrations. The headset offers 6 degrees of freedom, meaning it adapts to more than just you're the yaw, pitch, and roll of your head, but also to the position of your head. This feature is important if you want to move around a space, duck under obstacles, or see things from a different perspective. The width of the field of view on the Focus was noticeably constrained when compared to the desktop Vive which is a drawback, but preferable to having a wide field of view and poor tracking or image quality. I felt that earlier standalone headsets would just inoculate users against getting excited about VR; I think the Focus provides a sufficiently compelling experience to excite users about VR.
3. More Haptics and Improved Proprioception
More vendors were displaying enhanced proprioception and haptic devices, like the TeslaSuit shown here, to enhance the VR experience. While none were ready for real-world deployment, progress is being made. We're probably 2-3 years away from having a usable haptic glove for VR experiences. VR bodysuits are even further out. Sorry Ready Player One fans, we're a long way from that experience.
4. Tracking is Improving
VR tracking is seeing improvements. More vendors were exhibiting open space tracking both with 'lighthouses' and without them. The implementations of 'inside-out' tracking (where the headset tracks its own location visually), like the announced HTC Vive Cosmos, are starting to become practical. This will eliminate the need to setup external sensors to track the headset. Other companies are enabling VR in large open spaces using many of the external sensors.
5. 360 Degree Video Streaming via Cellular
Qualcomm was demonstrating streaming of 360° video over 5G. The video quality and frame rate were pretty good for cellular connectivity. It will be interesting to see if the promise of this much bandwidth actually comes to fruition as 5G rolls out over the next 24 months.
6. Isolating via Visors
Qualcomm also demonstrated a simple visor system for use on airplanes to provide a cocoon of isolation from the indignities of modern air travel. The experience was definitely a step up from seat-back video screen. One thing that wasn't clear was how the current regime of interrupting seat-back video for pointless pilot announcements would translate to this immersive headset. I can imagine that it will be very jarring to have your surround movie experience interrupted by the announcement that they are now passing out the bread and water rations.
7. Visors Still Have a Way to Go
I was unimpressed with any of the AR visors on display. They either suffered from poor tracking, occluded too much of my vision, or felt like I had a brick strapped to my face. It seems that AR visors are stuck in a catch-22 right now. They can either artificially constrain your field of view, like the Magic Leap, and provide an experience that fills the constricted field of view, or they can let you see what's around you but have noticeable, and distracting, edges to the AR experience. I don't think we'll have a compelling AR experience until you can have both natural peripheral vision and augmentations that fill that space. Vendors are working to solve the weight problem by moving the compute power off the headset into pucks that go in your pocket.
8. Vuzix Glasses Aren't the New Improved Google Glass
I hopefully tried on a pair of smart glasses from Vuzix. They were promoted as a Google Glass-like product done right. But my hopes were dashed when I tried them on. While the image quality was worlds better than the Glass, it was much too prominent in the one eye that it appeared in. The glasses are set up for viewing content in the right eye and the content is right in the center of your field of view so there's no looking around it. Worse yet, I'm left eye dominant so I had to concentrate to see the content through my right eye, but then I was blinded to the outside world.
9. Avoid Laser Images If You Want To See Anything Else
I tried a headset that used a laser to project images onto my tender retinas. The image quality was good, but again, it became the only thing you could see. Next time, I'll skip the retina laser blasting device.
Overall, progress is being made in practical, cost-effective XR equipment, but the industry appears to be a couple years away from having compelling solutions for long-term enterprise deployments of the technology. My advice, keep an eye on the space, dream about what could be done, and judiciously prototype but don't plan on major deployments this year.