A first try of Apple’s $3,500 Vision Pro headset

0 22

I got a sneak peek into Apple’s vision for the future of computing Monday. For about a half-hour, I wore the $3,500 Vision Pro, the company’s first high-tech goggles, which will be released next year.

I walked away with mixed feelings, including a nagging sense of skepticism.

On one hand, I was impressed with the quality of the headset, which Apple bills as the beginning of an era of “spatial computing,” where digital data blends with the physical world to unlock new capabilities. Imagine wearing a headset to assemble furniture while the instructions are digitally projected onto the parts, for instance, or cooking a meal while a recipe is displayed in the corner of your eye.

Apple’s device had high-resolution video, intuitive controls and a comfortable fit, which felt superior to my experiences with headsets made in the past decade by Meta, Magic Leap, Sony and others.

But after wearing the new headset to view photos and interact with a virtual dinosaur, I also felt there wasn’t much new to see here. And the experience elicited an “ick” factor I’ve never had before with an Apple product. More on this later.

Fit and control

Let me start from the beginning. After Apple unveiled the headset Monday, its first major new release since the Apple Watch in 2015, I was permitted to try a preproduction model of the Vision Pro. Apple staff led me to a private room at the company’s Silicon Valley headquarters and sat me on a couch for a demo.

The Vision Pro, which resembles a pair of ski goggles, has a white USB cable that plugs into a silver battery pack that I slipped into the pocket of my jeans. To put it on my face, I turned a knob on the side of the headset to adjust the snugness and secured a Velcro strap above my head.

I pressed down on a metal button toward the front of the device to turn it on. Then I ran through a setup process, which involved looking at a moving dot so the headset could lock in on my eye movements. The Vision Pro has an array of sensors to track eye movements, hand gestures and voice commands, which are the primary ways to control it. Looking at an icon is equivalent to hovering over it with a mouse cursor; to press a button, you tap your thumb and index fingers together, making a quick pinch that is equivalent to clicking a mouse.

The pinch gesture was also used for grabbing and moving around apps on the screen. It was intuitive and felt less clunky than waving around the motion controllers that typically come with competing handsets.

But it raised questions. What other hand gestures would the headset recognize for playing games? How good will voice controls be if Siri’s voice transcription on phones currently doesn’t work well? Apple isn’t sure yet what other gestures will be supported, and it didn’t let me try voice controls.

Source link

Leave A Reply

Your email address will not be published.