The Immersed Visor aims for spatial computing’s sweet spot
An Austin-based startup best known for its VR and mixed reality workspace software for other companies’ headsets now has hardware of its own. The Immersed Visor appears to sit somewhere between a Vision Pro Lite and Xreal Plus: a lightweight head-worn device that creates a high-resolution spatial computing environment on the cheap (well, relatively speaking).
Teased to death for months, Immersed founder Renji Bijoy finally unveiled the Visor at an Austin event on Thursday. The device, a bit more than glasses but much less than a full headset, gives each eye the equivalent of a 4K OLED screen. It has a solid 100-degree field of view. It supports 6DoF tracking (meaning it responds to motion on different axes, not just simple head rotations), and it offers hand and eye tracking and support for over five screens in a virtual or mixed reality environment.
In the presentation, Bijoy revealed that the Immersed Visor only weighs 186g, slightly less than an iPhone 16 Pro. It’s 64 percent lighter than the Meta Quest 3 (515g) and around 70 percent lighter than the Apple Vision Pro (600 to 650g). Weight and ergonomics have been drawbacks for many early adopters of VR and mixed-reality tech. (That includes some customers of the $3,500 Vision Pro.) So, trimming the Visor’s weight to about the same as a high-end smartphone could, in theory, help it succeed where competitors struggled. Part of that comes from (in borrowing a trick from Apple) a wired battery pack you stash in your pocket.
But unlike those devices, the Immersed Visor doesn’t include an app store or onboard experiences like games. Instead, it’s tailored for work: link it to your Windows, macOS or Linux computer (wirelessly or wired), and get stuff done on its immersive array of virtual screens. Its 6DoF tracking means you can stand up, lean or twist, and the virtual screens will remain planted where you put them, rather than awkwardly following you through space.
Like the company’s workspace app for Meta Quest and Vision Pro, you can operate either in a passthrough view of your space or an entirely virtual one. (It includes pleasant virtual environments like a mountaintop ski resort by a cozy fire.) You can also work with others in a shared space.
The device runs on the Qualcomm XR2+ Gen 2 chip, which debuted at CES 2024. The chip supports up to 4.3K per-eye resolution and can handle content up to 90fps.
Immersed has chosen an unconventional pricing scheme. The device starts at $1,050 to buy outright. But you can get it for $400 upfront if you agree to a subscription model: $40 monthly for 24 months or $60 monthly for a 12-month term. Oh, and that model doesn’t ship until “six months after” October, meaning April 2025. If you want a device that starts shipping next month — i.e., the “Founder’s Edition” — that price increases to $1,350 outright or $700 plus the monthly subscription fee (same prices as the later-shipping version).
In theory, the Immersed Visor could hit a sweet spot for many spatial computing-curious folks who want something cheaper than the Vision Pro, with a higher resolution than the Meta Quest 3 and that’s (perhaps) less like a beta product than Xreal’s AR glasses. Whether it succeeds on those points, well, we won’t have a clue until we get some hands-on time. As far as I can see, no major media outlets (including Engadget) have shared hands-on demos of the device. As this year’s wave of absurdly hyped AI gadgets reminded us, big promises mean nothing if you end up with a $1,000 paperweight.
You can watch the presentation below and, if it tickles your fancy, pre-order from Immersed’s website.
#Immersed #Visor #aims #spatial #computings #sweet #spot