Rumors of Apple AR Glasses
Two years ago I post a wishful request for an Apple AR display ("Wanted from Apple: The iDream, Daydream Meets Tango"), imaging combing a companion devices for the iPhone with the conceptual setup of Google's DayDream phone VR holders and their Tango phones (this was back before ARKit was released).
I'll call what I want to see the iDream (in a nod to Google). A Daydream/GearVR style holder for an iPhone, that has stereo cameras and SLAM-capable hardware. On-board hardware would do Tango/Hololens style SLAM, perhaps along with hand tracking, and the resulting mesh / tracking data plus stereo video could be sent to into the iPhone. Movement prediction could be handled properly because the system is closed, reducing or eliminating swim. The iPhone would only need to take care of rendering and running the applications: after all, the CPU/GPU in an iPhone blow the Hololens CPU/GPU out of the water. (They iPhone7 blows away the current Tango phone, the Lenovo Phab2 Pro, and this setup would impose even less load on the phone.)
A the time, a practical consumer-friendly see-through AR display seemed too far off, so I opted for the idea of a display holder (like Daydream and GearVR) but leveraging SLAM and 3D reconstruction to support usable video-mixed AR (something I wasn't really convinced is completely possible if we want to achieve the kind of slick, clean experience Apple would clearly demand from their products).
I am still dubious that something practical that consumer's could and would wear for long periods of time is doable, without a cord. A wide (enough) field of view immersive AR display (at least as good as ML1 and Hololens2), with a big enough battery to run for more than a few hours, seems too much for today's technology. I would love to be wrong, here. And a cord is the kiss of death for any consumer targeted device intended to be worn for any length of time (yes, I have an ML1).
But, the rumors that Apple will release something in Q4 2019 or Q1 2020 are reving up (see Tom's Guide and The Verge, for example), and they point to a display as a companion to iOS devices.
As these articles point out, Apple has the tech to put cameras and other sensors on such a display (akin to the ML1 and Hololens), and ARKit could easily be expanded to support off-device sensing and display like this.
Of course, these articles shed no light on the hard problems, such as battery life and display specs, size and weight, and connectivity to support the massive data transmission requirements: anyone who's tried to Airplay video from an iPhone to an Apple TV knows that similar kinds of connections are going to cut it for the kinds of low-latency 2 way video that would be needed.
Obviously, I would love these rumors to be true. I would also love to have Apple add WebXR support to Safari in a way that supports this, or at least have the system be open enough that I could make our WebXR Viewer support it!
(header image credit: idropnews/Martin Hajek, via the Tom's Guide link in the article)