Apple Glasses leaks and rumors: When could smartglasses arrive?

Taking a look at Apple’s other wearable devices could point to where Apple’s rumored glasses are heading.

Scott Stein/CNET

First came VR. Then came a wave of AR headsets that were high-priced and full of promises of wild mixed reality worlds. Apple could be blending AR and VR with two different headsets in the next couple of years, according to a recent report by Bloomberg’s Mark Gurman, which also points to internal disagreements inside Apple on its approach. Other reports say Apple now seems to be readying its own pair of smart glasses, at long last, seven years after Google Glass and four years after the debut of the Oculus Rift

These reports have been going for several years, including a story broken by CNET’s Shara Tibken in 2018. But the question is: When will this happen, exactly? 2021, 2022, or even later? At Apple’s most recent WWDC developer conference, AR news was extremely quiet. Still, new hints at how Apple’s existing hardware could map the world and blend virtual objects and real locations began to emerge.

Apple has been in the wings all this time without any headset at all, although the company’s aspirations in AR have been clear and well-telegraphed on iPhones and iPads for years. Each year, Apple’s made significant strides on iOS with its AR tools. It’s been debated how soon this hardware will emerge: next year, or the year after, or even further down the road. Or, whether Apple proceeds with just glasses, or with a mixed-reality VR/AR headset, too.

Now playing:
Watch this:

Top 5 most exciting future Apple products


I’ve worn more AR and VR headsets than I can even recall, and been tracking the whole landscape for years. In a lot of ways, a future Apple AR headset’s logical flight path should be clear from just studying the pieces already laid out. Apple just acquired VR media-streaming company NextVR, and previously acquired AR headset lens maker Akonia Holographics.

I’ve had my own thoughts on what the long-rumored headset might be, and so far, the reports feel well aligned to be just that. Much like the Apple Watch, which emerged among many other smartwatches and had a lot of features I’d seen in other forms before, Apple’s glasses will probably not be a massive surprise if you’ve been following the beats of the AR/VR landscape lately.

Remember Google Glass? How about Snapchat’s Spectacles? Or the HoloLens or Magic Leap? Facebook is working on AR glasses, too, and Snap, and also Niantic. The landscape could suddenly get crowded fast.

Here’s where Apple is likely to go based on what’s been reported, and how the company could avoid the pitfalls of those earlier platforms. 

Apple declined to comment on this story.


North Focals’ smart glasses design from earlier this year. North was just acquired by Google.


Normal glasses, maybe with a normal name

Getting people to put on an AR headset is hard. I’ve found it a struggle to remember to pack smart glasses, and find room to carry them. Most of them don’t support my prescription, either.

Apple always touted the Apple Watch, first and foremost, as a “great watch.” I expect the same from its glasses. If Apple makes prescription glasses and makes them available, Warby Parker-style, in seasonal frames from its Apple Stores, that might be enough for people if the frames are good looking.

Google just acquired smartglass maker North, which made a prescription, nearly normal-like pair of eyewear. North’s concept for glasses might be too similar to Google Glass for Apple’s tastes, but the idea of AR glasses doubling as functional glasses sounds extremely Apple-like.

From there, Apple could add AR features and let newcomers settle into the experience. Augmented reality is weird, potentially off-putting, and people will need to feel out how much of it is right for them. The original Apple Watch was designed to be glanced at for five seconds at a time. Maybe the same idea is in the works for Apple AR features.

“Apple Glass” was one purported name for the glasses. Not surprising, since the watch is the Apple Watch, the TV box is the Apple TV. Apple could have gone the “Air” route like “AirFrames,” but I wonder if these things will end up being tethered some of the time.

A recent patent filing also shows Apple looking to solve vision conditions with adaptive lenses. If true, this could be the biggest killer app of Apple’s intelligent eyewear.

Lower cost than you’d think?

A report from Apple leaker Jon Prosser in May said a product named the Apple Glass will start at $499 plus prescription add-ons like lenses. That could still ramp up the price beyond what I pay for my glasses, but still stay in a realm that isn’t insane. While the HoloLens and Magic Leap cost thousands of dollars, they’re not targeted at regular consumers at all. VR headsets cost anywhere from $200 to $1,000, and the Oculus Quest’s $400-to-$500 price seems like a good settling point. The original iPad started at $500. The Apple Watch was around the same. If the glasses are accessories, and meant to go with a watch, AirPods and an iPhone, you can’t make them cost too much.


Qualcomm’s AR and VR plans have been telegraphing the next wave of headsets: many of them will be driven by phones. Phone-powered headsets can be lower-weight and just have key onboard cameras and sensors to measure movement and capture information, while the phone does the heavy lifting and doesn’t drain headset battery life. 

Apple’s star device is the iPhone, and it’s already loaded with advanced chipsets that can do tons of AR and computer vision computation. It could already handle powering an AR headset now, imagine what could happen in another year or two.


Apple’s iPhones are likely to be the engine.

Angela Lang/CNET

A world of QR codes, and location-aware objects

Apple’s upcoming iOS 14 introduces QR code and NFC-enabled App Clips that can launch experiences from real-world locations with a tap or scan. These micro-apps are made to work with AR, too: On iPhones and iPads, they’ll require a device to be held up. But with glasses or an AR headset, they could eventually launch interactions at a glance.

Maybe QR codes can help accelerate AR working in the “dumb” world. Apple’s latest iPhones also have a mysterious U1 chip that can be used to improve accuracy in AR object placement, and also to more quickly locate other Apple devices that have the U1 chip, too. Reports of tracker tiles arriving as soon as this year, that could be seen via an iPhone app using AR, could possibly extend into Apple’s glasses. If all Apple’s objects recognize each other, they could act as beacons in a home. The U1 chips could be indoor navigation tools for added precision.

Apple’s newest iPad has the sensor tech it needs

Apple is already deeply invested in camera arrays that can sense the world from short and long distances. The front-facing TrueDepth camera on every Face ID iPhone since the X is like a shrunken-down Microsoft Kinect, and can scan a few feet out, sensing 3D information with high enough accuracy to be used for a secure face scan. The newer lidar rear sensor on the 2020 iPad Pro can scan out much further, several meters away. That’s the range that glasses would need. 

Apple’s iPad Pro lidar scanner is more for depth sensing than photo-real object scanning, according to developers: The array of dots sent out to ping the world is less fine-grained, but good enough to mesh its surroundings and scan a landscape, noting furniture, people and more. Recent iPad Pro apps using lidar use the tech to enhance room scans and even improve the camera’s understanding of room details. That lidar sensor array is reported to be Apple’s AR glasses sensors, and it makes complete sense. 

iPadOS 14 uses the lidar scanner to build out even more advanced depth features and room-meshing, and looks like the missing link to build a new wave of even more realistic AR graphics from Apple. At WWDC, some of these promised features started to look like the things I’d seen years ago on Magic Leap.

Add to this the wider-scale lidar scanning Apple is doing in Maps to enable overlays of real-world locations with virtual objects via a technology called Location Anchors, and suddenly it seems like the depth-scanning Apple is introducing could expand to worldwide ambitions.

How bleeding-edge will the visuals be?

Will Apple push the bleeding edge of realistic holographic AR, or aim for style, a few key functions and build up from there? Undoubtedly, the latter. The first Apple Watch was feature-packed but still lacked some key things other watches had, like GPS and cellular connectivity. So did the first iPhone, which had neither an app store, 3G or GPS. Apple tends to market its new products at doing a few key things exceedingly well. 

High-end mixed reality headsets like HoloLens 2 and Magic Leap, that show advanced 3D effects, are heavy. Smaller, more normal smart glasses like North Focals or Vuzix Blade are more like Google Glass used to be; they present bits of heads-up info in a flat 2D screen. 

There aren’t that many lightweight AR headsets yet, but that’s going to change. Plug-in glasses like the nReal Light show some Magic Leap-like 3D graphics, and it runs using a phone. That comes closer to what Apple could be making. 

Apple’s dual displays could leapfrog the competition and offer better image quality for its size. We’ve already seen regular-looking glasses lenses that can embed waveguides to make the images float invisibly. And over time, Apple could have more advanced hardware in store. Apple’s expanded chip push into Macs also means a new range of even more graphics-enhanced processors. Maybe some of these will be applied to AR hardware efforts, too.


AirPods went from absurd to essential. Can Apple do the same for glasses?

Sarah Tew/CNET

Look to AirPods for ease of use — and audio augmented reality

I’ve thought about how AirPods and their instant-on comfort, and weird design, was an early experiment on how wearing Apple’s hardware directly on our faces could be accepted and become normal. AirPods are expensive compared to in-box wired buds, but also utilitarian. They’re relaxed. The Apple Glass needs to feel the same way.

The AirPod Pros will work with spatial audio in iOS 14, to involve information from locations that could pop up and alert someone to maybe turn on their glasses. Maybe the two would work together. Immersive audio is casual, and we do it all the time. Immersive video is hard and not always needed. I could see AR working as an audio-first approach, like a ping. Apple Glass could potentially do the world-scanning spatial awareness that would allow the spatial audio to work.


Somehow, the watch and the glasses need to work together for navigation and other apps.

Scott Stein/CNET

Apple Watch and AirPods could be great Glass companions

Apple’s already got a collection of wearable devices that connect with the iPhone, and both make sense with glasses. Its AirPods can pair for audio (although maybe the glasses have their own Bose Frames-like audio, too), while the watch could be a helpful remote control. The Apple Watch already acts as a remote at times, for the Apple TV, or linking up with the iPhone camera. Apple’s glasses could also look to the watch and expand its display virtually, offering enhanced extras that show up discreetly, like a halo.

The Apple Watch could also provide something that it’ll be hard to get from hand gestures or touch-sensitive frames on a pair of glasses: haptics. The rumbling feedback on the Watch could lend some tactile response to virtual things, possibly.

Could Qualcomm and Apple’s reconciliation also be about XR?

Qualcomm and Apple are working together again on future iPhones, and I don’t think it’s just about modems. 5G is a key feature for phones, no doubt. But it’s also a killer element for next-gen AR and VR. Qualcomm has already been exploring how remote rendering could allow 5G-enabled phones and connected glasses to link up to streaming content and cloud-connected location data. Glasses could eventually stand on their own and use 5G to do advanced computing, in a way like the Apple Watch eventually working over cellular.

Qualcomm’s chipsets are in almost every self-contained AR and VR headset I can think of (Oculus Quest, HoloLens 2, a wave of new smart glasses, the latest version of Google Glass, Vive Focus). Apple’s hardware will likely interface with some of Qualcomm’s emerging XR tools, too.


There are other AR devices out there, like the nReal Light. Apple needs to open up iOS to work with them, like fitness trackers and smartwatches.

Sarah Tew/CNET

Expect the iPhone to support other VR and AR, too

While the Apple Glass may be Apple’s biggest focus, it doesn’t mean there can’t be, or shouldn’t be, competitors. There are tons of smartwatches and fitness trackers that work with the iPhone, for instance. Where it gets annoying for other trackers and watches is how they’re walled off in a more limited interaction with iOS than the Apple Watch. That could be the same down the road, if connected VR and AR headsets are allowed to work with a future iOS update. It’s where Qualcomm’s heading with phone chips, and Google’s Android could be likely to follow.

Launch date: 2021, 2022, 2023… or later?

New Apple products tend to be announced months before they arrive, maybe even more. The iPhone, Apple Watch, HomePod and iPad all followed this path. Prosser’s report says a first announcement could come alongside the next iPhone in the fall, if it’s a standard Apple event as was originally planned precoronavirus (which it probably won’t be). Even then, you might not be able to buy it until 2021. This lines up with Shara Tibken’s report from way back in 2018

Bloomberg’s Gurman has since contested Prosser’s report, and other noted analysts like Ming-Chi Kuo say the glasses could come in 2022. A report from The Information from 2019, based on reported leaked Apple presentational material, suggested 2022 for an Oculus Quest-like AR/VR headset, and 2023 for glasses. Maybe Apple takes a staggered strategy with AR, and releases several devices: one for creators first, with a higher price, and one for everyday wearers later. 

Either way, developers would need a long head start to get used to developing for Apple’s glasses, and making apps work and flow with whatever Apple’s design guidance will be. That’s going to require Apple giving heads-up on its hardware well in advance of its actual arrival. 

The Apple Glass — or whatever the hardware will be called — sounds like the culmination of years of acquisitions, hires and behind-the-scenes drama. But they may not be arriving quite as soon as you’d think.

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button