Update: Apple’s AR smart glasses might have some stiff competition from Intel. The company has just revealed its Intel Vaunt smart glasses, which look to offer their wearer instant access to notification and information from the cloud or a connected device by beaming lasers directly onto the eyeball.
We’ve also seen a patent from Apple explaining some aspects of how the forthcoming glasses are going to work. It seems that the emphasis is going to be on keeping these smart specs compact and light as much as possible.
Original story continues below…
Apple ARKit, an AR initiative Apple, has the potential to bring augmented reality to the forefront of consumer technology. It will allow developers to create augmented reality apps in minutes and hours compared to weeks and months. But, however cool it may be, ARKit is just a platform. So, if you want to see what the real future of Apple’s augmented reality road map looks like, you’ll need to talk about the long-rumored but not-yet-announced Apple AR Glasses.
Recently, the Financial Times ran an exploratory piece on the status of Apple’s Augmented Reality roadmap that included some key details on Apple’s Google Glass clone, including a crucial detail we had yet to hear.
The gist of what’s happening is that while Apple sees multiple potential opportunities for augmented reality in the home, it hasn’t yet decided on which one to ultimately pursue. Some engineers want to use the iPhone as a main screen for the AR Glasses, others want to build a display into the glasses themselves. The bad news? Apple AR Glasses won’t be ready for a while.
Those internal discussions, plus the historical data that says Apple comes in a bit later on most new types of devices, are leading some analysts to expect a 2018 announcement and release date for the glasses. Indeed, there are rumors that Apple met with parts suppliers at CES 2018 at the start of the year.
“I don’t think we can rely upon a ‘next big thing’ in the next 12 months,” Geoff Blaber, an analyst at CCS Insight, told FT. “For now, Apple’s next big thing is still the iPhone.”
So what do we know about the rumored Apple augmented reality glasses so far? When will the Apple AR spectacles be released, and what could a pair of Apple AR glasses offer that the world’s current smartphone screens and VR headsets like the Oculus Rift and HTC Vive can’t?
Read on to find out!
Cut to the chase
What is it? A new Apple wearable, a pair of glasses making use of augmented reality tech.
When is it out? No fixed date, but rumors point to 2019 unveil, with devices hitting stores in 2020.
What will it cost? Based on Snap Spectacles pricing, anything from $130/ £105/ AU$170 and upwards – but anything ten times as costly could be possible depending on Apple’s final configuration.
What is augmented reality?
You’re familiar with the concept of virtual reality, right? Popping on a headset and having software transport you to an interactive, 360-degree, left, right, up, down, all-encompassing virtual world?
Augmented reality works a bit like that but with one big difference. Rather than giving a window into an invented world, it uses either screens or transparent lenses to place digital items on top of the real world around you.
The most popular examples of this in action today would be Snapchat’s stickers (the ones that put slobbering dog tongues and cat ears on your moving videos intelligently), or Pokemon Go which puts Pikachu and co into your world through a combination of your phone’s camera and screen.
Both see your real world “augmented” by software on your smart device. Essentially, AR lets you get context sensitive digital information overlaid onto your real world surroundings – look at a subway station and get train times automatically displayed, for instance, or walk down the aisles of a food store and have the specs recommend a recipe.
Apple’s iPhone 8 is thought to lean heavily on AR technology, but dedicated AR wearables already exist from rivals, too. Of the big name players, Snapchat’s nascent efforts see it cheat a little, with the Snap Spectacles amounting to little more than a head mounted camera in a glasses frame, feeding into the core Snapchat app.
Microsoft’s HoloLens is more ambitious, putting Windows PC capabilities into a headset that lets you access everything from a web browser to Minecraft within your real world.
And then of course there’s Google Glass – which saw its buzz burn out pretty quickly, thanks to a screen that sat uncomfortably in front of your eye offering hard-to-read information overlays.
What is Apple ARKit?
ARKit is Apple’s way of sticking its flag down into the augmented reality landscape, an attempt to claim the space as its own.
Revealed at WWDC 2017, ARKit is a new set of APIs to let developers build augmented reality applications for Apple devices. It’s specifically being pitched currently for iPad and iPhone devices (making it the “largest AR platform in the world”, according to Craig Federighi, Apple’s Senior Vice President of Software Engineering), but certainly paves the way for an AR glasses device in the future.
Apple showed off a number of impressive demos, from simply placing objects like a digital coffee cup, light stand and a plant onto a tabletop (as viewed through an iPad camera lens and screen). But it expanded to include a mind-blowing sci-fi battle scene, complete with tiny, minutely detailed people and swooping starships, courtesy of director Peter Jackson’s company Wingnut. It’s an experience coming to existing devices before the end of 2017.
So how’s this possible? ARKit enables “fast, stable” motion tracking, and accurate plane, ambient light and scale estimation.
As if there was any doubt, all this will require camera, CPU, GPU and motion sensor hardware working in tandem. So, whether tapping into a nearby mobile device, or viewed through lenses, Apple’s ARKit has the same basic hardware requirements as all other AR gear we’ve seen so far. Specific spec requirements, however, will have to wait for now.
But developers will be happy – with support for Unity, Unreal, and SceneKit engines, Apple is looking to make its AR platform accessible for devs already working in the space.
Why would Apple make AR glasses?
CAPITALISM. Those shareholders’ appetites for mansions and swimming pools won’t be sated!
But on a serious note, Apple’s in need of a new product category. The last time Apple launched an inarguably successful new product line was the iPad – and even that has proved difficult to maintain momentum in. AR is an exciting new area, and one in which Apple (at least in hardware terms) wouldn’t have huge competition in, at least in the present.
Yes, there’s the Microsoft HoloLens – but that’s primarily being billed currently as a business-orientated device. Google’s Glass failure has seen it put more time into its VR based Daydream View and Cardboard projects, while Samsung likewise continues with its Gear VR efforts.
It’s an opportunity for Apple to set itself aside from the pack and, for Tim Cook, to launch a product that doesn’t have the shadow of the late Steve Jobs looming over it.
“A significant portion of the population of developed countries, and eventually all countries, will have AR experiences every day,” he said during the 2016 Utah Tech tour, before casting shade on VR.
“I can’t imagine everyone in here getting in an enclosed VR experience while you’re sitting in here with me,” said Cook to those assembled for the Utah talk.
“AR is going to take a while, because there are some really hard technology challenges there,” he added.
“But it will happen, it will happen in a big way, and we will wonder when it does, how we ever lived without it. Like we wonder how we lived without our phone today.”
Apple AR glasses hardware: the evidence, the patents and the specs
So, we’ve established Apple’s definitely working on AR software. Sources claim that the iPhone 8 will be the big start for Apple’s AR device ambitions, with iPhone leading the charge for dedicated AR hardware to follow.
But it’s moving fast, and with big teams. Apple is said to have 1,000 engineers working on an AR project in Israel, and has purchased multiple AR firms including Tel Aviv’s PrimeSense (focused on 3D sensing tech) and RealFace (facial recognition cyber security experts).
It’s also made a . According to a report from , Apple has poached a leading employee of Nasa for the project, hiring Jeff Norris, founder of the Mission Operations Innovation Office of Nasa’s Jet Propulsion Lab. He is said to be working as part of an augmented reality team being headed up by another poached talent, Dolby Labs executive Mike Rockwell.
Apple has also been related to AR and VR technologies, including a headset with headphones built in and a remote control. Perhaps most telling of all is a leaked injury report , which suggests Apple is working on a “prototype unit” which has resulted in eye injuries for two users. It’s unlikely an iPhone or MacBook prototype would result in eye injury at this mature stage in their ongoing development – but a potential new product, the details of which are still being hammered out, which will likely sit right in front of your eyes? We have our culprit, it seems.
A patent for an Apple AR 3D depth sensing camera also appeared in June. It detailed a system that would use a light beam for optical 3D mapping, and suggested it could be used for tracking hand gestures. Interestingly, the patent specifically called out the benefits of using such a system while playing augmented reality games, suggesting that may be a big focus for Apple’s future AR plans.
Software patents have trickled through too – a submission from February 2010 saw Apple trying to protect an idea it had regarding “augmented reality maps”, shows off how digital mapping data could be overlaid onto real-time video from an iPhone‘s camera. Any success with iPhone would likely be easily translated to the dedicated glasses devices.
There’s also a suggestion that, having severed ties with GPU chipset designer Imagination Technologies, Apple is looking to develop its own chipsets with AR technology as a key development target.
Apple also recently announced that it would be pumping $200 million of investment into Gorilla Glass manufacturers Corning. Though it’s as likely to be fuelling a move to wireless charging for iPhones as anything else, Corning’s work on lightweight, durable glass would make them a perfect match for a pair of AR specs.
Corning have already dabbled in augmented reality projects – check out this concept of the company’s AR car windscreen.
What will Apple AR glasses cost?
That’s a tough question, as there’s no real precedent for this sort of thing yet.
On one hand, you’ve got the incredibly basic Snap Spectacles which are priced around $130/ £105/ AU$170. But we’re expecting Apple’s AR glasses to be far more feature rich than this.
On the other, you have HoloLens. It’s not really a consumer device, and is only available on a limited basis to developers at a cost of $3,000 (£2,719, AU$4,369). But Apple’s glasses will likely be built to mass-market scale, and with consumers (and associated price tags) in mind.
So it’s a guessing game really. Keeping in mind that Apple tends to slap a premium on its devices, a broad estimate of somewhere between $500/$AU670/£400 and $1,000/£800/AU$1,300 could be the ballpark. But don’t hold us to that.