What iPhone X tells us about Apple’s plans for AR glasses

What iPhone X tells us about Apple’s plans for AR glasses
By Darshan Shankar
Nov 22 2017
https://blog.bigscreenvr.com/what-iphone-x-tells-us-about-apples-plans-for-ar-glasses-c23c3264eb88

The iPhone X started shipping last week to the public and while much of the discussion about the phone focused on Face ID, Animoji and the notch, it’s far more interesting to read into what Apple is signaling for the future of iPhone with this device.

It is a glimpse at what will be possible in a few years, and what we’re seeing right now is clearly laying the foundation for Apple’s next big product: Augmented Reality glasses.

The TrueDepth sensor & FaceID is the most advanced facial detection and recognition hardware and software in the world, but what it is capable of today is not as exciting as what it will be capable of tomorrow.

The play

It’s no coincidence that ARKit launched alongside the iPhone X. Apple’s next big platform is AR so by putting it on everything from the iPhone 6 and above it’s now in millions of peoples’ hands — making it the biggest augmented reality platform in the world.

By doing this Apple is building up the content library it needs for a successful platform launch years ahead of showing the world what the device that powers it might look like and gets developers comfortable with the building blocks needed to succeed in a 3D-first world.

Building out ARKit now gives developers time to create killer apps, and some have already launched many compelling real-world demonstrations of its power, like IKEA’s Place app.

This is an unusual departure for Apple. With the iPhone and Apple Watch, SDKs and toolkits were released to developers only after the product was announced. This time, Apple is releasing ARKit a couple years before the anticipated AR glasses.

Take a second to realize the magnitude of the accomplishment here: in a fraction of a second, a tiny device in the palm of your hand can recognize who we are and map our faces onto other objects, and track our emotions.

The real-time motion capture (“mocap”) technology used to map your face in the iPhone X to create Animoji was reserved for big-budget film studios until recently (side note: Apple also owns a company that creates those exact motion effects for the Star Wars franchise).

Apple is now able to accurately map out the details of your unique face in milliseconds, and drop that onto a 3D model, or understand who you are uniquely — perfect for building out a database of people or letting you load on a face filter that makes you look like Darth Vader.

Fast forward a few years from this and put this sensor on a pair of augmented reality glasses: suddenly you can identify the people you’re talking to (or about to) at a meetup or conference.

That data, when used as a ambient data point with machine learning, could remind you: “this is Steve, you met at CES 2021, and he now works at Google” and “this is Sara, who’s birthday is today.”

[snip]

Comments closed.

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s