

One reliable analyst said Apple Glass could come as soon as next year, while another longtime source for Apple product releases believes the release won’t happen until 2022. Either way, the project is definitely in the works.Īpple Glass is expected to run on Starboard, a proprietary operating system uncovered in the final version of iOS 13. The augmented reality framework shows up multiple times in code and text documents, meaning Apple is likely testing activation and application.Īt WWDC 2020 in June, the company is expected to update the world on its AR efforts, and perhaps we’ll catch a glimpse of Apple Glass as soon as then. But with Apple’s 2020 product line fully fleshed out with the likes of the iPhone 12, Apple Watch 6 and AirPods Studio, the near future Apple augmented reality glasses looked blurry. And developers don’t even have to start from scratch–they can use ARKit to imbue existing apps with AR magic.When we first heard the word of “Apple Glasses,” rumours suggested the lenses would launch this year. It offered a huge, immediately addressable market anybody with an iPhone ( with an A9 chip or higher) can run the ARKit apps the developers create. If developers can define or create markets for their AR apps, we’ll almost certainly see Apple give consumers something better than a phone with which to consume AR content.Īpple did some important things to sweeten the proposition of developing on ARKit. The extent to which they embrace ARKit will say a lot about real demand for consumer AR. Apple’s introduction of ARKit should be read as a definite commitment to AR, but it can also be seen as a way to collect more information–to see how truly serious and excited the developer community is about the technology.ĭevelopers, after all, must make real-world decisions about whether to dedicate precious resources to developing on this platform or another one.

It’s inclined to hang back and wait for evidence that consumers really care. The first AR glasses probably won’t respond to our eye movements, but they will allow us to shift our field of view via head movement.Īt the same time, Apple is very thoughtful–and even skeptical–when it comes to shiny new technologies like AR. This feels a little awkward because we are used to shifting our field of view by moving our eyes and our head.

The AR scene in an ARKit experience moves with the view of the camera and the angle of the phone, but that movement depends on movement of the user’s arm and hand (or entire body, through feet-shuffling). The field of view of a single iPhone camera is limited, as is the display width of a phone held in front of one’s face is limited.
#IGLASSES SHARE VIDEO PLUS#
Excluding owners of the iPhone 7 Plus (which has two camera lenses), most of the iPhones that will play ARKit apps are single-camera phones. There are various, fairly obvious, reasons that the glasses will be preferable to the current mode of experiencing ARKit creations–through the screen of the iPhone. While I watched Apple giving developers a taste of some of the AR experiences they might create with ARKit, it was clear to me that most of them would work way better on a head-worn device, preferably a device in the vein of the sunglasses or reading glasses we’re already used to.
