Apple’s Augmented Reality ARKits

Apple finally announced an augmented reality platform ARKit for iPhone and iPad, the iOS 11 at  Apple Worldwide Developers Conference 2017 Keynote on 5th June 2017. The kit can help find planes, track motion, and estimate scale and ambient lighting. ARKit will be supported inside iOS 11, which is expected to launch this fall.


Apple’s Senior Vice President Craig Federighi announced in a keynote presentation that: “Apple is introducing a new platform for developers to help them bring high-quality AR experiences to iPhone and iPad using the built-in camera, powerful processors and motion sensors in iOS devices.”

ArtKit leverages the graphics and processing chips inside existing iPads and iPhones, in addition to motion sensors, to allow developers to create apps like Pokemon Go. ARKit uses Visual Inertial Odometry (VIO) to accurately track the world around it. VIO fuses camera sensor data with CoreMotion data. These two inputs allow the device to sense how it moves within a room with a high degree of accuracy, and without any additional calibration.

ARKit is supposed to have the following distinct features:

  • Fast, stable motion tracking
  • Plane estimation with basic boundaries
  • Ambient lighting estimation
  • Scale estimation
  • Xcode app templates


Apple's AR will immediately reach millions of people who already have the requisite hardware. And while it looks to be functionally as flexible and capable as Google's Tango, its broader audience makes it much more enticing for serious developers to invest their time and money into. 

Opportunities for developers:

ARKit will enable app designers and developers to design and build immersive experiences easily. Developers don’t need to spend time building their own AR renderer or world-tracking engine, performing object mapping and spatial tracking using computer vision and physics algorithms.

It promises to enable developers to build these experiences in less time without worrying about additional hardware and cost. It is supposed to seamlessly work with existing apple technologies (Metal, SceneKit, SpriteKit), and with the introduction of new iOS11 frameworks including CoreML, object and face detection through the Vision framework, built-in QR code recognition, natural language processing and more, the realm of possibilities becomes vast.





Leave a Reply

* Please perform CAPTCHA test