
1. What is AR?
Augmented reality is what it sounds like: reality, enhanced with interactive digital components. The most commonly used AR applications these days rely on smartphones to showcase the digitally augmented world: users can activate a smartphone’s camera, view the real world around them on the screen, and rely on an AR application to enhance that world in any number of ways via digital overlays:
- Superimposing images, digital information or 3D models
- Adding real-time directions
- Changing colors
- Altering the user or their environment's appearance via "filters" on Instagram, Snapchat, and other apps
Common Augmented Reality Use Cases
So what is augmented reality used for these days? In fact, in 2020, nearly every industry has found ways to apply AR technology to improve processes and outcomes. Common uses include…
- Training and education
- Gaming
- Selling
AR Technology: What Do Brands Need?
will vary among brands, but the short version is that adding AR to a website today is much easier, faster, and more accessible than it was even a couple of years ago, thanks to new software on the market.
If you’re intrigued by the potential AR has to improve your bottom line, here are a few guiding principles to keep in mind:
- AR functionality should live on your website, not in a dedicated AR app. Customers don’t want to download an app just to use AR features. To enjoy the sales boost that AR features can offer, eliminate unnecessary hurdles to using them.
- AR-generating software can save you time, energy, and money. A few years ago, any brand that wanted AR had to build the functionality itself, from the ground up. Today, software like Threekit makes it possible to generate AR content with a SaaS model. This makes the path to AR smoother, faster, and more accessible.
- Build AR for smartphones. Again, if you want the benefits AR offers, you have to maximize the odds that your customers will use it. That means creating AR experiences that work with the tech most people have today, which means smartphones (such as iPhones and Android).
Examples of augmented reality in action

AR for IKEA furniture shopping
When shopping for furniture online, many customers may want the same experience they can get when visiting the store and seeing furniture options in person. IKEA understood this need and was able to deliver a more immersive online furniture shopping experience through AR.
2. AR With iOS
Augmented reality (AR) describes user experiences that add 2D or 3D elements to the live view from a device’s camera in a way that makes those elements appear to inhabit the real world. ARKit combines device motion tracking, camera scene capture, advanced scene processing, and display conveniences to simplify the task of building an AR experience. You can create many kinds of AR experiences with these technologies using the front or rear camera of an iOS device.
ARKit

ARKit is the underlying framework that handles the "heavy lifting" of Augmented Reality experiences. ARKit configures the camera, gathers the relevant sensor data, and is responsible for detecting and locating the "anchors" that will tether your 3D content to the real world, as seen through the camera. In a sense, Augmented Reality is all about displaying 3D content in the real world, tethering your 3D content to anchors that are tracked and followed, making the 3D content appear as though it truly is in front of your user.
As a whole, ARKit does the work to find those anchors, track those anchors, and handles the computations and augmentations to keep your 3D content tethered to those anchors, making the experience seem realistic.
As a whole, ARKit does the work to find those anchors, track those anchors, and handles the computations and augmentations to keep your 3D content tethered to those anchors, making the experience seem realistic.
Content Technology
Whereas ARKit handles the heavy lifting of configuring the camera, finding anchors, and tracking those anchors, you have a choice of what type of Content Technology you plan to use to actually render/show your 3D content. The Content Technology is the framework doing the heavy lifting of either loading your 3D model (that you probably created elsewhere, such as a 3D modeling program, or in Reality Composer), or creating 3D content programmatically. There are four main choices for Content Technology;
- RealityKit :-

RealityKit was announced at WWDC 2019 and is the newest of the 3D graphics technologies available in iOS. Much like other 3D technologies available in iOS, RealityKit offers you the ability to load 3D models you may have created in other 3D modeling programs, create 3D content (such as boxes, spheres, text, etc.), as well as create 3D lights, cameras, and more.
As described in the RealityKit Documentation, RealityKit allows you to Simulate and render 3D content for use in your augmented reality apps. To your comment, RealityKit complements ARKit; ARKit gathers the information from the camera and sensors, RealityKit renders the 3D content.
As described in the RealityKit Documentation, RealityKit allows you to Simulate and render 3D content for use in your augmented reality apps. To your comment, RealityKit complements ARKit; ARKit gathers the information from the camera and sensors, RealityKit renders the 3D content.
RealityKit reads in .usdz, .rcproject and .reality file formats. Supports transform and asset animation, rigid body dynamics, PBR materials, HDR Image Based Lighting, raycasting and spatial audio. All scene models must be tethered with anchors (AnchorEntity class). Framework automatically generates and uses mipmaps, which are a series of progressively low-rez variants of objects' texture that improve render times when applied to distant objects. RealityKit works with a polygonal mesh generated using Scene Reconstruction feature. I wanna add a few words about AR Quick Look - a zero-settings framework that's built on the RealityKit engine and it's conceived for fast AR visualization.
Sample Project : Creating Screen Annotations for Objects in an AR Experience
- SceneKit :-

SceneKit is another popular choice for working with ARKit. SceneKit is wildly popular in iOS development for generating 3D content. Similar to RealityKit, SceneKit offers the ability to load and create 3D models, handle lighting, reflections, shadows, etc., and works hand-in-hand with ARKit. SceneKit is also popular in game development, and given that many developers have experience with SceneKit from developing 3D games, it is a great way to bring that understanding to the world of Augmented Reality, as much of the same principles from 3D game development can be applied to AR.
SceneKit reads in several file formats, including .usdz, .dae and .scn. Supports nested asset animation, dynamics, particles, PBR materials, HDR IBL and spatial audio. For implicit and explicit transform animation of any node you can use SCNAction, SCNTransaction and CAAnimation classes. Though a collisions' setup in SceneKit is a little bit complicated. To create a modular and scalable game architecture with SceneKit we need to implement GameplayKit’s entity-component pattern. SceneKit still has several advantages over RealityKit 2.0. One thing that Swift developers forget about is that Objective-C SceneKit's apps ensure fast compile times.
Sample Project : Tracking and Visualizing Faces
- SpriteKit :-

SpriteKit is another popular choice for game development and its principles, when brought into the world of AR, can still be applied. SpriteKit is a highly performant framework, and deals traditionally in 2D content. Again, this a hugely popular framework already for iOS game development, and its ability to work hand-in-hand with ARKit allows developers with existing knowledge to implement AR experiences.
SpriteKit is a general-purpose 2D framework that leverages Metal to achieve high-performance rendering, while offering a simple programming interface to make it easy to create games and other graphics-intensive apps. Using a rich set of animations and physics behaviors, you can quickly add life to your visual elements and gracefully transition between screens.
SceneKit reads in several file formats, including .usdz, .dae and .scn. Supports nested asset animation, dynamics, particles, PBR materials, HDR IBL and spatial audio. For implicit and explicit transform animation of any node you can use SCNAction, SCNTransaction and CAAnimation classes. Though a collisions' setup in SceneKit is a little bit complicated. To create a modular and scalable game architecture with SceneKit we need to implement GameplayKit’s entity-component pattern. SceneKit still has several advantages over RealityKit 2.0. One thing that Swift developers forget about is that Objective-C SceneKit's apps ensure fast compile times.
Documentation : Providing 2D Virtual Content with SpriteKit
- Metal | MetalKit :-

Metal is a low-level graphics framework that is hugely powerful. In its simplest form, Metal allows you to take control of the entire graphics pipeline, offering you the ability to develop experiences from the ground up while maintaining exceptional performance. Metal talks directly to your device's GPU, and can allow you to have more nuanced control of the functionality of how everything from the camera to your 3D content appears.
All of the aforementioned frameworks are built on top of Metal, and all are built to offer the same incredible performance and security that Metal provides. If you find yourself needing to work more directly with the GPU, Metal is your best choice.
All of the aforementioned frameworks are built on top of Metal, and all are built to offer the same incredible performance and security that Metal provides. If you find yourself needing to work more directly with the GPU, Metal is your best choice.
Metal begins shining when you render a considerably greater number of polygons or 3D-particles than SceneKit or RealityKit is capable of rendering. Developers usually use Metal framework to generate a High-Quality GPU Rendering for games with sophisticated 3D environments, for video processing apps like Final Cut Pro and Nuke, for 3D apps like Maya, or for big data scientific apps that must perform for scientific research. Consider, raytracing in Metal is much more quicker than in RealityKit.
Documentation : Effecting People Occlusion in Custom Renderers
For this series of tutorials you will need:
-
- Xcode
The first thing you need to do is download Xcode in AR from the App Store. Once it is installed, open it up and familiarize yourself with the interface. You need to have an Apple Developer account and register your devices with the program. This will allow you to install and run your apps on physical devices for testing purposes.
-
- iPhone or iPad
AR Supported Apple Devices :-
