Augmented Reality in ios

1. What is AR?

Augmented reality is what it sounds like: reality, enhanced with interactive digital components. The most commonly used AR applications these days rely on smartphones to showcase the digitally augmented world: users can activate a smartphone’s camera, view the real world around them on the screen, and rely on an AR application to enhance that world in any number of ways via digital overlays:
  • Superimposing images, digital information or 3D models
  • Adding real-time directions
  • Changing colors
  • Altering the user or their environment's appearance via "filters" on Instagram, Snapchat, and other apps
Common Augmented Reality Use Cases
So what is augmented reality used for these days? In fact, in 2020, nearly every industry has found ways to apply AR technology to improve processes and outcomes. Common uses include…
  • Training and education
  • Gaming
  • Selling
Today, most of these augmented reality experiences are made possible by smartphones. However, the development of more advanced AR devices (like Apple’s AR glasses or Microsoft's Hololens) could open the door to even more applications. The benefits of AR are only continuing to expand to new sectors, such as healthcare, manufacturing, utilities, telecommunications, education, and public safety. Imagine, for example, viewing IKEA furniture in the comfort of your own physical environment via AR, ordering it online, and then receiving it with assembly instructions that projected themselves right on the pieces in the box.
AR Technology: What Do Brands Need?
will vary among brands, but the short version is that adding AR to a website today is much easier, faster, and more accessible than it was even a couple of years ago, thanks to new software on the market. If you’re intrigued by the potential AR has to improve your bottom line, here are a few guiding principles to keep in mind:
  • AR functionality should live on your website, not in a dedicated AR app. Customers don’t want to download an app just to use AR features. To enjoy the sales boost that AR features can offer, eliminate unnecessary hurdles to using them.
  • AR-generating software can save you time, energy, and money. A few years ago, any brand that wanted AR had to build the functionality itself, from the ground up. Today, software like Threekit makes it possible to generate AR content with a SaaS model. This makes the path to AR smoother, faster, and more accessible.
  • Build AR for smartphones. Again, if you want the benefits AR offers, you have to maximize the odds that your customers will use it. That means creating AR experiences that work with the tech most people have today, which means smartphones (such as iPhones and Android).
Examples of augmented reality in action
reality-action
AR for IKEA furniture shopping When shopping for furniture online, many customers may want the same experience they can get when visiting the store and seeing furniture options in person. IKEA understood this need and was able to deliver a more immersive online furniture shopping experience through AR.
Showing Dulux paint colors on walls Another company putting AR to the test is Dulux, which has used an immersive AR application and interface to enable users to see how different paint colors would look on their walls at home. Customers can use Dulux's Visualizer app on Android and iOS devices to view over 1,200 paint colors on their walls. This helps take paint color selection to the next level.

2. AR With iOS

Augmented reality (AR) describes user experiences that add 2D or 3D elements to the live view from a device’s camera in a way that makes those elements appear to inhabit the real world. ARKit combines device motion tracking, camera scene capture, advanced scene processing, and display conveniences to simplify the task of building an AR experience. You can create many kinds of AR experiences with these technologies using the front or rear camera of an iOS device.
ARKit
arkit
ARKit is the underlying framework that handles the "heavy lifting" of Augmented Reality experiences. ARKit configures the camera, gathers the relevant sensor data, and is responsible for detecting and locating the "anchors" that will tether your 3D content to the real world, as seen through the camera. In a sense, Augmented Reality is all about displaying 3D content in the real world, tethering your 3D content to anchors that are tracked and followed, making the 3D content appear as though it truly is in front of your user.

As a whole, ARKit does the work to find those anchors, track those anchors, and handles the computations and augmentations to keep your 3D content tethered to those anchors, making the experience seem realistic.

Anchors can come in a variety of forms. Anchors are most commonly planes (a horizontal plane, like a floor, table top, or the ground, or a vertical plane, like a wall, window, or door), but can also be faces (a human face), an image (where you provide your app an image, and when the camera detects that image, that becomes the "anchor" for your 3D content), an object (where you provide your app a 3D object, and when the camera detects that object in the real world, that object becomes the "anchor" for your 3D content), a body (for the purposes of tracking the movement of joints and applying that movement to a 3D character), a location (using ARGeoAnchors, which anchor your 3D content to a specific set of longitude/latitude/altitude coordinates, as a CLLocation from the CoreLocation framework, if in a supported location), or a mesh (if your device has a LiDAR scanner, ARKit becomes capable of detecting more nuanced planes, such as recognizing a floor plane vs. a table-top plane, or a door plane vs. a wall plane). In all, your 3D content has to be anchored to something in the real world, and ARKit handles finding these anchors and providing them to you for your use.
Content Technology
Whereas ARKit handles the heavy lifting of configuring the camera, finding anchors, and tracking those anchors, you have a choice of what type of Content Technology you plan to use to actually render/show your 3D content. The Content Technology is the framework doing the heavy lifting of either loading your 3D model (that you probably created elsewhere, such as a 3D modeling program, or in Reality Composer), or creating 3D content programmatically. There are four main choices for Content Technology;
  • RealityKit :-
realitykit
RealityKit was announced at WWDC 2019 and is the newest of the 3D graphics technologies available in iOS. Much like other 3D technologies available in iOS, RealityKit offers you the ability to load 3D models you may have created in other 3D modeling programs, create 3D content (such as boxes, spheres, text, etc.), as well as create 3D lights, cameras, and more.

As described in the RealityKit Documentation, RealityKit allows you to Simulate and render 3D content for use in your augmented reality apps. To your comment, RealityKit complements ARKit; ARKit gathers the information from the camera and sensors, RealityKit renders the 3D content.

There's no Objective-C legacy, RealityKit supports only Swift, and rather declarative syntax (like in SwiftUI). The main advantage of RealityKit – it can complement / change / customize scenes coming from Reality Composer app, and it can be a powerful extension for ARKit – although it's quite possible that in the near future RealityKit will work completely standalone (without ARKit).

RealityKit reads in .usdz, .rcproject and .reality file formats. Supports transform and asset animation, rigid body dynamics, PBR materials, HDR Image Based Lighting, raycasting and spatial audio. All scene models must be tethered with anchors (AnchorEntity class). Framework automatically generates and uses mipmaps, which are a series of progressively low-rez variants of objects' texture that improve render times when applied to distant objects. RealityKit works with a polygonal mesh generated using Scene Reconstruction feature. I wanna add a few words about AR Quick Look - a zero-settings framework that's built on the RealityKit engine and it's conceived for fast AR visualization.

Sample Project : Creating Screen Annotations for Objects in an AR Experience
  • SceneKit :-
scenekit
SceneKit is another popular choice for working with ARKit. SceneKit is wildly popular in iOS development for generating 3D content. Similar to RealityKit, SceneKit offers the ability to load and create 3D models, handle lighting, reflections, shadows, etc., and works hand-in-hand with ARKit. SceneKit is also popular in game development, and given that many developers have experience with SceneKit from developing 3D games, it is a great way to bring that understanding to the world of Augmented Reality, as much of the same principles from 3D game development can be applied to AR.

SceneKit was conceived for VR and can be run on iOS / macOS. For AR projects you can use it only in conjunction with ARKit. SceneKit supports both Objective-C and Swift. In SceneKit the main unit is a node (SCNNode class) that has its own hierarchy and can store a light (SCNLight), or a camera (SCNCamera), or a geometry (SCNGeometry), or a particle system (SCNParticleSystem), or audio players (SCNAudioPlayer). The main advantage of SceneKit – it's highly customisable, it can change geometry and materials at runtime, it has morphers, skinners and constraints, it renders a scene up to 120 fps and it has an advanced setup for particle system. There are Blinn, Constant, Lambert, Phong, ShadowOnly and PBR shaders.

SceneKit reads in several file formats, including .usdz, .dae and .scn. Supports nested asset animation, dynamics, particles, PBR materials, HDR IBL and spatial audio. For implicit and explicit transform animation of any node you can use SCNAction, SCNTransaction and CAAnimation classes. Though a collisions' setup in SceneKit is a little bit complicated. To create a modular and scalable game architecture with SceneKit we need to implement GameplayKit’s entity-component pattern. SceneKit still has several advantages over RealityKit 2.0. One thing that Swift developers forget about is that Objective-C SceneKit's apps ensure fast compile times.

Sample Project : Tracking and Visualizing Faces
  • SpriteKit :-
spritekit
SpriteKit is another popular choice for game development and its principles, when brought into the world of AR, can still be applied. SpriteKit is a highly performant framework, and deals traditionally in 2D content. Again, this a hugely popular framework already for iOS game development, and its ability to work hand-in-hand with ARKit allows developers with existing knowledge to implement AR experiences.
SpriteKit is a general-purpose 2D framework that leverages Metal to achieve high-performance rendering, while offering a simple programming interface to make it easy to create games and other graphics-intensive apps. Using a rich set of animations and physics behaviors, you can quickly add life to your visual elements and gracefully transition between screens. SceneKit reads in several file formats, including .usdz, .dae and .scn. Supports nested asset animation, dynamics, particles, PBR materials, HDR IBL and spatial audio. For implicit and explicit transform animation of any node you can use SCNAction, SCNTransaction and CAAnimation classes.

Though a collisions' setup in SceneKit is a little bit complicated. To create a modular and scalable game architecture with SceneKit we need to implement GameplayKit’s entity-component pattern. SceneKit still has several advantages over RealityKit 2.0. One thing that Swift developers forget about is that Objective-C SceneKit's apps ensure fast compile times.

Documentation : Providing 2D Virtual Content with SpriteKit
  • Metal | MetalKit :-
metalkit
Metal is a low-level graphics framework that is hugely powerful. In its simplest form, Metal allows you to take control of the entire graphics pipeline, offering you the ability to develop experiences from the ground up while maintaining exceptional performance. Metal talks directly to your device's GPU, and can allow you to have more nuanced control of the functionality of how everything from the camera to your 3D content appears.

All of the aforementioned frameworks are built on top of Metal, and all are built to offer the same incredible performance and security that Metal provides. If you find yourself needing to work more directly with the GPU, Metal is your best choice.

According to Apple documentation: "Metal is a C++ based programming language that developers can use to write code that is executed on the GPU for graphics and general-purpose data-parallel computations. Since Metal is based on C++, developers will find it familiar and easy to use. With Metal, both graphics and computer programs can be written with a single, unified language, which allows tighter integration between the two."

Metal begins shining when you render a considerably greater number of polygons or 3D-particles than SceneKit or RealityKit is capable of rendering. Developers usually use Metal framework to generate a High-Quality GPU Rendering for games with sophisticated 3D environments, for video processing apps like Final Cut Pro and Nuke, for 3D apps like Maya, or for big data scientific apps that must perform for scientific research. Consider, raytracing in Metal is much more quicker than in RealityKit.

Documentation : Effecting People Occlusion in Custom Renderers
For this series of tutorials you will need:
    • Xcode
Xcode in AR is Apple’s Augmented Reality development platform. With it, you can create apps that bring your products and services to life like never before! In this blog post, we will show you how to get started with Xcode in AR and create some amazing apps that users will love.

The first thing you need to do is download Xcode in AR from the App Store. Once it is installed, open it up and familiarize yourself with the interface. You need to have an Apple Developer account and register your devices with the program. This will allow you to install and run your apps on physical devices for testing purposes.
    • iPhone or iPad
In order to test your iPhone or iPad app on a real device, you need to have access to a physical device. This will help you catch errors and glitches that may not be apparent on the simulator. For this part, you need to have an Apple Developer account registered beforehand. Connect the laptop with your iPhone or iPad and click on Trust, in order to give access to the laptop to control and install the application you are about to build on your phone.
AR Supported Apple Devices :-
ar-supported-apple-device

KrunaL Parvadiya

CEO