Developer Guide

Tutorial: Use Tracking on iOS with CraftAR Augmented Reality

icon Date Developer Guide

icon Date Dec 22, 2020

icon author Catchoom Team

icon folder , , , , ,

This section applies only to the Augmented Reality SDK v4 +. Are you still using an older version? Previous versions of the SDKs will not receive updates anymore. If you need help transitioning to the newer version, you can get support through our support channel.

The CraftAR iOS Augmented Reality SDK allows you to create AR apps that render the experiences created with the CraftAR service. If you’re not yet familiar with the general steps, read How to add augmented reality into your app.

An Augmented Reality app using the iOS native SDK can be implemented following two steps. First, you need to set up the UIVIewController and then you can trigger Augmented Reality experiences.

If you want to see an example that combines Cloud Image Recognition (see Tutorial: Use Cloud Image Recognition on iOS ) with Tracking, take a look at the open source samples available in our Github repository:

Setting up the SDK in your UIViewController

Once you have set up the CraftARSDK into your Xcode project, it’s time to implement the UIViewController that will show the experience.

1. Adopt CraftARSDKProtocol in your UIViewController

Adopt the <code>CraftARSDKProtocol</code> and add <code>CraftARTracking</code> interface to your UIViewController.

2. Get the instance of the CraftARSDK

Once the view is loaded, you can get an instance of the CraftARSDK.

3. Start the VideoCapture module

Once the view is loaded and will appear, you can initialize the VideoCapture module of the CraftARSDK with a specific UIView. 

Note: the ‘videoPreviewView’ you provide will be loaded with a rendering view and no other subviews will be displayed for it. If you need to display other UIViews as part of MyViewController, add them to self.view of MyViewController (i.e. at the same level as ‘videoPreviewView’).

4. Get the instance of Tracking

Once the VideoCapture module is ready, it performs a callback to didStartCapture. Here you can setup the Tracking interface. 

Rendering Augmented Reality

Once your ViewController has the necessary protocols and instance of Tracking, it’s time to add code to start using the iOS Augmented Reality SDK for a mobile app.

In most cases, you’ll use the Cloud Image Recognition service from CraftAR to recognize the object and obtain the necessary AR scene in return. Take a look at the tutorial about Cloud Image Recognition to see the flow.
Next is an example of the calls that are required to start rendering the scene attached to a CraftARItem.

Related Posts in Developer Guide