Dec 22, 2020
Catchoom Team
Are you still using an older version? Previous versions of the SDKs will not receive updates anymore. If you need help transitioning to the newer version, you can get support through our support channel.
An Image Recognition app using the iOS native SDK can be implemented following two steps. First, you need to set up the UIVIewController and then you can run image recognition and parse the results for each item that is recognized.
If you want to see an example that implements Cloud Image Recognition using the Augmented Reality SDK, take a look at the open source samples available in our Github repository: https://github.com/Catchoom/craftar-example-ios
Once you have set up the CraftARSDK into your Xcode project, it’s time to implement the UIViewController that will show the experience.
The following is an example for an app that uses CraftAR’s Cloud Image Recognition.
Adopt the CraftARSDKProtocol and the SearchProtocol in your UIViewController.
1 2 3 4 5 6 7 8 9 10 |
#import "MyViewController.h" #import <CraftARSDK/CraftARSDK.h> @interface MyViewController () <CraftARSDKProtocol, SearchProtocoll> { // CraftAR SDK reference CraftARSDK * mSDK; CraftARCloudRecognition * mCloudRecognition; } @end |
Once the view is loaded, you can get an instance of the CraftARSDK, the CloudRecognition and set up the CloudRecognition.
1 2 3 4 5 6 7 8 9 10 11 12 |
- (void)viewDidLoad { [super viewDidLoad]; // Get the instance of the SDK and become delegate mSDK = [CraftARSDK sharedCraftARSDK]; mSDK.delegate = self; // Get the Cloud recognition module and set 'self' as delegate to receive // the SearchProtocol callbacks mCloudRecognition = [CraftARCloudRecognition sharedCloudImageRecognition]; mCloudRecognition.delegate = self; } |
Once the view is loaded and will appear, you can initialize the VideoCapture module of the CraftARSDK with a specific UIView from your Storyboards.
1 2 3 4 5 6 |
- (void) viewWillAppear:(BOOL) animated { [super viewWillAppear:animated]; // Start Video Preview for search [mSDK startCaptureWithView:self._preview]; } |
Note: the ‘videoPreviewView’ you provide will be loaded with a rendering view and no other subviews will be displayed for it. If you need to display other UIViews as part of MyViewController, add them to self.view of MyViewController (i.e. at the same level as ‘videoPreviewView’).
Once the VideoCapture module is ready, it performs a callback to didStartCapture. Here you can set up the CloudRecognition interface as the SearchController for the SDK. The SDK manages the Single shot and Finder Mode searches with the frames and pictures from the camera. At this moment, it is also worth to set the token of the collection that you want to search through when performing Cloud Image Recognition.
1 2 3 4 5 6 7 8 9 10 11 12 13 |
- (void) didStartCapture { // The SDK manages the Single shot search and the Finder Mode search, // the cloud recognition is the delegate for doing the searches. // This needs to be done after the camera initialization mSDK.searchControllerDelegate = mCloudRecognition.mSearchController; // Set the colleciton we will search using the token. [mCloudRecognition setCollectionWithToken:@"catchoomcooldemo" onSuccess:^{ NSLog(@"Ready to search!"); } andOnError:^(NSError *error) { NSLog(@"Error setting token: %@", error.localizedDescription); }]; } |
Once your ViewController has the necessary protocol and instance of CloudRecognition, it’s time to add code to start scanning the real world.
The SDK class manages the search modes (the Finder Mode or Single Shot Mode) and forwards the events to the assigned search controller delegate, in this case this is the Cloud Recognition instance. When a response is found, the Cloud Recognition will notify its delegate, your ViewController.
You can scan in the two modes described below
You can call singleShotSearch to perform a search with a single image. This method sends a single query to CraftAR’s Cloud Image Recognition. The response to that query triggers didGetSearchResults. You could for instance call the following function once the Collection token has been set:
1 2 3 4 5 6 |
- (void) mySearchFunction { // The searchControllerDelegate (the mCloudRecognition instance) // will receive the camera events and search // with the picture or image frames coming from the camera. [mSDK singleShotSearch]; } |
and then the implementation of didGetSearchResults should parse the results.
1 2 3 4 5 6 7 8 9 10 11 12 13 |
- (void) didGetSearchResults:(NSArray *)results { // Parse the results for (CraftARSearchResult* result in results) { // Each result has one item CraftARItem* item = result.item; // Obtain information/experience for each item } // Unfreeze the VideoCapture that the singleShotSearch freezes by restarting the capture. [[mSDK getCamera] restartCapture]; } |
You can call
startFinder to start searching continuously without user intervention. This method sends queries at a controllable rate to CraftAR’s Cloud Image Recognition. For every query, the response triggers
didGetSearchResults.
You could for instance call the following function once the Collection token has been set:
1 2 3 4 5 6 |
- (void) mySearchFunction { // The searchControllerDelegate (the mCloudRecognition instance) // will receive the camera events and search // with the picture or image frames coming from the camera. [mSDK startFinder]; } |
and then the implementation of didGetSearchResults should parse the results.
1 2 3 4 5 6 7 8 9 10 |
- (void) didGetSearchResults:(NSArray *)results { [mSDK stopFinder]; // Parse the results for (CraftARSearchResult* result in results) { // Each result has one item CraftARItem* item = result.item; // Obtain information/experience for each item } } |