Create a Watson image recognition app with Swift
Overview
One of the most impressive Watson cognitive analytic services is Visual Recognition. In this lab, you’ll use that service to consume an image URL, identify the image, and return a confidence score for relevant classifiers representing things such as objects, events, and settings.
For example, the Visual Recognition service recognizes the blue sky and mountain in this image and suggests that this landscape is from Yosemite National Park in the United States.
To create the application in this lab, follow these main steps:
- Create a simple iOS application in Swift.
- Instantiate the Watson SDK for iOS.
- Create a Watson service in Bluemix and get the key token for it.
- Add some lines of code in the Swift iOS application to call the Watson service.
You can see the code for the GUI for this lab in GitHub: https://github.com/blumareks/Swift-Watson-Guis.
Prerequisites
You need the following software:
- Mac OS X El Capitan
- Xcode 7.3 or later
- Swift 2.2.x
- Carthage (package manager similar to Ant): https://github.com/Carthage/Carthage#installing-carthage
Complete the previous lab “Create a Watson sentiment analysis app with Swift.”
Currently, the Watson Developer Cloud SDK for Swift requires a legacy version of Swift. Be sure to select the option to support Legacy Swift (Swift 2.3):