Robots are coming! Build IoT apps with IBM Watson, Swift, and Node-RED

Robots are coming! Build IoT apps with IBM Watson, Swift, and Node-RED

Course Features

Course Details

In this hands-on Internet of Things (IoT) course, you’ll connect Node-RED flows (applications) running on IBM Bluemix (IBM Cloud) to more flows and simple Swift applications. You’ll use those applications to track the temperature of a Raspberry Pi CPU, store that data in IBM Cloudant NoSQL database, takes pictures with the RaspCam, send Twitter notifications, and send simple commands from your smartphone to the iRobot. Important: You don’t need to purchase the iRobot to be successful in this course. But playing with a programmable robot can be a lot more fun! For a quick overview of the IBM Watson IoT Platform, see the video IoT made simple with Watson IoT Platform (3:17 minutes).
Get started

Lab 1: read temperature data from an IoT sensor

FREE
Create the Internet of Things Platform Starter in IBM Bluemix and connect to a virtual sensor
2 of 9
In this section, you’ll deploy the Internet of Things Platform Starter, which is a boilerplate application, and connect it to a virtual sensor on a web page. You don’t need your Raspberry Pi device for this section. Log in to Bluemix and click Catalog > Boilerplates > Internet of Things Platform Starter. Enter a name and host for your application. Both names must be unique. Then, click Create. After the application launches, click View App and then on go to the Node-RED flow editor. You should see Flow 1 running on Bluemix with a virtual thermometer. For now, you’ll work with the flow in the red square as shown below: After you start your Internet of Things Platform Starter instance, click the following link to connect a temperature simulator to the IBM IoT app In node as shown in blue in Flow 1: http://ibm.biz/iotsensor Copy the value in the upper-right corner of the virtual thermometer. In the image above, the value is 03075c7af9d9. In the flow editor, double-click the IBM IoT App In node. Paste the ID of the sensor into Device Id field. Click Done and then deploy the flow. Go to the virtual sensor and increase the temperature to 41° C or more. In the flow editor, note that the flow has three green output debug nodes that show flow data in the debug pane. Disconnect the top green Debug output payload node in the top flow by clicking the connecting line and pressing Delete on your keyboard. Disconnect the device data node on the bottom flow as shown in the following image. Then, click Deploy. You might need to scroll down in the debug pane to see the simplified view of temperatures.
FREE
Prepare the Raspberry Pi to use as an input source
3 of 9
Be sure you already set up your Raspberry Pi device. For help with setting up the device, see the Help videos and the Raspberry Pi Software Guide. Turn on the Raspberry Pi device with the Raspbian Jessie. Connect all the peripherals: monitor, keyboard, mouse, and a mini USB smartphone or digital camera charger for power. Configure the device network to use the same LAN that you are using. Suggestion: Use an access point that you might reuse in different locations. Set up networking and launch the Node-RED service. When you launch a terminal, get the IP address that is assigned to your Raspberry Pi by running this command: ifconfig Read and save the assigned IP address. In typical LAN locations, it is something like 192.168.1.213. When the IP address is known and if your computer is on the same LAN network, you are ready to launch a terminal on your computer. Run the following Secure Shell (SSH) command and enter the assigned IP address in place of the example IP address: ssh [email protected] If the SSH commands do not work, follow these steps: Install SSH: sudo apt-get install ssh. Wait for the installation to complete. Start the daemon (UNIX name for a service) with this command from the terminal: sudo /etc/init.d/ssh start Optional: To start the daemon every time you start Raspberry Pi, add the command: sudo update-rc.d ssh defaults When you see the prompt for the password, enter your own password or enter raspberry, which is the standard default password and press Enter.  Start Node-RED by entering one of the following commands. You can start it automatically when you start your computer or manually: Start Node-RED automatically: sudo systemctl enable nodered.service Start Node-RED manually: node-red-start Enter ifconfig to see the IP address of the Raspberry Pi device. Enter the IP address in a browser. You can now see the empty Node-RED environment...

Continue Reading
FREE
Connect the Raspberry Pi Node-RED flows to IBM Bluemix
4 of 9
Import a new Node-RED flow into the flow editor on the Raspberry Pi device: Go to GitHub: https://raw.githubusercontent.com/ibm-messaging/iot-device-samples/master/node-red/device-sample/quickstart.json. Copy the raw JSON from GitHub to your clipboard. In the flow editor, create a new flow tab. Import the new flow by clicking the Menu icon > Import > Clipboard. Double-click the red exec node (getCPUtemp) that retrieves the CPU temperature from the Raspberry Pi device. Add the following information and then click OK: Command: vcgencmd Append: measure_temp Name: GetCPUtemp Connect the Node-RED flow on the Raspberry Pi with the Node-RED flow that you already have on Bluemix. The communication between the Node-RED instance running on the Raspberry Pi and the one running on Bluemix is provided by IBM Watson IoT Platform/Quickstart, which is based on MQTT, a lightweight messaging protocol. Double-click the blue Watson IoT Output node (wiotp out) in the flow on the Raspberry Pi device. Select Quickstart and enter a unique value for the Quickstart Id. This value can be any string, but it must be unique in the entire system. Then, click OK. Double-click the blue IBM IoT App In node and enter the same Device Id that you used for the one on the Raspberry Pi. This is how the two Node-RED environments will communicate. Then, click Done. When the flow is set up as instructed in the previous steps, click Deploy. Then, click the timestamp node on the Raspberry Pi to start the flow that will send CPU data to the Node-RED instance on Bluemix every 5 seconds. Review the output in the debug panes. The two outputs are dissimilar because the Node-RED flow on the Raspberry Pi does not have the trigger warning that you used for the Node-RED flow in Bluemix. If you disconnect the connections to the green debug node on the Raspberry Pi, you won’t see any output in the first debug pane, which is highlighted in red, but you will see output in the debug pane highlighted in blue...

Continue Reading
FREE
Add a Cloudant database to store temperature data
6 of 9
In addition to adding Twitter notifications, you can add a Cloudant database node to store the temperature data. Then, you might want to read that data from your smart device when you want to validate what is happening with your physical temperature sensor and follow up with your social network followers. In the Bluemix dashboard, select your Watson IoT boilerplate application. Then, click Overview on the left navigation. Under Connections, select the existing Cloudant NoSQL database engine. In the Cloudant service dashboard, click Launch. Then, add a new database by clicking Create Database and name it temperatures. Then, click Create. storage, drag a cloudant out node onto the canvas. Be sure to use the cloudant out node with the Cloudant logo on the right as shown in the following screen capture. Then, double-click it to add data to the internal Cloudant database by using the following values: Database: temperatures Operation: insert    Connect the cloudant out node and function node to the previously created Node-RED flow at the output of the temp node. Configure the timestamp node to add the time to the temperature in the msg.payload to insert it in the Cloudant database: msg.timestamp= new Date().toISOString(); return msg; Optional: Reposition the nodes on the palette to make the flow easier to follow. Redeploy the Node-RED flow. To see how the temperatures are being recorded, open the Cloudant database Dashboard and review the temperature records: If you can see the records as shown above, you are ready to build an iOS app that has a UI to display the temperatures that are sent from the Cloudant database.
FREE
Create a UI in Swift and Xcode
7 of 9
Create a UI using Swift in Xcode that has the button and label named Refresh IoT Temp. If needed, see the solution video for this lab to see how to deploy this UI from the Main.storyboard from the object library. Connect the UI with the code. If you need help, see the solution video for lab 1. The following code shows the basic functionality of the code without wiring with the Cloudant database: import UIKit //import SwiftCloudant from CocoaPods class ViewController: UIViewController { @IBOutlet weak var tempLabel: UILabel! private var jsonTemp = 0.0 override func viewDidLoad() { super.viewDidLoad() // Do any additional setup after loading the view, typically from a nib. } override func didReceiveMemoryWarning() { super.didReceiveMemoryWarning() // Dispose of any resources that can be re-created. } @IBAction func refreshButtonPressed(_ sender: Any) { NSLog("refresh IoT Temp button pressed") //get the temp //update the label tempLabel.text = "T= ?" } } When the code as shown above is working, close this project or close the Xcode editor. Add the SwiftCloudant CococaPods resource. You should get the CocoaPods with the latest Xcode. To get the latest update, in the terminal enter: sudo pod update Install the CocoaPods SwiftCloudant pod. Open a terminal and go to the project root directory that was selected when you created the iOS application. You should see the name of the application with the extension .xcodeproj. In the root directory of the application, initialize Podfile by issuing the command pod init Modify Podfile with the vim Podfile command. Uncomment platform and add pod ‘SwiftCloudant’. Then, press the Esc key and enter the vi editor command :wq to save the file and close the vim editor. Start the CocoaPods package manager by issuing the command pod install Open Xcode and instead of launching the xcodeproj, use the xcworkspace as it was described in the output of the CocoaPods project. In this example, it is the CheckIoTTemp.xcworkspace file as...

Continue Reading

Lab 2: add a camera and analyze images

FREE
Lab 2 overview
2 of 11
The previous lab demonstrated how you can get an internal (CPU) temperature sensor on a Raspberry Pi to communicate with IBM® Bluemix® through the Watson IoT™ Platform. After you have the data on Bluemix, you can easily access, store, and analyze it by using a Cloudant database and Watson cognitive services. In this lab, you will expand the functionality from Lab 1 into two-way communication between IBM Bluemix and a Raspberry Pi by using Node-RED and the Watson IoT Platform. You’ll add a RaspCam camera to the Raspberry Pi to take pictures requested remotely and then send them to the IBM® Cloudant® NoSQL database on Bluemix for storage and analysis. You’ll use the Watson Visual Recognition service to analyze these images. If a person is found in a picture, you can drill down further to find out the person’s gender and approximate age. This information can then be sent to a smartphone through the Twilio API service. For the communications between the Raspberry Pi and Bluemix, you’ll use the Watson IoT Platform, which is already a part of the Internet of Things Platform Starter boilerplate that you used in Lab 1. Prerequisites You need an IBM Bluemix account and a RaspCam, which is a camera for Raspberry Pi. However, you can complete this lab without purchasing a camera. Instead of using the camera, you can use images from the Internet, but using a camera is a lot more fun! Review the demo and “How it works” video on the IBM Watson Visual Recognition page. You also need some experience working with IBM Cloudant databases. Flow architecture for this lab The following image shows the three Node-RED flows that you’ll use for this lab and how they relate to each other. The first flow runs in Node-RED on IBM Bluemix and sends data that is received by Node-RED on the...

Continue Reading
FREE
Create a device schema on the Internet of Things Platform boilerplate on IBM Bluemix
3 of 11
To send messages to a device from IBM Bluemix, you need an IoT platform. In this lab, you’ll use the Watson Internet of Things Platform Starter boilerplate from Lab 1, but you will start by using one of the services embedded in this boilerplate: the Watson IoT Platform service. You can find that boilerplate in the overview of the services of your Bluemix application. Log in to Bluemix and launch the service. Then, click Launch dashboard. Select Devices and then click Add Device. In the new Watson IoT Platform, there are no device types, so you need to create a one. Click Create device type. Click Create device type and then enter something like RPi3 to identify your Raspberry Pi. For this lab, use the device type name of RPI3 that you just created. Click Next on the next four dialogs until you see the following page with the device that you selected in the list. Click the RPi3 device type. Select your previously created device type and click Next. Enter the device ID. For example, enter RaspberryPi3. Then, click Next. Specify an authentication token or use the randomly generated one like the one shown here: Copy this device information, especially the authentication token. Later, you need to enter this information in Node-RED on the Raspberry Pi. Important: You will not be able to see the authentication token after you leave this page, so be sure to copy and save it.
FREE
Analyze an image from Raspberry Pi on Bluemix
5 of 11
In this Node-RED flow on Raspberry Pi, the application performs these tasks: Receives the command sent from Node-RED on Bluemix Gets a picture from the Raspberry Pi file system and optionally takes a picture with the RaspCam and saves it Sends this encoded picture and its name to Bluemix for cognitive analysis by using the Visual Recognition service Before you use the camera or Internet pictures, import a flow from GitHub: Go to this GitHub page and click Raw. Copy all the raw code to your clipboard. In the flow editor on the Raspberry Pi, create a new tab. Click the Menu icon > Import > Clipboard. In Step 4, Add a Cloudant database node, you will install the Cloudant nodes that are a part of the import, if you did not already do this task described in Lab 1. Decide how you want to get images to analyze: Download pictures from the Internet Use the RaspCam camera Download pictures from the Internet Follow these instructions if you aren’t using the RaspCam. Find a link to an image (PNG or JPEG) on the Internet that is less than 2 MB. For example, copy this link: http://g1.computerworld.pl/news/thumbnails/2/6/265397_resize_620x460.jpg Download the image and save it with the name example.jpg on the Raspberry Pi in the following directory: /home/pi/pictures/example.jpg Important: Do not change the file name. The image must be named example.jpg. Because you’re not using a camera, change the code in the take a pic function node in the flow. Double-click the node and paste in this code under Function: node.log("entering"); var pictureFilename = "/home/pi/pictures/example.jpg"; var currTime = new Date().getTime(); // Providing the above generated file name to the next nodes return {payload: pictureFilename, filename: pictureFilename, filedate: currTime}; Double-click the json function node on the Raspberry Pi and change the file type after “value” to this: “data:image/jpg;base64,” var picId = msg.filename; var picDate = msg.filedate; var...

Continue Reading
FREE
Retrieve and analyze a picture on Bluemix
7 of 11
Now, you are ready to import the bottom flow from GitHub. To get the pictures from the database, double-click the Cloudant node. In Bluemix, configure the Cloudant database to read all documents from the pictures database in the boilerplate database. See the Cloudant documentation on IBM Bluemix for more information on database search indexes. Create a new database and name it pictures. You should now see the pictures database in Your Databases view: To use the Watson Visual Recognition node in Node-RED on Bluemix, you need to create the Visual Recognition service from the Bluemix catalog and connect it to your IoT Starter Boilerplate. From the Bluemix catalog, select the Visual Recognition service. Under Connect to, change Leave unbound to your service in the list of applications and then click Create. When the Visual Recognition service opens, you should be ready to use it in the Node-RED flow on the Bluemix. You might need to restage the IoT Starter boilerplate so that you can use this service in your Node-RED flow. If you have problems, see Problems connecting from Bluemix to the Raspberry Pi device. After the Visual Recognition service is running, click Service Credentials and then View Credentials to get the API key. You will need it to enter this key in the visual recognition node in Node-RED on Bluemix. Double-click the visual recognition node in the Node-RED flow, enter the API Key, and select any other functionality that you want. Finally, you can see the results in the flow editor debug pane: You might want to test the image for readability by the Watson Visual Recognition service and also get it processed through the web interface on the demo page of the IBM Watson service. To see what type of images are being recognized with the best accuracy, use this link: http://visual-recognition-demo.mybluemix.net/.
FREE
Retrieve the picture that was taken by the Raspberry Pi in a Swift application
8 of 11
Be sure you’ve completed Lab 1 and all previous steps in this lab. Now, you can see the picture and pass it to the database on your mobile application. In Xcode, add a button and the Image View in the existing view controller in Main.storyboard by dragging and dropping the relevant components from the list on the bottom right pane: Connect the view to the code. If you need help, watch the solution video for Lab 2. Create a function: @IBAction func showPictureButtonPressed(_ sender: Any) Create a parameter: @IBOutlet weak var imageFromDb: UIImageView! Optional: Add a helper parameter: private var fetchedImage = UIImage() As in Lab 1, implement the function showPictureButtonPressed. Connect to the database and read the JSON document with the stored picture as you did in Lab 1. You can use the Cloudant library again to do CRUD operations in the Cloudant database. Use this code to connect to the database and read the JSON with the embedded picture: //connect to pictures DB let cloudantUrl = NSURL(string: "cloudant db connection url") let cloudantClient = CouchDBClient(url: cloudantUrl! as URL, username: "cloudant db connection user", password: "cloudant db connection password") let database = "pictures" //get picture let find = FindDocumentsOperation(selector: [:], databaseName: database, fields: ["value", "pic_date"], limit: 1, skip: 0, sort: [], bookmark: nil, useIndex: nil, r: 1) { (response, httpInfo, error) in if let error = error { print("Encountered an error while reading a document. Error:\(error)") } else { //get the temp value from JSON do { let data = try JSONSerialization.data(withJSONObject: response!, options: []) let parsedJson = try JSONSerialization.jsonObject(with: data, options: []) as! [String:Any] if let nestedArray = parsedJson["docs"] as? NSArray { //getting nested temp from payload let newDoc = nestedArray[0] as? [String:Any] // access nested dictionary values by key let encodedImage = newDoc?["value"] as! String let index =...

Continue Reading

Lab 3: connect your app to an iRobot and smartphone

FREE
Invoke the MQTT command on Watson IoT Platform from the iOS application
4 of 8
In this section, you’ll write a simple Swift application for iOS by using Xcode on a Mac. You need two buttons for your user interface: beep and dock. If you don’t want to build the interface, you can use the application code in GitHub. When you build the application, you must add an MQTT client and reference it in the package manager of your choice. You might want to choose the protocol pod MQTTClient, which is available through CocoaPods. Reference the MQTT environment from your IoT boilerplate application. Then, click View Credentials. Enter your credentials in the following code: /* "iotCredentialsIdentifier": "???????????", "mqtt_host": ".messaging.internetofthings.ibmcloud.com", "mqtt_u_port": 1883, "mqtt_s_port": 8883, "http_host": ".internetofthings.ibmcloud.com", "org": "", "apiKey": "", "apiToken": "" */ let ORG_ID = "" let ioTHostBase = "messaging.internetofthings.ibmcloud.com" let C_ID = "a: :RaspberryPiCreate2App" let DEV_TYPE = "RPi3" let DEV_ID = "RaspberryPi3iRobotCreate2" let BEEP_MSG = "1" //play beep let DOCK_MSG = "2" //dock let IOT_API_KEY = "" let IOT_AUTH_TOKEN = "” With these parameters when a button is pressed, you should create a connection to MQTT queues: let host = ORG_ID + "." + ioTHostBase let clientId = "a:" + ORG_ID + ":" + IOT_API_KEY let CMD_TOPIC = "iot-2/type/" + DEV_TYPE + "/id/" + DEV_ID + "/cmd/cmdapp/fmt/json" iotfSession.connect( to: host, port: 1883, tls: false, keepalive: 30, clean: true, auth: true, user: IOT_API_KEY, pass: IOT_AUTH_TOKEN, will: false, willTopic: nil, willMsg: nil, willQos: MQTTQosLevel.atMostOnce, willRetainFlag: false, withClientId: clientId) This code sends a command over the MQTT connection: iotfSession.send(BEEP_MSG.data(using: String.Encoding.utf8, allowLossyConversion: false), topic: CMD_TOPIC, qos: MQTTQosLevel.exactlyOnce, retain: false) Test the application.    When your app is working, you can build the receiving flow on Raspberry Pi. Import the flow from GitHub to the Node-RED environment on Raspberry Pi. Just like in Lab 1, you’ll use the existing flow and configure it to connect to the Raspberry Pi. Update the credentials that you...

Continue Reading
FREE
Add a voice user interface
6 of 8
You can extend your application with a do-it-yourself voice user interface. Imagine that your home robot assistant can take voice commands and give you advice like other voice assistants currently on the market. By using your iOS application, your robot assistant could take requests to provide the current temperature at, say, the golf course. To configure the voice interface, you’ll add the following services from IBM Bluemix: Watson Speech to Text Watson Text to Speech You’ll also add the Weather Company Data service to your application to get a weather forecast. In the existing application, set up the Carthage repository with the required SDKs by creating a cartfile and including the following SDKs: Watson-Developer-Cloud SDK for iOS: Simplifies using Watson services in Swift. Bluemix Services SDK for iOS: Helps with connectivity to Bluemix services. SwiftyJSON SDK: Simplifies JSON operations in Swift. In a terminal, issue the following command from the root directory of the application: cat > cartfile Copy and paste the following Carthage GitHub commands: github "ibm-bluemix-mobile-services/bms-clientsdk-swift-core" github "SwiftyJSON/SwiftyJSON" github "watson-developer-cloud/swift-sdk" Fetch the libraries from Git by running the command carthage update --platform iOS. Add in the Xcode references to the following libraries from the Carthage/iOS directory: BMSAnalytics.framework BMSCore.framework RestKit.framework SpeechToTextV1.framework TextToSpeechV1.framework SwiftyJSON.framework Create two new .swift files: String.swift: Adds the encoding for the network. WeatherData.swift: Manages the connectivity to the Weather Company Data service. Add the following code to these files: String.swift import UIKit // Extending String to support Base64 encoding for network requests extension String { func fromBase64() -> String? { guard let data = Data(base64Encoded: self) else { return nil } return String(data: data, encoding: .utf8) } func toBase64() -> String { return Data(self.utf8).base64EncodedString() } } WeatherData.swift // // WeatherData.swift // WatsonIoTCreate2 // // Created by Marek Sadowski on 2/25/17. // Copyright © 2017 Marek Sadowski. All rights reserved....

Continue Reading

Course summary

More Courses by this Instructor