Warning: count(): Parameter must be an array or an object that implements Countable in /home/iteanz/public_html/wp-includes/post-template.php on line 317

Lab 2 overview

The previous lab demonstrated how you can get an internal (CPU) temperature sensor on a Raspberry Pi to communicate with IBM® Bluemix® through the Watson IoT™ Platform. After you have the data on Bluemix, you can easily access, store, and analyze it by using a Cloudant database and Watson cognitive services.

In this lab, you will expand the functionality from Lab 1 into two-way communication between IBM Bluemix and a Raspberry Pi by using Node-RED and the Watson IoT Platform.

You’ll add a RaspCam camera to the Raspberry Pi to take pictures requested remotely and then send them to the IBM® Cloudant® NoSQL database on Bluemix for storage and analysis.

You’ll use the Watson Visual Recognition service to analyze these images. If a person is found in a picture, you can drill down further to find out the person’s gender and approximate age. This information can then be sent to a smartphone through the Twilio API service.

For the communications between the Raspberry Pi and Bluemix, you’ll use the Watson IoT Platform, which is already a part of the Internet of Things Platform Starter boilerplate that you used in Lab 1.

Prerequisites

You need an IBM Bluemix account and a RaspCam, which is a camera for Raspberry Pi. However, you can complete this lab without purchasing a camera. Instead of using the camera, you can use images from the Internet, but using a camera is a lot more fun!

Review the demo and “How it works” video on the IBM Watson Visual Recognition page.

You also need some experience working with IBM Cloudant databases.

Flow architecture for this lab

The following image shows the three Node-RED flows that you’ll use for this lab and how they relate to each other.

    1. The first flow runs in Node-RED on IBM Bluemix and sends data that is received by Node-RED on the Raspberry Pi.
    2. This second flow in Node-RED on the Raspberry Pi is also sending data, in this case a picture to the cloud storage and its name to the Node-RED flow on Bluemix.
    3. When the third flow in Node-RED on Bluemix receives MQTT messages with the names of pictures, the Bluemix flow selects the pictures from the cloud storage and sends them to the Watson Visual Recognition service for analysis.