Data Science Programming

L3D Cube visualizations Part 1: real-time scatter plot with Thingspeak

Overview

Basically a demonstration of the plotting capabilities of the cube. We will retrieve some data from a public thingspeak channel (they correspond to
data points posted by a connected barometer installed in my living room).

The JSON returned by the Thingspeak API is parsed on processing and displayed on the cube. Each serie of data is represented by a 2 voxels thick scatter plot.

The client code on the photon is a variation of the main client: we use the accelerometer data to give the ability to change the plot displayed on the front frames of the cube by tilting it one way or another.

Walkthrough

Initial set-up

Thingspeak

You are free to set up a new account and post data of your own or to use my own channel since it is public.

I also made a test channel for those of you who don’t want to create an account at all.

And of course you can use any other source of data with little adjustments to the code.

Processing

To quote their main page:

Processing is a flexible software sketchbook and a language for learning how to code within the context of the visual arts. Since 2001, Processing has promoted software literacy within the visual arts and visual literacy within technology. There are tens of thousands of students, artists, designers, researchers, and hobbyists who use Processing for learning and prototyping.

It runs on Linux, Mac, Windows and even Raspbian so you can choose your flavor and download it here.

Processing is its own Java-esque language (it is to Java what Arduino is to C++. Which is not a surprise since the Arduino IDE is derived from Processing).

Libraries

How do you install a library with Processing ? A good place to start is to read their guide on the subject.

We need to add two libraries to Processing if we want our application to work:

  • L3D Library: used to draw and stream graphics to the Cube. A simulation of the state of the cube can be rendered and the corresponding state can be steamed to the cube. Needs to be installed manually.
  • HTTP Requests: this library will handle the GET request made to the API and retrieve the response. This one can be installed from the IDE using the Contribution Manager.

The code

Link to the repository.

The code is divided into two parts. The first one corresponds to the sketch that we will run on Processing. It is the back-end of the application, the server that will retrieve and process the data.

The front-end client runs on the Photon that lies under the cube. It is the receiving end of the stream of voxel colors sent by Processing.

Server: Processing sketch

Calling the API

Import the necessary library and set up the variables that will compose your request, providing the core of the request’s address, the ID of your channel, the return type and the options that you wish to use.

Here we provide the ID of my channel and request the average values of the past 2 days, returned as a JSON.

import http.requests.*;

// Declare request parameters
String channelId = "53833"; // id of the thingspeak channel to connect to
String requestParams = "?average=240&days=2"; // retrieve the average values over 4h periods
String requestStart = "https://api.thingspeak.com/channels/";
String requestEnd = "/feeds.json/";

// Set-up the request url 
GetRequest get;
String request = requestStart + channelId + requestEnd + requestParams; // compose full request

Next we need a function that will make the request and return the response as a JSONArray containing the values from the 4 fields queried for each timestamp.

JSONArray getData() {
  JSONObject jsonObject;
  JSONArray results;
  GetRequest get;
  String response;
  
  get = new GetRequest(request); // instanciate new GET request object
  get.send(); // send GET request to thingspeak
  response = get.getContent(); // retrieve response 
  jsonObject = parseJSONObject(response); // parse response
  results = jsonObject.getJSONArray("feeds"); // get data from response as Array
  
  return results;
}       

Parsing the response

This is where we get to display the values on the cube. Thus we need to import the corresponding library, as well as instantiate and initialize the cube in the setup() function. The setup() functions is equivalent to its Arduino counterpart: it is called once at the begging of the application.

We set-up the rendering window, initialize a new cube object and start streaming the voxel’s values on port 2000.

import L3D.*;
...
// Instanciate cube object
L3D cube;

void setup() {
  size(512, 512, P3D);  // start simulation with 3d renderer
  
  cube = new L3D(this); // init cube 
  cube.enableMulticastStreaming(2000); // enable streaming of voxel colors
}

Now that we can query the channel and project the data, we need a way to parse the JSONArray returned by getData() in a way that its elements can be displayed by the cube.

We begin by defining metadata about our set. For each field we provide:

  • A range used to map the values on a [0 ; 7] scale.
  • The name of the field.
  • The color to be applyied to this field.
// Set-up parameters for the plots
int[][] metadata = {{15, 25}, {30, 40}, {1000, 1050}, {0, 80}}; // lower and upper range of the data series. Used to map original values on 8 voxels
String[] fields = {"field1", "field2", "field3", "field4"}; // name of the fields to retrieve from json
color[] colors = {color(227, 131, 5), color(75, 97, 222), color(23, 181, 14), color(236, 242, 44)}; // colors to be used to display the series

Next we write a new function that will parse and display our data on the cube using the values we just defined.

void updateData() {
  JSONArray results = getData(); // retrieve data
  int k = results.size();
  int x = 7;
  
  // Display data as scatter plot mapped on 8 voxels
  // only the 8 last data points are displayed
  for (int i=0; i<8; i++) {
    for (int j=0; j<4; j++) {
      float value = results.getJSONObject(k-i-1).getFloat(fields[j]);
      int roundedValue = round(map(value, metadata[j][0], metadata[j][1], 0, 7));
      cube.setVoxel(x,roundedValue,j*2,colors[j]);
      cube.setVoxel(x,roundedValue,(j*2)+1,colors[j]); // We give a 2 voxel thickness to each data point
    }
    x--;
  }  
}       

Putting it all together

Remains to populate the draw() function. To come back to the analogy with Arduino, it is the equivalent of loop() in that it runs perpetually unless told otherwise with noLoop().

Since the data requested is the average value of the past 4 hours, we don't need to query Thingspeak more often than once every 4 hour. Let's define two variables that will help use achieve this.

int nextUpdate = 0;
int updateRate = 4; // update rate in hours. Since the each data point is an average over 4h, update every 4h

Next set-up the draw() loop:

void draw() {
  background(0); // set background to black
  lights(); // turn on light
  
  // update data if delay elapsed
  if ((millis()-nextUpdate)>0) {
    updateData();
    nextUpdate = millis() + updateRate*60*60; // reset time for next update
  }
}

You're all set! Let's turn our attention to the Photon now.

Client: Photon firmware

If you don't feel comfortable programming the Photon, you can have a look at this Getting started guide.

Upload the following code to your device.

So what happens here ? We open a UDP socket on port 2000 over the local network and listen for incoming bytes.

A buffer is filled until 512 bytes are received. Each byte corresponds to a color. We loop on the buffer and display the corresponding colors on the cube.

There is a little twist here, which is why this sketch has its own client version.

Part of the code's function is to monitor the inclination of the cube, using the data from the accelerometer. If the cube is tilted forward or backward, the frames are shifted in the corresponding direction so as to make another plot appear at the front.

Sources

  • JSON GET example from HTTP Request library repository.
  • JSONObject documentation from Processing.org
  • JSONArray documentation from Processing.org

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.