CategoryData Science

Volumetric display and data visualization : 4 animations for the L3DCube

The L3DCube

The cube falls into the category of volumetric displays, meaning that it can be used to represent 3 dimensional shapes. It is composed of 8*8*8 512 RGB LEDs, namely the very popular WS2812 that you can find in Adafruit’s neopixel product line.

It is sold by a company called Looking glass factory. It is still a product at the kickstarter stage. You can read about the story of its development on this instructable.

They make an 8*8*8 and a 16*16*16 version. The small version will set you back 399$, not something that I can afford. I was able to play for a while with the one from WearHacks. The solution for you can be even more rewarding: build your own! You can start by having a look at these Instructables.
Continue reading

L3D Cube visualizations Part 1: real-time scatter plot with Thingspeak

Overview

Basically a demonstration of the plotting capabilities of the cube. We will retrieve some data from a public thingspeak channel (they correspond to
data points posted by a connected barometer installed in my living room).

The JSON returned by the Thingspeak API is parsed on processing and displayed on the cube. Each serie of data is represented by a 2 voxels thick scatter plot.

The client code on the photon is a variation of the main client: we use the accelerometer data to give the ability to change the plot displayed on the front frames of the cube by tilting it one way or another.

Continue reading

L3D Cube visualizations Part 2: real-time worldwide weather

Overview

We will make use of the OpenWeatherMap API to retrieve the temperature from cities around the world and displays the result on the cube. The result is a “real-time” (actually the free API key only gives access to hourly updates) visualization of the earth’s weather.

A Python script is used to select which cities are displayed: we start with a json file provided by OpenWeatherMap that contains every city accessible from the API as well as their ID and coordinates. The json is parsed and casted as a panda dataframe. The latitude and longitude of each city are transformed in voxel coordinates over a sphere of 4 voxel radius. The cities that fall on the same voxel are grouped and a random one is picked up from each group to represent that voxel.

The result is saved in a csv file that is loaded in Processing and used to query the API. The temperatures of each city are then shown on the cube using a gradient of color.

Continue reading

Make your own data platform for the Internet of Things using Node.js and Express.js

Prologue

I recently wrote a tutorial explaining how to make a connected barometer in which I used Thingspeak as an endpoint for the data. With the current buzz around the Internet of Things, a lot of similar services popped up : Plotly, the Wolframe data drop, Xively and even IBM Cloud to name a few.

What I find problematic with theses services is that you loose control over your data. What are they used for, what happens if the company closes ? You don’t want to lose your preciously collected data.

One solution is to create your own data platform. This way you keep full control over your database. You can set-up backups and are sure that your digital property wont be used for commercial purposes behind your back.

We will thus be making IoT.me, a web application using Node.js as a server, Express.js for the framework and MongoDB as the database (the MEAN stack, without the A).
Continue reading

Python, neural networks and the value of failure

Remember that in my last post about neural networks, I tried (and failed) to replicate the results I obtained in R using Python.

I have been thinking about how I would solve the problem, and frankly I wasn’t eager to spent too much time on such a silly example, especially since I’m not a specialist of the PyBrain module.

The problem is, beside my occasional laziness when it comes to solving problems, I’m also quite stubborn and I don’t like to let things go that easily.

And I realized that it was a great opportunity to write about failure, and how to react when confronted to it.
Continue reading

Playing around with neural networks – Python version

In my last post I said that I would try to replicate the code in Python. Well here it is.

It is a first attempt, and unfortunately the predictive power of the network thus created is awful (it’s even worst than a random guess…). I need to explore more deeply the options of the module in order to understand where lies the difference between the network created in R and this one.
Continue reading

Playing around with neural networks and R

I was recently confronted through my work to a classification problem : given a set of explanatory variables, which category a player will most likely end up in (I work in the videogame industry).

To be frank, my statistical knowledge was a little rusty since I have been doing web-dev for a year (unfortunatly stats are not like riding a bike : you do forget after a while). So I ended up doing a quick litterature review in order to list the tools that could help me with this task.

I began with a logistic regression but wasn’t that happy with the accuracy of the result and the implementation was not that easy due to the high volume of data I was dealing with.

Through my readings, I came in contact with various techniques of machine learning and was eager to try them out. I’ve heard about it in the paste but it seemed like a mystical and out of reach corner of computer and information sciences.

And I was wrong. It’s accessible, it works and it’s a lot of fun (well, data-scientist-kinda-fun). What follows is my naive attempt at solving a problem that puzzled me for a while somehow : automatic shape recognition. I mean, how long did it took us as kids to be able to put those damn educational toys into the rightly shaped slot ? Well, quite a long time after all…
Continue reading

© 2016 DigitalJunky

Theme by Anders NorenUp ↑