Programming an interactive space of distributed sensors

This is part of my graduation project back at TU Delft. The aim of the project was to test weather designers could learn to code an interactive room fitted with a newtwork of sensors. This project was a collaborative effort between me and a team of engineers from the Computer Science faculty.

My role

Design
Development

Type of project

Design and prototyping

A little background

Teaching designers how to code is no small feat.

This project was the combination of two different efforts. On one side you had the engineers, who wanted to build a distributed network of micro controllers inside an interactive room.
On the other side, you have me, a designer whose job was to simplify the complexity around coding for distributed networks, to allow designers to create interactive pieces inside the interactive room. The main challenges for this project were:

1

Distributed networks don't rely on the programming paradigm we are most familiar with. Object Oriented Programming was not an option because the distributed network built by the engineers was using a different paradigm, called State Machine.

2

State Machine is a completely unfamiliar paradigm for designers who code. For me, the main challenge was to figure out how easy (or hard) it was for a user get something useful out of it.

3

Parallel to testing the assumption that designer can learn and use the State Machine paradigm, I needed to figure out what a programming environment for state machine would look like.

Guerrilla testing with student

Asking people to create state machines for common devices

One of the first step for me was to create a series of experiments, aimed at understanding if people could grasp how State Machines work. I run these experiments with volunteers, picked at random from around the faculty.

The test setup

Each participants was asked to replicate the behaviours of everyday machines, like a vending machine or a microwave. To do that they were given 3 predefined states, plus a list of input, output and actions. By combining those together they had to fill in the gaps and create as many states as they needed to perform a task (in the case of the vending machine, take some money and give out food).

Key learning

The main learning for this experiments was that State Machines don't allow for much flexibility. As long as the behaviour intended is quite linear (if this then that) then people can manage. But as soon as more responsiveness is introduced keeping track of all the states changes becomes rather difficult.

creating a visual language

Time to understand what a programming language for State Machines looks like

Next, I started designing and testing what a programming language for State Machine could look like. The design was informed by my test with my fellow design students on how the new paradigm could be adapted to the designer's idea of programming.

A visual programming language

I knew I could not create a text based programming language for State Machine. I simply did not have the knowledge and the understanding required for such an endeavour. I also knew that designers have a strong affinity to graphical interfaces. Visual programming languages are also often used to familiarise kids with programming (see MIT's Scratch and the more recent SAM Labs).
So for all these reasons and more, I decided I would create a visual programming interface.

Designing, testing and iterating a functional prototype

The first iterations of the interface were done through paper prototypes, where I asked users to code simple interactions, like turning on and off or change colours to an array of LED.
This first round of testing helped me understand what was the right level of abstraction for the tool to be usable.

It quickly became clear that users needed some kind of visual feedback. Coding this type of interactions without seeing the results was difficult for most users. So I created a virtual representation of the grid of tiles inside the interactive room, under which the network of embedded sensor was placed. The idea was to mimic the behaviours coded on paper with those shown on the iPad running the prototype.

A functional prototype

This video shows the final prototype in context. It summarises the main features of the application and how it could be used inside the room it was built for.