Emoji Split-Flap Display



What will be the visual language of the Internet of Things? If we move away from screens and monitors as the primary method of digital communication, and instead embed communication technology into the objects and environments around us, then what language will they speak?

This prototype is part of multiple research outputs I’m building with the aim of engaging with these questions. It aims to explore post-digital semantics, and what kind of signs and symbols make sense in the growing crossover between our digital and physical lives. Future outcomes will test other forms of communication, but to begin with I wanted to use emojis – a new universal language of symbols built into all new smart phones. They’re intriguing as they incorporate symbols from a range of global cultures, selected by standards committees for their significance to contemporary messaging, and include a diverse range of subjects from love hearts to smiling poos, and from detailed miniature landscapes to cryptic glyphs and icons. These small images hold deep cultural or emergent meanings within digital communication, and therefore seem well suited to this research.

The scope of this particular prototype is to test the mechanism of the split-flap, and the ability to control the device remotely over the network. To do this I have repurposed the flaps from a cheap ebay split-flap clock. This gave me 60 character positions to play with, and I therefore selected a subsection of popular and practical emojis to display. I placed the flap mechanism in a section of aluminium box section, and connected it to Sparkfun small stepper motor, which provides the precise movement control needed to flip the display one flap at a time. This is then controlled by an Arduino Yun, via an H-Bridge chip. The Yun makes the control over the network incredibly easy to set up, as it has wifi built in, and allows you to control the Arduino code via simple REST commands – meaning a link on a website can send a message to the Arduino (providing they are connected to the same network). The H-Bridge chip is a standard method of controlling motors with an Arduino. Motors use much more power than an Arduino can provide, and therefore need an external component to safely control how the power is sent to the motor.

The next stage for this prototype is to connect it to online information sources, so it can act as a physical representation of live online information. The finish of the current prototype will also be improved. More updates on this to follow. In the mean time, here’s a video of it in action:

Angus Main – Emoji Split-Flap Display from Angus Main on Vimeo.

Pair Painting


As part of my research focusing on developing methods of assisting fine art processes through the use of creative coding, I have developed a small web app which provides suggestions and prompts for abstraction.

This builds on the coding principle of Pair Programming, where two people assist each other in a coding task. With this system the idea is to use the abstract strengths of the computer (which are essentially machines for translating and abstracting data forms) to suggest words, styles and imagery to the artist. They are then free to interpret or utilise these prompts in whatever way they choose. The prompts are generated by the computer based on an algorithm that starts with a random word, then generates linked or associated imagery using a series of automatic internet searches. The code is written in Javascript, and the results shown simply on an HTML5 canvas within the tablet browser. The artist can easily swipe between prompts on the screen.

This is part of an ongoing research project, and hopefully I’ll post the results of more experiments here soon.

Colour Capture App


As part of the ongoing Hues project with Jenni Grove, I’m creating an app for iOS which makes it easier to record environmental colours. The app takes the camera input and outputs an RGB colour value. This is based either on an average sampling of the whole screen, or the colour value of an individual pixel in the view. This makes it easy for me to record changing colour values, and in particular trace sea colours over time. Placing the iPhone in the window above Porthmeor beach allowed me to record the changing shades of blue, green and grey over the day. The plan is to use these colours in generative art projects.

Chart Recorder Variations

IMG_1531 IMG_1523

Continuing on from the Chart Recorder tests, I have been exploring different – perhaps less precisely quantitative methods of displaying live data. These tests show the printer outputting a range of patterns that vary in density to signify different intensities within the data.

IoT Thermal Chart Recorder



This is the latest outcome in a long running set of projects looking at how to visualise or manifest digital information within real world, domestic environments. My current work looks at how to signal ongoing streams of data in a cheap and effective way. To do this I have build a Chart Recorder using a small Thermal Printer.

Chart Recorders are the traditional way of recording constantly fluctuating data streams. Normally using a moving pen and a roll of chart paper, they have familiar aesthetic and often used to represent seismographs, lie detectors, ECG machines etc.

The mini thermal printer is a staple output tool for Arduino, Raspberry Pi and Physical Computing projects. They’re simple to get working, ad there’s plenty of documentation and libraries available on line. They’re normally used when a ‘take away’ outcome is required – a printed out receipt, or a personalised piece of information that the user can take with them. However I’m interested in seeing how they can be used effectively as signage, or persistent, slowly changing displays. These could be linked to the Internet of Things to provide ambient feedback on a range of data sources. Perhaps showing household energy use, or local environmental information such as weather or tides.

Using thermal paper means that it’s easy to represent slowly changing data without having to constantly power a visual display. The drawback is that it introduces a consumable, with the paper having to be restocked periodically. This isn’t particularly costly though (receipt paper is mass produced to be cheap, and can be sourced recycled and BPA chemical free), and the fact that the paper does need to be purchased, even for a few pence, could be used as a feature of the device – perhaps incentivising certain behaviours.

The printer is controlled by an Arduino Yun, which draws in data over WiFi. This also controls the stepper motor to spool up the paper on the other side of the display. Further development could use the gearing of the printer to operate the opposite spool as well.


Inputs and outputs – the bread and butter of interaction design. You can have them individually, but you get a much more tasty project when you put them together. For physical computing that means coupling together sensors with outputs such as lights, motors, speakers, displays etc.

I recently ran my INPUT/OUTPUT brief with students MA Industrial Design at CSM, who came up with some great outcomes involving interactive clouds, drunkenness sensors, face-to-sound generators, tele-presence machines, and a blushing Obama.

photo 4


Space Replay

This brilliantly spooky sound project was created by one of my Information Experience Design tutees, Francesco Tacchini, along with fellow RCA students.

Great mixture of Arduino and helium!

Space Replay from RCA IED on Vimeo.

More information here:


Flip dots

photo 1


Flip dots are one of the most iconic analogue displays. They contain a small metal disc and two magnetic coils. By charging the coils you can flip the disc back and forth – revealing different colours. Put lots of these together and you can large scale kinetic displays like those seen in public spaces like train stations. As well as looking (and sounding) great, one of the benefits of these kinds of displays is that once set, the display can show information indefinitely without using power.

I got mine from flipdots.com. When I find out ways of driving them efficiently I’ll post more.

Small Global

Recently I worked with art collective D-Fuse on an interactive version of their Small Global video installation for the Bloomsbury Festival in London. A Kinect was used to track the movement of audience members and use it to reveal text within the video.

The four screen projection created a very powerful visual effect.

photo 4

More information and video of the installation here:







The summer issue of Varoom! – the magazine of the Association of Illustrators – features an article about the robot drawing brief we run at CSM (more details here). The article, written by Andrew Hall, showcases the work created by students, as well as a spectacular photo featuring me.

If an electronic version becomes available, I’ll link to it here.