Crowd-operated Christmas Lights

Christmas makes us think of friends. And in Visuality friends make us think of our garden, where in warmer months we use to invite them for merry gatherings. Alas, it was winter, and our garden stood cold and desolate. So the IoT club decided to light up the atmosphere and invite people to the garden anyway - virtually.

We came up with an idea for crowd controlled garden lights, which anyone could blink remotely through a simple webapp.

Lights controller

First we installed 7 different lights in the garden. Some were fancy garden illuminations, some were regular Christmas tree lamps - the more the merrier.

photo of the garden with the lights pointed out

To control them we plugged them to a relay module connected to Particle Photon.

fritzing diagram of the pin connections

Relays are simple electrically operated switches. In each relay the logical input voltage (0-5 V) powers a small electromagnet that mechanically opens or closes a separate high-voltage circuit. This is what makes this awesome clicking sound :D

As for Particle Photon, it's a small but powerful microcontroller. It comes already Wi-fi enabled and with a set of cloud management tools. It can be programmed remotely, so it was very convenient during development, especially when the device itself was installed in the basement cell and was difficult to physically access to each time we wanted to make a change in the program.

Encoding the animations

To handle each animated sequence of the lights we used sets of 32 bit commands. The first four bits of each command are its type, the interpretation of the rest depend on the type. We handle three types of animation.

Animation start

diagram of bit interpretations

Animation end

diagram of bit interpretations

Single frame of the animation

diagram of bit interpretations

To display each animation frame we decoded it on Photon, set the pin states accordingly and kept up this state for the set duration.

int singleFrame(int command) {
    digitalWrite(0, ((command >> 0) & 1) ? LOW : HIGH);
    digitalWrite(1, ((command >> 1) & 1) ? LOW : HIGH);
    digitalWrite(2, ((command>> 2) & 1) ? LOW : HIGH);
    digitalWrite(3, ((command>> 3) & 1) ? LOW : HIGH);
    digitalWrite(4, ((command>> 4) & 1) ? LOW : HIGH);
    digitalWrite(5, ((command >> 5) & 1) ? LOW : HIGH);
    digitalWrite(6, ((command>> 6) & 1) ? LOW : HIGH);
    digitalWrite(7, ((command>> 7) & 1) ? LOW : HIGH);
    delay(500 * (((command& (((1 << 8) - 1) << 8)) >> 8) + 1));
    return 0;


Next thing we wanted to have is a video stream that would enable people from outside the office to see the lights. We used D-Link DCS-5222L IP camera installed in our garage to record a view of the house.

photo of the camera

The camera can be accessed from a local network through its public .sdp file, which defines the parameters of the video streaming.

To record a video from this stream to the file we used ffmpeg.

ffmpeg -an -f lavfi -i anullsrc -rtsp_transport tcp -i rstp://<address_of_thestream.sdp> -tune zerolatency -vcodec libx264 -pix_fmt + -c:v copy -c:a aac -strict experimental -f flv <filename>.flv

Instead of channeling the stream to an .flv file it can be also directly sent to a Youtube channel

ffmpeg -an -f lavfi -i anullsrc -rtsp_transport tcp -i rstp://<address_of_thestream.sdp> -tune zerolatency -vcodec libx264 -pix_fmt + -c:v copy -c:a aac -strict experimental -f flv rtmp://<youtube_channel_id>

Sadly we couldn’t get rid of the ~15s delay on the stream. In normal live streams such a delay is negligible, but in our case that meant that the guest who left his wishes would have to wait for them to appear. This delay comes from youtube processing the stream and could be avoided in lower quality transmissions. The design of our page however required HD, for the full screen view.

The webapp

Finally we needed a way to give the control over the lights to the people. We created a React.js application which displayed the livestream and enabled the visitor to click on the positions of the lights to design their own animation, send them to our garden and watch as they are executed.

screens from the app

For this we created a frame editor component, which converted the user-selected light states to a command set which we stored in a database along with the authors signature and their Christmas wishes.


By now we had:

  • the lights controller able to execute any received command sets
  • a camera and the recording server able to start and stop recording videos
  • the web app allowing to create the command sets

With the three sections covered, the missing link was something to enable communication between all the parts. And the answer was MQTT.

MQTT (MQ Telemetry Transport) is a lightweight messaging protocol. It provides the publish/subscribe queuing pattern through the use of an MQTT broker. It is designed so that it would ensure reliable asynchronous communication even in unstable and low-bandwidth networks, which makes it perfectly suited for M2M (machine-to-machine) communication.

The broker service can be provided in cloud by various third parties so we don’t need to focus on the implementation. We chose CloudMQTT, because it's well documented and comes ready with javascript and c libraries.

diagram of the whole system

Getting the commands from the Web

All the wishes created by the user landed in the Postgres database on Heroku. Each wish record had a created_at, send_at and played_at fields.

We created a webworker running separately from the Express server and connected it to the CloudMQTT broker:

var mqtt = require('mqtt');
var client  = mqtt.connect(
    host: '', 
    username: <username>, 
    password: <password>, 
    port: '17989'

Each second the worker was polling for the unsent wishes, sending them to the CloudMQTT client and then updating their send_at field."select * from wishes where sent_at IS NULL ORDER BY id ASC LIMIT 1")
  .then(function (data) {
    var intArray = data.commandset.split(",");
    var id = parseInt(;
    var startCode = parseInt('0001' + rjust(id.toString(2), 28, '0'), 2);
     var endCode = parseInt('0010' + rjust(id.toString(2), 28, '0'), 2);
     ([].concat([startCode], intArray, [endCode])).map(function (str) {
          client.publish('/inTopic/message', str.toString());
    var sqlQuery = "UPDATE wishes SET sent_at = CURRENT_TIMESTAMP WHERE id = $1";
    db.none(sqlQuery, []).then(function(data){});

At the same time the webworker was subscribed for the status update messages from Cloud MQTT, and upon receiving a status update started it updated the appropriate played_at field

client.on('message', function (topic, message) {
  var str = message.toString();
  if (str.startsWith("started")){
    var id = parseInt(str.replace("started ", ""));
    var sqlQuery = "UPDATE wishes SET played_at = CURRENT_TIMESTAMP WHERE id = $1";
    db.none(sqlQuery, [id]).then(function(data){})

Receiving the commandset on the Photon and confirming the execution

Likewise, we had the Photon connect to the CloudMQTT broker and listen for messages

client.connect("particlePhoton", <cloud_id>, <cloud_key>);
if (client.isConnected()) {
  client.publish("/outTopic/message", "reconnected");

Upon receiving a message, Photon sent back the status update processing and then proceeded to execute each command from the set On the first command from the set it sent a status update started On the last command from the set it sent a status update stopped

if (QueueGet(&lastInstruction) == 0){
    client.publish("/outTopic/message", "process " + String(lastInstruction));
    switch(lastInstruction >> 28){
      case 1:
        command_number = lastInstruction & ((1 << 28) - 1);
        client.publish("/outTopic/message", "started " + String(command_number));
      case 2:
        command_number = lastInstruction & ((1 << 28) - 1);
        client.publish("/outTopic/message", "stopped " + String(command_number));

Recording the animations on the server

The last piece of the puzzle was recording a video from each animation to have an archive of all the wishes.

We configured another node web worker connected to CloudMQTT and had it running on a separate recording server. It was subscribed to the state updates started and stopped. On animation start it ran an ffmpeg command and on animation stop it killed the process.

client.on('message', function (topic, message) {
  var str = message.toString();

  if (str.startsWith("started")){
      var id = parseInt(str.replace("started ", ""));
      var argsStr = "-an -f lavfi -i anullsrc -rtsp_transport tcp -i \
        rstp://<address_of_thestream.sdp> -tune zerolatency -vcodec libx264 \
        -pix_fmt + -c:v copy -c:a aac -strict experimental -f flv " 
        + id.toString() + ".flv";
      ffmpeg = spawn("ffmpeg", argsStr.split(' '));
  } else if(str.startsWith("stopped")){
      var id = parseInt(str.replace("stopped ", ""));

The result

We managed to get this project online by December 20th and sent it out to our family and friends. During cold winter evenings each blinking of the lights visible from our windows made our hearts warm up.