During summer 2018, I worked as a research intern at Aalto University. I participated in a project titled Digitalizing performance with wearables and software, alongside two other Masters students:
- Tjaša Frumen, Costume Designer @ Aalto Arts
- Mia Jalerva, Lighting Designer @ University of the Arts Helsinki
“Digitalizing Performance with Wearables and Software” was a project funded by the Aalto Internal Seed funding scheme for “new multidisciplinary openings”, led by Professors Mario Di Francesco (Aalto SCI), Sofia Pantouvaki (Aalto ARTS) and Tomi Humalisto (UniARTS, VÄS). Collaborating with them was guest supervisor Taina Relander, costume designer.
The project aimed to examine the possibilities opened up by using 3D positioning system data to control theatre lighting fixtures, and to gauge its visual impact on the performer’s body through costume. The end goal was to collaboratively develop a short demo-performance in a studio space, free from the constraints of manual operation. To achieve it, we decided we needed to integrate wearable electronics and a software set-up tailored to the performance needs. This would allow us to create meaning and narrative through the interaction of movement, space, and visual design.
Some other incredibly awesome students and artists joined us during the three months we worked on this project, and they helped us shape the final demo performance:
- Kalle Rasinkangas, Sound Designer @ University of the Arts Helsinki
- Oscar Dempsey, Set Designer @ Aalto Arts
- Aliisa Rinne, Performer
- Tiia Marie Karstu, Make-up Designer
Final demo
For our demo, we were able to book Kallio Stage, one of the Aalto University venues. This is a 12 x 8 meters performance space located in Helsinki’s Kallio district. The stage staff assisted us with the lighting equipment installation for our performance.
The final demo was held twice on September 6th, 2018. There is a video of the first show on YouTube if you would like to watch it:
During the beginning of the demo, the audience on each side of the stage gets to play with a ball. They throw it across the stage, and each time the ball crosses the center of the stage, the lights in the costume perform an effect. Then, an alarm clock goes off. The alarm sound is stronger the closer the performer is to the front of the stage. Later on, the performer pushes the mattress to the front of the stage, and the alarm clock sound goes away.
After that, the performer moves across the stage, and an spotlight mirrors their movements. Attempting to hide in the back of the stage is in vain – the light still mirrors their movements. They proceed to have a discussion, and after that, the spotlight disappears.
The performer then goes to the back of the stage. A big structure starts to inflate, and three lightbars show up on stage. The performer moves around in the space, and the lightbars react to the movement. Eventually the lights start moving by their own accord and start pulsating. One of them is taken by the performer, and they then get inside the big structure in the back of the stage.
After a show of lights, the performer comes back with a large white inflatable, and a smaller one on their head. They take care of the big inflatable, and after playing a bit with it, they say goodbye and disappear. The lights then go back up, and the demo concludes.
The highlight here is that everything was done with no operators on the consoles - the complete demo was automated, end to end, and control was given to the performer.
3D tracking with the Quuppa Real-Time Locating System

Quuppa is a Finnish company that specializes in indoor localization and tracking with sub-meter accuracy, using Bluetooth LE tags. For this project we used four of their LD-7L locator units that we set up in the Kallio Stage ceiling. These locators were used to track five of their LD-6T tags around the stage.
Installing and connecting the units was relatively painless, as they are powered and communicate over a single Ethernet cable, thanks to PoE (Power over Ethernet). Because they needed to be mounted on the ceiling, the staff at Kallio Stage kindly installed them for us. Once they were wired, we calibrated the system by following the guidance provided in the Quuppa software and documentation. Our final results and system performance roughly match those described by the ART+COM development group.
The Quuppa software includes several useful features, like the ability to split the physical space in areas and know in which of those a tag is, or the possibility to record and later replay a set of moving tags. It is also possible to independently configure the tag refresh speed, to get more frequent position updates on fast-moving tags, while conserving battery on slower-moving ones.
Controlling lighting: PosiStageNet
To design the lighting for our demo, Mia worked with GrandMA2 onPC running on a small computer with a command wing attached to it. This hardware-software combination supports the PosiStageNet protocol over UDP to receive position information and use it to feed StageMarkers. If your fixtures are XYZ enabled, you can configure them to automatically point and follow the StageMarkers.
Unfortunately, the Quuppa software does not support this protocol natively, so a
translator had to be written for this project. Quuppa2PSN, a C++ program,
listens for the JSON over UDP datagrams sent by the Quuppa Positioning
Engine and converts them into PosiStageNet packets, which are sent to be
consumed by the GrandMA2 onPC software.
Once that is sorted out, StageMarkers can be patched in the GrandMA2 onPC and after configuring the fixtures properly, they will be able to move around, pointing at the XYZ coordinate received through PosiStageNet.
Controlling sound: OpenSound Control
For our performance, Kalle used Max for his sound design duties. This
software suite can receive external information via OpenSound Control,
but sadly the Quuppa software doesn’t support this protocol either. Because we
wanted to take advantage of position information for some sound effects,
Quuppa2PSN gained support for OSC output in its next iteration.
Kalle was then able to take advantage of the position information from inside Max. This information was used in different spatial sound effects, and also enabled us to have the question-answer part in our final demo. This is the part in where the performer can trigger specific questions on the speakers by staying near a certain pillar on stage.
Wireless lighting
As part of the performance, we decided we wanted to include portable, wireless, remotely managed lights in the costume and stage. Given that we were using a lighting console with network connectivity, we opted to implement wireless communication by using the Art-Net protocol over Wi-Fi. This allows the transport of DMX512 data over UDP packets, which we then decode on a microcontroller to control the actual lights.
To get a Wi-Fi network going, we connected the console to an Apple Airport router and used HUZZAH32 ESP32 feather from Adafruit on the receiving ends. The microcontrollers ran custom code to receive the Art-Net data and individually control LEDs on strips of RGBW NeoPixels. We powered both the LEDs and the microcontrollers with USB power banks of various sizes.
Three of these kits were then mounted on wooden boards to make “lightsabers”. A fourth set was blended with the costume. Tjaša had to make several accommodations to route the wires throughout the garment and hide the power banks from sight, you can get the full details on her thesis.
The ultimate goal: controlling all the things

PosiStageNet support on the GrandMA2 platform was nice to have and allowed us to get follow-spots going quickly, but unfortunately, it did not give us much-needed flexibility. To be able to control colour, intensity, and non-light DMX devices based on position information, we needed something more versatile – the GrandMA2 onPC software only offers basic tracking of StageMarkers with XYZ fixtures. Luckily, it also happens to allow remote Telnet connections, and any command that would usually be issued on the physical wing or the onPC lighting console can also be run through Telnet.
This is why Quuppa2PSN got extended with Telnet support and Lua scripting
abilities shortly after finding out about this feature. This permitted us to
process position information in whatever form that we wanted, and was the
crucial tool that let us have an autonomous performance that required no manual
intervention from lighting or sound designers.
Lua was chosen due to its simplicity, good performance and popularity as a game scripting engine. Using it, we were able to easily program small scripts that received position updates and issued Telnet commands as a result. This let us control colours, speeds, light intensity, launch cue lists, and more.
One of the more iconic parts where we used this feature was when we raised and lowered the “lightsabers”, which were tied to DMX winches on the ceiling. We also used this to control the speed of dimmed fans based on the performer position.