Immersion is an interactive installation with the intention to create a state of flow in the visitor. Using motion tracking and a heartrate monitor people use their body as input for the installation. Inside a big dome people can move objects around in the 360 degrees projection. By arranging small spheres they make a audiovisual composition, of which the speed and intensity is driven by their heartbeat and the amount of movemement
I was responsible for the conceptual and visual design, software development and hardware integration. We used an array of technology and software: Bitalino Heartbeat monitor, Polar H7, Kinect V2, beamers, mirrors, projection mapping, OSC communication, Java, Objective C, Max MSP and much much more. You can read more about the technology used below the photos.
The project is founded by Charel Elberse and is a collaboration between various young creatives: Kim Buisman (Dome construction), Ehsan Fardjadniya (Dome fabric), Merijn den Haan (Audio) and Tineke Titsing (Furniture).
For Immersion I used many different kinds of technology. Rather than inventing the wheel again, I integrate many existing technologies into a new cocktail. Below you can read about the software and communication protocols that were used.
Processing is a very well documented and widely used creative coding software for artists and programmers. It provides an extensive pool of libraries to get an idea working very quickly. Our Processing software is responsible for generating the visuals according to all the input data (heart rate and body movement). We wrote a custom audio sequencer that triggers sounds and keeps overview of all the user interactions. It is like the control room of our installation.
The Kinect is a motion tracking device by Microsoft. It is able to track a persons movement accurately, as well as detect gestures like opened or closed hands. By using their body people can interact with the project, thus forming a natural way of interacting, as opposed to using a keyboard and mouse. The goal is to provoking playful behaviour and invite people to experiment with movement.
In order to increase the interactivity we choose to use biofeedback as another way of input. We use a sports heartbeat monitor with a Bluetooth Communication Protocol. OSX Xcode has great support and documentation for creating your own heartbeat app. We compared various heart rate sensors, but a chest strap turned out to most stable, also during intense movement inside the dome.
Read more about Core Bluetooth
The visuals that were generated in Processing are created for a ‘normal’ screen, that is, a rectangular shaped image. Our beautiful dome however is spherical. That means the generated image needs to be warped in order to stretch across the full dome, resulting in the desired 360 degrees projection. We used Madmapper for this with many, many bezier curves and many, many control points to get the right mapping. The output of Madmapper is being projected on the spherical mirror which reflects the light to the dome walls.
The Processing software communicates with a Max MSP patch. Max MSP is great software for generative sounds and algorithmic music. For Immersion we worked together with Merijn den Haan, who did the sound design. Various chords and sounds are being triggered based on the position of objects in the audio sequencer.
Max MSP at Cycling ’74
Because of the various software applications and programming languages used some kind of glue was needed to make everything talk. For Immersion we used OSC, or, Open Sound Control. A protocol originally designed as the new MIDI, but which can also be used to send data from one app/system to the other. Data includes heart rate, tracking data, audio triggers.
Read more about OSC
While OSC is very good at communicating basic data from one app to another, it cannot send images. In order to glue the various visual parts of the installation together, we used Syphon. It can send image realtime from our visualisation app to the mapping software.