EDC Redux

The Experimentation in Digital Creation program took place at PWM between Winter 2022 to Summer 2023.  Through this program we were able to provide mentorship as well as technical and dramaturgical support to eight pieces in development. Each of these projects involved a high degree of technological integration and our work with the artist’s was focussed on how to make this fusing of art and technology as synergistic as possible.  In line with PWM’s general ethos of knowledge sharing we offer some of what we learned as a team through this process  Since the majority of these works have not been publicly exhibited we’ll be focussed on technological discoveries, software and hardware tips and development strategies that we found effective.


Circuits of Sand and Water

This week was primarily focused on proximity triggered events with the work falling into a hybrid art installation/theatrical experience.  We spent the residency period exploring how a single person could navigate a designed space and have the audio and video content be triggered by their spatial position or interaction with an object.  

While there are many ways that proximity can be tracked in a space, we decided that the fastest route to get a prototype on its feet would be to use the existing VR equipment in the PWM studio.  The space is equipped with 2 wireless HTC Vive Pro 2’s and offers a tracking area of 20’x40’.  We mounted a Vive Tracker 2 on to a set of headphones using some camera clamps and hung a rear projection screen to serve a spot to trigger different visuals.  After getting the hardware in place we began to piece the prototype together in Unreal Engine 4.27.  Getting tracking data into Unreal is actually fairly easy using the livelink plugin https://www.vive.com/ca/support/camtrack/category_howto/setting-up-the-virtual-camera-using-the-live-link-controller.html  After some calibration (making sure the tracker’s xyz axis were in the correct orientation) we flushed out the rest of the programming needed using Unreal’s built in blueprint system.  Madmapper (which was also triggered by unreal) was used to map the projected content onto the wall and rear projection screen.  By the end of the week we were able to test out a prototype (on various PWM staff members) that had proximity driven binaural audio and video.  

As we worked through various iterations of the prototype, more and more questions came up around the agency that the sole audience member would have in the experience.  It became clear that there was a big difference on a dramaturgical level between an event being triggered by the user touching or manipulating an object versus the trigger being purely proximity based.  Through testing we were also able to assess the length that each clip could be sustained for, we found in general that people had a lower tolerance for standing and listening to a piece of text vs reading it. 


Invisible Me

The residency week for Invisible Me was mostly focussed around visual exploration, collaborative storyboarding tools and investigating ways a piece could be presented digitally while still keeping the quality of liveliness that made up the in-person show.  

Miro (https://miro.com/index/) was used extensively in this residency to brainstorm, collect visual materials, mark up the script, and create a rough storyboard.  The interactive projector in the PWM studio allowed us to write directly on the Miro board with the same feeling as writing on a white board and the collaborative nature of the software let the team add their own sticky notes and contributions from their individual machines.  One of the advantages of using Miro over a traditional whiteboard is that the notes and discoveries that are made during the residency week are available to team members afterwards so that they can continue their work seamlessly.

As part of the visual exploration of the week, we spent a good period of time experimenting with different materials to get a sense of how they move as well as how they respond to light and projection. Tyvek quickly became one of the more focused on materials because of how well it takes projection and its physical similarities to paper (while being more robust and able to withstand repeat handling).  


QUEERasure: Sex Garage

By the time that it was Ragtag’s experimentation week, they had a very strong concept of what story they wanted to tell with their piece, but they were still figuring out how exactly they wanted to tell it.  What was known for sure was that it should be an immersive experience that would take the audience members through the emotional journey of the folks who were at the Sex Garage party raid in Montreal in June 1990.

One of the visual concepts that we investigated during this week was focussed around point clouds.  Point clouds are representations of 3D space created with many individual data points.  They are great at creating a feeling of depth while still being an abstracted form.  We created several point clouds using a variety of depth cameras (we tried both Kinect 360, Kinect One, and a Microsoft Kinect Azure – the Azure giving us the highest resolution).  The depth cameras were connected to a computer running Touchdesigner and they created a limited 3D scene using infrared pulses.  Once we got the basic point cloud running, we were able to manipulate the 3D image to obscure or otherwise distort the particles, creating an effect where the space would appear to coalesce together or fly apart. 


Given the historical nature of the piece, we spent a while examining how the real life locations could be incorporated into the event.  The group took a 360 camera out to various spots in Montreal and filmed content both standing in place as well as walking through the streets.  When we got back to the studio, we took this content into Unreal 4.27 and mapped it onto a sphere.  We could then send camera shots of this surface into Madmapper using NDI (https://ndi.video/tools/ndi-core-suite/) (https://offworld.live/products/unreal-engine-live-streaming-toolkit) and map it onto several screens in the studio.  With this basic setup we could test out how it would feel to be in the center of a room with all the walls playing a walkthrough of a street and play with the thresholds of speed and light to find what feels good vs what induces nausea. 


Untitled Project (Mischa Shadloo)

Mischa’s project entered the residency while in a very early stage of development and so the work that we did was extremely exploratory and impulse based.  When talking about big visual concepts it can be hard for artists to know if the wild idea they’re imagining works or doesn’t unless they are able to try it out.  It was with this spirit in mind that we dove into Mischa’s week in the studio.  

In our early meetings we discussed the inspiration for the piece as well as made a wish list of all the visual ideas that Mischa wanted to experiment with, some digital, some more physical in nature.  After gathering the materials needed we started to play with the different visual concepts that had been identified earlier.  As we played, new ideas and concepts would pop up for us to follow.  Through this chasing process we were able to arrive at looks and aesthetics that might not have been considered earlier.  

Some of the more digital items that we experimented with included an interactive sandbox & audio reactive LEDs.  The audio reactive LEDs were accomplished using a device called a Pixlite (https://www.advateklights.com/pixlite-16-mk2-control-board).  This device allows you to control addressable LEDs using ArtNet data.  As such, it was easy to hookup Madmapper with an audio input from a mic (using an audio interface such as a Focusrite) and have Madmapper drive the led content.  In the case of our experimentation the volume of the mic equated to the brightness of the LED.

The interactive sandbox was similar to a concept that often is used in science museums.  Essentially a box of sand is projected on from the top and the content being projected changes based on how the sand is played with.  For our sandbox we used a projector mounted in the grid, a small box of sand and a depth camera (such as a Kinect or Microsoft Azure) to drive the interactive content.  The depth camera was mounted next to the projector and connected to a computer running Touchdesigner.  In Touchdesigner, we were able to map different depths to different colours as well as create topographic lines, and create images that could be physically revealed or hidden by a person depending on the depth of the sand (such as a scene of running water below a desert landscape, or text hidden in the dunes).  


BQFKN!

Big Queer Filipino Karaoke Night! was by far the piece which was furthest along in its development, as the team behind it had had the chance to prototype it already and they came to us with a very solid wishlist of what they wanted to work on over the course of the residency.  What was most important to them was streamlining & designing the audience experience as well as working out some of the technical bottlenecks that they had discovered in their initial prototype.  As such we spent a considerable amount of time working out a system to help eliminate lag in their long distance virtual karaoke. 

One of the pieces of software that we found particularly useful for this was VDO.ninja.  VDO.ninja is a low latency, high quality solution for creating a virtual recording studio (https://vdo.ninja/).  A great thing about VDO.ninja is the amount of control it gives the technician operating the recording session.  Unlike with zoom or other conferencing platforms, the technician is able to see, record and change details around each participant’s camera, microphone, bandwidth and encoders.  This helps ensure a quality event because settings can be tested and recorded beforehand.  Both OBS and Touchdesigner (as well as any media server that allows for some sort of web or browser input) allow for VDO.ninja feeds to be imported into them without any form of watermark or other unwanted overlay.  In addition to VDO.ninja we also used loopback (https://rogueamoeba.com/loopback/) to route audio from one program to another. Through the week we helped the team build a solid system diagram that they were then able to then put to use for the full production of their show in the first week of June 2023.


Untitled Opera Project (Amanda Smith)

Unfortunately immediately before the start of this residency week, there was a COVID-19 exposure with our digital dramaturgy team.  Since the creative team had already travelled to Montreal from Toronto we decided as a group to pursue a hybrid work week with the residency leaders staying at home and the visiting creative team working in the PWM studio.  The studio is well equipped for remote or hy-flex work and we were able to log into the studio computers from home to help troubleshoot and teach some Touchdesigner techniques to the team on the ground.  

One of the ideas that the group was curious to experiment with was that of the Pulfrich Effect.  The Pulfrich Effect (https://en.wikipedia.org/wiki/Pulfrich_effect) is an optical illusion that uses a particular motion in combination with glasses with a single tinted lens to create a 3D feeling.

Given the musical nature of the work it was important to spend some time playing with various forms of audio reactivity.  As such, various microphones were set up in the space and run through a Focusrite sound card into a computer running Touchdesigner and Modul8, which is another media server software used for creating visuals.  The Focusrite allows each microphone to be input individually into whatever software and control different settings based on pitch or volume. 


Only Footprints

Only Footprints exists in a space between a point and click game, a haunted house and a theatre piece.  In one of the earlier iterations of the project, it existed only online, with all of the game design mechanics being worked out in Twine (https://twinery.org/).  The hope however has always been for a live experience, so that is where we put most of our energy in the residency period.  After going through some of the dramaturgical beats of the piece we started on creating a mockup of the first few game choices to see how they would feel to an audience member.  To do this we created 2 sides of a cube with vapour barrier (a construction material used to protect building walls) and rear projected on them.  Above one of the sides was a Microsoft Kinect which we connected to a computer running Touchdesigner to detect when someone touched a specific area of the screen.  To simulate what would eventually be a custom app, we created a prototype using TouchOSC (https://hexler.net/touchosc) on an iPad.  TouchOSC is great for prototyping basic functionality because it is extremely fast to set up a working patch with no programming knowledge.  It is able to send and receive MIDI and OSC over network so it’s easy to connect to Qlab or most other media servers.  

Once we had the basic prototype put together, we invited one of PWM’s dramaturgs into the space to see how they would interact with the piece given little to no instruction.  One of the things that is always important with pieces that fall out of the structure of traditional theatre is to think about the onboarding process for the audience.  With a piece like Only Footprints there will need to be a lot of consideration around what this onboarding process is, how in depth does it need to be and in general how much hand holding will there need to be for an audience member to get the most out of the experience.  It was great to see how our first test subject instinctually navigated through some of the prototype while needing prompting to engage with others. 

Accessibility Tools
English (Canada)
Skip to content