Tag Archives: media design

TouchDesigner | Import from a System Folder

One of the handy building blocks to have in your arsenal as you’re working with any media system is an understanding of how it works with system folders. Isadora, for example, pulls assets from the folder specified when you load the original file. This means that you can change an asset, save it with the same name and in placing it in the proper system folder your changes show up without having to re-import any files. What then, if I want to automatically pull files from a folder on my computer? In TouchDesigner that’s a relatively simple thing to accomplish without too much work. Better yet, the underlying principles open up a lot of much more interesting possibilities.

Alright, let’s start with a simple problem that we can solve with TouchDesigner. I want to build a simple patch where I can use two control panel buttons to navigate through a list of images (or movies) in a specified folder – essentially I want to be able to add my photos or videos to a folder on my computer, and then display them in TouchDesigner without having to drag them into the network or having to add a Movie-In TOP.

Let’s start by looking at the full network.

Our system is made of three strings – a series of CHOPs, a series of DATs, and a TOP. We’re going to use two buttons to control a series of channel operators to give us a number. We’re going to use this number to select a specific file from a folder, and then we’re going to use that file name in a Movie-In Texture Operator to be displayed.

Let’s start with our CHOP string.

Button – Trigger – Count – Math – Null

First things first, the important place to start in this particular string is actually in the middle at the Count CHOP. The Count CHOP, in this case, is going to watch for a change in a signal and use that as a mechanism for determining when to increment or reset. There are lots of different ways to count, but in this particular case the Count CHOP is an excellent method for incrementing from a button press. Alright, so if we know that we’re going to start with a count, then we don’t actually need the Trigger CHOP – especially as I’m going to watch a Button for a signal change. That said, the Trigger CHOP might be handy if I’m watching something that doesn’t have a specific binary, or a signal where I want a certain threshold to act as the trigger. You’ll notice that my Trigger is bypassed, but I’ve left it in my network just in case I want to use a different kind of triggering method. Alright, with some of those things in mind, lets move back to the beginning of this string.

First off I start by adding to Button COMPs to my network (If you want to learn more about Buttons and Control Panels Start here). Zooming into the button COMP I can set the text for the button itself. First I want to find the Text TOP labeled “bg.” Then I want to open the parameter window for this TOP and change the Text to “Forward” (you can name this whatever you want, I just wanted to know that this button was going to move me forward through my set of photos).

We’ll do the same thing for the other button, and in my case I both changed it’s name and it’s color. You can do both from the Text TOP in your second Button COMP.

Moving back out of the Button COMP I’m going to make a few other changes to my buttons. I want to change their position on the control panel, and I want to make them a little larger to accommodate the longer words inside of them. To do this I’m going to open up the parameter window of the Button COMPs and change the width of both to 66. I’m also going to change the Y location of button labeled “Forward” to 55 to make sure that my buttons aren’t on top of one another.

Before moving on, I also want to change the button Type for both of these COMPs to “Momentary.”

Next I’m going to connect my Button called “forward” to the top inlet on my Count CHOP, and the Button “reset” to the middle inlet on the Count CHOP. This means that every time I click the forward button I’ll increment the count up by 1, and when I click reset it will set the counter back to 0.

In my Math CHOP the only change that I’m going to make is to Pre-Add 1 to the integer coming form my count. I’m going to do this because TouchDesigner counts the first row of a table as 0. When we get to our DATs we’ll see that the 0 line in our table is actually the header of a table. While I could tell TouchDesigner not to show the header, that doesn’t feel like the best solution – especially as I want to be able to see what columns are referring to.

Last but not least, I’m going to end this string of CHOPs in a Null. If you’re new to TouchDesigner, it’s important to know that ending a string in a Null can be a true life safer. Ending in a Null allows you to make changes up stream of the Null, without any headache. Let’s imagine in a complicated network that you’ve connected 20 or 50 different operators to a CHOP. You decide that you want to change something in your string, and you delete the final CHOP (the one that’s connected to everything) leaving you to reconnect all of those patch chords in order to get your network back up and running again. The better solution is to end the string in a Null, and then connect the Null to those 20 or 50 or 1000 other operators. Changing things up stream of the Null is easy, especially as you don’t have to reconnect all of your patch chords again.

Now let’s take a look at our DAT stream.

(Text) – Folder – Select – Null

Here the Text DAT isn’t an essential part of my network, at least in terms of operation. As a general rule of thumb, I try to add comments to my networks as I’m working in them in order to make sure that I can remember how I did something. I highly recommend having some kind of commenting system, especially as this can prove to be a helpful tool to communicate with collaborators about what you’re doing with a specific part of a network. All of that to say, that we can ignore my first Text DAT.

The Folder DAT will pull the contents of a specified system folder. Ultimately, I’ll need the pathway of the files in this folder. I can set the Folder DAT to give me all of that information, and specify if I want it to include the content of sub folders, or the names of sub folders. In my case I only want the names and locations of the files in this single folder.

Next I want to select a single line out of this whole table. To do this, we’re going to connect the Folder DAT to a Select DAT.

First up I want to set this DAT to extract Rows by Index. I want to make sure that both Include First Row, and Include First Column is turned off. Next I want to export the Null CHOP to both the Start Row Index and the End Row Index. Finally, I want to set the Start Column Index to 5, and the End Column Index to 5 – I’m telling the Select DAT that I’m only interested in the sixth column in this table (but you set it to 5, whats the deal? remember 0 counts as a column, so when you’re counting columns in your table 0 is the new 1). I’m also telling it that I only want to see the row that’s specified by the CHOP string.

Last but not least, I’m going to pass this to a Null.

Finally, let’s add a Movie-In TOP to our network. Now, in the Movie-In parameter dialog we’re going to do a little typing. We need to tell the Movie-in TOP that it should reference a cell in a table in a DAT. Easy. Right? Sure it is. We’re going to first click the little plus that shows up as your pointer hovers over the left side of the parameter window.

Next we’re going to toggle the field from a constant into a expression by clicking on the little blue box.

Finally, we’re going to write a short expression to tell TouchDesigner where to look for the pathway of a file to load. Our expression looks like this:

op(‘null1’)[0,0]

If we were writing that in English it would read – Look at the Operator named “null1” and find the contents of the cell in the 0 column position and in the 0 row position. It’s important to note that the name in single quotes should be the name of the DAT Null that we created before, otherwise this won’t act as expected.

There you have it. We’ve now made a network that will allow us to load the contents of a folder, sequentially based on how we use two buttons in our control panel.

Sending and Receiving OSC Values with TouchDesigner

The other day I posted look at how to send and receive TouchOSC data with Troikatronix’s Isadora. In an effort to diversify my programming background I’m often curious about how to translate a given task from one programing environment to another. To this end, I was curious about repeating this same action in Derivative’s TouchDesigner. If you’re one of a growing number of artists interested in visual programming, VJing, media design, interactive system design, media installation, or media for live performance than it’s well worth your time to look at both Isadora and TouchDesigner.

That said, I initially started thinking about control panel interfaces for TouchOSC when I saw Graham Thorne’s instructional videos about how to transmit data from Isadora to Touch OSC (If you have a chance take a look at his posts here: Part 1 & Part 2). Last year I used TouchOSC accelerometer data to drive an installation at Bragg’s Pie Factory in Downtown Phoenix, but I didn’t do much with actual control panels. Receiving data from TouchOSC in TouchDesigner is a piece of cake, but sending it back out again didn’t work the first time that I tried it. To be fair, I did only spend about 3 minutes trying to solve this particular question as it was a rather fleeting thought while working on another project.

Getting this to working Isadora, however, gave me a renewed sense of focus and determination, so with a little bit of experimenting I finally got this working the way that I wanted. While there are any number of things you might do with TouchDesigner and TouchOSC, I wanted to start with a simple deliverable: I wanted the position of Slider 1 to inversely change the position of Slider 2, and vise versa. In this way, moving Slider 1 up moves Slider 2 down, and moving Slider 2 up moves Slider 1 down. In performance setting it’s unlikely that I’d need something this simple, but for the sake of testing an idea this seemed like it would give me the kind of information that I might need.

For the most part this is pretty easy, but there are a few things to watch for in order to get the results that you want.

Here’s the network stream of Chanel Operators (CHOPs):

OSC In – Select – Math – Rename – OSC Out

OSC In

Derivative Documentation on the OSC In CHOP 

The OSC in CHOP is pretty excellent out the gate. We need to do a little configuring to get this up and running, but after that we’re truly off to the races. First off it’s important to make sure that your TouchOSC device is connected to your network and broadcasting to your TouchDesigner machine. If this process is new to you it’s worth taking a little time to read about out to set this all up – start on this page if this is new to you. Once your TouchOSC device is talking to your TouchDesigner network we’re ready to start making magic happen. One of the first things that I did was to activate (just adjust up or down) the two faders that I wanted to work with. This made sure that their IDs were sent to TouchDesigner, making it much easier for me to determine how I was going to rename in them in the future.

Screenshot_100913_094345_PMSelect

Derivative Documentation on the Select CHOP

When you’re working with the OSC In CHOP, all of your TouchOSC data comes through a single pipe. One of the first things to do here is to use the select CHOP to single out the signal that you’re looking for (as a note, you could also export the Channel in question to a Null, or handle this a variety of other ways – in my case a Select CHOP was just the first method I tried). This operator allows you to pull out a single Channel from your OSC In.

Screenshot_100913_094428_PM

Math

Derivative Documentation on the Math CHOP

Next I wanted to remap the values of my incoming signal to be inverted. A Math CHOP lets us quickly remap the range for our data by changing a few values in the Range Tab of that parameters window. You’ll notice that you first need to specify the from range (what’s the range of incoming values), and then specify the to range (what’s the range of outgoing values). In my case, the from values are 0 and 1, and the to values are 1 and 0.

Screenshot_100913_094528_PM

Rename

Derivative Documentation on the Rename CHOP

The Rename CHOP is one of the most important steps in your whole network. When communicating with TouchOSC you need to know exactly what object you’re trying to drive with your data. To accomplish this you need to know the UDP address of your device, the port number that you’re board casting to, and the name of the slider or button that you’re wanting to change. While we can specify the UPD address and port number with another CHOP, we need to use the Rename CHOP (a quick aside – there are, of course, other ways to do this – this just happens to be the method that I got working after a bit of trial and error) to specify the name of the object that we’re driving. In my case I wanted fader 1 to drive fader 2. You’ll notice that formatting of the fader is important. In this particular OSC layout there are multiple tabs. We can see this as evidenced by the “1/” preceding the name of our fader. In changing the name of this input we need to enter the precise name of the fader that we want to change – this is why we activated both faders initially, this gave us the names of both objects in the TouchOSC panel. You’ll notice that changed the name of my fader to “1/fader2”.

Screenshot_100913_094606_PM

OSC Out

Derivative Documentation on the OSC Out CHOP 

Now that we’ve processed and named our signal we’re ready to send it back out to TouchOSC. In this CHOP we need to input the UDP address of the TouchOSC device (you can find this in the info tab of the TouchOSC app) as well as the port that we’re sending to. We also want to make sure that we’re sending this as a sample, and finally we want to turn off “send events every cook.” This last step is important because it ensures that we’re only sending values when they change. If we send messages for every cook we won’t be able to see the inverse relationship between our two sliders. Correction, we’ll be able to see the relationship so long as we don’t set up slider 2 to drive slider 1. In order to create an inverse relationship with both sliders we only want to transmit data as it’s changed (otherwise we’ll find that we’re fighting with the constantly streamed position data from our slider that isn’t being activated).

Screenshot_100913_094705_PM

The last step in this process is to copy the whole string and set up fader 2 to transmit to fader 1, this last step allows both faders to drive one another in turn.

There you have it. You’ve now successfully configured your TouchDesigner network to receive and send data to a mobile device running TouchOSC.

Sending and Receiving TouchOSC Values in Isadora

Sometime last year (in 2012) I came across Graham Thorne’s instructional videos about how to transmit data from Isadora to Touch OSC. Here’s Part 1 and Part 2 – if this is something that interests you, I’d highly recommend by that you watch these two videos first.

While I didn’t have a TouchOSC project at the time that I was working on, it got me thinking about how interfaces communicate information about what’s happening in a patch, and how that information communicates to the operator or user of a given system. This year I’m working on the thesis project of Daniel Fine (an MFA student here at ASU), and one of the many challenges we’re bound to face is how to visualize and interact with a system that’s spread across multiple computers, operating systems, and controlling a variety of different systems in the installation / performance space.

To that end, I thought that it would be important to start thinking about how to to send control data to a TouchOSC interface, and how to then ensure that we can see relationships between different control values in a given control panel. That all sounds well and good, but it’s awfully vague. A concrete exploration of this kind of concept was what I needed to start planning, in order to more fully wrap my head around how idea could be more fully exploited in a performance setting.

In order to do this I decided that I wanted to accomplish a simple task with a TouchOSC control panel. On a simple panel layout I wanted the position of Slider 1 to inversely change the position of Slider 2, and vise versa. In this way, moving Slider 1 up moves Slider 2 down, and moving Slider 2 up moves Slider 1 down. In performance setting it’s unlikely that I’d need something this simple, but for the sake of testing an idea this seemed like it would give me the kind of information that I might need.

First let’s look at the whole patch:

The Whole Patch

The set up for this starts by configuring Isadora to receive data from TouchOSC. If you’re new to this process start by reading this post (or at least though the Stream Set-Up section) to learn about how to start receiving data in Isadora from TouchOSC. Next we’re going to use a few simple actors to make this happen. We’re going to use the following actors:

  • OSC Listener
  • Limit Scale Value (there are other ways to scale values, I just like this method as a way to clearly see what values you’re changing and in what way)
  • OSC Transmitter

OSC ListenerOnce you have your connections between TouchOSC and Isadora set up you’ll want to make sure that you’ve isolated a single slider. We can do this by using the OSC Lister Actor. The OSC Listener reports the data coming from a given channel that you specific in the input inlet on the actor. The listener then sends out the transmitted values from the value outlet.

Limit Scale ValueWe have two sliders that we’re working with – Channel 1 and 2 respectively (they also have names, but we’ll get to that later). We first want to look at Channel 1. We’re going to set the OSC Listener actor to channel 1 and then connect the value output to the value inlet on a Limit-Scale Value Actor. The Limit-Scale Value Actor allows you to change, scale, or otherwise remap the floats or integers to a new range of values. This is important because TouchOSC uses normalized values (values from 0-1) for the range of the sliders in its control panels. In order to create an inverse relationship between two sliders we want to remap 0-1 to output as values from 1-0. We can do that by enter the following values in the inlets on the actor:

  • limit min: 0
  • limit max: 1
  • out min: 1
  • out max: 0

OSC TransmitThe remaining step is to connect the output from our Limit-Scale Value Actor to an OSC Transmit Actor. The OSC Transmit Actor, like its name suggests, transits data wrapped in the OSC protocol. In order to fully understand how this data is being transmitted we need to know a few things about this actor. In looking at its components we can see that it is made up of the following inlets:

  • UDP Addr – UDP address. This is the IP address of the computer or device that you’re taking to. You can determine what this address is in TouchOSC by looking at the info panel. Imagine that this is the street name for a house that you’re sending a letter to.
  • Port – this is the port on the device that you’re sending data to. It’s important that you know what port you’re trying to talk to on a given device so that your message can be parsed. If the UDP Address is the street name, the port number is akin to the house number that you’re trying to send a letter to.
  • Address – The address in the case of TouchOSC is the individual target / name of an asset that you want to change. Each of the sliders and buttons on a TouchOSC panel have a name (for example – /1/fader1), the address is how you tell Isadora what slider you are wanting to change. You can determine these names by looking closely at your Stream Set-up when you’re connecting your device to Isadora. To follow with our letter sending metaphor above, the Address is the name of the person you’re sending the letter to.
  • Use Type – this allows us to toggle the sending mechanism on and off.
  • Value – this is the value that we’re transmitting to our other device.

To use the OSC Transmit actor we need to fill in all of the appropriate fields with the information from our mobile device. You’ll need to specify the UPD Address, the Port number, the Address, and connect the value out form our Scale-Value actor to the value inlet of the OSC Transmit Actor.

In this test I started by having fader1 drive fader2. Once I got this working, I then repeated all of the steps above, for the other fader – if you look closely at the full patch this will make more sense. The resulting interaction can be seen in the gifs below.

 

Cue Building for Non-Linear Productions

The newly devised piece that I’ve been working on here at ASU finally opened this last weekend. Named “The Fall of the House of Escher” the production explores concepts of quantum physics, choice, fate, and meaning through by combining the works of MC Escher and Edgar Allen Poe. The production has been challenging in many respects, but perhaps one of the most challenging elements that’s largely invisible to the audience is how we technically move through this production.

Early in the process the cohort of actors, designers, and directors settled on adopting a method of story telling that drew its inspiration from the Choose Your Own Adventure books that were originally published in the 1970’s. In these books the reader gets to choose what direction the protagonist takes at pivotal moments in the drama. The devising team was inspired by the idea of audience choice and audience engagement in the process of story telling. Looking for on opportunity to more deeply explore the meaning of audience agency, the group pushed forward in looking to create a work where the audience could choose what pathway to take during the performance. While Escher was not as complex as many of the inspiring materials, its structure presented some impressive design challenges.

Our production works around the idea that there are looping segments of the production. Specifically, we repeat several portions of the production in a Groundhog Day like fashion in order to draw attention to the fact that the cast is trapped in a looped reality. Inside of the looped portion of the production there are three moments when the audience can choose what pathway the protagonist (Lee) takes, with a total of four possible endings before we begin the cycle again. The production is shaped to take the audience through the choice section two times, and on the third time through the house the protagonist chooses a different pathway that takes the viewers to the end of the play. The number of internal choices in the production means that there are a total of twelve possible pathways through the play. Ironically, the production only runs for a total of six shows, meaning that at least half of the pathways through the house will be unseen.

This presents a tremendous challenge to any designers dealing with traditionally linear based story telling technologies – lights, sound, media. Conceiving of a method to navigate through twelve possible production permutations in a manner that any board operator could follow was daunting – to say the least. This was compounded by a heavy media presence in the production (70 cued moments), and the fact that the scrip was continually in development up until a week before the technical rehearsal process began. This meant that while much of the play had a rough shape, there were changes which influenced the technical portion of the show being made nearly right up until the tech process began. The consequences of this approach were manifest in three nearly sleepless weeks between the crystallization of the script and opening night – while much of the production was largely conceived and programmed, making it all work was its own hurdle.

In wrestling with how to approach this non-linear method, I spent a large amount of time trying to determine how to efficiently build a cohesive system that allowed the story to jump forwards, backwards, and sidewise in a system of interactive inputs, and pre-built content. The approach that I finally settled on was thinking of the house as a space to navigate. In other words, media cues needed to live in the respective rooms where they took place. Navigating then was a measure of moving from room to room. This ideological approach was made easier with the addition of a convention for the “choice” moments in the play when the audience chooses what direction to go. Have a space that was outside of the normal set of rooms in the house allowed for an easier visual movement from space to space, while also providing for visual feedback that for the audience to reinforce that they were in fact making a choice.

Establishing a modality for navigation grounded the media design in an approach that made the rest of the programming process easier – in that establishing a set of norms and conditions creates a paradigm that can be examined, played with, even contradicted in a way that gives the presence of the media a more cohesive aesthetic. While thinking of navigation as a room-based activity made some of the process easier, it also introduced an additional set of challenges. Each room needed a base behavior, an at rest behavior that was different from its reactions to various influences during dramatic moments of the play. Each room also had to contain all of the possible variations that existed within that particular place in the house – a room might need to contain three different types of behavior depending on where we were in the story.

I should draw attention again to the fact that this method was adopted, in part, because of the nature of the media in the show. The production team committed early on to looking for interactivity between the actors and the media, meaning that a linear asset based play-back system like Dataton’s Watchout was largely out of the picture. It was for this reason that I settled on using troikatronix Isadora for this particular project. Isadora also offered opportunities for tremendous flexibility, quartz integration, and non-traditional playback methods; methods that would prove to be essential in this process.

Fall_of_the_House_of_Escher_SHOW_DEMO.izzIn building this navigation method it was first important to establish the locations in the house, and create a map of how each module touched the others in order to establish the required connections between locations. This process involved making a number of maps to help translate these movements into locations. While this may seem like a trivial step in the process, it ultimately helped solidify how the production moved, and where we were at any given moment in the various permutations of the traveling cycle. Once I had a solid sense of the process of traveling through the house I built a custom actor in Isadora to allow me to quickly navigate between locations. This custom actor allowed me to build the location actor once, and then deploy it across all scenes. Encapsulation (creating a sub-patch) played a large part in the process of this production, and this is only a small example of this particular technique.

Fall_of_the_House_of_Escher_SHOW_DEMO.izz 2

The real lesson to come out of non-linear story telling was the importance on planning and mapping for the designer. Ultimately, the most important thing for me to know was where we were in the house / play. While this seems like an obvious statement for any designer, this challenge was compounded by the nature of our approach: a single control panel approach would have been too complicated, and likewise a single trigger (space bar, mouse click, or the like) would never have had the flexibility for this kind of a production. In the end each location in the house had its own control panel, and displayed only the cues corresponding to actions in that particular location. For media, conceptualizing the house as a physical space to be navigated through was ultimately the solution to complex questions of how to solve a problem like non-linear story telling.

TouchDesigner Concepts for Projection Mapping

This coming year at ASU I’ll be working on a project with several talented artists and media Makers on a project that’s tentatively being called “WonderDome.” As Dan Fine’s thesis project he’s exploring what it means to create a playing space that exists inside of a dome of projection. The audience and performer will share a single immersive media environment where a story will be told with puppets, video, and just about all the theatre magic you can imagine. Central to this endeavor are questions about immersive media systems and how to approach the complex issue of mesh warping media for a curved surface that’s being ditched together with somewhere between six to eight projectors.

The traditional approach for this kind project would be to make the flat media first. After creating the structure, and installing the projectors we’d create a sample mesh-warped After Effects Comp and ultimately run all of our flat media through that comp to split it up into several pieces with the appropriate distortion applied. While that’s certainly a tried and true method it doesn’t leave much room for error, and it also makes it difficult to use live video.

On our wish-list of media systems is a d3 media server by d3technologies. While their systems may well be out our price range, their approach is one that’s gaining traction in smaller projection circles. We can pull apart some of the same process by working with Derivative’s TouchDesigner to get a sense of how we may well be thinking of projection mapping in the not too distant future.

Instead of waiting to get onsite, or crafting media specially fit for a particular structure, we instead start with a 3D model. In our computer generated model we add projectors, lights, paint our surfaces with textures or video, effectively assembling our system virtually.

The videos below take a quick and rough look at what this kind of workflow looks like in TouchDesigner, and how it differs from the kind of projection mapping you may already be accustomed to.

Edge Blending

Geometry and Cameras

Putting it All Together

House of Escher | Media Design

In December of 2012 I was approached at an ASU School of Theatre and Film party and asked if I would be interested in working on a project that would begin the following semester, and premiere a new work in the Fall of 2013. As this is exactly the kind of opportunity that I came to ASU to peruse, I eagerly agreed to be a part of the project. 

Some Background

ASU’s School of Theatre and Film (soon to also include Dance) has a very interesting graduate school program for performers and designers. Operating on a cohort based system, the school admits a group of performers, directors, and designers (Scenic, Costume, and Lighting) every three years. One of the other graduate programs at the school, the one in which I’m enrolled, can enroll students each year. My program, Interdisciplinary Digital Media and Performance (IDM), straddles both the school of Arts, Media, and Engineering as well as the School of Theatre and Film. Students in my program have pursued a variety of paths, and one skill that’s often included in those various paths is media and projection design for stage productions. Just today as I was watching the live web-cast of the XboxOne announcement, I was thinking to myself, “some designer planned, created, and programmed the media for this event… huh, I could be doing something like that someday.”

The latest cohort of actors, designers, and directors started in the Fall of 2011, which means that the group is due to graduate in the Spring of 2013. In both the second and third year of the cohort’s program they work to create a newly devised piece that’s performed in one of the theatre’s on campus as ASU. Occasionally, this group also needs a media designer, and it’s their new show for 2014 that I was asked to be a part of. 

The Fall of the House of Escher

Our devising process started with some source material that we used as the preliminary research to start our discussion about what show we wanted to make. Our source materials were Edgar Allen Poe’s The Fall of the House of Usher, M.C Escher, and Quantum Mechanics. With these three pillars as our starting point we dove into questions of how to tackle these issues, tell an interesting story, and work to meet creative needs of the group. 

One of our first decisions focused on the structure of show that we wanted to create. After a significant amount of discussion we finally settled on tackling a Choose Your Own Adventure (CYOA) kind of structure. This partially arose as a means of exploring how to more fully integrate the audience experience with the live performance. While it also brought significant design limitations and challenges, it ultimately was the methodology the group decided to tackle. 

Shortly after this we also settled on a story as a framework for our production. Much of our exploratory conversation revolved around the original Poe work, and it was soon clear that the arc of the Fall of the House of Usher would be central to the story we set out to tell. The wrinkle in this simple idea came as our conversations time and again came back to how Poe and Quantum Mechanics connect with one another. As we talked about parallel universes, and the problems of uncertainty, we decided to take those very conversations as a cue for what direction to head with the production. While one version of the CYOA model takes patrons on the traditional track of Poe’s gothic story, audience members are also free to send our narrator down different dark paths to explore what else might be lurking in the Usher’s uncanny home. Looking at the photo below you can see where the audience has an opportunity to choose a new direction, and how that impacts the rest of the show. 

While this was a fine starting point, we also realized that it only giving the audience an opportunity to explore one avenue of possibility in the house felt a little flat. To address that point we discussed a repeated journey through the house in a Ground Hog Day-esque repeated style. Each run of the show will send the audience through the CYOA section three times, allowing them the opportunity to see the other dark corners of the house, and learn more about the strange inhabitants of the home. I did a little bit of map-making and mapped out all of the possible paths for our production; that is, what are all of the possible permutations of the three legged journey through the house. The resulting map means that there are twelve different possible variations for the production. A challenge, to be sure. 

Media and the House

So what’s media’s role in this production? The house is characterized by it’s Escher patterned qualities. Impossible architecture and tricks of lighting and perspective create a place that is uncanny, patterned, but also somehow strangely captivating. Just when it seems like the house has shared all of it’s secrets there are little quantum blips and pulses that help us remember that things are somehow not right until ultimately the house collapses. 

Our host (who spends his/her time slipping between the slices of the various paths the audience tumbles through) is caught as a destabilized field of particles only sometimes coalesced. The culminating scene is set in a place beyond the normal, a world of quantum weirdness – small as the inside of an atom, and vast as the universe itself. it’s a world of particles and waves, a tumbling peak inside of the macro and micro realities of our world that are either too big or too small for us to understand on a daily basis. 

Media’s role is to help make these worlds, and to help tell a story grounded in Poe’s original, but transformed by a madcap group of graduate students fighting their way out of a their own quantum entanglement. 

Isadora | Button Basics

In a previous post I talked about how to get started in Isadora with some basics about slider operation. I also want to cover a little bit about using buttons with Izzy. 

Buttons are very handy interface controls. Before we get started, it’s important to cover a few considerations about how buttons work. When working with a physical button, like an arcade button on a midi controller, the action of pressing the button completes a circuit. When you release the button, you also break the circuit. In Isadora, we can control what happens when we press a button. Specifically, we can control what values are being transmitted when the button isn’t being pressed, when it is being pressed, and how the behaves (does it toggle, or is the signal momentary). Thinking about how a button behaves will help as you start to build an interface, simple or complex.

Let’s start by experimenting with a simple implementation of this process. We’ll create a white rectangle that fills our stage, connect our shape to a projector, and finally use a button to control the intensity of the projector. 

Start by creating a new scene, and adding a “Shapes” actor and a “Projector” actor. Connect the shapes’ video outlet to the projector’s video inlet.

Next change the width and height dimensions of the shape to be 100 and 100 respectively. Remember that Isadora doesn’t use pixel values, but instead works in terms percentage. In this case a value of 100 for the hight indicates that the shape should be 100% of the stage’s height, the same applies for the width value of 100. 

We should now have a white box that covers the height and width of the stage so that we only see white.

Now we’ll use a button to control a change in the stage from white to black. Remember that in order to start adding control elements we first need to reveal the control panel. You can do this by: selecting it from the drop down menu use Command-Shift-C to see only the control panel or use Control-Shift-S to see a split of the control panel and the programming space. If you’ve turned on the Grid for your programming space you’ll be able to see a distinct difference between the control panel space (on the left) and the programming space (on the right). You’ll also notice that with your Control panel active your actor selection bins have been replaced by control panel operators.

With the control panel visible, add a button. 

Once you’ve added your button to the control panel, you can change the size of the button by clicking and dragging the small white square on the bottom right of the button. 

Next let’s look at the options for the button. We can see what parameters we can control by double clicking on the button. When you do this you should see a pop up window the the following attributes:

  • Control Title – what the control is named
  • Width – how wide is this control (in pixels)
  • Height – how tall is this control (in pixels)
  • Font – the font used for this control
  • Font Size – self explanitory
  • Show Value of Linked Properties – this allows data from the patch itself to feed back into the control panel
  • Button Text – the text displayed on the button
  • Control ID – the numerical identification number of this contorl
  • Off Value – the numeric value sent when the button is in the off position
  • On Value – the numeric value sent when the button is in the on position
  • Mode (Momentary or Toggle) – the mode for the button. Momentary indicates that the on value is only transmitted while the button is being pressed. Toggle indicates that the values with toggle between on and off values with a click.
  • Don’t Send Off – prevents the button from sending the off value
  • Invert – inverts the on and off values

There are a few other options here, but they mostly have to deal with the appearance of the button. When you start thinking about how you want your control panel to look to an operator, these last parameters will be very helpful. 

For right now let’s leave the default parameters for the button’s options. Next connect the button’s control ID to the inlet on the Projector labeled “intensity.”

As we’re working on the control panel our edit mode is currently enabled which will prevent us from being able to actually click the button with the mouse. To check the controls we have two options:

  • We can disable edit mode by right clicking on the control panel work space and selecting “Disable Edit Mode” from the contextual menu.
  • We can use the option key to by-pass the above process.

If you’re doing some extensive testing of your control panel I’d recommend that you disable edit mode. On the other hand, if you’re only testing a single slider or button, I’d recommend using the option key as a much more efficient alternative.  

Holding down the option key, you should now be able to click the button in the control panel. You should see the white box flash on, and back off again as you press and release the mouse button. Right now as we click the button we’re sending a value of 100 to the intensity parameter of the Projector actor. This makes the shape opaque only so long as you’re pressing the button. 

Double click the button in the control panel and check the box for “Invert.” We’ve now inverted the message being sent from the bottom to the Projector. When you click the button you should now see the opposite. A white screen that flashes black, and then returns to white. 

Double click on the button in the control panel, uncheck the box for “Invert.” Change the “Mode” of the button from “Momentary” to “Toggle.” Now as you click the button you should notice that it stays depressed until you click it again. This allows you to toggle between the on and off states of the button. 

This, obviously, is only the beginning of how to work with Buttons. You might use a button to control the play back of a movie, what media was on the stage, the position of media on a stage, to jump between scenes, or to change any number of parameters in you parch. Knowing the basics of how buttons behave will help ensure that you can start to build a solid control panel that you can use during a live performance.