Category Archives: production

Custom Quartz Compositions in Isadora

The What and Why

The more I work with Isadora, the more I feel like there isn’t anything it can’t do. As a programming environment for live performance it’s a fast way to build, create, and modify visual environments. One of the most interesting avenues for exploration in this regard is working with Quartz Composer. Quartz is a part of Apple’s integrated graphics technologies for developers and is built to render both 2D and 3D content by using the system’s GPU. This, for the most part, means that Quartz is fast. On top of being fast, it allows you access to GPU accelerated rendering making for visualizations that would be difficult if you were only relying on CPU strength.

Quartz has been interesting to me largely as it’s quick access to a GPU-accelerated high performance rendering environment capable of 2D, 3D and transparency. What’s not to love? As it turns out, there’s lot to be challenged by in Quartz. Like all programming environments it’s rife with its own idiosyncrasies, idioms, and approaches to the rendering process. It’s also a fair does of fun once you start to get your bearings.

Why does all of this matter? If you purchase the Isadora Core Video upgrade you have access to all of the Core Imaging processing plugins native to OS X. In addition to that you’re now able to use Quartz Composer patches as Actors in Isadora. This makes it possible to build a custom Quartz Composer patch and use it within the Isadora environment. Essentially this opens up a whole new set of possibilities for creating visual environments, effects, and interactivity for the production or installation that you might be working on.

Enough Already, Let’s Build Something

There are lots of things to keep in mind as you start this process, and perhaps one of the most useful guidelines I can offer is to be patent. Invariably there will be things that go wrong, or misbehave. It’s the nature of the beast, paying close attention to the details of the process is going to make or break you when it all comes down to it in the end.

We’re going to build a simple 3D Sphere in Quartz then prep it for control from Isadora. Easy.

Working in Quartz

First things first, you’ll need to get Quartz Composer. If you don’t already have it, you’ll need to download Quartz Composer. Check out I Love QC’s video about how to do this:

The next thing we’re going to do is to fire up QC. When prompted to choose a template, select the basic composition, and then click “Choose.”


One of the first things we need to talk about is what you’re seeing in the Quartz environment. The grid like window that you’re started with is your patch editor. Here you connect nodes in order to create or animate your scene.


You should also see a window that’s filled with black. This is your “viewer” window. Here you’ll see what you’re creating in the patch editor.


Additionally you can open up two more windows, by clicking the corresponding icons in the patch editor. First find the button for Patch Library, and click that to open up a list of nodes available for use within the network.

The Patch Library holds all of the objects that are available for use within the Quartz editor. While you can scroll through all of the possible objects when you’re programming, it’s often more efficient to use the search box at the bottom of the library.


Next open open up the patch inspector.
The patch inspector lets you see and edit the settings and parameters for a given object.

Untitled_-_Editor 3

Let’s start by making a sphere. In patch library search for “sphere” and add it to your patch. Out the gate we’ll notice that this sucks. Rather, this doesn’t currently look interesting, or like a sphere for that matter. What we’re currently seeing is a sphere rendered without any lighting effects. This means that we’re only seeing the outline of the sphere on the black background.


This brings us to one of the programming conventions in Quartz. In Quartz we have to place objects inside of other components in order to tell QC that we want a parent’s properties to propagate to the child component.

To see what that means let’s add a “lighting” patch to our network. Congratulations, nothing happened. In order to see the lighting object’s properties change the sphere, we need to place the sphere inside of that object. Select the sphere object in the editor, Command-X to cut, double click on the Lighting object, then Command-V to paste.


This is better, but only slightly.

Untitled_-_Viewer_and_Untitled_-_Editor 2

Let’s start by changing the size properties of our sphere. Open the Patch Inspector and click on the Sphere object in the editor. Now we can see a list of properties for our Sphere. Let’s start by adjusting the diameter of our sphere. I’m going to change my diameter to .25.

Sphere_and_Untitled_-_Viewer_and_Untitled_-_Editor 2

Next, select “settings” from the drop down menu in the Patch Inspector. Here I’m going to turn up the number of sub divisions of my sphere to give it a smoother appearance.

Sphere_and_Untitled_-_Viewer_and_Untitled_-_Editor 3

With our sphere looking pretty decent I want to add some subtle animation to give it a little more personality. We can do this by adding a LFO (low-frequency oscillator). We’ll use our LFO to give our sphere a little up and down motion. In the Patch Library search for LFO and add it to your editor next to your sphere.

Untitled_-_Editor 4

Next click the “Result” outlet on the “Wave Generator (LFO)” and connect it to the “Y Position” inlet on the sphere.

Wonderful… but this is going to make me sea sick.

Next we’re going to make some adjustments to the LFO. With your patch inspector open, click on the LFO. Let’s the following changes:

Period to 2
Phase to 0
Amplitude to .01
Offset to 0


Now you should have a sphere that’s very gently bouncing in the space.

Next let’s return to the parent lighting patch to make some changes to the lighting in this environment. We can get back to the parent either by clicking on the button “edit parent” or by clicking on the position in the string of objects where we want to go.

Untitled_-_Editor 5

In the root patch let’s click on the lighting object and change some parameters:

Material Specularity to 0.1
Material Shininess to 128
Light 1 Attenuation to .2
Light 1 X Position to -0.25
Light 1 Y Position 0.5
Light 1 Z Position to 1


Excellent. We should now have a sphere gently bobbing in space with a light located just barely to the left, up, and away (as a note these are locations in relation to our perspective looking at the sphere).

At this point we could leave this as it is and open it in Isadora, but it wouldn’t be very exciting. In order for Isadora to have access to make changes to a QC patch we have to “publish” the inputs that we want to change. In other words, we have to choose what parameters we want to have access to in Isadora before we save and close our QC patch.

I’m thinking that I’d like this sphere to have a few different qualities that can be changed from Isadora. I want to be able to:

  • Change the Lighting Color (Hue, Saturation, and Luminosity as separate controls)
  • Change the position of the light
  • Change the Sphere Size

In QC in order to pass a value to an object, the parameter in question needs to be published from the Root patch. This will make more sense in a second, but for now let’s dive into making some parameters available to Isadora. First up we’re going to add a HSL to our patch editor. This is going to give us the ability to control color as Hue, Saturation, and Luminosity as individual parameters.

Connect the Color outlet of the HSL to the Light 1 Color inlet on the lighting object.

Untitled_-_Editor 6

Now let’s do some publishing. Let’s start by right clicking on the HSL object. From the pop-up menu select “Publish Inputs” and one at a time publish Hue, Saturation, and Luminosity. You’ll know a parameter is being published if it’s got a green indicator light.


Next publish the X, Y, and Z position inputs for the lighting object. This time make sure you change the names Light X Pos, Light Y Pos, and Light Z Pos as you publish the parameters.


At this point we’ve published our Color values, and our position values, but only for the light. I still want to be able to change the diameter of sphere from Isadora. To do this we need to publish the diameter parameter variable from the “sphere” object, then again from the lighting object.

First double click on the lighting object to dive inside of it. Now publish the diameter parameter on the sphere, and make sure to name it “Sphere Diameter.” When you return to the root patch you’ll notice that you can now see the “Sphere Diameter” parameter.


We now need to publish this parameter one more time so that Isadora will be able to make changes to this variable.

Here we need to pause to talk about good house keeping. Like all things in life, the more organized you can keep your patches, the happier you will be in the long run. To this end we’re going to do a little input splitting, organizing, and commenting. Let’s start by right clicking anywhere in your patch and selecting “Add Note.” When you double click on this sticky note you’ll be able to edit the text inside of it. I’m going to call my first note “Lighting Qualities.”

Next I’m going to go back to my HSL, right click on the patch, and select “Input Splitter” and select Hue. You’ll notice that you now have a separate input for Hue that’s now separate from the HSL. Repeat this process for Saturation and Luminosity. I’m going to do the same thing to my lighting position variables that are published. Next I’m going to make another note called “Sphere Qualities” and then split m sphere diameter and drag it to be inside of this note. When I’m done my patch looks like this:

Untitled_-_Editor 8

Right now this seems like a lot of extra work. For something this simple, it sure is. The practice, however, is important to consider. In splitting out the published inputs, and organizing them in notes we can (at a glance) see what variables are published, and what they’re driving. Commenting and organizing your patches ultimately makes the process of working with them in the future all the easier.

With all of our hard work done, let’s save our Quartz patch.

Working in Isadora

Before we fire up Isadora it’s important to know where it looks to load quartz patches. Isadora is going to look in the Compositions folder that’s located in the Library, System Library, and User Library directories. You can tell Isadora to only look in any combination of these three at start up. Make sure that you copy your new quartz composition into one of those three directories (I’d recommend giving your custom quartz comps a unique color or folder to make them easier to find in the future).

With your QC patch in place, and Isadora fired up lets add our custom patch to a scene. Double click anywhere in the programming space and search for the name of your patch. I called mine Simple Sphere. We can now see that we have our composition with all of the variables we published in QC.


We can see what our composition looks like by adding a CI projector and connecting the image output form our QC actor to the image inlet on the projector. Let’s also make sure that we set the CI projector to keep the aspect ratio of our image.


When you do this you should see nothing. What gives?!
If you look back at your custom actor you’ll notice that the diameter of the sphere is currently set to 0. Change that parameter to the 0.1, or any other size you choose.

Untitled___Stage_1_and_Untitled 2

You should now see a dim floating sphere. What gives?!
If you look at the position of your light you’ll notice that it’s currently set to 0,0,0, roughly the same location as the sphere. Let’s move our light so we can see our sphere:

Light 1 X Position to -0.25
Light 1 Y Position 0.5
Light 1 Z Position to 1

Untitled___Stage_1_and_Untitled 3

If you’re connected to an external monitor or projector you’ll also want to make sure that you set the render properties to match the resolution of your output device:

Untitled 3

There you have it. You’ve now built a custom quartz patch that you can drive from Isadora.

Book Keeping

Not all of the work of media design is sexy. In fact, for all of the excitement generated by flash of stunning video, interactive installations, and large scale projection there is a tremendous amount of planning and paper work involved in the process. In the case of a theatrical project, one of the designer’s crucial pieces of work comes in the form of creating a cue sheet. For the uninitiated, a cue sheet is a list of all of the moments in a show that contain media. These documents help the designer, stage manager, and director establish a common understanding about how media / lights / sound are going to be used in a given production.

Drafting a useful cue sheet is often more a matter of preference, but it warrants mentioning some things that can help the media designer get organized as she / he wrestles with a growing list of content to generate. While one could certainly use a word-processing program in order to generate a cue sheet, I prefer to working a spreadsheet. Excel or Google Spreadsheets are a fine medium for writing cues, and have features that can be extremely helpful.

Cue Sheet Must-Haves

In my opinion there are a few must-have columns in organizing your cue sheet:

The Cue – Whatever you’ve settled on using, numbers or letters or some combination of the two, you need a column that puts the cue name / number in plain sight.

The Page Number – If you’re working from a traditional script, keep track of the page number. At some point you’ll be struggling to find the place in the script where a particular cue is getting called, and knowing the page number can ensure that you stay organized in the planning and technical rehearsal process.

Duration – How long does the cue last? In talking with the director it’s important to have a shared understanding of what’s happening in a given moment in a production. Specifying how long a particular effect or video is going to last can help provide some clarity for the designer as she/he is editing, animating, or programming.

From the Script – What’s in the source material that’s driving you to design a particular look? Did a character say something in particular? Is there any stage direction that’s inspiring your choice? Having a quick reference to what’s inspiring you can be a godsend while you’re crafting the content for a production.

Notes – For all that times you say “I’ll remember,” you will invariably forget something. Write it down. Did the director say something inspiring? Write it down. Did the lighting designer mention something about the amount of ambient light during a particular moment? Write it down. Write down what you’re thinking, or brainstorming. You’re never obligated to keep anything, but having a record of what you’ve been thinking gives you something to start from when you sit down to edit / animate / program.

Shooting Notes – If you’re going to need to record video for a production, make note of what particulars you need to keep in mind at the shoot. Do you need a green screen? A particular lighting effect? A particular costume? Keeping track of what you need for a particular moment is going to make the filming process that much easier.

Checklists. At the end of my cue sheet I keep a checklist for each cue. Three columns that help me keep track of what work I’ve done, and what work is left to be done.

Filmed / Animated – Is this cue filmed or animated?
Edited – Is this footage cut and prepped for the playback system?
Installed – Is this footage installed in the playback system?

Working with a Spreadsheet

Simple Formulas

One of the truly magical parts of working with a spreadsheet is one’s ability to use simple formulas to automate a workflow. A simple example comes out of the show that I’m currently working on, The Fall of the House of Escher. A new work that’s come out of the collaborate process of the 2014 ASU Graduate Acting / Directing / Design cohort, this show is built around a structural focus on giving the audience as much agency as possible. Central to this endeavor is a choose-your-own-adventure model for a the production. Given this structure, our production has sections that are distinguished from another another by a alpha-numeric prefix – 1A, 2A, 3A, et cetera. Additionally, I want to pair the section code with the number of a particular cue. This, for example, might look like 3A-34 (section 3A, cue 34). This combination of letters and numbers could make renumber cues, should one change, a difficult process. To simplify this particular problem I can use a simple formula for combining the contents of columns.


First I start by creating separate columns for the section of the show, and the cue number. Next I created a third column intended to be a combination of these other two columns. Here I inserts the following formula: =A24&”-“&B24

Here the Google Spreadsheets (or Excel for that matter) will read this formula as: display the contents of cell A24 and insert a “-” symbol and take the contents of cell B24. This may not seem like a big deal until you consider the time saved when a cue is added or removed, forcing a change in the numbering convention of all the following cues.

Conditional Formatting

Conditional formatting largely comes in varieties that are relative to changing the background of a particular cell. In this case Excel has a much wider range of possibilities for this kind of implementation. For me, however, the simplicity of automatic color coding is tremendously helpful. For example, let’s consider the final three checklist categories that I talked about earlier. Let’s say that I want every cell that contains the word “No” to be color coded red. In order to achieve this look first I’d highlight the cells that I want the formula to apply to.

Next I’d click on the background formatting button in the toolbar and select “Conditional Formatting” from the bottom of the list.

House_of_Escher_Media_Cue_Sheet 2

Finally I’d write my simple formula to control the change of color for the Cell’s selected.

House_of_Escher_Media_Cue_Sheet 3

Multiple Tabs

Last by not least, maintaining multiple tabs on a worksheet saves time and organizational energy. Additionally this allows you to cross reference notes, cells, and thoughts in your workbook. You might, for example, maintain a working cue sheet where you can brain storm some ideas and be a little less tidy. You can then use simple reference formulas to pull the relevant data to your final cue sheet that you give to the Stage Manager. Should you have the good fortune of having an assistant, you might make a separate page in your work book to outline her/his responsibilities on a project.

A cleanly organized cue-sheet is far from exciting, but it does ensure that you stay focused and productive as you work.

Escher Image and Animation Research

In thinking about what the media and animation for The Fall of the House of Escher might look like I’ve been sifting through the internet this summer looking for images that abstractly represent the universe and the behavior of particles and waves. Some of the more interesting work that I’ve found uses simple geometric shapes, and particle systems to evoke a sense of scale and distance and perspective. The work of the motion graphics designer Mr. Div is a prime example of someone who makes works that are both simple and also strikingly captivating.

The gif to the right is amy attempt at copying his piece “Tri-Heart”. I think copy-art is a practice that can’t be over stated. Recreating a work that you see from scratch teaches you more than simply following a tutorial. You are forced to wrestle with questions of how and why, and solve problems that don’t necessarily have clear solutions. While I don’t think this animation, specifically, is going to find it’s way into Escher, there are qualities of it that I really like and that feel distinctly quantum.

On the other end of the spectrum, in the “just follow along with a tutorial” category is a fascinating how-to create by minutephysics. Their quick After Effects tutorial covers how to create a simple simulation of formation of the universe. While it’s not scientifically accurate, the aesthetic conveys the look and feel of much more complex simulations that look at formation of the universe. Both of their videos are worth a watch, and the result of the tutorial can be seen here. Again, I don’t know that this exact animation will be something that I use, but it has elements that are inspiring and interesting.

In many respects there is a daunting amount of media in this production – interactive elements, large scale animations, moments of world-shifting perspective, and the expression of the abstract inner space of the atom. There’s a life-time of work in this production, and it goes up in October. There’s lots to do.

House of Escher | Media Design

In December of 2012 I was approached at an ASU School of Theatre and Film party and asked if I would be interested in working on a project that would begin the following semester, and premiere a new work in the Fall of 2013. As this is exactly the kind of opportunity that I came to ASU to peruse, I eagerly agreed to be a part of the project. 

Some Background

ASU’s School of Theatre and Film (soon to also include Dance) has a very interesting graduate school program for performers and designers. Operating on a cohort based system, the school admits a group of performers, directors, and designers (Scenic, Costume, and Lighting) every three years. One of the other graduate programs at the school, the one in which I’m enrolled, can enroll students each year. My program, Interdisciplinary Digital Media and Performance (IDM), straddles both the school of Arts, Media, and Engineering as well as the School of Theatre and Film. Students in my program have pursued a variety of paths, and one skill that’s often included in those various paths is media and projection design for stage productions. Just today as I was watching the live web-cast of the XboxOne announcement, I was thinking to myself, “some designer planned, created, and programmed the media for this event… huh, I could be doing something like that someday.”

The latest cohort of actors, designers, and directors started in the Fall of 2011, which means that the group is due to graduate in the Spring of 2013. In both the second and third year of the cohort’s program they work to create a newly devised piece that’s performed in one of the theatre’s on campus as ASU. Occasionally, this group also needs a media designer, and it’s their new show for 2014 that I was asked to be a part of. 

The Fall of the House of Escher

Our devising process started with some source material that we used as the preliminary research to start our discussion about what show we wanted to make. Our source materials were Edgar Allen Poe’s The Fall of the House of Usher, M.C Escher, and Quantum Mechanics. With these three pillars as our starting point we dove into questions of how to tackle these issues, tell an interesting story, and work to meet creative needs of the group. 

One of our first decisions focused on the structure of show that we wanted to create. After a significant amount of discussion we finally settled on tackling a Choose Your Own Adventure (CYOA) kind of structure. This partially arose as a means of exploring how to more fully integrate the audience experience with the live performance. While it also brought significant design limitations and challenges, it ultimately was the methodology the group decided to tackle. 

Shortly after this we also settled on a story as a framework for our production. Much of our exploratory conversation revolved around the original Poe work, and it was soon clear that the arc of the Fall of the House of Usher would be central to the story we set out to tell. The wrinkle in this simple idea came as our conversations time and again came back to how Poe and Quantum Mechanics connect with one another. As we talked about parallel universes, and the problems of uncertainty, we decided to take those very conversations as a cue for what direction to head with the production. While one version of the CYOA model takes patrons on the traditional track of Poe’s gothic story, audience members are also free to send our narrator down different dark paths to explore what else might be lurking in the Usher’s uncanny home. Looking at the photo below you can see where the audience has an opportunity to choose a new direction, and how that impacts the rest of the show. 

While this was a fine starting point, we also realized that it only giving the audience an opportunity to explore one avenue of possibility in the house felt a little flat. To address that point we discussed a repeated journey through the house in a Ground Hog Day-esque repeated style. Each run of the show will send the audience through the CYOA section three times, allowing them the opportunity to see the other dark corners of the house, and learn more about the strange inhabitants of the home. I did a little bit of map-making and mapped out all of the possible paths for our production; that is, what are all of the possible permutations of the three legged journey through the house. The resulting map means that there are twelve different possible variations for the production. A challenge, to be sure. 

Media and the House

So what’s media’s role in this production? The house is characterized by it’s Escher patterned qualities. Impossible architecture and tricks of lighting and perspective create a place that is uncanny, patterned, but also somehow strangely captivating. Just when it seems like the house has shared all of it’s secrets there are little quantum blips and pulses that help us remember that things are somehow not right until ultimately the house collapses. 

Our host (who spends his/her time slipping between the slices of the various paths the audience tumbles through) is caught as a destabilized field of particles only sometimes coalesced. The culminating scene is set in a place beyond the normal, a world of quantum weirdness – small as the inside of an atom, and vast as the universe itself. it’s a world of particles and waves, a tumbling peak inside of the macro and micro realities of our world that are either too big or too small for us to understand on a daily basis. 

Media’s role is to help make these worlds, and to help tell a story grounded in Poe’s original, but transformed by a madcap group of graduate students fighting their way out of a their own quantum entanglement. 

Phase 2 | Halfway House

Media design is an interesting beast in the theatre. Designers are called upon to create digital scenery, interactive installations, abstract imagery, immersive environments, ghost like apparitions, and a whole litany of other illusions or optical candy. The media designer is part system engineer, part installation specialist, and part content creator. This kind of design straddles a very unique part of the theatrical experience as it sits somewhere between the concrete and the ephemeral. We’re often asked to create site specific work that relates to the geometry and architecture of the play, and at the same time challenged to explore what can be expressed through sound and light. 

One of the compelling components of ASU’s School of Theatre and Film (SoTF) is its commitment to staging new works. In addition to producing works that are tried and true, ASU also encourages its students to create works for the stage. As a part of this commitment  the department has developed a three phase program to serve the process of developing a work for full main-stage production. 
  • Phase 1 – Phase one is between a staged reading and a work-shop production of a play. This phase allows the team to focus on sorting out the nuts and bots of the piece – what is the play / work really addressing  and what are the obstacles that need to be addressed before it moves onto the next stage of production. 
  • Phase 2 – Phase two is a workshop production environment  With a small budget and a design team the production team creates a staged version of the work that operates within strict design constraints. Here the lighting plot is fixed, scenic elements are limited, and media has access to two fixed projectors focused on two fixed screens.  This phase is less about the technical aspects of the production, and more focused on getting the work up in front of an audience so that the writer and director have a chance to get some sense of what direction to move next.
  • Phase 3 – Phase 3 is a full main-stage production of a work. Here there production has a full design team, larger budget, and far fewer constraints on the implementation of the production. 
While productions can skip one of the stages, ideally they are produced in at least one phase (either one or two) before before being put up as a phase three show. 
This semester I was selected to be the media designer on call for the two original works slotted in as Phase 2 productions: Los Santos, and The Halfway House. These two new works are both written by current ASU playwrights, who are invested in receiving some critical and informative feedback bout their work. The beginning part of this process begins with production meetings where directors pitch their visions of the production and start the brainstorming / creating process with the designers. Ultimately,  Los Santos decided against using any media for their production. Halfway House, however, did decide that it wanted some media driven moments in their production. 
My role in this process was to work with the director to find the moments where media could be utilized in the production, film and edit the content, and program the playback system for the short run of the production. After reading through the play a few times I met with Laurelann Porter, the director, to talk about how media could be used for this show. Important to the design process was understanding the limitations of the production. In the case of the Phase 2 productions, the projectors and screens are fixed. This limitation is in part a function of reducing the amount of tech-time, as well as limiting the complications imposed a set and lighting when doing complex projection. Looking at the script I thought the best use of media would be to enhance some of the transition moments in the production. Several of the transitions in the show involve moments where there is action taking place “elsewhere” (this is the language used by the playwright). These moments seemed perfect for media to help illustrate. In meeting with the director we identified the major moments that would benefit from some media presence, and started brainstorming from there.
A large part of the production process is planning and organization. In the case of lighting, sound, and media designers are tasked with identifying the moments when their mediums will be used, and creating a cue sheet. Cue sheets are essentially a set of discretely identified moments that allow a stage manager to give directions about how the show runs. Media, lights, and sound all have their own board operators (actual humans), and the stage manager gives them directions about when to start or stop a given cue. Creating a cue sheet with this fact in mind helps to ensure that a designer has working understanding of how to plan the moments that are being created. My process of reading the script looked like this:
  • 1st time through – for the story and arc of the action
  • 2nd time through – identify possible moments for media
  • 3rd time through – refine the moments start to create a working cue sheet
  • 4th time through – further refinement, label cues, look for problematic moments
After talking with the director and identifying what moments were going to be mediated material, it was time to create a shooting list, and plan for how to use a single afternoon with the actors to record all of the necessary footage for the show. We had one afternoon with the actors to film the transition moments. I worked with the director to determine a shooting order (to make sure that we efficiently used the actors’ time), and to identify locations and moments that needed to be captured. From here it was a matter of showing up, setting up, and recording. This transitioned smoothly to the editing process that was a matter of cutting and touching up the footage for the desired look.

The School of Theatre and Film currently have two show control systems at our disposal. Dataton’s Watchout4 and Troikatronix’s Isadora. Given the timing of the phase 2 productions, I knew that the Isadora machine was going to be available to me for show control. Like MaxMSP, Isadora a is a node-based visual programming environment. Importantly, Isadora is truly designed with performance in mind, and has a few features that therefore make it easier to use in a theatrical production environment. 

Typically a theatrical production requires a additional steps for media that are similar to the lighting process – lensing, and plotting for example. For the Phase two productions  the the shows use a standard lighting and media plot that doesn’t change. This means that there’s little additional work in terms of projector placement, focusing, masking, and the like that I have to do as a designer. For a larger production I would need to create a system diagram that outlines the placement of computers, projectors, cable, and other system requirements. Additionally, I would need to do the geometry to figure out where to place the projectors to ensure that I had a wide enough throw with my image to cover my desired surfaces, and I would need to work with the lighting designer to determine where on the lighting plot there was room for this equipment. This element of drafting, planning, and system design can easily be taken for granted by new designers but it’s easily one of the most important steps in the process as has an effect on how the show looks and runs. With all of the physical components in place, and the media assets created the designer is now looks at programming the playback system. In the case of Isadora this also means designing an interface for the operator.
One of the pressing realities of designing media for a theatrical installation is the need to create playback system knowing that someone unfamiliar with the programming environment will be operating the computer driving the media. ASU’s operators are typically undergraduate students that may or may not be technical theatre majors. In some cases an operator may be very familiar with a given programming interface, while others may not have ever run media for a show. Theatre in educational institutions are a wonderful place for students to have an opportunity to learn lots of new tools, and get their feet wet with a number of different technologies. In this respect I think it’s incumbent upon the designer to create a patch that has an interface that’s as accesible as possible for a new operator. In my case, each moment in the show where there is media playing (a cue) has  corresponding button that triggers the start, playback, and stop for the given video. 

Media is notoriously finicky in live performance. It can be difficult to program, washed out by stage lights, perform poorly if it’s not encoded properly, or any host of other possible problems. In the case of Half Way House, the process went very smoothly. The largest problem had more to do with an equipment failure that pushed back equipment installation than with the editing or programming process. While this is a simple execution of using media in a production, it was valuable for a number for the individuals involved in the process – the director, lighting designer, sound designer, and stage manager to name only a few. There are large questions in the theatre world about the role of media in production – is it just fancy set dressing? how is it actively contributing to telling the story of the show? is it worth the cost? does it have a place in an idiom largely built around the concept of live bodies? And the list goes on. I don’t think that this implementation serves to address any of those questions, but for the production team it did start the process of demystifying the work of including media in a production, and that’s not nothing.

Tools Used
Programming and Playback- Isadora | TrokaTronix
Projector – InFocus HD projector
Video Editing – Adobe After Effects , Adobe Premiere
Image Editing – Adobe Photoshop
Filming / Documentation – iPhone 4S, Canon 7D, Zoom H4n
Editing Documentation – Adobe PremiereAdobe After Effects

Sparrow Song | Drawing with Light

One of the effects that I’ve used in two productions now is where lines appear to draw-in over time in a video. This effect is fairly easy to generate in After Effects, and I wanted to take a quick moment to detail how it actually works. 

This process can start many ways. For Sparrow song it started by connecting a laptop directly to the projectors being used, and using photoshop to map light directly onto the set. You can see in the photo to the right that each surface that’s intended to be a building has some kind of drawn on look. In photoshop each of these buildings exists as an independent layer. This makes it easy to isolate effects or changes to individual buildings in the animation process.

Here’s a quick tutorial about how I animated the layers to create the desired effect:

Now that I’ve done this several times it finally feels like a fairly straightforward process – even if it can be a rather time consuming one.

Here’s an example of what the rendered video looks like to the playback system.

Here’s an album of documentation photos from the closing show.

Tools Used
Digital Drawing Input – Wacom intuos4
Mapping and Artwork – Adobe Photoshop
Animation and Color – Adobe After Effects
Photos – Cannon EOS 7D
Photo Processing – Adobe LightRoom 4

What is a Media Designer

It’s difficult to describe what exactly one does as a Media Designer. Prior to this first semester of graduate work I would have likely explained that the work of a media designer is centered around creating artwork that in some way represents and supports the world of the play. I also may well have said that a Media Designer is a person who works to erase the boundaries between the set and projection. A designer who works with light that isn’t light in the strictest theatrical sense, a scenic designer who doesn’t work with sets in the strictest theatrical sense. 

Now, after assisting on a production at Arizona State University, I might be more hesitant to describe a Media Designer the same way. In many ways the work is so much larger than I could have imagined. In hindsight I’d say that my vision was limited by a full understanding of the challenges, obstacles, and options. These things are difficult to understand abstractly, and only truly become obvious when they are manifest around a particular issue that needs to be solved. 

What is a Media Designer

It is interesting that I would have imagined that the primary work of the designer would be in the creation of the artwork itself. This is a beautiful and romantic idea, but in reality the work of creating content is only a small part of this particular role. Instead, central to the successful implementation of a mediated space, the designer is challenged to resolve issues of how to cover surfaces with projection (read as: using angular geometry to calculate the position of projectors), designing an interconnected system of computers and projectors to realize an artistic vision,consistent playback (read as: determining the most advantageous use of playback systems in regards to questions of stability and ease of use for an operator), programming said media systems, mapping the geometry of surfaces for projection, blending and masking the edges of projection, and of course creating the artwork that fills the surfaces.

What is a Media DesignerThis is complicated, of course, by the fact that it’s difficult to see these as independent variables. Instead many of these exist as dependent variables – each problem and solution nested or connected to countless others. Each solution or problem with consequences that are difficult to anticipate your first time around.

What is a Media DesignerAll of that to say that this particular work is more than meets the eye, which stands as a rather ironic statement given how much actually does meet one’s eye. It’s not unusual for the invisible challenges of a particular role to be larger than one expects, but the challenges I continually find myself facing are ever more surprising in their dependency on mathematics and computation for solution. I suppose that it some come as no surprise that mathematics would be a useful tool in regards to operating a computer, but it has surprised me that understanding the computational problems of these challenges provides for the best insight about their solutions.

This has truly been a whirlwind of an experience. I think I’ve learned more in working on this production than I could have learned in any class on the subject. Educationally that makes for a compelling case in favor of project based learning. That said, in terms of assessment it’s difficult to capture and measure what precisely I’ve learned. Further, it’s difficult to determine outright if these problem-solving skills with transfer to other domains or even other similar projects. I suppose in that sense I’ll be my own longitudinal study.

Though, in truth that’s really what education is for any individual. That’s what education is trying so desperately to unpack, to standardize, to measure, and to reproduce. Does solving problems in real-world scenarios with high stakes make for the best learning? Sometimes. Does abstract conceptual exploration and investigation work to make creative thinkers? Sometimes. At least in this case I got to make some art out of it.