This summer, as I’m thinking about future production work for the coming years, I’ve started to consider what kind of live data I want to be able to use in the context of live performance. To that end, one of the more interesting sensors worth examining are accelerometers. While there are lots of ways to work with these seniors, a simple way to get started is to use an iPhone or iPod touch to broadcast it’s data over a wireless network. In a project that I tackled in the Spring of 2013 I used this technique when working with TouchDesigner on a piece of installation art.
In continuing to think about (and work with) accelerometer data it’s important to look at what information you’re actually getting from the sensor. I’ve started exploring this process by creating a simple Isadora patch to visualize this flow of information, and to document how the data corresponds to the motion of an iPod touch or iPhone.
TouchOSC to Isadora
To start we need to first capture the OSC data with Isadora. I’m using TouchOSC for process. With TouchOSC installed on your mobile device you first need to make sure that’ve identified your IP address, and chosen a port number that corresponds to Isadora’s OSC incoming port number (found in Isadora’s preferences under Midi / Net). Next we need to ensure that TouchOSC is broadcasting the accelerometer data. You can do this by first tapping on the “Settings” icon in the upper corner of the TouchOSC panel.
From here selection “Options”
Under options we need to make sure that “Accelerometer (/xyz)” is enabled.
Back in Isadora, provided that our IP address and ports are correct we now only need to set up the communications stream properties. To do this we’ll start by clicking on the “Communications” menu item and selecting “Stream Setup” from the drop down list.
First we need to make sure that we’ve selected the “Open Sound Control” stream, then clicked the “Auto-Detect Input” box. At this point you should now see three live data points under the column labeled “data.” In order to be able to work with this data in the Isadora patching window we need to make sure that we’ve clicked the box to enable the port, and changed the number from 0 to 1. Once this is all set we can click on the “ok” button and start placing some actors.
The Big Picture
First let’s take look at the full patch and talk through what our next steps are going to be.
In order to have a fuller sense of what the accelerometer data is doing, we’re going to create a simple visualization. We’ll do this by creating a small oval for each data stream, and map their change to the vertical axis, while mapping a constant change to the horizontal axis. We’ll then set up a simple screen refresh so we can see a rough graph of this data over time that will automatically be cleared each time our visualization wraps around the screen.
This approach should allow us to see how the three data from the accelerometer correlate to one another, as well as giving us a chance to consider how we might work with this kind of data flow inside of an Isadora patch. We’ll also have a chance to look at how to purposefully erase the stage background in Isadora, and how to create a simple system to automate that process.
Changes over Time
With our communications stream set-up, and TouchOSC broadcasting to Isadora we can now think about how to visual represent this information over time. We’re going to start this process with the following flow of actors:
OSC Listener – Smoother – Shapes – Projector
Let’s first look at the OSC Listener and the Smoother actors. Once we add the OSC Listener we need to change the “channel” value to “1.” You may remember that in our Communication Stream we were seeing the data from TouchOSC labeled as “/accxyz” with live values represented as (float, float, float). As we add listener actors we’ll notice that those floats correspond to channel’s 1,2,and 3. This allows us to isolate the list of values without having to do any additional parsing of data. You’ll also notice that I’ve attached the Listener to a Smoother actor. The accelerometer in your iPod Touch or iPhone is extremely sensitive. As such, you’ll notice that you may end up with some unwanted noise in your data stream. We’re going to use the Smoother to even out that noise. Connect the value outlet of the OSC Listener actor to the “value in” input on the smoother actor. I’ve changed the smoothing value to .9 and set the frequency to 100 Hz. You’re milage may vary here, so keep in mind that you may need to adjust these values once you’ve got your whole patch up and running.
In order to keep the values in a range that we will display on our stage we’re going to apply a little bit of scaling to the value out from the Smoother Actor. To do this, click on the text “value out” on the Smoother actor, and change the limit min value to -2.5 and the limit max value to 2.5. This will constrain our scale our values slightly and ensure that any change we see from the accelerometer is visible on the stage.
Next we’ll connect the “value out” of the Smoother actor to the “vert pos” (vertical position) inlet on a shapes actor. On the shapes actor you’ll also want to change the size of the shape, and the kind of shape. I’ve set mine to be a oval with a hight of 2 and a width of 2. Next you’ll connect the video output of the Shapes actor the video input of a projector actor and we should finally be able to see something for all of our hard work.
Now in the center of the stage we should see an oval that moves up and down in relation to the movement of our iPod touch. I’ve added the video from my camera so you can see the movement of the iPod touch in relation to the movement of the oval.
While we’ve made some good progress we’re still only seeing one axis of data from the accelerometer, and on top of that our oval is stuck in the center of the screen. Next we’ll tackle these very problems to create something that we can see change over time as the Y-axis.
First, lets start by taking the actor string we’ve just created and duplicating it twice. Select the whole string:
OSC Listener – Smoother – Shapes – Projector
copy it, and paste it twice. This should give us three of these strings. Next change the channel numbers on the second and third strings to be 2 and 3 respectively.
At this point you may also wish to give each string it’s own color so you can differentiate why the axises of the accelerometer.
Let’s add some horizontal motion to our ovals. We’ll do this by adding a wave generator actor. Change the wave-type to saw-tooth, and the frequency to 0.1Hz. Next connect the outlet value to the “horz pos” (horizontal position) values of the three shapes actors.
You should now see the ovals representing all three axis moving across the screen. At this point we’ve connected our iPod touch’s accelerometer to Isaodra, given each axis it’s own shape, mapped the change in accelerometer to vertical position, and mapped the horizontal position of our ovals to a single wave generator so we can see the change of accelerometer data over time.
Our next steps here should help us see this information a little more clearly. Before we start we first need to know a little bit more about one of the actors in Isadora. The Stage Background actor allows us to play with some interesting feedback effects here in Isadora. Useful for us today is going to be playing with value for erasing the screen. By leaving this value turned off we will be able to give our ovals trails. That said, we don’t want to totally turn off screen erasing, otherwise we’ll quickly loose track of our ovals as they wrap from one side of the screen to the other. In order to solve this problem we’ll use a few comparator actors, a wave generator, and a few trigger values.
I’m sure there’s a more elegant approach to this challenge, this however was my 30-minute solution. First, I wanted to be able to see the trails of the ovals we’ve just created. I also wanted these trails to disappear as the ovals wrapped from the right side of the stage to the left. While you could do this by adding another wave generator, the simplest solution (in my opinion) is to use the same wave generator that’s driving the horizontal movement of the ovals. Essentially, the nuts and bots of my approach is this: use the wave generator driving the horizontal movement of the ovals to toggle the screen erase function from off to on and back to off.
To do this we’ll complete the following steps:
- First create a comparator actor and connect the value outlet from our wave generator to the value 1 inlet on the comparator actor. Next ensure that the comparison method is changed to lt (less than). Now we’ll change the second value of the comparator to 0.1. This means that when our wave generator is publishing a value of less than 0.1 to our comparator we’ll see a “true” trigger.
- Next we’ll create an envelope generator. Connect our comparator’s “true” outlet to the “trigger” inlet of the envelope generator. Change the envelope generator to have two segments. Rate 1 and 2 should be changed to .25 seconds.
- Now create another comparator actor. Connect the “output” value from the envelope generator to the “value 1” inlet of the new comparator actor. Leave the comparison method to eq (equal), and leave the “value 2” at 0.
- Create two trigger value actors. One trigger value should be set to 0, while the other trigger value to should be set to 1. Connect the comparator actor’s outlets of true and false to the trigger values 0 and 1 respectively.
- Finally, create a stage background actor. Connect both outlets from the trigger values to the “erase” inlet on the stage background actor.
All of that should look like this:
From here you can start to play with your accelerometer and you’ll be able to see a visual representation of the values coming out of the sensor.
3 comments
Comments are closed.