top of page
Writer's pictureNeil Hales

HOW THE INTERACTIVE SYSTEM WORKS


Inputs and Outputs


The heart of the system is the program built inside of Touch Designer; this controls everything. In a nutshell, the network is fed inputs that are transformed into other forms of outputs.


The system is fed different types of data (like hand gestures, sound), this data is then processed inside the network and reshaped into other forms of data or media (like visuals, midi, dmx, etc.).


Example 1:

A particular hand gesture (input) can be programmed to send a midi trigger (output) to Ableton Live that plays a chord.


Example 1 is one of the most common input/output modules in the system. It just takes an input and sends out data to control parameters inside Ableton Live. Data processing can become much more complex. An endless amount of possibilities can be programmed inside the system, even creating feedback loops in and out of the system.


Example 2:

A different hand gesture (input) sends midi data (output) to Ableton Live that controls a filter sweep, the filtered sound data (input) can be sent back to the system to control the brightness of the visuals (output).


In Example 2 the first input is creating a variation in the sound that is then fed back into the system to control the visuals.



Types of Inputs from the Network


Motion: One of the main sources of inputs is the Inertial Measuring Unit that is embedded inside the glove to read the motion. This reads acceleration and angular velocity. These are then processed to manipulate other parts of the system according to requirements. More detail on how the sensor works can be found below.


Sound: this type of data can be an input as well as an output. Sound as an input can control any part of the system.


MIDI Messages: Midi data can be sent from ableton Live into Touch Designer. In our case, midi input will mainly be used to selectt the configurations to their appropriate values during different parts of the performance.


Location: through the techniques of trilateration or computer vision, the location data of the performer on stage is produced.



Types of Outputs from the Network


Visuals: one of the main components of the project


MIDI Notes: these are sent into ABleton Live to play notes, chords, trigger clips or samples.


Modulations: through TDAbleton’s MIDI Mappers, modulations can be sent to control any desired parameter inside Ableton Live. The most common one so far is controlling the cut-off of the filter.


Spatial Sound Location: also through the TDAbleton’s MIDI Mappers a desired position in 3D space is outputted. This is used to move this sound around.

DMX: DMX data is outputted to control light fixtures.How the Sensor Works.


Triggers


One of the main programmed functions of the sensor is to send a trigger whenever a parameter value surpasses a set threshold. Just to name a few examples of what can be done, these triggers could launch clips or notes (or chords) inside Ableton Live via MIDI, trigger visual cues in Touch designer and/or Resolume Arena via OSC messages, set parameters to desired values in TD, and everything else a button can be programmed to do.


There are 12 raw triggers, there are 6 different parameter values on the accelerometer and another 6 on the gyroscope:


Accelerometer Triggers:

  1. Positive Acceleration x

  2. Negative Acceleration x

  3. Positive Acceleration y

  4. Negative Acceleration y

  5. Positive Acceleration z

  6. Negative Acceleration z


Gyroscope Triggers:

  1. Positive Gyroscope x

  2. Negative Gyroscope x

  3. Positive Gyroscope y

  4. Negative Gyroscope y

  5. Positive Gyroscope z

  6. Negative Gyroscope z


Example 1:

A punch gesture with power (ex. 20m/s2 <meters/second squared>) that exceeds the threshold will send a trigger on the +x acceleration, this could trigger a selected sample or play a set of notes.






Orientation Data


Another data that is extracted from the IMU Sensor is the orientation data. This data is divided into 3 variables. Roll, pitch and yaw, in simple words these three parameters are the rotation on each of the axes x, y and z respectively. This data can be mapped to control any parameter in our system.


Example 1:

When the performer raises her hand, pointing upwards, she could be controlling the filter cutoff of a synthesizer, or any other sound parameter.




Acceleration and Gyroscopic (Angular Velocity) Data


Acceleration and Gyroscopic data (not to be mixed with the the acceleration or the gyroscope triggers mentioned before), are also used to control selected parameters of sound and visuals. This is basically how fast and/or hard the performer is moving her hand; reading the acceleration which measured in meters per second squared, or how fast she's rotating her hand; reading the gyroscopic data which is measured in degrees per second.


Example1:

When moving (accelerating) her hand at a certain speed on the x-axis we increase the release of a note, thus making the note seem longer.



Mixing and matching data through Mathematics


By using mathematical functions, all the data mentioned above can be mixed with each other. We don't have to only use the acceleration on any axis to control a selected parameter, but we can for example add all accelerations together, or divide the X acceleration with the X gyroscopic data. With the ability to combine different datas with each other, more dynamic outputs can be created to control stuff!


9 views0 comments

Recent Posts

See All

Comments


bottom of page