HYENA DAYS
Michael Saup / Steina Vasulka
(BURST MODE VERSION)
MIDI-CONTROLLED AND SOUND-CONTROLLED IMAGE PRODUCTION
Featuring:
Steina Vasulka – violin
Michael Saup – guitar
By converting acoustic signals from analog to digital and interpreting their measured values, the musicians and their instruments can interface with image-generating electronic devices such as 3D graphics workstations, laserdisc players, or hard disk recorders—transforming acoustic events into dynamic visual experiences.
The acoustic output signals are analyzed during A/D conversion and interpreted by a control computer, which makes algorithmic decisions and controls additional equipment:
Via MIDI impulses, various sound-producing machines—such as samplers, effects processors, and mixing consoles—are triggered.
Through a parallel interface, a laserdisc player is supplied with control impulses, while a workstation is addressed via RS232, enabling both to respond in real time to the musicians’ interactions.
Musicians can thus use a wide range of acoustic parameters—such as pitch or volume—as triggers for visual choreographies. The resulting imagery is displayed on video monitors via a video mixer.
MANIPULATING EDIT LISTS
My first step in controlling video images through music or audio signals was the creation of the program XTRA.TRAX (1989–1990), which transformed STANDARD MIDI FILES (a hexadecimal format that can be imported/exported by most software sequencers) into edit lists for a SONY 910 video editor.
Musical frequency forms could be simulated by the dynamic motion control of a Betacam SP video recorder, and volume translated into video saturation levels via the picture mixer. The software allowed music to be rapidly and automatically transformed into video imagery—handling diverse parameters such as image selection, transitions, wipes, and more.
By separately translating individual tracks—e.g., bass drum, violin, piano—the resulting edit sequences could be reassembled in sync using a device like the Harry.
At the same time, the program generated a 3D Wavefront model from the analyzed MIDI data, allowing for animation synchronized with the music.
Disadvantages included the non-interactive nature of MIDI file preparation, but advantages lay in the speed and complexity of music-driven image generation, which would be difficult to achieve manually.
The background visuals for TILT IT were produced using this method and overlaid with 3D computer animation based on a recorded guitar solo. The guitar tone—converted via A/D—was transformed into a 3D object using a C program and simulated on a Silicon Graphics Personal IRIS with Wavefront software.
All original video frames were projected—frame by frame—onto the 3D object as a reflection map. The result was a computer animation that simultaneously represented the original video and sound. In post-production, the rendered animation was re-synchronized with the original audio.
INTERACTIVE INSTRUMENTS
The next step was to develop an instrument capable of simultaneously generating sound and image. This led to the creation of the installation Paula Chimes, presented in 1991 at the 4th European Media Art Festival in Osnabrück.
Sixteen steel tubes were set in motion by touch or wind. Their vibrations were analyzed using extensometers (industrial sensors for measurement) and interpreted by a computer via a special amplifier circuit and A/D conversion.
The resulting data were scaled into MIDI impulses for an AKAI S1000 sampler—triggering sound events.
In parallel, a Silicon Graphics VGX workstation generated spatially-altered video imagery. Each tube was mapped to an XY coordinate on a video still of cell division (meiosis). Moving a tube initiated a physical wave motion at the corresponding coordinate through real-time texture mapping. The resulting audiovisual output represented the tubes’ real-time state.
A second monitor displayed this data stream as a serial code—also visualized in its original form as a meiosis projection and mathematical score.
The 3D wave program was realized by Bob O’Kane, a colleague from the Institute for New Media in Frankfurt. The amplifier interface was built by Dieter Sellin (IfNW), and the steel sculpture was created by Stefan Karp and Kai Cuse.
OUTLOOK
All techniques described here will be used in the interactive concert performance HYENA DAYS. Special guitars are being developed to provide a broader signal spectrum. Future plans include integrating medical sensors—such as blood pressure monitors, EEGs, or eye-tracking systems—into the system to give musicians even greater expressive range through sound.