Hyena Days | 1992

Hyena Days | 1992

Hyena Days | Ars Electronica 1992

In 1992, Hyena Days debuted at Ars Electronica. The project featured a remarkable collaboration between Icelandic video art pioneer Steina Vasulka and German media artist Michael Saup, who initiated and directed the work.

Steina Vasulka, co-founder of The Kitchen and a leading figure in experimental video, brought her unique approach to live image manipulation. Connecting her violin to laserdisc players, she transformed video recordings of dancers into fluid, reactive visuals—effectively “playing” the moving image as she would her instrument.

The concept and technological architecture of Hyena Days were developed by Michael Saup at the very moment he met Steina at the Institute for New Media in Frankfurt, Germany. Realizing they were pursuing the same artistic goal—merging sound and image through digital media—they joined forces immediately, setting the foundation for a collaboration that would become both technically and artistically resonant.

At the time, the idea of controlling images with sound—and generating sound from images—through digital means was a striking breakthrough. This bidirectional interaction between audio and image was both technically daring and artistically visionary, opening new paths for multimedia and interactive art.

Hyena Days remains a milestone in digital art history—an early and powerful demonstration of how sound, image, and code can converge to produce entirely new forms of expression.

Hyena Days | TILT | Digitale 1996

“The evening showcased media art pioneer Steina Vasulka on violin, her instrument seamlessly linked to Michael Saup’s image-generating computers. Movement, sound, and complex three-dimensional visuals synchronized in real time. The computer aesthetics of bodies drifting into central perspective demonstrated their potential so convincingly that the work resembled a meditative homage to digitality.”​

Christoph Blase, Frankfurter Allgemeinen Zeitung, Sept 27, 1996

Stations

  • Ars Electronica, 1992
  • V2, Instable Media IV, Rotterdam, 1992
  • Deutsche Welle TV, 1994
  • Digitale 96, KHM Cologne Germany, 1996
  • Multimediale 5, ZKM, Germany, 1997
Ars Electronica Catalog, 1992

HYENA DAYS
Michael Saup / Steina Vasulka

(BURST MODE VERSION)
MIDI-CONTROLLED AND SOUND-CONTROLLED IMAGE PRODUCTION

Featuring:
Steina Vasulka – violin
Michael Saup – guitar

By converting acoustic signals from analog to digital and interpreting their measured values, the musicians and their instruments can interface with image-generating electronic devices such as 3D graphics workstations, laserdisc players, or hard disk recorders—transforming acoustic events into dynamic visual experiences.

The acoustic output signals are analyzed during A/D conversion and interpreted by a control computer, which makes algorithmic decisions and controls additional equipment:

Via MIDI impulses, various sound-producing machines—such as samplers, effects processors, and mixing consoles—are triggered.

Through a parallel interface, a laserdisc player is supplied with control impulses, while a workstation is addressed via RS232, enabling both to respond in real time to the musicians’ interactions.

Musicians can thus use a wide range of acoustic parameters—such as pitch or volume—as triggers for visual choreographies. The resulting imagery is displayed on video monitors via a video mixer.

MANIPULATING EDIT LISTS

My first step in controlling video images through music or audio signals was the creation of the program XTRA.TRAX (1989–1990), which transformed STANDARD MIDI FILES (a hexadecimal format that can be imported/exported by most software sequencers) into edit lists for a SONY 910 video editor.

Musical frequency forms could be simulated by the dynamic motion control of a Betacam SP video recorder, and volume translated into video saturation levels via the picture mixer. The software allowed music to be rapidly and automatically transformed into video imagery—handling diverse parameters such as image selection, transitions, wipes, and more.

By separately translating individual tracks—e.g., bass drum, violin, piano—the resulting edit sequences could be reassembled in sync using a device like the Harry.

At the same time, the program generated a 3D Wavefront model from the analyzed MIDI data, allowing for animation synchronized with the music.

Disadvantages included the non-interactive nature of MIDI file preparation, but advantages lay in the speed and complexity of music-driven image generation, which would be difficult to achieve manually.

The background visuals for TILT IT were produced using this method and overlaid with 3D computer animation based on a recorded guitar solo. The guitar tone—converted via A/D—was transformed into a 3D object using a C program and simulated on a Silicon Graphics Personal IRIS with Wavefront software.

All original video frames were projected—frame by frame—onto the 3D object as a reflection map. The result was a computer animation that simultaneously represented the original video and sound. In post-production, the rendered animation was re-synchronized with the original audio.

INTERACTIVE INSTRUMENTS

The next step was to develop an instrument capable of simultaneously generating sound and image. This led to the creation of the installation Paula Chimes, presented in 1991 at the 4th European Media Art Festival in Osnabrück.

Sixteen steel tubes were set in motion by touch or wind. Their vibrations were analyzed using extensometers (industrial sensors for measurement) and interpreted by a computer via a special amplifier circuit and A/D conversion.

The resulting data were scaled into MIDI impulses for an AKAI S1000 sampler—triggering sound events.

In parallel, a Silicon Graphics VGX workstation generated spatially-altered video imagery. Each tube was mapped to an XY coordinate on a video still of cell division (meiosis). Moving a tube initiated a physical wave motion at the corresponding coordinate through real-time texture mapping. The resulting audiovisual output represented the tubes’ real-time state.

A second monitor displayed this data stream as a serial code—also visualized in its original form as a meiosis projection and mathematical score.

The 3D wave program was realized by Bob O’Kane, a colleague from the Institute for New Media in Frankfurt. The amplifier interface was built by Dieter Sellin (IfNW), and the steel sculpture was created by Stefan Karp and Kai Cuse.

OUTLOOK

All techniques described here will be used in the interactive concert performance HYENA DAYS. Special guitars are being developed to provide a broader signal spectrum. Future plans include integrating medical sensors—such as blood pressure monitors, EEGs, or eye-tracking systems—into the system to give musicians even greater expressive range through sound.