Jump to content

Zeta Instrument Processor Interface

From Wikipedia, the free encyclopedia

Zeta Instrument Processor Interface (ZIPI) was a research project initiated by Zeta Instruments and UC Berkeley's CNMAT (Center for New Music and Audio Technologies). Introduced in 1994 in a series of publications in Computer Music Journal from MIT Press, ZIPI was intended as the next-generation transport protocol for digital musical instruments, designed with compliance to the OSI model.

Concept

[edit]

The draft working version of ZIPI was primarily aimed at addressing many limitations of MIDI (Musical Instrument Digital Interface). Unlike MIDI which uses a peer-to-peer serial port connection, ZIPI was designed to run over a star network with a hub in the center. This allowed for faster connection and disconnection, because there was no need to daisy-chain multiple devices. 10BASE-T was used at the physical layer, but the protocol did not depend on any physical implementation.

There were proposals for querying device capabilities, patch names and other system and patch parameters, as well as uploading and downloading samples into device memory.

MPDL

[edit]

ZIPI used completely new message system and a complex note addressing scheme based on Music Parameter Description Language (MPDL) protocol, which was a direct replacement to MIDI events.

Instead of MIDI Channels, there were three-level address hierarchy of 63 Families consisting of 127 Instruments, each having 127 notes, resulting in up to 1,016,127 individual note addresses. Instruments in a Family could be assembled from different physical devices. This arrangement allowed fine per-note control of synthesis parameters, especially useful for non-standard scenarios such as MIDI wind controller or MIDI guitar controller.

For example, instant note-on capability could mask the deficiencies of note detection (tracking) in guitar MIDI systems, especially on lower strings. When triggered, the note would begin sounding as a noise or an arbitrary low note until the controller logic had tracked the actual pitch, which would be sent by a follow-up message without the need to retrigger the note. Conventionally, messages could also address a whole Instrument or an entire Family, as an equivalent to channel messages.

Some MDPL messages were direct carryovers from MIDI, given more pronounceable names in order to avoid ambiguity, but most messages were new and based on a very different, although innovative, control logic. The resolution of message parameters could be any multiple of 8-bit, potentially extending 7-bit resolution typical of MIDI to 32 or more bits.

There were also some higher-level messages corresponding to advanced program parameters, such as modulation, envelopes and 3D spatialization of voices, as well as instrument-specific messages for guitar, wind, and drum controllers.

Message types

[edit]

The basic synthesis control messages were:

  • Articulation - 'note on/off' in MIDI
  • Pitch (note number and offset in 0.2 cents)
  • Frequency (in Hz)
  • Loudness - 'velocity' in MIDI
  • Amplitude - 'volume' in MIDI
  • Even/Odd Harmonic balance
  • Pitched/Unpitched balance
  • Roughness
  • Attack character
  • Inharmonicity
  • Pan Left/Right, Up/Down, Front/Back
  • Spatialization distance and azimuth/elevation angles
  • Program Change - immediately and future notes
  • Timbre space X/Y/Z
  • Multiple output levels
  • Time tag
  • Modulation rate/depth/wavetype

Controller (performance-oriented) messages included:

  • Key Velocity/Number/Pressure
  • Pitch Bend Wheel
  • Mod Wheel 1/2/3
  • Switch pedal 1 (Sustain)/ 2 (Soft pedal) /3 /4
  • Continuous pedal 1 (Volume)/2 /3 /4
  • Pick/bow Velocity/Position/Pressure
  • Fret/fingerboard Position/Pressure
  • Wind flow or pressure (breath controller)
  • Embouchure (bite)
  • Wind controller keypads
  • Lip pressure/frequency
  • Drum head striking point X/Y position and distance/angle from center
  • X/Y/X position in space
  • Velocity in X/Y/Z dimension
  • Acceleration in X/Y/Z dimension

Outcome of the project

[edit]

Although ZIPI provided many outstanding new features, they did not line up well with existing MIDI-based implementations. The unusual addressing scheme which required substantial increase in complexity was the main factor in the lack of its adoption. Maintaining 1,016,127 individual synthesis states was far beyond the capabilities of synth hardware of the time, even though ZIPI developers hinted that there would be some practical limits upon the number of simultaneously available programs and notes. In comparison, MIDI defined only 16 channels that accumulated common channel control messages like program change, volume and pitch, and most digital synthesizers of the time could only provide from 12 to 128 simultaneously sounding notes.

As no commercial devices were released supporting ZIPI, the sufficiency of MIDI for most applications and the introduction of the "FireWire" (IEEE1394) as the alternative physical layer soon led to the practical demise of the project. ZIPI web site at CNMAT asserts that IEEE1394 "supersedes ZIPI in every respect," mainly because it has simpler interface requirements: it does not require a hub, supports hot plugging (devices may be added or removed more conveniently), and includes an isolated power distribution scheme.

The developers continued on to work on Open Sound Control protocol, currently supported in a wide variety of musical instruments, sensors and software.

See also

[edit]
[edit]