RESEARCH ACTIVITIES
Computer music is a multi-disciplinary field. The research summarized in this overview spans such areas as engineering, physics, computer science, psychology, and music (including performance, analysis, and composition). Any given specific research topic may require sophistication in several of these fields. This document can only contain a brief review of the work being done at CCRMA. For a more complete description of the research, a list of CCRMA research publications is included. Copies of reports are available upon request.
Although the research described in this document is diverse, the following rough grouping of the work presented here may be useful: Music composition software development continues (Schottstaedt, Taube, Lopez, Oppenheim), and new systems have been created for rapid prototyping of signal processing and sound synthesis algorithms (Smith, Jaffe, Porcaro, Stilson).
Research in physical modeling of musical instruments and reverberant environments continues to grow (Smith, Fleck, Rocchesso, Van Duyne, Cook, Chafe, Berners, Scavone, Pierce, Su). Innovative signal processing approaches are being studied (Smith, Wang, Levine).
Much attention is currently being given to the problem of human control of computers and sound synthesis algorithms running on computers (Chafe, Cook, Morrill, Stilson, O'Modhrain, Gillespie, Mathews, Lopez, Jaffe, Tanaka, Putnam).
In the psychoacoustics area, work continues in sound perception research and its applications (Trautmann, Chomyszyn, Pierce, Moreno, Levitin, Shepard, Cook, Bregman). In addition, CCRMA now houses two archival research libraries in electroacoustic music and in acoustics (Mathews, Bauman, Scavone).
The researchers working at CCRMA include graduate students, faculty, staff, and visiting scholars.
REAL-TIME CONTROL USING BIOLOGICAL SIGNALS
Bill Putnam Ph.D. student, Electrical Engineering
The focus of this research effort is to use naturally occurring bioelectric signals in advanced human computer interfaces. The specific signals used to date include the Electroencephalogram (EEG), Electrooculogram (EOG) and Electromyogram (EMG). These three signals arise from the normal electrical activity associated with neural activity in the body. The EEG signal is the electrical activity arising from brain activity; EOG signals come from the eye positioning mechanism; and EMG signals are the electrical signals produced by the activity of the musculature.
Biomuse is a real-time system used to collect, analyze and utilize biological signals for control purposes. Biomuse was developed at CCRMA by Ben Knapp and Hugh Lusted. Currently, this system accommodates the collection of 8 channels of biological data. Biomuse operation is controlled by host software running on a PC. Communication between the host and the Biomuse takes place through a standard serial interface. In addition, the Biomuse includes a MIDI interface to connect to musical instruments directly.
My work to date has focused on the use of the EMG signal. To this end, pattern recognition techniques have been used to detect and classify dynamic gestures on the part of the user. A typical example would be determining if the user was opening or closing their hand. Once classified, these gestures have been used to effect volitional control in the context of a computing environment and musical performance.
There are two basic application areas which are being pursued. The first is that of musical performance. As mentioned previously, pattern recognition techniques can be used to classify physical gestures made by the user. Once classified, these gestures can be used for real-time control of a performance. Work in this area is being focused upon the development of a real-time mixing system which would respond to both EMG and EOG signals. Knapp has successfully implemented an eye tracking device using EOG signals. The eye tracker could be used to select individual instruments by simply looking at them. Once selected a parameter such as volume could then be controlled by an appropriate gesture, such as moving the arm up or down.
The second application area is that of computer control. Current efforts have focused upon cursor control and control of standard graphical objects such as sliders, scroll bars, etc. The main focus and end use of this effort is to facilitate human computer interaction for persons with disabilities.
[This work has been supported by Biocontrol Systems Inc.]
Related publications
Patmore, David and William Putnam. "Assistive Cursor Control For A PC Window Environment: Electromyogram and Electroencephalogram Based Control,'' Proceedings of the Virtual Reality and Persons With Disabilities Conference, June 1994.
Putnam, William and R. Benjamin Knapp. "Real-Time Computer Control Using Pattern Recognition of the Electromyogram,'' Proceedings of the Engineering in Medicine and Biology Conference, pp.1236-1237, 1993.
Putnam, William and R. Benjamin Knapp. "The Use of the Electromyogram in a Man-Machine Interface,'' Proceedings of the Virtual Reality and Persons With Disabilities Conference, June 1993.
BIOCONTROL INTERFACES AS MUSICAL INSTRUMENTS
Atau Tanaka Ph.D. student, Music
The BioMuse is a neural interface/biocontroller developed by BioControl Systems, Palo Alto, CA. It monitors electrical activity in the body (as EEG and EMG) and translates it into MIDI [Knapp and Lusted 1990]. I have been composing music for and performing with the BioMuse. Although such human interface technology can be applied in ways to give the capability of making music to nonmusicians, my interest has been to look at the system as a new musical instrument - one that requires "practice'' to avoid playing "wrong notes'' in performance, and an instrument for which we can imagine developing an idiomatic performance technique [Tanaka 1993].
MIDI output from the BioMuse is passed to software running in the Max environment [Puckette and Zicarelli 1990]. These Max patches configure the BioMuse, and also map incoming MIDI control data (representing EMG trajectories) to musical gestures. In this way, a physical gesture of the muscles effects melody, rhythm, timbral changes, and combinations.
There is a certain frustration in directly connecting the BioMuse output to MIDI devices in this way. The source biodata is a rich, continuous signal that is constantly changing. MIDI, on the other hand, is an event based music control specification. To better suit the nature of the biosignal, I have created Max patches to allow direct control of sound synthesis by sending MIDI System Exclusive to the synthesizer.
Another area of exploration has been the control of visual images from the BioMuse. Animation and video files are created in the Macintosh as PICS files and QuickTime movies. Max patches are then created to navigate through these dynamic graphics images under control of the BioMuse. This allows one bodily gesture to pilot at once changes in the music and visuals. I feel that this holds interesting potential for multimedia performance.
References
Knapp, Benjamin R. and Hugh S. Lusted. "A Bioelectric Controller for Computer Music Applications,'' Computer Music Journal, 14(1):p.42, 1990.
Puckette, Miller and David Zicarelli. MAX - An Interactive Graphic Programming Environment, Opcode Systems, Palo Alto, CA, 1990.
Tanaka, Atau. "Musical Technical Issues in Using Interactive Instrument Technology,'' Proceedings of the International Computer Music Conference, Tokyo, 1993.
PADMASTER: A RADIO DRUM IMPROVISATION ENVIRONMENT
Fernando Lopez-Lezcano Scientific Research Programmer
PadMaster is a real time improvisation environment that is controlled through the Radio Drum. It was written in the NEXTSTEP environment using Objective-C, NeXT's Interface Builder for the graphical user interface and the Music Kit for the basic MIDI and music performance class library. PadMaster receives through MIDI position and force information from the Radio Drum when either of the batons hits the surface of the drum, and can also poll continuously for their instantaneous spatial position.
The program enables the composer to define virtual pads in the drum surface and assign a behavior to each one (a 5 by 6 grid of rectangular pads in the current implementation). The pads can be grouped in sets and the performer can step forward or backwards through the sets using the batons, in effect completely (or subtly) changing the behavior of the drum and the interaction of the performer with the composition. Each pad can trigger the next note in a sequence of notes; trigger, pause and resume a single sequence; or trigger multiple overlapping instances of a sequence. Each sequence can be one note, or a list of notes or any other MIDI messages.
Each pad can be controlled by a global tempo or can have its own tempo; or even each sequence within a pad can have a completely independent and variable tempo. In addition each pad can request the main controller program to send continuous position information and can translate that information into tempo control or continuous MIDI messages such as controllers or pitch bend. The graphical interface provides feedback to the performer as to the state of each individual pad (idle, playing, paused, and so on). All the information can be saved and loaded from document files, edited by using graphical inspectors, or loaded and saved as text in Music Kit score files. Future work will enable the program to use other MIDI controllers in addition to the Radio Drum.
@@@@@@