Real-Time Gesture-Controlled Physical Modelling Music Synthesis with Tactile Feedback
- PDF / 1,416,095 Bytes
- 6 Pages / 600 x 792 pts Page_size
- 7 Downloads / 158 Views
Real-Time Gesture-Controlled Physical Modelling Music Synthesis with Tactile Feedback David M. Howard Media Engineering Research Group, Department of Electronics, University of York, Heslington, York, YO10 5DD, UK Email: [email protected]
Stuart Rimell Media Engineering Research Group, Department of Electronics, University of York, Heslington, York, YO10 5DD, UK Received 30 June 2003; Revised 13 November 2003 Electronic sound synthesis continues to offer huge potential possibilities for the creation of new musical instruments. The traditional approach is, however, seriously limited in that it incorporates only auditory feedback and it will typically make use of a sound synthesis model (e.g., additive, subtractive, wavetable, and sampling) that is inherently limited and very often nonintuitive to the musician. In a direct attempt to challenge these issues, this paper describes a system that provides tactile as well as acoustic feedback, with real-time synthesis that invokes a more intuitive response from players since it is based upon mass-spring physical modelling. Virtual instruments are set up via a graphical user interface in terms of the physical properties of basic wellunderstood sounding objects such as strings, membranes, and solids. These can be interconnected to form complex integrated structures. Acoustic excitation can be applied at any point mass via virtual bowing, plucking, striking, specified waveform, or from any external sound source. Virtual microphones can be placed at any point masses to deliver the acoustic output. These aspects of the instrument are described along with the nature of the resulting acoustic output. Keywords and phrases: physical modelling, music synthesis, haptic interface, force feedback, gestural control.
1.
INTRODUCTION
Musicians are always searching for new sounds and new ways of producing sounds in their compositions and performances. The availability of modern computer systems has enabled considerable processing power to be made available on the desktop and such machines have the capability of enabling sound synthesis techniques to be employed in realtime, that would have required large dedicated computer systems just a few decades ago. Despite the increased incorporation of computer technology in electronic musical instruments, the search is still on for virtual instruments that are closer in terms of how they are played to their physical acoustic counterparts. The system described in this paper aims to integrate music synthesis by physical modelling with novel control interfaces for real-time use in composition and live performances. Traditionally, sound synthesis has relied on techniques involving oscillators, wavetables, filters, time envelope shapers, and digital sampling of natural sounds (e.g., [1]). More recently, physical models of musical instruments have been used to generate sounds which have more natural qualities and have control parameters which are less abstract and
more closely related to musicians’ experiences with acoustic instruments [2, 3, 4, 5]. Pr
Data Loading...