Electroacoustic Marionette – Early Prototype

By looking at the design of digital musical instruments through the discipline of string puppetry, the Electroacoustic Marionette is meant to uncover new approaches to sound synthesis and gestural mapping. In this report I will describe the approach taken in developing an early prototype of this instrument, including the early hardware prototype, sound synthesis algorithm, and mapping between the two.

Introduction

Unlike acoustic musical instruments, whose design and functionality are constrained by the physical nature of sound and the instruments’ construction, digital musical instruments (DMIs) can take any form and produce any sound (Magnusson 2010). For a designer and maker of DMIs, this poses the problem of deciding what form new instruments should take (the construction of an interface), what sounds they should produce (the choice or design of sound and/or music synthesis algorithms), and how the interface relates to the sounds (the mapping) (Nort et al. 2014).

The Electroacoustic Marionette is a proposed digital musical instrument inspired by Michel Waisvisz’s remark (in Battier et al. 2014) that he learned a great deal about playing his DMI, the Hands, from puppeteers: “Puppeteers deal with gesture as a life-giving force all the time. Modern puppeteers are able to give life to the simplest objects by finding ways to let the objects work themselves and just interfering with gesture where it is needed.” The Electroacoustic Marionette seeks to address the issues of DMI design by considering them through the discipline of string puppetry. The hardware interface of the instrument is inspired by and analogous to the wooden marionette controllers. It is made with a variety of sensors to capture the gestures of the player, who takes on a double role as musician-puppeteer. Sound and music synthesis algorithms were considered as sonic puppets, designed to have a life of their own which the player activates through gesture. Mapping was considered as a way of stringing the sonic puppet thereby connecting the interface and sound synthesis algorithm.

The Controller

Marionettes are controlled in two basic ways: by moving the controller from which the strings are suspended, and by moving the strings themselves (Currel 2004). In order to determine the orientation of the Electroacoustic Marionette controller, the device includes a magnetic, angular-rate, and gravity (or MARG) sensor. This is a combination of an electronic compass, gyrometer, and accelerometer, each with high resolution sensing along three axes, for a total of 9 degrees of freedom. These sensors can be used together to determine the absolute orientation of the Electroacoustic Marionette relative to the earths magnetic and gravitational fields using sensor fusion algorithms such as the complimentary filter. However, in the current prototype no sensor fusion was used, and instead the raw sensor values were interpreted by a neural network to eschew the need for sensor fusion.

marionette-early-prototypeThe manipulation of the strings is more difficult to sense electronically, so for the current prototype this affordance was radically simplified. A single infrared proximity sensor on the left side of the controller represents a single string which the musician-puppeteer can operate with the hand. While this simplification represents a significant reduction of gestural affordances, it seems appropriate for this early prototype, since it is a common recommendation for beginner puppeteers (such as the author and sole player of the Electroacoustic Marionette) to focus at first on manipulating the orientation of the controller.

Many marionette controllers also have mechanisms for accomplishing special effects; these range from dedicated strings to complicated lever and pulley systems. In the current prototype, a capacitive touch sensor was included to adapt these extended gestural affordances to the Electroacoustic Marionette. Unfortunately, it has not yet been implemented.

Sound Synthesis

A string puppet will move in certain ways depending on the way it is made (Currel 2004). The materials used, the shape of the puppet, the way the different body parts are connected, and the places where the strings are attached all have profound effects. The puppet’s own momentum and the interconnection of its parts define which gestures will be natural for the puppet, what constraints will limit its range of motion, and the overall character of its movements. These effects are especially relevant for string puppets because they are controlled indirectly.

To design a sound synthesis algorithm analogous to a string puppet, it was necessary to consider how to adapt momentum and interconnection of physical objects to digital sound. The current sound synthesis engine has two main components: a simple FM synthesizer, and four delay lines interconnected in a matrix. The FM synthesizer injects sound into the delay lines, which feedback into one another in various ways depending on the setting of the matrix parameters. This feedback allows even a brief sound from the FM synthesizer to propagate and evolve within the delay matrix. Thus, feedback was considered analogous to momentum, and feedback between delay lines analogous to interconnection.

Mapping

Using the gyrometer in the MARG unit, any rotation of the Electroacoustic Marionette controller can be measured with high accuracy. The absolute value of the raw angular velocity data was smoothed logarithmically to derive the envelope of the angular velocity for each axis of rotation. Three pairs of operators were set with fixed frequency ratios. For each modulator, one axis of rotation was used to set the modulation index. Finally, the sum of the three angular velocity envelopes was used to set the amplitude of the carrier operators.

The delay matrix has a total of 20 parameters (4 delay times and 16 amplitude values which determine the feedback and interconnection in the matrix). Due to time constraints, it was not possible to implement sensor fusion for the MARG sensors or an extensive explicit mapping algorithm. Instead, the Wekinator machine learning environment was used to accomplish the mapping between the orientation of the controller (implied by the accelerometer and magnetometer in the MARG unit) and the delay matrix parameters. A neural network was trained to recognize 5 different orientations of the controller and output 5 different presets for the delay matrix. In between these 5 trained orientations, the neural network produces interesting mixes and interpolations of the learned presets. This was found to be a quick and effective way of working to develop a flexible and interesting mapping.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>