Making Mappings
Online Appendix
Electroacoustic musical instruments (EAMIs), those new instruments for musical expression that use an electronic transducer such as a loudspeaker to produce sound, are characterized by the seperation of control and sound. This modular separation produces the need for EAMI makers to decide how the modules that produce control data should be connected to the modules that expose control paramters. These connections form the mappings in an EAMI, which have been shown to have important influences over the success of any EAMI design.
This online appendix includes additional detailed documentation related to the research presented in the publications cited below. This information is provided to facilitate close examination of the methodology of the research e.g. with the aim of conducting a similar study, as well as to allow other researchers to analyse the dataset produced. If you have found this page some other way than by reading one of the publications cited below, you may wish to consider reading one of those presentations of the research before returning here.
Feel free to contact Travis West at the email provided in the publications if you have any questions about this documentation.
Publications
West, Travis J. Making Mappings: Examining the Design Process with Libmapper and the T-Stick. 2020. MA Thesis. McGill University, Canada.
West, Travis J., Baptiste Caramiaux, Marcelo M. Wanderley. "Making Mappings: Examining the Design Process." 2020. In Proceedings of the International Conference on New Interfaces for Musical Expression.
West, Travis J., Stéphane Huot, Baptiste Caramiaux, Marcelo M. Wanderley. "Making Mappings: Design Criteria for Live Performance." 2021. In Proceedings of the International Conference on New Interfaces for Musical Expression.
Methodology
Demographics Questionairre
Participants were asked to rate their familiarity with a handful of subjects, using a five point scale from "not familiar at all" to "very familiar", and to estimate how many years of experience they had in each of these subjects:
- designing digital musical instruments
- using gestural interfaces with sound synthesizers/effects
- modular synthesizers
- sonification of data
- algorithmic composition
- computer music
- interactive audio and/or visual performance systems
- Max/MSP, Pure Data, Supercollider, and/or related or similar tools
- DMI and/or NIME related research, e.g. Computer Music Journal, Journal of New music Research, Organised Sound, NIME, ICMC, etc.
Participants were also asked to choose and rank the top three terms that best described them from this brief list of roles:
- Music Technology Researcher
- Producer
- Musical Instrument Designer
- Composer
- Programmer
- Performing Musician
In case none of these were especially descriptive, participants were also allowed to add terms to the list to better characterize their expertise.
Finally, participants age and gender identity were also noted.
Interview Guide
The interview was conducted after the participant had finished their mapping design. The purpose of the interview was for the participant to describe the mapping they made, their reasons for choosing to design it in the precise way they did, and the process by which they made those decisions. Ideally this would take the form of an open-ended discussion, and in the case of earlier participants in the study that is exactly what took place. As more participants were interviewed, a guide was gradually developed including questions that were found to evoke interesting responses:
- Would you please demonstrate (explain) how your instrument works?
- According to you, what makes a mapping effective? What did you try to do so that your mapping would be "an effective mapping for a live musical performance"?
- What were some of the things that worked really well? (Why, and how did you know?)
- What were some of the things you tried that didn't really work? (Why didn't they work, or how did you know?)
- Could you describe the process that you used to make your mapping?
- Suppose as if you had to give someone step by step instructions, "this is how you make an effective mapping (regardless of the interface or synth)", what would you tell them?
- Could you describe a bit how your mapping evolved from start to finish?
- If you had all the time in the world to work on this, what would you do next?
- If I asked you to do it all again from scratch, would you do anything differently?
- At any point did you feel limited by the mapping software, or as if it got in your way?
- If you could change anything or add any functionality that you wanted, that you can imagine, how would you improve the mapping software?
Mapping Environment
Download mapping environment Max/MSP patches
The mapping environment consisted of an interface, a synthesizer, and a tool for making mappings between the two. The interface used was a T-Stick; hardware and software documentation for the T-Stick can be found on github. Webmapper was used to make mappings, and is also available from github. Native Instruments Massive was used as the synthesizer in the 2020 studies, configured as described in the publications above. The Max/MSP reimplementation of the synthesizer was used for the 2021 study.
The remaining software glue, implemented in Max/MSP, is available for download from the link above. This includes the software for reading raw data from the particular T-Stick used in the study, the gestural feature extraction algorithms, and the libmapper bindings for the T-Stick and the synthesizer. In addition, a reimplementation of the synthesizer in Max/MSP is also included. Although not identical to the original synthesizer, this reimplementation may be considered close enough to be comparable, and it was used in the 2021 study cited above.
Data Recorder
One important feature of the mapping environment is that all raw data and gestural signals from the T-Stick are sent to a certain port on the local host, as are all changes to the synthesis parameters (whether by the user moving a slider or by the mapping). Combined with the public availability of administrative bus traffic in the libmapper network, these provide the foundation for a simple recording mechanism.
The following simple shell script was used in the study to
record the T-Stick signals, synthesis parameter changes, and
libmapper bus traffic. The script uses oscdump
, a
simple utility included with the liblo library
which simply dumps
all OSC messages received on the given port as text to the standard
output.
#!/bin/bash
echo "Running mapping design experiment"
if [[ -z "$1" || -z "$2" ]]
then
echo "Error: Please set participant name and ID with arguments to the script"
echo "e.g. $0 JohnDoe P1"
echo "Aborting script"
exit
fi
echo "Adding participant to the list"
echo "Participant Name: $1 Participant ID: $2"
echo "$1, $2" >> ../Data/participantList.txt
echo "Starting oscdumps in background processes"
echo "BEGIN EXPERIMENT———————————————————————————————————" >> "../Data/$2.mapperdata.txt"
oscdump 7570 224.0.1.3 >> "../Data/$2.mapperdata.txt" &
echo "BEGIN EXPERIMENT———————————————————————————————————" >> "../Data/$2.tstickdata.txt"
oscdump 11610 >> "../Data/$2.tstickdata.txt" &
echo "BEGIN EXPERIMENT———————————————————————————————————" >> "../Data/$2.massivedata.txt"
oscdump 11810 >> "../Data/$2.massivedata.txt" &
echo "Type ctrl+c to save and close data collection when finished"
echo "!*!***# REMEMBER TO SAVE THE MAPPING FROM LIBMAPPER #***!*!"
while true
do
:
done
Dataset
Over the course of this research, two groups of participants joined the study (described in detail in the 2021 paper cited above). The first group was comprised of general music technology users, with no prior experience using libmapper or the T-Stick. The second group was similar, but all participants had prior experience using the T-Stick. The dataset from each group is available seperately for download from the links below.
Mappings
Download mapping json files from group one
Download mapping json files from group two
Each participant designed a mapping from the T-Stick to the given subtractive synthesizer. After completing their design, the mappings were exported from Webmapper using its built in save functionality. The exported mappings can be downloaded from the link above. Alternatively, graphical representations of the mappings can be viewed from the link above.
Note that many participants did not make use of every synthesis parameter; in these cases, a static value may have been set using the sliders in the mapping environment GUI. These static values can be retreived by considering the synthesizer parameters below.
Also note that, because the Webmapper version was upgraded part way through the study, there are two slightly different formats in the exported mapping files.
Interview Transcripts
Download interview transcripts from the 2020 studies
Download interview transcripts from the 2021 study
The interview transcripts are broken up by study rather than group. From the 2020 study, two versions of the transcripts are available: the raw version is simply the transcription of the interview, whereas the annotated version has been indented and marked up in an xml-like format based on the thematic analysis of the interviews. From the 2021 study, only the annotated transcripts are available. This archive also includes the transcripts from the first study, annotated with the additional codes determined during the second study.
Demographic Information
Download demographic information from group one
Download demographic information from group two
Each participant filled out a demographic questionairre as described above.
Activity Data
Download activity data from group one
Download activity data from group two
View activity timeline diagrams from group one:
Activity timeline diagrams from group two have not yet been generated, since this data was not referenced in the 2021 study.
By recording the network traffic on the libmapper administrative bus, a timeline can be reconstructed of all the changes made to the mapping during a participant's design process. Because this activity data includes additional chatter on the administrative bus which is likely irrelevant to understanding the mapping design process, two versions of the activity data are available: a raw version including all bus traffic, and a clean version including only changes to the mapping.
Timeline diagrams from group one can also be viewed from the links above. These were generated using the scripts as described below. Version A emphasizes the presence of each association, and the duration each association is present during the task. Associations take the form of horizontal bars, beginning at the moment they were added to the mapping, and disappearing if they are removed or replaced by a convergent mapping. Blue associations remain in the final mapping, whereas red ones were utlimately rejected. Changes to the transfer function of an association are denoted with a cyan circle. Finally, a thin grey line follows the designer's activity.
Version B emphasizes the pace of activity, and the time between actions. Each action (creating or removing an association, or editing a transfer function) is illustrated with a glyph and a vertical line. The vertical position of each glyph reflects how long it had been at the moment of that action since the previous action took place. Creating an association is shown with a square, removing one with a diamond, and transfer function edits are shown by cyan circles. As with version A, retained mappings are shown with a dark blue, while rejected mappings are coloured red.
T-Stick Signals
The T-Stick generates a large number of signals, including the raw output from its sensors, the clean processed version of the sensor data, and a number of gestural features. All of these signals were recorded using the data recording script above. Unfortunately, due to the inefficient text-based encoding of the original data format and the limitations of the hosting server for this website, these recordings are not available to download at this time. If you are interested in analyzing this data in particular, please contact Travis West. Work is ongoing to provide this data in a useful and compact data representation.
Synthesizer Parameters
All changes to the synthesizer parameters (whether manually by the user dragging a slider in the GUI or by the influence of a signal mapped to the parameter) were recorded using the data recording script above. Unfortunately, due to the inefficient text-based encoding of the original data format and the limitations of the hosting server for this website, these recordings are not available to download at this time. If you are interested in analyzing this data in particular, please contact Travis West. Work is ongoing to provide this data in a useful and compact data representation.
Analysis Scripts
A large number of programs were written to parse and analyse the dataset generated in the study. Most of these are written in python, and are available from the Github repository here. Many of the scripts are used to generate a chart; these outputs, produced from the group one dataset only, can also be viewed directly, linked below. There are also a handful of scripts not described below that can be found in the git repo; these represent speculative inquiries that may not have yielded interesting results. You may disregard them or follow the same line of inquiry yourself if you wish. Maybe you'll discover something I missed.
- mapperclean.py
- This script was used to generate the clean versions of the activity data timelines by removing lines with irrelevant information.
- mapperparse.py
- Several other scripts reference this one, which is used to read the activity data files and saved mappings and create native python representations of the data contained in them.
- firstimpressions.py
- First impressions were a recurring theme in the interview transcripts, motivating the creation of this script. This script derives statistics related to how long each mapping association was present during a participant's design process, generating charts visualizing this data. It also analyses how many times each association was tried by each participant.
- plotters.py
- Some graphic subroutines used by firstimpressions.py.
- mappertimeline.py
view py, view output (timeline version A)
py, output
This script generates a graphical representation of the activity data emphasizing each association and duration that it was present during the activity.- activityperiods.py
- This script generates an alternative graphical representation of the activity data, emphasizing the period of time between actions.
- signal_usage.py
- The number of times each T-Stick signal and synthesis parameter was used can potentially shed light on how useful participants found these inputs and outputs to be. This script analysis this information and generates a histogram to visualize it.
- mappings.py
- This script generates a visual representation of each participant's mapping, as well as several diagrams visualizing the mapping data in aggregate.
- xmlfilter.py
- The interview transcripts were annotated using an xml-like format; this script is used to filter the transcripts and print only those exerpts with a certain requested annotation type. It also extracts simple observational metrics related to the frequency of the requested annotations, such as how many exerpts were marked with them.
- xmlverify.py
- This script was used to verify the correctness of the xml annotations when marking up the interview transcripts.