Research

Please click here to visit my new research page at Queen Mary, University of London.

I am currently Lecturer in Digital Media in the Centre for Digital Music at Queen Mary, University of London.

Previously, I was a post-doctoral researcher in the Music & Entertainment Technology Laboratory (MET-lab) at Drexel University, supported by a CIFellows grant from the Computing Research Association and National Science Foundation.

Current Research Interests

My research aims to quantify aspects of creative musical expression, particularly the interaction between performer and instrument. Efforts to establish an artistic role for computing are often hindered by a mutual lack of understanding between musicians and computer scientists. The problem is partially one of terminology: engineers tend to analyze audio signals in terms of amplitude, frequency and spectrum, where a musician might hear dynamics, intonation and timbre. The relationship between these qualities can be surprisingly subtle. Other musical parameters, such as gesture or character, are extremely difficult to quantify.

I aim to reconcile musical and technical perspectives by using computing to augment, rather than replace, existing musical instruments. My approach is twofold: first, by integrating high-resolution sensors into existing instruments, performance gestures can be analyzed in detail and correlations drawn with expressive intent. Second, electronic actuation can be used to manipulate an instrument's acoustic properties, creating hybrid instruments blending the natural richness of acoustic instruments with the flexibility of electronic synthesis.

Magnetic Resonator Piano

The magnetic resonator piano is a hybrid acoustic-electronic instrument augmenting the traditional grand piano. Sound is produced without loudspeakers using electromagnetic actuators to directly manipulate the piano strings, expanding its vocabulary to include infinite sustain, notes that crescendo from silence, harmonics, and new timbres. A sensor on the keyboard reports the continuous position of every key. Time and spatial resolution are sufficient to capture detailed data about key press, release, pretouch, aftertouch, and other extended gestures. The system, which is designed with cost and setup constraints in mind, seeks to give pianists continuous control over the musical sound of their instrument.

More information on the MRP can be found at my MET-lab research page.

Click on a photo for a high-resolution version:


Complete MRP system

Electromagnetic actuators above strings

Amplifier and signal-router system

Interface based on modified Moog Piano Bar

Demo Video 1: Secrets of Antikythera, from November 2009 concert

Demo Video 2: Extended keyboard techniques on MRP

Multi-Touch Piano Keys

The piano keyboard remains the interface of choice for many digital music tasks, but interest is growing in the musical applications of multi-touch interfaces. Though devices such as the Apple iPad offer continuous, detailed tracking of multiple fingers, touchscreen devices lack the tactile feedback of traditional musical instruments.

This project integrates multi-touch sensing directly into the keyboard itself through the use of specialized key tops. Each key top is a capacitive sensor capable of recording the position and contact area of up to three fingers on the key surface. Black keys sense touch in a single lengthwise axis; the front of the white keys also sense horizontal position. Touch data is sent to a computer by USB, and Open Sound Control messages are provided for a wide range of low- and high-level gestures, allowing connections to many synthesis platforms. The keys can be installed atop any existing acoustic or electronic keyboard.

Demo Video (click here if the video below doesn't load):

Publications

A. McPherson and Y. Kim. Design and applications of a multi-touch musical keyboard. To appear in Proceedings of the 8th Sound and Music Computing Conference (SMC 2011), Padova, Italy.

B. Dolhansky, A. McPherson and Y. Kim. Designing an expressive virtual percussion instrument. To appear in Proceedings of the 8th Sound and Music Computing Conference (SMC 2011), Padova, Italy.

A. McPherson and Y. Kim. Multidimensional gesture sensing at the piano keyboard. In Proceedings of the 29th ACM Conference on Human Factors in Computing Systems (CHI 2011), Vancouver, BC.

A. McPherson. The magnetic resonator piano: electronic augmentation of an acoustic musical instrument. Journal of New Music Research 39 (3), 2010, pp. 189-202.

A. McPherson and Y. Kim. Augmenting the acoustic piano with electromagnetic string actuation and continuous key position sensing. In Proceedings of the 2010 International Conference on New Interfaces for Musical Expression (NIME 2010), Sydney, Australia.

A. McPherson and Y. Kim. Toward a computationally-enhanced acoustic grand piano. In Extended Abstracts of the 28th ACM Conference on Human Factors in Computing Systems (CHI 2010), Atlanta, GA.

Other Presentations

July 4, 2011
I will present a paper, Piano touch as a model for expressive gestural interaction, at the BCS HCI 2011 conference, in the workshop When Words Fail: What can Music Interaction tell us about HCI?.

March 4, 2011
Seminar and demo: Electronically Augmenting the Acoustic Piano: Explorations of Resonance and New Techniques for Continuous Note Shaping. NYU, Music and Audio Research Lab, Department of Music.

February 5, 2011
MRP demo and performance at the FAST Festival, MIT Media Lab, celebrating MIT's 150th anniversary.

August 16, 2010
MRP demo and performance, Tanglewood Music Festival.

May 26, 2010
Seminar: Augmenting the acoustic piano with electromagnetic string actuation and continuous key position sensing. Stanford University, Center for Computer Research in Music and Acoustics (CCRMA).

May 4, 2010
Live demo and workshop with composition students at the Curtis Institute of Music.

February 26, 2010
MRP research presentation and demo, Guthman Musical Instrument Competition, Georgia Tech. Chosen as competition finalist.

November 13, 2009
Seminar: Electronic Augmentation of an Acoustic Musical Instrument. Drexel University, Department of Electrical and Computer Engineering.

March 9, 2009
Seminar: Living Resonance: Compositions and Augmented Piano, Wellesley College.

Tools and Previous Projects

MATLAB audio input MEX Plug-in for cross-platform multichannel audio input using PortAudio
CrossoverUnit Digital loudspeaker crossover design using Apple's AudioUnit architecture.
Function Generator Simple function generator AudioUnit for MacOS X
SoundBlocks Master's thesis: a set of interconnectable blocks for children to explore music and audio manipulation.