Skip to main content
School of Electronic Engineering and Computer Science

Music Informatics and Sonic Visualiser

Centre for Digital Music (C4DM)

This Case Study concerns impact of research in the Centre for Digital Music (C4DM) into Music Informatics, and in particular through the open source software platform Sonic Visualiser (SV) and its associated open-source VAMP API plugin architecture. SV has been used by musicologists, the British Library (BL) has agreed to the deployment of a specially tailored version for use by researchers in the BL, and SV has also been used in dance performance. A new web service, “Yanno”, which uses the SV VAMP plugins, has been used in school music lessons.

www.sonicvisualiser.org Hover over image to enlarge
www.sonicvisualiser.org Hover over image to enlarge

This Case Study focuses on research undertaken in the Centre for Digital Music (C4DM) into Music Informatics. This research – sometimes also known as Music Information Retrieval, Semantic Audio or Intelligent Audio – investigates methods to extract semantic information from musical audio files, and use this extracted information in the production, distribution and consumption of music. According to EPSRC’s 2010 “Programme landscapes” for ICT, C4DM is one of the top 5 centres in “People and Interactivity,” and in EPSRC’s Music and Acoustic Technology area, C4DM is the largest of the “Current Major EPSRC Research Investments”.

 

Our Impact

The research has had an economic impact, and an impact on society, culture and creativity. To make our Music Informatics research available to as many potential users as possible, we release many open source software tools. Our website isophonics.net, which also hosts videos and screencasts of demos, receives some 1500 visitors/month (Google Analytics). In Nov 2011 Isophonics was selected as one of eight European music-tech startups to be invited to pitch to a panel of judges at “TechPitch4.5”, held at EMI Music headquarters [I6].

 

Sonic Visualiser (SV)

svg4 Sonic Visualiser is one of our open-source tools. It is an easy-to-use application for viewing and analysing the contents of musical audio files, incorporating a wide range of our music informatics research (e.g. chord analysis, beat tracking, audio segmentation) in the form of “plugins”. It has been developed mainly as part of our EU and EPSRC funded projects (IST-FP6-507142 SIMAC €2.9M; IST-FP6-033902 EASAIER €2.1M; GR/S84743/01 SeMMA £195k; EP/E017614/1 OMRAS2 £2.5M; EP/H043101/1 SoundSoftware.ac.uk £947k).

 

Sonic Visualiser is available for Linux, Mac OS/X and Windows, and has been downloaded over 330,000 times since its release in 2007 (SourceForge and code.soundsoftware.ac.uk download statistics, June 2013). In a survey of 821 users (11 Oct 2010 to 25 Apr 2013), 49% reported that they were non-academic users, using it as a professional (9%) or for personal use (40%). Usage is truly international, with users from 66 countries including the USA (32%), UK (25%), Germany (10%) and France (8%). An overwhelming majority (82%) report that they enjoy using SV [I7].

Sonic Visualiser chord labelling example from Matthias Mauch on Vimeo.

Sonic Visualiser (SV) in the British Library (BL)

A special version of SV, Sonic Visualiser Library Edition [Im9], has been developed with specific evidence-based adaptations both for the workflow of musicologists, and to satisfy strict BL requirements, such as no export of audio (for copyright reasons). It is currently available to BL staff members and Edison Fellows. BL has also approved its installation in BL reading rooms. Once some legacy technology issues are dealt with, it is anticipated to be deployed on the computers equipped with the British Library Sound Server, allowing any reader the possibility to use SV for playback, analyses, and visualisations. Results of analyses can be exported via email, allowing readers the possibility to further employ their analyses in different setting (but without the audio for copyright reasons).

Repurposing Sonic Visualiser technologies for Schools: Yanno

Yanno-bandofhorses

The “Yanno” system [I12], above, is a re-purposing of SV-based VAMP API technology into a web service for music teaching in schools. Its development was underpinned by observational research [R5] on how to achieve the strongest impact in the high-school music classroom, and uses Mauch & Dixon’s chord analysis plugin [R3].

Teachers and students gave very positive feedback (verbally and by email) and the tool was subsequently used in classroom teaching in our partner schools. One school enthusiastically forwarded details to a music education network, leading to significant interest in the service. In one month (March 2012) Yanno has had 7637 views, and since its launch in early 2012 it has analysed 3236 distinct Youtube videos with currently 2800 hits/month (Website statistics, Jan-Jun 2013).

Public Engagement and use in Performance

SV has been used in a variety of public engagement events, including at the Science Museum Dana Centre (over 100 attendees) and the EPSRC Pioneers ’09 event (over 700 attendees) [Im8]. SV has also been used to create visualisations of music as a backdrop for a dance performance during the Cats Meet Show at Latitude Festival 2011 (Henham Park, 14-17 July 2011) [I10]. The show was held on the Music and Film stage at the Festival, and attracted over 500 people.

Popular Audio

Sonic Visualiser was featured (Sep 2011) in “SoundStage! HiFi”, an online magazine for audiophiles. The article showed how music enthusiasts could use Sonic Visualiser to explore music which claimed to be “high resolution” (sampled at higher sample rate), to see if it has really been produced at a higher sampling rate, or merely a traditional 44.1kHz or 48kHz source which has been upsampled to 88.2kHz or 96kHz. [I11]

Use in Massive Open Online Courses (MOOCS)

Sonic Visualiser was featured in the Coursera course “Introduction to Digital Sound Design” [I4], with a 20 minute tutorial dedicated to SV (Jan 2013). This course has had 6,800 facebook likes and led to a “spike” of approximately 6,000 additional downloads of SV by users of the course.

Teaching in HEIs

SV is used to help teaching of Audio & Music at Queen Mary and in other institutions. Examples of non-QM courses include:

  • Universitat Pompeu Fabra, Barcelona: Module “Audio and Music Processing”, Master in Sound and Music Computing [I2]
  • University of Montreal, Canada: Module IFT6808 “Music and Machine Learning” (2009) [I3]

     

Underpinning Research

 

C4DM research into Music Informatics is grounded in digital signal processing (DSP) for extracting features from musical audio. In 1998 we published one of the field’s earliest papers to tackle automatic music genre analysis [R1]. This contributed to a JISC/NSF Digital Libraries co-funded project, OMRAS (On-line Music Recognition and Search, www.omras.org, 1999-2003) with Sandler, Oxford University and University of Massachusetts (Amherst). Sandler (Professor of Signal Processing) and other key researchers (Bello, Reiss) and academics (Plumbley) moved to Queen Mary in 2001. The OMRAS project (now at Queen Mary and Goldsmiths) produced pioneering work using audio queries to search symbolic music databases, summarized in [R2]. It also established the ISMIR conference series (ismir2000.ismir.net), with 260 delegates in 2012.

The EU FP6 SIMAC project (mtg.upf.edu/static/semanticaudio, 2004-2006) was one of the first major European music informatics projects. C4DM was responsible for developing and defining feature extraction algorithms, including rigidly defined structured semantics for audio features. This led to a major EPSRC ICT “Large Grant” project OMRAS2 (www.omras2.org, 2006-2010, £2.5M).

The SIMAC and OMRAS2 project (and others) supported the development of our open source cross-platform tool “Sonic Visualiser”, designed to allow our music informatics research methods to be used by a wide range of users (see Impact). Examples of our methods available in Sonic Visualiser “Vamp” plugins (www.vamp-plugins.org) include: note onset detector, beat and barline tracker, tempo estimator, key estimator, tonal change detector, structural segmenter, timbral and rhythmic similarity estimator, adaptive spectrogram, note transcription, chromagram, harmony extraction, chord extraction [R3], and audio alignment.

For particular audiences we have undertaken ethnographic research into potential users of our music informatics visualization technology during the development of tailored systems, specifically for musicologists in the British Library [R4] and during music lessons in schools [R5].

To help promote take-up of the research and maximize impact, the Sonic Visualiser website includes a range of tutorial material, including videos on user-oriented tasks such as Mapping Rubato and Loudness, Mapping Melody and Adding in Bar and Beat Numbers.

C4DM research in Music Informatics has a major impact in other research fields. For example, the work on classification in the wavelet domain [R1] has received 105 citations in a wide range of fields including medical, finance, oil & gas, semiconductor and sports (Google Scholar, March 2012), and also has 50 patent citations [I5].

 

References

 

  • [R1] T. Lambrou, P. Kudumakis, R. Speller, M. Sandler, A. Linney. Classification of audio signals using statistical features on time and wavelet transform domains. Proceedings of the 1998 IEEE International Conference on Acoustics, Speech and Signal Processing, 12-15 May 1998, Volume: 6, 3621-3624, Seattle, WA, USA, ISBN: 0-7803-4428-6. GS: 130 citations.
  • [R2] J. Pickens, J.P. Bello, G. Monti, M. Sandler, T. Crawford, M. Dovey & D. Byrd, Polyphonic score retrieval using polyphonic audio queries: A harmonic modeling approach, Journal of New Music Research, 2003, Vol. 32, Number 2, 223-236. GS: 92 citations
  • [R3] M. Mauch and S. Dixon, Simultaneous estimation of chords and musical context from audio. IEEE Transactions on Audio, Speech and Language Processing, 18 (6), 1280-1289, 2010. DOI: 10.1109/TASL.2009.2032947. GS: 73 citations
  • [R4] Barthet, M. and Dixon, S. Ethnographic observations of musicologists at the British Library: implications for music information retrieval. In Proc 12th International Society for Music Information Retrieval Conference (ISMIR), Miami, Florida (USA), 2011. GS: 5 citations.
  • [R5] Stowell, D. and Dixon, S. Integration of informal music technologies in secondary school music lessons. British Journal of Music Education. Published online: 28 Aug 2013. DOI: 10.1017/S026505171300020X
  • [R6] Bello, Juan Pablo, et al. A tutorial on onset detection in music signals. Speech and Audio Processing, IEEE Transactions on 13.5 (2005): 1035-1047. GS: 488 citations.

 

Impact Corroboration

 

 

Back to top