Confirmed Keynote Speakers
Beat tracking evaluation from the MIR perspective: How we try to measure if computer algorithms can tap the beat
Matthew Davies – Sound and Music Computing Group, INESC TEC
In my talk I would like to address the topic of beat tracking evaluation as currently undertaken within the music information retrieval community. To this end, I will provide an overview of the typical pipeline of MIR beat tracking evaluation from the selection of relevant test material, the means for annotating ground truth, and the array of currently used evaluation measures for calculating beat tracking accuracy. For each aspect I will seek to highlight the specific open challenges and in particular, focus on the need for “evaluating the evaluation measures”. My overarching goal will be to engage with members of the RPPW community towards a deeper understanding of the multidisciplinary aspects of rhythm perception and production, and hence their implications for MIR evaluation.
Matthew Davies is a music information retrieval researcher with a background in digital signal processing. His main research interests include the analysis of rhythm in music, MIR evaluation methodology, creative-MIR, harmony, music therapy and reproducible research. Matthew coordinates the Sound and Music Computing Group in the Centre for Telecommunications and Multimedia at INESC TEC.
Peter E. Keller – The MARCS Institute for Brain, Behaviour and Development, Western Sydney University
Basic sensorimotor synchronisation skills support complex displays of rhythmic interpersonal coordination in group behaviours including musical ensemble performance. One venerable research tradition, stemming largely from the field of experimental psychology, has focussed on sensory-motor coupling processes that enable interpersonal entrainment in such contexts. Another relevant tradition, pursued mainly in the field of cognitive neuroscience, has focussed on mechanisms that enable the prediction of others’ actions in social settings. My recent research has aimed to bring together these different perspectives, resulting in a computational model of temporal adaptation and anticipation in sensorimotor synchronisation. I will present key findings from a program of research that tests this model in controlled laboratory experiments, naturalistic studies involving musical ensemble performance, and neuroscientific investigations of how temporal adaptation and anticipation are linked in the human brain. The findings of this research program have implications for applications in music pedagogy and clinical settings.
Peter Keller conducts research that is aimed at understanding the behavioural and brain bases of human interaction in musical contexts. His specific interests include the cognitive and motor processes that enable ensemble musicians to coordinate with one another. Peter has served as Editor of the interdisciplinary journal ‘Empirical Musicology Review’ (2010-2012) and as a member of the Editorial Board at ‘Advances in Cognitive Psychology’ (2005-2015). He is currently an Associate Editor at ‘Royal Society Open Science’ and a Consulting Editor for ‘Music Perception’ and ‘Psychomusicology: Music, Mind, and Brain’.
Automatic beat and downbeat tracking
Sebastian Böck, Department of Computational Perception, Johannes Kepler University
Jason Hockman, Digital Media Technology Lab, Birmingham City University
Automatic analysis of rhythmic structure in music pieces has been an active research field since the 1970s. It is of prime importance for musicology and tasks such as music transcription, automatic accompaniment, expressive performance analysis, music similarity estimation, and music segmentation. In this talk I will give an overview of the state-of-the-art in computational rhythm description systems, with a special focus on beat and downbeat tracking. These systems are usually built, tuned, and evaluated with music of low rhythmic complexity. Based on difficult examples, I will analyse the capabilities of these systems, investigate their shortcomings, and discuss challenges and ideas for future research with regard to music of higher complexity.
Birgitta Burger, Finnish Centre for Interdisciplinary Music Research, University of Jyväskylä
This workshop will introduce the MoCapToolbox, a freely available Matlab toolbox for analyzing and visualizing motion capture data developed at the University of Jyväskylä, Finland. The toolbox offers about 70 functions to handle various kinds of mocap data. The workshop will give an overview on the structure of the toolbox and its data representations as well as an on the use of the toolbox for research and analysis purposes, covering general data handling, creating stick-figure images and animations, kinematic and kinetic analysis, and methods used for periodicity and synchronization estimation.
Exploring rhythm through the movements of the body
Arnould Massart, Conservatoire Royal de Bruxelles
Western music teaching methods have led to an abstract conception of musical rhythm. In many cases the basic link between rhythm and body experience is overlooked or simply ignored. This results in a global withering of rhythmic sense, feeling and, hence, of potential musical significance. An alternative approach is illustrated in this practical workshop where all participants can experiment the interactions between body movement and rhythm perception and production. We will step, clap, bounce, twist, and sway together in a circle, using our legs, our arms, and our voices to explore various rhythmic situations in a relaxed and creative atmosphere.
Bayesian approaches to time and timing
Darren Rhodes, Sackler Centre for Consciousness Science, University of Sussex
Max Di Luca, School of Psychology, University of Birmingham
Time is a fundamental dimension of perception, action and cognition. In recent years, Bayesian principles have been applied to the perception of time and timing. Bayesian models have been applied to a variety of temporal dimensions, including perceived event timing, duration, and the relative timing between two events. The fundamental principle of these models is that the perception of temporal properties is the result of combing temporal expectations with current sensory evidence. A broad distinction exists between summary and dynamic Bayesian models of time. Summary models have been tied to the internal clock model, whilst dynamic models have been connected to the idea that timing can be accomplished in the absence of clocks. The aim of this symposium is to bring together these different approaches to time in order to facilitate research and theory towards a general unified model of human time perception.