23 January 2014
Motion in Musical Interaction
Meeting Room 10, 2nd Floor, JLB
13:00pm - 14:15pm
Dr Baptiste Camamiaux - Goldsmiths, University of London
In this talk, we will present our research on motion for expressive human-computer interaction using specific use cases from NIME. First we will theoretical aspect of motion and motion in interaction, aspects borrowed from cognitive sciences. Then we will focus on the design of motion-based interaction relying on advanced machine learning techniques. Such an approach gives powerful tools for understanding expressive aspects of gesture data and for linking gesture to sound in a meaningful way. Finally, we will present the advances in the design of interfaces for expressive musical performance that involves embedded sensors, multimodal inputs with biosensors.
Baptiste Caramiaux is a post-doctoral fellow at Goldsmiths, University of London, funded by the European Research Council (ERC) project, MetaGesture Music. He is interested in machine learning for interaction design, i.e. how a system that automatically learns expressive information is suitable for the design of the interaction. His research is specifically based on Embodied Cognition theory giving a predominant role of the body in the interaction. He is particularly interested in creative and artistic applications, for example designing new gesture-sound interaction for music. Previously to Goldsmiths, he received a PhD from IRCAM – Centre Pompidou and University Paris VI.
Save to your Calendar