- 02.12.2016 bis 04.12.2016
»Model-Based Sonification / Interactive Functional Sound«
Model-Based Sonification (MBS) is a technique to sonify data based on the data’s inherent structure. In contrast to Parameter Mapping Sonification, in MBS the data serves to define dynamical systems alike physical model which users can excite interactively, in turn receiving the system response as the auditory representation. The ability to hear the inherent features of a dataset makes MBS a suitable option in exploratory data analysis where the explicit knowledge of the data is absent. The talk will demonstrate a technique called Particle Trajectory Sonification as an example of MBS to analyse cluster information of high dimensional data.
The term ‘Functional Sound’ defines the designed sound to serve specific purposes. Most sound can be regarded as a functional sound, e.g. language is used for communication, alarm belt is to trigger certain warning, movie soundtracks can enhance emotional experience. Our mission is to expand the spectrum which sound can be useful and interacted with. This talk will present several examples on how we can develop interactive sound systems that can serve particular functionalities, such as communicative tool for emotional expression, speech intelligibility augmentation.
ZKM | Institut für Bildmedien
Kamera: Frenz Jordt
Schnitt: Frenz Jordt
Liveschnitt: Martina Rotzal