EEG Sonification for enhanced analysis and diagnosis of epilepsy
Dynamically complex diseases with distributed and multi-scale interacting physiological rhythms require a more refined temporal investigation than is currently provided by visual displays of physiological data. We argue that sonification, the auditory inspection of experimental data, provides a unique approach to the representation of the temporal, spectral and spatial aspects of EEG data as it addresses the human sense of listening. The ear ́s capacity to detect temporal patterns of sameness and differences, of coincidence and coordination – widely exploited in listening to music and spoken language – creates new temporal perspectives in the scientific observer. Since listeners constantly apply auditory learning to establish concepts, this can be used for sound-based (differential) diagnostics of pathologies. Since the sound represents multivariate time series as they unfold in time, sonification can be used for peripheral monitoring tasks, leaving the eyes free for other activities. Since sonification enables a shift complex information otherwise presented visually to the auditory domain, it can be combined with visual display – or simply direct interaction with the patient – as multimodal tool. Over the past 15 years, we developed a multitude of methods for EEG sonification, from audification, via parameter-mapping sonifications to event-sonification and even vocal-EEG-sonification, which turn the multimodal series into an articulated vowel stream. This is not only easier to memorize but also enables the listener to reproduce heard patterns with their own vocal tract to facilitate communication (e.g. between doctors).
People involved
- Thomas Hermann
- Gerold Baier
Publications
Supplementary Material