Universität BielefeldCITECAmbient Intelligence Group

MoveSound

2008/2009 by Till Bovermann.

MoveSound

MoveSound is an azimuth panning interface for up to 16 sources in a ring of an arbitrary number of loudspeakers. Both the position and width of arbitrary sound sources can be adjusted with it. By providing the user with an interface to select either one or more sources to operate on, the system allows controlling several such sources at the same time. Together with the integrated azimut- and width panning control, its functionality opens the field of dynamic sound spatialisation also to untrained users. MoveSound was designed as a software system that can be easily attached to human interface devices. Its usage scenarios are spatial control of unobtrusive ambient soundscapes, and dynamic spatial control of sound sources for artistic contexts.

Case Study

We conducted a case study to gain insights into the usefulness of MoveSound as an interface to control spatial parameters of soundscapes. Its primary goal was to find out, if the minimalist TAI realised with MoveSound is sufficient for people to control spatial distribution of sound, and how they feel in operating it. Although the survey was designed to be explorative, we particularly searched for indicators regarding the questions:

  • Do people understand MoveSound’s capabilities?
  • Do they experience any controlling limits?
  • Is there a difference in action that depends on the detail of the visual feedback system?

It turned out that people do understand MoveSound’s capabilities, as far as the challenges of this case study are concerned. In most cases, they experienced controlling limits only regarding the sound source selection, a problem we want to address in future extensions of the interface. Although all participants were able to fulfil the challenges to their satisfaction, we found differences in their action depending on the visual feedback system. While the full display exclusively attracted the user’s gaze, the reduced visual display made the participants’ gaze around. We believe that this turns them more to be present in the soundscape itself, then focusing on the model provided by the MoveSound interface.

 

People involved in the Production Process

Till Bovermann, René Tünnermann.

Contributions

For more information and videos, see TangibleAuditoryInterfaces.de.