Tangible Active Objects

Acronym: 
TAOs
Research Areas: 
B
C
Abstract: 

In this project a novel Human Computer Interaction (HCI) approach is developed. Small-sized reconfigurable robots, so called Tangible Active Objects (TAOs) with multi-modal input and output capabilities will act as an embodiment of data.
Thereby existing Tangible User Interfaces (TUIs) can be extended with active feedback capabilities or even completely new aplications can be developed.

 

Methods and Research Questions: 

The field of Tangible Interaction is a relatively new subfield of Human Computer Interaction (HCI) and still much fundamental research has to be done. Actuated Tangible User Interface Objects (TUIOs) are a very new concept where much effort has to be put into interaction design and concepts and user experience mainly by conducting user studies.

The basic approach of Tangible User Interfaces (TUIs) is to externalize digital data into the real world to make them naturally manipulable. Most table-top TUIs cannot easily represent dynamically changing scenarios, because they are not able to control physical properties of the used Tangible User Interface Objects (TUIOs), such as position and orientation. This is why we incorporated small-sized mobile platforms into our TUIOs, that we call Tangible Active Objects (TAOs).

Users are mostly not familiar with Human Interface Devices (HIDs) that actually move (beyond force feedback). Here our studies come into play. We want to investigate how users interact with such novel devices, how they perform in certain tasks and how they accept these new interaction concepts. For this we conduct user studies to measure performance, gather gestures made with the TAOs, and evaluate applications that implement our new concepts.

 

Outcomes: 

So far we presented a novel approach for combined auditory and haptic interactive rendering of scatter plots. Through Interactive Sonification and TAOs, it was possible to create a rich exploratory data analysis interface for the visually impaired. As a novel contribution we introduced a hybrid interaction schema where continuous density sonification and a model-based sonification using data sonograms are tightly interwoven to create a rich repertoire for exploratory interactions to create a rich multi-modal user interface. This system was evaluated in a first user study and was proved successfully to enable non-visual exploration of scatter plots.

Another approach combines the TAOs with gestural input for triggering actions. We conducted a study to investigate if, how, and which gestures are suitable for interacting with a social network over TAOs. We already found tendencies for suitable gestures and furthermore got valuable feedback from our subjects for improvements of our system design and considerations for our assumptions on the interaction design.

 

Publications: