When A Robot Politely Requests Your Attention

CITEC researcher studying socially appropriate behavior for assistance systems

In human interaction, what is perceived to be polite and appropriate depends on the situation. Assistance systems, however, do not make such a distinction. For people to accept assistive systems in their everyday lives – and not see these systems as nuisance – it helps if they behave in a socially appropriate manner. And this is what Dr. Ricarda Wullenkord studies.

Whether it is a cleaning robot or just the calendar in your smartphone that reminds you of an appointment, assistance systems are now playing a significant role in the daily lives of many people, and this is what Dr. Ricarda Wullenkord research delves into at Bielefeld University’s Cluster of Excellence CITEC. She is a member of the “Applied Social Psychology and Gender Research” group, which belongs to CITEC and the Department of Psychology at Bielefeld University. The research group is headed by Professor Dr. Friederike Eyssel.

Wullenkord is participating in “poliTE”, a joint project with researchers from the Cluster of Excellence CITEC at Bielefeld University as well as FokoS (Forschungskolleg ‘Zukunft menschlich gestalten’ [Research Center ‘Shaping the future’]) at the University of Siegen. They are studying the ways in which such assistance systems can behave in a socially appropriate manner – and what this even means in the first place.

“For one thing, most systems are very dominant and demand attention in every situation,” says Wullenkord. Assistance systems do not distinguish between different situations in their behavior. “A system like this is always chiming in,” explains the researcher. However, this may not always be appropriate: if, for example, you are giving a lecture or doing a job interview, it would make sense at that moment for the respective assistance system to move into the background on its own – without prior intervention by the user – and remain silent.

Investigating What Social Appropriateness Means   
“In this project, we first want to find out what social appropriateness actually is, and which rules govern it,” says the psychologist. For this, the researchers are first surveying academic literature on the subject of social appropriateness. “We are first doing basic research,” says Wullenkord. A second step following the project could be to apply the rules derived from this research to assistive systems in empirical studies.

“The question of what social appropriateness is often can’t even be answered in general terms,” explains Wullenkord. For example, it can simply be an issue of etiquette: “If people do not want to be disturbed while eating, then an assistive system should not interrupt them,” as Wullenkord explains.

The researchers have already analyzed more than 3,000 articles on the subject from a wide range of disciplines. “The topic doesn’t just come up in philosophy, sociology, or psychology,” says Wullenkord. Linguistics, for example, also deals with social appropriateness. The project will initially run until 2020. “Our goal is to have written a manual by then.”

This manual is meant to define which aspects are important for assistive systems so that their users will perceive them to be socially appropriate in their behavior. “Assistance systems are constantly evolving,” explains Wullenkord: a robot that cleans the apartment and helps with household chores should not be a disturbance.

Appropriateness Can Already Be Seen in Seemingly Small Things
A virtual fitness coach is meant to interact appropriately with its users. “It starts with things that sound minor at first,” says Wullenkord. It is important, for example, that the coach greets its users, saying things like “hi” and “bye.” “Beyond this, it is also of course important that the coach communicates with the athletes in a friendly manner and motivates them, instead of being disparaging.”

The CITEC researcher says that what is socially appropriate is usually that which is perceived to be normal in social interaction. "This has also been demonstrated by the findings of our literature survey to date,” says Wullenkord: “Common is moral.” The conversion of these findings into concrete behavioral instructions will ultimately be a task for computer scientists. “A goal in the future will be for robots to someday be able to learn what is socially appropriate on their own.” If this happens, this may help robots to be more readily accepted as companions in everyday life.

In order to gain further insight into this topic, a workshop will be held for experts at Bielefeld University’s Cluster of Excellence CITEC from 28–29 November (https://polite.fokos.de/workshop-2018).
At the workshop, researchers will also be discussing the question of what constitutes socially appropriate behavior and how this is codified in cultural behavioral practices. Perspectives from social psychology, cultural theory, philosophy, and anthropology as well as other disciplines will help shed light on the topic.

Dr. Ricarda Wullenkord, Bielefeld University
Cluster of Excellence CITEC / Faculty of Psychology and Sports Science
Telephone: +49 521 106-12133
Email: rwullenk@cit-ec.uni-bielefeld.de

Written by: Maria Berentzen