Fill in this form to receive a download link:
Robots that interact with humans in everyday situations, need to be able to interpret the nonverbal social signals of their human interaction partners. We show that humans use body posture and head pose as social signals to initiate and terminate interaction when ordering drinks at a bar. For that, we record and analyze 108 interactions of humans interacting with a human bartender. Based on these findings, we train a Hidden Markov Model (HMM) using automatic body posture and head pose estimation. With this model, the bartender robot of the project JAMES can recognize typical social signals of human customers. Evaluation shows a recognition rate of 82.9 % for all implemented social signals and in particular a recognition rate of 91.2 % for bartender attention requests, which allows the robot to interact with multiple humans in a robust and socially appropriate way.