Nao programming II: Get social with Nao

Workshop coordinators: Erik Billing (University of Skövde) and Paul Baxter (Plymouth University)

Get social with Nao

In this workshop you will get into some real robot interaction. As a participant of the workshop, you will develop an interaction scenario where both humans and the Nao take part, and are to consider both how should the robot best behave (desiderata), and how can I implement it with Nao (constraints).

A number of Nao robots will be available for use during the workshop. We ask all participants to install required software (see below) on their own computers prior to the workshop. 

Please read the following scenario descriptions and consider what you would like to focus on during the workshop. We encourage all participants to read the related literature for the selected scenario prior to the workshop, this leaves more time for practical work during the workshop.

You may choose a scenario from the following three options:

Companion Nao I: Tactile Buddy

Tactile communication is important for humans, especially for children. In this scenario we look at how a robot should react to touch, and how this is perceived by humans. How can a robot differentiate between a friendly pet or an aggressive push, and should it?

You are to use the existing programming components for the Nao robot to get some tactile interaction going. Think about what HRI aspects come in to play, take ground in the resources provided below, and suggest how open questions could be investigated in a longer study.

Companion Nao II: Desktop Buddy

Could the Nao robot be a modern form of the old time butler delivering your mail? It could certainly do some!

Nao will here connect to your email and inform you about incoming messages. But what exactly is the best way for a robot to call your attention. Should it read the email out loud, or simply make a fine gesture with the arm? Motivate your choice of action from an HRI perspective, and implement it to see how it works!

There are a number of different aspects that could be considered when designing the behaviour of the robot. The decision for each aspect should influence your designs. These aspects include, but are not limited to:

  1. Context of your envisaged application (home, office, public,…),
  2. Role of the robot with respect to you (assistant, friend/peer, ‘mere’ machine),
  3. The desired personality of the robot (e.g. assertive/dominant, reticent/shy,…),
  4. Sensitivity (to what cues should the robot be sensitive?).

There are many related published works that touch on these issues (e.g. persuasiveness). For example, Sidner et al. (2005) and Chidambaram et al. (2012) showed the important role of robot non-verbal cues for persuading people, with Ham et al. (2011) showing the particular role for gaze. Also, the type robot behaviour (human-like/machine-like) brings with it some trade-offs (e.g. Hinds et al., 2004).

Nao and you: Design your own scenario

You may propose your own scenario that for example link to some of your own research. The scenario does not need to be complex from a programming point of view, but it should have a social component and comprise an some interesting interaction between Nao and humans.

Please confirm with the workshop leaders that the scenario is suitable and realistic to, at least partly, implement within the scope of the workshop.

Prerequisites

Familiarity with programming Nao, either by attending the first Nao workshop or from previous experience. We recommend that you also have some prior programming skills.

Participants are welcome to use any language for programming the Nao during this workshop. Some software components will be provided, implemented using Python and Choregraphe. When necessary, we will also look at how core written in different languages can be linked using NaoQI modules, and how software can be distributed over several computers.

Preparations

Before attending the workshop, install the following software on your computer:

Make sure to download the right version of Choreograph/NaoQi, other version may not work with the Nao robots available during the workshop. In order to access software from Aldebaran, you need to create an Aldebaran account, which is free. We recommend that each participant of the workshop have access to an account, since this is also necessary for accessing the documentation and other resources useful when programming Nao.

NOTE: In order to download the NaoQI Python SDK, you need a physical Nao robot registered with your account. For those of you who do not have your own robot, NaoQi will be provided for installation during the summer school.

Documentation

The middleware for Nao is called NaoQI, this is the software library used when programming Nao, and what Choreograph uses under the hood to communicate with Nao.

The NaoQI documentation and API are very useful when programming the Nao. I find these a bit messy to navigate so here are some guidelines:

The documentation of the Python SDK is limited, to say the least, but it does match the C++ specifications. Please refer to the lists of C++ Classes and Functions. If you are looking for how to implement a specific functionality, these are probably the best places to look.

Examples

Before you go ahead too far and spend time on implementations, please have a look at the following examples:

Email Butler (download)

A simple Choreograph model that makes Nao listen for emails and react when a new email arrives. Can be used as a starting point for Desktop Budy scenario.

Hi Five! (download)

A Choreograph model that makes now do Hi Five! The ALMotionProxy is used to listen to changes in the joint angle of the elbow, allowing the robot to react when the five is given. Can be used as a starting point for the Tactile Body scenario.

Postures (download)

A simple Python script that use the NaoQI API to connect to Nao and set a specified posture.

Keyboard module (download)

A NaoQI module written in Python, allowing a user to type in some text in response to a question. The module can be called, as a remote procedure call (RPC), from a Python Box in Choreograph, or from any other program using the NaoQI API, even when running on a different computer.

Resources for Nao Companion I

Please refer to Cooney et al. (2014) and Gibbons et al. (2012) for a background to this scenario.

Below is a standard index for types of touch (Weiss & Niemann, 2011):

Squeezing, Stroking, Rubbing, Pushing, Pulling, Pressing, Patting, Tapping, Shaking, Pinching, Trembling, Poking, Hitting, Scratching, Massaging, Tickling, Slapping, Lifting, Picking, Hugging, Finger interlocking, Swinging, Tossing.

In an on-going study (Andreasson et al. 2015), participants were asked to touch Nao in order to communicate some emotion, e.g. anger, love, friendship. Touches marked in bold have specifically been observed when interacting with Nao.

In the same study, people touched shoulders, arms, back, chest, cheeks, hands, and fingers of Nao. Note that none of the tactile sensors of Nao, located on top of the head, on the toes, and at the back of the hands, were touched during these interactions!

References

R. Andreasson, B. Alenljung, & R. Lowe (2015) Communicating emotions to a robot via tactile interaction. Ongoing work. University of Skövde, Skövde, Sweden.

Chidambaram, V., Chiang, Y.-H., & Mutlu, B. (2012). Designing Persuasive Robots: How Robots Might Persuade People Using Vocal and Nonverbal Cues. In 7th Annual ACM/IEEE International Conference on Human-Robot Interaction (HRI ’12) (pp. 293–300). doi:10.1145/2157689.2157798

Cooney, M., Nishio, S., & Ishiguro, H. (2014). Affectionate Interaction with a Small Humanoid Robot Capable of Recognizing Social Touch Behavior. ACM Transactions on Interactive Intelligent Systems, 4(4), 1–32. doi:10.1145/2685395

Gibbons, P., Dahl, T. S., & Jones, O. (2012). Identification and Production of “simple” Tactile Gestures. In 7th ACM/IEEE International Conference on Human Robot Interaction (HRI2012)Workshop on Advances in Tactile Sensing and Touch based Human-Robot Interaction. Boston, USA. doi:10.13140/2.1.3165.3441

Ham, J.; Bokhorst, R.; Cuijpers, R.; van der Pol, D. & Cabibihan, J.-J. (2011), Making Robots Persuasive: The Influence of Combining Persuasive Strategies (Gazing and Gestures) by a Storytelling Robot on Its Persuasive Power. Social Robotics. Lecture Notes in Computer Science Volume 7072, pp 71-83.

Hinds, P., Roberts, T., & Jones, H. (2004). Whose Job Is It Anyway? A Study of Human-Robot Interaction in a Collaborative Task. Human-Computer Interaction, 19(1), 151–181. doi:10.1207/s15327051hci1901&2_7

Sidner, C. L., Lee, C., Kidd, C. D., Lesh, N., & Rich, C. (2005). Explorations in engagement for humans and robots. Artificial Intelligence, 166(1-2), 140–164. doi:10.1016/j.artint.2005.03.005

Weiss S. J. & Niemann, S. K. (2011) Measurement of touch behavior. Hertenstein, M. J. & Weiss, S. J. (Eds.) The handbook of Touch: Neuroscience, Behavioral and health perspectives (p. 245-270).

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s