hh.sePublications
Planned maintenance
A system upgrade is planned for 24/9-2024, at 12:00-14:00. During this time DiVA will be unavailable.
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Importance of Touch for Conveying Affection in a Multimodal Interaction with a Small Humanoid Robot
Hiroshi Ishiguro Laboratory, Advanced Telecommunications, Research Institute International (ATR), 2-2-2 Hikaridai, Keihanna Science City, Kyoto 619-0288, Japan & Department of Systems Innovation, Graduate School of Engineering Science, Osaka University, 1-3 Machikaneyama, Toyonaka, Osaka, Japan.ORCID iD: 0000-0002-4998-1685
Hiroshi Ishiguro Laboratory, Advanced Telecommunications, Research Institute International (ATR), 2-2-2 Hikaridai, Keihanna Science City, Kyoto, Japan.
Hiroshi Ishiguro Laboratory, Advanced Telecommunications, Research Institute International (ATR), 2-2-2 Hikaridai, Keihanna Science City, Kyoto 619-0288, Japan Department of Systems Innovation, Graduate School of Engineering Science, Osaka University, 1-3 Machikaneyama, Toyonaka, Osaka, Japan.
2015 (English)In: International Journal of Humanoid Robotics, ISSN 0219-8436, Vol. 12, no 1, article id 1550002Article in journal (Refereed) Published
Abstract [en]

To be accepted as a part of our everyday lives, companion robots will require the capability to communicate socially, recognizing people's behavior and responding appropriately. In particular, we hypothesized that a humanoid robot should be able to recognize affectionate touches conveying liking or dislike because (a) a humanoid form elicits expectations of a high degree of social intelligence, (b) touch behavior plays a fundamental and crucial role in human bonding, and (c) robotic responses providing affection could contribute to people's quality of life. The hypothesis that people will seek to affectionately touch a robot needed to be verified because robots are typically not soft or warm like humans, and people can communicate through various other modalities such as vision and sound. The main challenge faced was that people's social norms are highly complex, involving behavior in multiple channels. To deal with this challenge, we adopted an approach in which we analyzed free interactions and also asked participants to rate short video-clips depicting human–robot interaction. As a result, we verified that touch plays an important part in the communication of affection from a person to a humanoid robot considered capable of recognizing cues in touch, vision, and sound. Our results suggest that designers of affectionate interactions with a humanoid robot should not ignore the fundamental modality of touch.

Place, publisher, year, edition, pages
Singapore: World Scientific, 2015. Vol. 12, no 1, article id 1550002
Keywords [en]
Affection, humanoid robot, touch; multimodal, human–robot interaction
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
URN: urn:nbn:se:hh:diva-27442DOI: 10.1142/S0219843615500024ISI: 000351225400002Scopus ID: 2-s2.0-84928501145OAI: oai:DiVA.org:hh-27442DiVA, id: diva2:777609
Available from: 2015-01-08 Created: 2015-01-08 Last updated: 2021-05-11Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Cooney, Martin

Search in DiVA

By author/editor
Cooney, Martin
Electrical Engineering, Electronic Engineering, Information Engineering

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 88 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf