default search action
Chie Hieida
Person information
Refine list
refinements active!
zoomed in on ?? of ?? records
view refined list in
export refined list as
2020 – today
- 2024
- [i3]Kazuki Tsurumaki, Chie Hieida, Kazuki Miyazawa:
Study of Emotion Concept Formation by Integrating Vision, Physiology, and Word Information using Multilayered Multimodal Latent Dirichlet Allocation. CoRR abs/2404.08295 (2024) - 2023
- [j4]Chie Hieida, Tomoaki Yamamoto, Takatomi Kubo, Junichiro Yoshimoto, Kazushi Ikeda:
Negative emotion recognition using multimodal physiological signals for advanced driver assistance systems. Artif. Life Robotics 28(2): 388-393 (2023) - 2022
- [j3]Chie Hieida, Takayuki Nagai:
Survey and perspective on social emotions in robotics. Adv. Robotics 36(1-2): 17-32 (2022) - 2021
- [i2]Chie Hieida, Takayuki Nagai:
Survey and Perspective on Social Emotions in Robotics. CoRR abs/2105.09647 (2021) - 2020
- [j2]Chie Hieida, Kasumi Abe, Takayuki Nagai, Takashi Omori:
Walking Hand-in-Hand Helps Relationship Building Between Child and Robot. J. Robotics Mechatronics 32(1): 8-20 (2020) - [j1]Kasumi Abe, Takayuki Nagai, Chie Hieida, Takashi Omori, Masahiro Shiomi:
Estimating Children's Personalities Through Their Interaction Activities with a Tele-Operated Robot. J. Robotics Mechatronics 32(1): 21-31 (2020)
2010 – 2019
- 2018
- [c9]Chie Hieida, Takato Horii, Takayuki Nagai:
Decision-Making in Emotion Model. HRI (Companion) 2018: 127-128 - [c8]Chie Hieida, Takato Horii, Takayuki Nagai:
Toward Empathic Communication: Emotion Differentiation via Face-to-Face Interaction in Generative Model of Emotion. ICDL-EPIROB 2018: 66-71 - [c7]Anh-Tuan Nguyen, Chie Hieida, Takayuki Nagai:
A Model of Generating and Predicting Intention toward Human-Robot Cooperation. RO-MAN 2018: 113-120 - [c6]Chie Hieida, Takato Horii, Takayuki Nagai:
Emotion Differentiation based on Decision-Making in Emotion Model. RO-MAN 2018: 659-665 - [c5]Akihito Shimazu, Chie Hieida, Takayuki Nagai, Tomoaki Nakamura, Yuki Takeda, Takenori Hara, Osamu Nakagawa, Tsuyoshi Maeda:
Generation of Gestures During Presentation for Humanoid Robots. RO-MAN 2018: 961-968 - [i1]Chie Hieida, Takato Horii, Takayuki Nagai:
Deep Emotion: A Computational Model of Emotion Using Deep Neural Networks. CoRR abs/1808.08447 (2018) - 2017
- [c4]Chie Hieida, Takayuki Nagai:
A Model of Emotion for Empathic Communication. HRI (Companion) 2017: 133-134 - 2016
- [c3]Chie Hieida, Hiroaki Matsuda, Shunsuke Kudoh, Takashi Suehiro:
Action Elements of Emotional Body Expressions for Flying Robots. HRI 2016: 439-440 - 2014
- [c2]Kasumi Abe, Chie Hieida, Muhammad Attamimi, Takayuki Nagai, Takayuki Shimotomai, Takashi Omori, Natsuki Oka:
Toward playmate robots that can play with children considering personality. HAI 2014: 165-168 - [c1]Chie Hieida, Kasumi Abe, Muhammad Attamimi, Takayuki Shimotomai, Takayuki Nagai, Takashi Omori:
Physical embodied communication between robots and children: An approach for relationship building by holding hands. IROS 2014: 3291-3298
Coauthor Index
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.
Unpaywalled article links
Add open access links from to the list of external document links (if available).
Privacy notice: By enabling the option above, your browser will contact the API of unpaywall.org to load hyperlinks to open access articles. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Unpaywall privacy policy.
Archived links via Wayback Machine
For web page which are no longer available, try to retrieve content from the of the Internet Archive (if available).
Privacy notice: By enabling the option above, your browser will contact the API of archive.org to check for archived content of web pages that are no longer available. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Internet Archive privacy policy.
Reference lists
Add a list of references from , , and to record detail pages.
load references from crossref.org and opencitations.net
Privacy notice: By enabling the option above, your browser will contact the APIs of crossref.org, opencitations.net, and semanticscholar.org to load article reference information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Crossref privacy policy and the OpenCitations privacy policy, as well as the AI2 Privacy Policy covering Semantic Scholar.
Citation data
Add a list of citing articles from and to record detail pages.
load citations from opencitations.net
Privacy notice: By enabling the option above, your browser will contact the API of opencitations.net and semanticscholar.org to load citation information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the OpenCitations privacy policy as well as the AI2 Privacy Policy covering Semantic Scholar.
OpenAlex data
Load additional information about publications from .
Privacy notice: By enabling the option above, your browser will contact the API of openalex.org to load additional information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the information given by OpenAlex.
last updated on 2024-10-09 20:29 CEST by the dblp team
all metadata released as open data under CC0 1.0 license
see also: Terms of Use | Privacy Policy | Imprint