Brief Research Overview  

Human-robot interaction (HRI) is the study of how robots can flexibly and smoothly interact with humans. The SMART Lab's research on HRI began with work on assistive robots for blind travelers when Dr. Min worked for the NSF/NRI project at Carnegie Mellon University, and has continued throughout our independent research on robotics and AI, including collaborations with other research groups in interactive computing, cognitive science, psychology, and human factors. Our research in HRI disciplines is both in-depth and broad, with topics ranging from human multi-robot systems, human multi-robot/swarm interaction, human-robot teams, affective computing, social robots, human-machine interfaces, and assistive technology and robotics.

You can learn more about our current and past research on Human-Robot Interaction below.

Human Multi-robot Systems (2018 - Present)

  

Description:  Human multi-robot systems and multi-human multi-robot interaction constitute a relatively new area of research focused on interaction and collaboration between humans and multiple robots. Well-designed systems can enable a team of humans and robots to effectively work together on complex and sophisticated tasks such as exploration, monitoring, and search and rescue operations. The SMART Lab has accumulated considerable knowledge in this area while studying multi-robot systems and robot swarms, human-robot interactions, and assistive technology/robotics. Currently, we design algorithms and systems to enable multiple robots to collaborate with each other in a distributed way and flexibly interact with any humans, in any situation, anywhere; we also develop applications that can leverage the advantages of human multi-robot systems. Through this research, we anticipate a future where anyone (e.g. people without any experience in robot control, or people with disabilities) and robots (few or countless) can work together on various practical tasks.

Grant: NSF (IIS), NSF (CMMI)
People: Wonse Jo, Go-Eum Cha, Ruiqi Wang, Jeremy Pan, Revanth Krishna Senthilkumaran, Vishnunandan Venkatesh
Project Website: https://polytechnic.purdue.edu/ahmrs

Selected Publications:

  • Wonse Jo*, Ruiqi Wang*, Su Sun, Revanth Senthilkumaran, Daniel Foti, and Byung-Cheol Min (* equal contribution), "MOCAS: A Multimodal Dataset for Objective Cognitive Workload Assessment on Simultaneous Tasks", arXiv preprint 2210.03065. Paper Link, Video Link
  • Ahreum Lee, Wonse Jo, Shyam Sundar Kannan, and Byung-Cheol Min, "Investigating the Effect of Deictic Movements of a Multi-robot", International Journal of Human-Computer Interaction, Vol 37, No. 3, pp. 197-210, 2021. Paper Link, Video Link
  • Wonse Jo, Robert Wilson, Jaeeun Kim, Steve McGuire, and Byung-Cheol Min, "Toward a Wearable Biosensor Ecosystem on ROS 2 for Real-time Human-Robot Interaction Systems", 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Workshop on HMRS 2021: Cognitive and Social Aspects of Human Multi-Robot Interaction, Prague, Czech Republic, Sep 27 – Oct 1, 2021. Paper Link, Video Link, GitHub Link
  • Tamzidul Mina, Shyam Sundar Kannan, Wonse Jo, and Byung-Cheol Min, "Adaptive Workload Allocation for Multi-human Multi-robot Teams for Independent and Homogeneous Tasks", IEEE Access, Vol. 8, pp. 152697-152712, 2020. Paper Link, Video Link
Affective Computing (2019 - Present)

Description:  The estimation of human's affective states such as emotional states and cognitive workloads for effective human-robot interaction has been gaining increased interest. The emergence of new robotics middleware such as ROS has also played a larger role in growing the variety of HRI research to integrate the robotics system with the affective computing. We believe that human's affective states play an important role in human-robot interaction, especially human-robot collaborations, and accordingly we currently conduct various research on affective computing, from framework design to dataset design/creation and estimation algorithm development. For example, we recently developed a ROS-based framework that enables to monitor the various human physiological and behavioural data and the robot conditions and to share them simultaneously for human robot collaborations. We also developed and published ROS-friendly multimodal dataset comprising physiological data measured using wearable devices and behavioral data recorded using external devices. Currently, we explore machine learning and deep learning-based methods (e.g., using Transformer) for predicting human's affective states in real-time.

Grant: NSF
People: Wonse Jo, Go-Eum Cha, Ruiqi Wang, Revanth Krishna Senthilkumaran
Project Website: https://polytechnic.purdue.edu/ahmrs

Selected Publications:

  • Wonse Jo*, Ruiqi Wang*, Su Sun, Revanth Senthilkumaran, Daniel Foti, and Byung-Cheol Min (* equal contribution), "MOCAS: A Multimodal Dataset for Objective Cognitive Workload Assessment on Simultaneous Tasks", arXiv preprint 2210.03065. Paper Link, Video Link
  • Ruiqi Wang*, Wonse Jo*, Dezhong Zhao, Weizheng Wang, Baijian Yang, Guohua Chen, and Byung-Cheol Min (* equal contribution), "Husformer: A Multi-Modal Transformer for Multi-Modal Human State Recognition", arXiv preprint 2209.15182. Paper Link, GitHub Link
  • Go-Eum Cha and Byung-Cheol Min, "Correlation between Unconscious Mouse Actions and Human Cognitive Workload", 2022 ACM CHI Conference on Human Factors in Computing Systems - Late-Breaking Work, New Orleans, LA, USA, April 30–May 6, 2022. Paper Link, Video Link
  • Wonse Jo, Robert Wilson, Jaeeun Kim, Steve McGuire, and Byung-Cheol Min, "Toward a Wearable Biosensor Ecosystem on ROS 2 for Real-time Human-Robot Interaction Systems", 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Workshop on HMRS 2021: Cognitive and Social Aspects of Human Multi-Robot Interaction, Prague, Czech Republic, Sep 27 – Oct 1, 2021. Paper Link, Video Link, GitHub Link
  • Wonse Jo, Shyam Sundar Kannan, Go-Eum Cha, Ahreum Lee, and Byung-Cheol Min, "ROSbag-based Multimodal Affective Dataset for Emotional and Cognitive States", 2020 IEEE International Conference on Systems, Man and Cybernetics (SMC), Toronto, Canada, 11-14 October, 2020. Paper Link
Human-Delivery Robot Interaction (2019 - Present)

 

Description:  As delivery robots become more able and also more necessary in delivering goods both quickly and economically, interest in utilizing robots for last-mile delivery also increases. However, existing research endeavors and services involving delivery robots still remain far from adequate to meet the growing demand in this area, not to mention far short of being fully incorporated into our lives. The SMART Lab explores various practical and theoretical topics in robot delivery, including vehicle routing for drones, localization of a requested delivery spot, and social interaction between package recipients and delivery robots. To this end, we use mathematical methods to tackle optimization problems and conduct experimental methods based on user studies. We expect that this research will play a major role in allowing delivery robots to deliver packages more intelligently and effectively, like a professional human courier, and that it will improve human-delivery robot interaction while increasing robot autonomy.

Grant: Purdue University
People: Shyam Sundar Kannan, Ahreum Lee

Selected Publications:

  • Shyam Sundar Kannan and Byung-Cheol Min, "Autonomous Drone Delivery to Your Door and Yard", 2022 International Conference on Unmanned Aircraft Systems (ICUAS), Dubrovnik, Croatia, June 21-24, 2022. Paper Link, Video Link
  • Shyam Sundar Kannan and Byung-Cheol Min, "Investigation on Accepted Package Delivery Location: A User Study-based Approach", 2021 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Virtual, Melbourne, Australia, 17-20 October, 2021. Paper Link
  • Shyam Sundar Kannan, Ahreum Lee, and Byung-Cheol Min, "External Human-Machine Interface on Delivery Robots: Expression of Navigation Intent of the Robot", 2021 30th IEEE International Conference on Robot & Human Interactive Communication (RO-MAN), Virtual, Vancouver, Canada, 8-12 August, 2021. Paper Link, Video Link
  • Patchara Kitjacharoenchai, Byung-Cheol Min, and Seokcheon Lee, "Two Echelon Vehicle Routing Problem with Drones in Last Mile Delivery", International Journal of Production Economics, Vol. 25, 2020. Paper Link
Assistive Technology and Robots for People who are Blind or Visually Impaired (2014 - 18)

navigation_yj  navigation

Description: World Health Organization (WHO) estimates that 285 million people in the world are visually impaired, of whom 39 million are blind. Although safe and independent mobility is a critical element of modern life, traveling in unfamiliar environments can be challenging and often daunting for people who are blind or visually impaired due to the lack of appropriate navigation aid tools. To address this challenge, the SMART Lab explores practical and theoretical research topics on human-machine interaction and human-robot interaction in the context of assistive technology and robotics. Our primary research goal is to empower people with disabilities to safely and independently travel to and navigate unfamiliar environments. To this end, we developed improved and appropriate navigation aid tools that will enable visually impaired people to travel unfamiliar environments safely and independently with minimal training and effort. We also introduced indoor navigation application for a blind user to request help based on emergency and non-emergency situation.

Grants: Purdue University
People: Yeonju Oh, Manoj Penmetcha, Arabinda Samantaray

Selected Publications:

  • Yeonju Oh, Wei-Liang Kao, and Byung-Cheol Min, "Indoor Navigation Aid System Using No Positioning Technique for Visually Impaired People", HCI International 2017 - Poster Extended Abstract, Vancouver, Canada, 9-14 July, 2017. Paper Link, Video Link
  • Manoj Penmetcha, Arabinda Samantaray, and Byung-Cheol Min, "SmartResponse: Emergency and Non-Emergency Response for Smartphone based Indoor Localization applications", HCI International 2017 - Poster Extended Abstract, Vancouver, Canada, 9-14 July, 2017. Paper Link
  • Byung-Cheol Min, Suryansh Saxena, Aaron Steinfeld, and M. Bernardine Dias, “Incorporating Information from Trusted Sources to Enhance Urban Navigation for Blind Travelers", Robotics and Automation (ICRA), 2015 IEEE International Conference on, pp.4511-4518, Seattle, USA, May 26-30, 2015. (Paper Link)
  • Byung-Cheol Min, Aaron Steinfeld, and M. Bernardine Dias, “How Would You Describe Assistive Robots to People Who are Blind or Low Vision?", Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction (HRI) Extended Abstracts, Portland, USA, Mar. 2-5, 2015. (Paper Link)
Assistive Technology and Robots for Children with Autism Spectrum Disorder (ASD) (2015 - 17)

Description: Autism spectrum disorder (ASD) is one of the most significant public health concerns in the United States and also worldwide. Children with ASD have impaired ability in social interaction, social communication, and imagination and are also often lack in verbal ability. Several approaches have been used to help them, among which humanoid robot is emerging as a new tool to teach them recently, since it could offer more simplified physical features and controllable environment which is preferred by autistic children. At the same time, a robot could offer a human-friendly conversational environment which is appropriate to emotion and social skills learning. The SMART Lab is trying to design a set of robot body movements which is supposed to express different emotions and a robot-mediated instruction prototype to explore the potential of robots to teach emotional concepts to autistic children. We also study a technical methodology that can be easily deployed in a daily environment of children with ASD and teach language to them with low cost, based on embedded devices and semantic information which can be extended to a cyber-physical system in the future. This method will provide verbal descriptions of objects and also adapt the level of descriptions to the child's learning achievements.

Grants: Purdue University
People: Huanhuan Wang, Pai-Ying Hsiao, Sangmi Shin

Selected Publications:

  • Sangmi Shin, Byung-Cheol Min, Julia Rayz, and Eric T. Matson, "Semantic Knowledge-based Language Education Device for Children with Developmental Disabilities", IEEE Robotic Computing (IRC) 2017, Taichung, Taiwan, April 10-12, 2017. Download PDF
  • Huanhuan Wang, Pai-Ying Hsiao, and Byung-Cheol Min, "Examine the Potential of Robots to Teach Autistic Children Emotional Concepts", The Eight International Conference on Social Robotics (ICSR), Kansas City, USA, Nov. 1-3, 2016. Download PDF