Brief Research Overview
Human-robot interaction (HRI) is the study of how robots can
interact with humans in a flexible and smooth manner. The
SMART Lab's research on HRI started with work on assistive
robots for blind travelers when Dr. Min worked on the NSF/NRI
project at Carnegie Mellon University, and has continued
through our independent research on robotics and AI, including
collaborations with other research groups in interactive
computing, cognitive science, psychology, and human factors.
Our research on HRI covers a wide range of topics, including
human-multi robot systems, human-multi robot/swarm
interaction, human-robot teams, affective computing, social
robots, human-machine interfaces, and assistive technology and
robotics.
You can learn more about our current and past research on human-robot interaction below.
Human Multi-robot Systems (2018 - Present)
Description: Human multi-robot
systems and multi-human multi-robot interaction is a
relatively new area of research that focuses on the
interaction and collaboration between humans and multiple
robots. Well-designed systems can enable a team of humans and
robots to work together effectively on complex and
sophisticated tasks such as exploration, monitoring, and
search and rescue operations. The SMART Lab has gained
extensive knowledge in this area through research on
multi-robot systems and robot swarms, human-robot
interactions, and assistive technology/robotics. Currently, we
design algorithms and systems to enable multiple robots to
collaborate with each other in a distributed manner and
flexibly interact with humans in any situation, anywhere. We
also develop applications that can take advantage of human
multi-robot systems. Through this research, we envision a
future where anyone (e.g., people without experience in robot
control or people with disabilities) and robots (few or many)
can work together on various practical tasks.
Grant: NSF (IIS), NSF (CMMI)
People: Wonse
Jo, Go-Eum
Cha, Ruiqi
Wang, Jeremy
Pan, Revanth Krishna Senthilkumaran, Vishnunandan Venkatesh
Project Website: https://polytechnic.purdue.edu/ahmrs
Selected Publications:
- Ruiqi Wang, Dezhong Zhao, and Byung-Cheol Min, "Initial Task Allocation for Multi-Human Multi-Robot Teams with Attention-based Deep Reinforcement Learning", 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2023), Detroit, USA, October 1-5, 2023. Paper Link, Video Link
- Wonse Jo*, Ruiqi Wang*, Su Sun, Revanth Senthilkumaran, Daniel Foti, and Byung-Cheol Min (* equal contribution), "MOCAS: A Multimodal Dataset for Objective Cognitive Workload Assessment on Simultaneous Tasks", arXiv preprint 2210.03065. Paper Link, Video Link
- Ahreum Lee, Wonse Jo, Shyam Sundar Kannan, and Byung-Cheol Min, "Investigating the Effect of Deictic Movements of a Multi-robot", International Journal of Human-Computer Interaction, Vol 37, No. 3, pp. 197-210, 2021. Paper Link, Video Link
- Wonse Jo, Robert Wilson, Jaeeun Kim, Steve McGuire, and Byung-Cheol Min, "Toward a Wearable Biosensor Ecosystem on ROS 2 for Real-time Human-Robot Interaction Systems", 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Workshop on HMRS 2021: Cognitive and Social Aspects of Human Multi-Robot Interaction, Prague, Czech Republic, Sep 27 – Oct 1, 2021. Paper Link, Video Link, GitHub Link
- Tamzidul Mina, Shyam Sundar Kannan, Wonse Jo, and
Byung-Cheol Min, "Adaptive Workload Allocation for
Multi-human Multi-robot Teams for Independent and
Homogeneous Tasks", IEEE Access, Vol. 8, pp. 152697-152712,
2020. Paper Link, Video Link
Affective Computing (2019 - Present)
Description: The estimation of human
affective states such as emotional states and cognitive
workloads for effective human-robot interaction has gained
increased attention. The emergence of new robotics middleware
such as ROS has also contributed to the growth of HRI research
that integrates affective computing with robotics systems. We
believe that human affective states play a significant role in
human-robot interaction, especially human-robot collaboration,
and we are conducting various research on affective computing,
from framework design to dataset design/creation and algorithm
development. For example, we recently developed a ROS-based
framework that enables the simultaneous monitoring of various
human physiological and behavioral data and robot conditions
for human-robot collaboration. We also developed and published
a ROS-friendly multimodal dataset comprising physiological
data measured using wearable devices and behavioral data
recorded using external devices. Currently, we are exploring
machine learning and deep learning-based methods (e.g., using
Transformer) for real-time prediction of human affective
states.
Grant: NSF
People: Wonse
Jo, Go-Eum
Cha, Ruiqi
Wang, Revanth Krishna Senthilkumaran
Project Website: https://polytechnic.purdue.edu/ahmrs
Selected Publications:
- Go-Eum Cha, Wonse Jo, and Byung-Cheol Min, "Implications of Personality on Cognitive Workload, Affect, and Task Performance in Robot Remote Control", 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2023), Detroit, USA, October 1-5, 2023. Paper Link, Video Link
- Wonse Jo*, Ruiqi Wang*, Su Sun, Revanth Senthilkumaran, Daniel Foti, and Byung-Cheol Min (* equal contribution), "MOCAS: A Multimodal Dataset for Objective Cognitive Workload Assessment on Simultaneous Tasks", arXiv preprint 2210.03065. Paper Link, Video Link
- Ruiqi Wang*, Wonse Jo*, Dezhong Zhao, Weizheng Wang, Baijian Yang, Guohua Chen, and Byung-Cheol Min (* equal contribution), "Husformer: A Multi-Modal Transformer for Multi-Modal Human State Recognition", arXiv preprint 2209.15182. Paper Link, GitHub Link
- Go-Eum Cha and Byung-Cheol Min, "Correlation between Unconscious Mouse Actions and Human Cognitive Workload", 2022 ACM CHI Conference on Human Factors in Computing Systems - Late-Breaking Work, New Orleans, LA, USA, April 30–May 6, 2022. Paper Link, Video Link
- Wonse Jo, Robert Wilson, Jaeeun Kim, Steve McGuire, and Byung-Cheol Min, "Toward a Wearable Biosensor Ecosystem on ROS 2 for Real-time Human-Robot Interaction Systems", 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Workshop on HMRS 2021: Cognitive and Social Aspects of Human Multi-Robot Interaction, Prague, Czech Republic, Sep 27 – Oct 1, 2021. Paper Link, Video Link, GitHub Link
- Wonse Jo, Shyam Sundar Kannan, Go-Eum Cha, Ahreum Lee, and Byung-Cheol Min, "ROSbag-based Multimodal Affective Dataset for Emotional and Cognitive States", 2020 IEEE International Conference on Systems, Man and Cybernetics (SMC), Toronto, Canada, 11-14 October, 2020. Paper Link
Human-Delivery Robot Interaction (2019 -
Present)
Description: As delivery robots
become more capable and necessary for quick and economic
delivery of goods, there is increasing interest in using
robots for last-mile delivery. However, current research and
services involving delivery robots are still far from meeting
the growing demand in this area, let alone being fully
integrated into our lives. The SMART Lab investigates various
practical and theoretical topics in robot delivery, including
vehicle routing for drones, localization of a requested
delivery spot, and social interaction between package
recipients and delivery robots. To do this, we use
mathematical methods to solve optimization problems and
conduct experimental methods based on user studies. We expect
that this research will play a major role in enabling delivery
robots to deliver packages more intelligently and effectively,
like professional human couriers, and that it will improve
human-delivery robot interaction while increasing robot
autonomy.
Grant: Purdue University
People: Shyam Sundar Kannan, Ahreum Lee
Selected Publications:
- Shyam Sundar Kannan and Byung-Cheol Min, "Autonomous Drone Delivery to Your Door and Yard", 2022 International Conference on Unmanned Aircraft Systems (ICUAS), Dubrovnik, Croatia, June 21-24, 2022. Paper Link, Video Link
- Shyam Sundar Kannan and Byung-Cheol Min, "Investigation on Accepted Package Delivery Location: A User Study-based Approach", 2021 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Virtual, Melbourne, Australia, 17-20 October, 2021. Paper Link
- Shyam Sundar Kannan, Ahreum Lee, and Byung-Cheol Min, "External Human-Machine Interface on Delivery Robots: Expression of Navigation Intent of the Robot", 2021 30th IEEE International Conference on Robot & Human Interactive Communication (RO-MAN), Virtual, Vancouver, Canada, 8-12 August, 2021. Paper Link, Video Link
- Patchara Kitjacharoenchai, Byung-Cheol Min, and Seokcheon Lee, "Two Echelon Vehicle Routing Problem with Drones in Last Mile Delivery", International Journal of Production Economics, Vol. 25, 2020. Paper Link
Assistive Technology and Robots for People who are Blind or Visually Impaired (2014 - 18)
Description: The World Health Organization
(WHO) estimates that 285 million people in the world are
visually impaired, with 39 million being blind. While safe and
independent mobility is essential in modern life, traveling in
unfamiliar environments can be challenging and daunting for
people who are blind or visually impaired due to a lack of
appropriate navigation aid tools. To address this challenge,
the SMART Lab investigates practical and theoretical research
topics on human-machine interaction and human-robot
interaction in the context of assistive technology and
robotics. Our primary research goal is to empower people with
disabilities to safely and independently travel to and
navigate unfamiliar environments. To achieve this, we have
developed improved and appropriate navigation aid tools that
will enable visually impaired people to travel unfamiliar
environments safely and independently with minimal training
and effort. We have also introduced an indoor navigation
application for a blind user to request help based on
emergency and non-emergency situations.
Grants: Purdue University
People: Yeonju
Oh, Manoj
Penmetcha, Arabinda
Samantaray
Selected Publications:
- Yeonju Oh, Wei-Liang Kao, and Byung-Cheol Min, "Indoor Navigation Aid System Using No Positioning Technique for Visually Impaired People", HCI International 2017 - Poster Extended Abstract, Vancouver, Canada, 9-14 July, 2017. Paper Link, Video Link
- Manoj Penmetcha, Arabinda Samantaray, and Byung-Cheol Min, "SmartResponse: Emergency and Non-Emergency Response for Smartphone based Indoor Localization applications", HCI International 2017 - Poster Extended Abstract, Vancouver, Canada, 9-14 July, 2017. Paper Link
- Byung-Cheol Min, Suryansh Saxena, Aaron Steinfeld, and M. Bernardine Dias, “Incorporating Information from Trusted Sources to Enhance Urban Navigation for Blind Travelers", Robotics and Automation (ICRA), 2015 IEEE International Conference on, pp.4511-4518, Seattle, USA, May 26-30, 2015. (Paper Link)
- Byung-Cheol Min, Aaron Steinfeld, and M. Bernardine Dias, “How Would You Describe Assistive Robots to People Who are Blind or Low Vision?", Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction (HRI) Extended Abstracts, Portland, USA, Mar. 2-5, 2015. (Paper Link)
Assistive Technology and Robots for Children with Autism Spectrum Disorder (ASD) (2015 - 17)
Description: Autism spectrum disorder
(ASD) is one of the most significant public health concerns in
the United States and globally. Children with ASD have
impaired ability in social interaction, social communication,
and imagination and often have poor verbal ability. Several
approaches have been used to help them, including the use of
humanoid robots as a new tool for teaching them. Robots can
offer simplified physical features and a controllable
environment that are preferred by autistic children, as well
as a human-like conversational environment suitable for
learning about emotions and social skills. The SMART Lab is
designing a set of robot body movements that express different
emotions and a robot-mediated instruction prototype to explore
the potential of robots to teach emotional concepts to
autistic children. We are also studying a technical
methodology that can be easily deployed in the daily
environment of children with ASD and teach language to them at
low cost, based on embedded devices and semantic information
that can be extended to a cyber-physical system in the future.
This method will provide verbal descriptions of objects and
adapt the level of descriptions to the child's learning
achievements.
Grants: Purdue University
People: Huanhuan Wang, Pai-Ying Hsiao,
Sangmi Shin
Selected Publications:
- Sangmi Shin, Byung-Cheol Min, Julia Rayz, and Eric T. Matson, "Semantic Knowledge-based Language Education Device for Children with Developmental Disabilities", IEEE Robotic Computing (IRC) 2017, Taichung, Taiwan, April 10-12, 2017. Download PDF
- Huanhuan Wang, Pai-Ying Hsiao, and Byung-Cheol Min, "Examine the Potential of Robots to Teach Autistic Children Emotional Concepts", The Eight International Conference on Social Robotics (ICSR), Kansas City, USA, Nov. 1-3, 2016. Download PDF