top of page

Robotics Track
Track Chair: Hongwei Hsiao, Texas A&M University
Track Sponsor:  Liberty Mutual Insurance

Hybrid Workforce in Robotics at Workplace

  • Exploring current hybrid workforce in robotics and future of work

  • Uncovering human-robot communication technologies for safety and efficiency

  • Finding insight in robotics methods and various applications

  • Discussing with experts on challenges and best practices in hybrid robotics for inclusiveness, such as trustworthy and safety awareness of robotics and worker stress

  • Networking for partnership on hybrid robotics research for worker health and well-being, such as remote operation, virtual simulation, mixed work scheduling for workers

Keynote Presenter
Nia Jetter Headshot.jpg
Human and Robot, a Collaborative Workforce

Nia Jetter, Amazon

Nia Jetter is passionate about changing the world through innovation, technology planning, teaching, mentoring and solving tough problems in Autonomy and AI that can be applied across different platforms. She has a dedicated focus on helping people who may not have easy access to educational materials to understand topics like artificial intelligence.  Nia is enthusiastic about working the human-AI interface as artificial intelligence is further integrated into our society. 


Nia is an Aerospace Engineer who has 20 years of experience in the Aerospace Industry and has developed key algorithms in supporting a variety of programs across the product-lifecycle from design and development to mission and anomaly resolution and through customer delivery and support. In January 2021, Nia left the Aerospace Industry as a Technical Fellow to join Amazon as a Senior Principal Technologist in Robotics. In this role as a leader in technical development for autonomy as well as strategic planning for robotics and other autonomous applications, Nia has so far led an Autonomous Mobile Robot that will be deployed in an unstructured environment through Preliminary Construction Review for Safety Certification as well as establishing best engineering practices for the product as Chief Engineer.


Nia has a bachelor’s degree in Math with Computer Science and a minor in Earth Atmospheric and Planetary Sciences from MIT as well as a Master’s Degree in Aeronautical and Astronautical Engineering from Stanford. Nia enjoys reading (especially science fiction), astronomy, baking, travelling and dancing. For more information, please see her website:

Track Program
*Please see the Presenters & Bios Page for presenter biographies.

Saturday, October 15th


8:30 AM - 9:30 AM

10:30 AM - 11:30 AM

1:00 PM - 2:00 PM

2:00 PM -3:00 PM

3:30 PM - 4:30 PM



Trustworthiness of Human-Robot Interface
Moderator: Robert Radwin, University of Wisconsin

Toward A Safer Work Environment: Robot Posture Adaptation in Human-Robot Collaboration

Karen Chen, North Carolina State University

While collaborative robots (cobots) possess various safety features, such as limited end effector speed and torque sensors, the OSHA Severe Injury Reports database has documented a number of robot-related mishaps where one-third of those occurred during normal, uninterrupted automated operations. This research aims to mitigate human-robot collisions through computer vision human activity recognition. A collision avoidance scheme employing a two-level hazard zone was defined around a cobot’s end-effector. Upon the detection of a worker, the worker’s position and movement speed determined by Videopose3D (computer vision algorithm) activated a series of visible and audible warnings and cobot end-effector retraction. The evaluation of this collision avoidance scheme demonstrated successful end-effector retraction and increased intensity and frequency the visual and auditory warning as a worker continued to approach the cobot. In sum, this collision avoidance is a plausible safety mechanism, but it should be noted that it is primarily designed to be active when interactions between a worker and a cobot was not supposed to take place.

Ergonomics in Robotic-Assisted Manufacturing and Remanufacturing

Boyi Hu, University of Florida

Traditionally, e-waste recycling is labor intensive, and represents multiple safety threat to workers. To reduce safety risk and enhance working efficiency, collaborative robots (cobots) might be a viable option. Therefore, the feasibility of deploying cobots in e-waste disassembly operations needs to be investigated. The major objective of this study is to evaluate the effects of working with a cobot during e-waste disassembly processes on human workload and ergonomics through a lab based human subject experiment. Statistical results revealed that using a cobot to assist a desktop disassembly task reduced the sum of the NASA-TLX scores significantly, compared to disassembling by the human work only condition (p = 0.001). A significant reduction was observed in participants’ mean L5/S1 flexion angle as well as mean shoulder flexion angle on both sides, when working with the cobot (p < 0.001). However, participants took a significantly longer time to accomplish the disassembly task when working with the cobot (p < 0.001). Results from this study could advance the knowledge of how human workers wound behave and react during human-robot collaborative e-waste disassembly tasks, and shed light on the design of better HRC for this specific context.

Understanding Safety and Trust of Human-Robot Interaction

Marvin Cheng, NIOSH

Multi-robot-multi-worker (MRMW) workspaces have become common in manufacturing and transportation industries, which also make safe and effective human-robot interaction in these collaborative environments a critical issue. Unfortunately, safety standards related to industrial robotic applications nowadays still have a huge gap between regulated operations and working efficiency of robotic devices while human workers are present in the same workspace. Human-robot trust can affect the performance in such an environment. To ensure a safe collaborative environment for human workers, safe human-robot interaction (HRI), proactive collision-prevention strategies between robotic devices and human workers, and analysis of robot dynamics of robotic devices in human-robot collaboration are the major research activities in the Robotics Research Lab at NIOSH. By integrating machine vision and deep learning approach of motion recognition, collaborative robotic devices can actively avoid unscheduled contacts to prevent potential injuries. With the assistance of biosensor measurements, quantitative model of human-trust level can be investigated.

Human-Drone Collaboration
Moderator: Craig Schlenoff, NIST

Safe Human-Drone Collaboration in Construction

Masoud Gheisari, University of Florida

We have seen a tremendous increase in the use of drones on construction sites. While beneficial, integrating drones in construction raises novel occupational safety and health issues for construction workers, which are critical to identify, understand, and evaluate. Risks arise from unintended physical contact between drones and human workers or the cognitive interaction between workers and drones that may affect workers’ attentional and psychological states. Every interaction, cognitive or physical, creates the potential for an accident. This presentation discusses a series of studies on the safety challenges of human workers working with or near drones. We first envision a future of construction work where human workers constantly interact and collaborate with these aerial robots to do their work on construction jobsites. We then discuss the novel health and safety risks that result from these worker-drone interactions. We will mainly focus on the physical risks, attentional costs, and psychological impacts of such interactions on human workers who work directly or indirectly with or around these robots. Finally, we will provide recommendations and guidelines to ensure the safe integration of these robots in construction.

The Lack of Realistic Workload Models for Single Human Supervising Multiple (Semi)Autonomous Drones

Julie A. Adams, Oregon State University

Models of a single human supervising multiple (semi)autonomous drones were developed in order to provide insight into the factors that impact human performance, identify research gaps, and inform future human-in-the-hardware loop evaluations. A loosely coupled delivery drone task and a tightly coupled wildland fire fighting task were developed and modeled. Neither the human factors or the human-robot interaction literature provides concrete evidence of realistic workload models to support model development for multiple drone scenarios, rather the literature focuses on generic enroute scenarios. This presentation will discuss limitations of existing models and some associated gaps to be addressed in order to accurately model realistic applied domains.

Human-Machine Interaction Strategies For Controlling Unmanned Aerial Vehicles

Jose Baca, Texas A&M University

Through the years, Unmanned Autonomous Systems (UAS) such as ground robots and aerial robots, a.k.a. drones, have gained increasing popularity within the Science and Engineering field. They can be used for exploration and data acquisition over areas that are difficult to access and/or for wide variety of missions. UAS have proven to be useful in terms of human-risk reduction and time consumption. Previous studies have shown the feasibility for one operator to control one robot, as well as one operator to control multiple robots via a joystick or a predefine waypoint navigation. However, when it comes to controlling a multi-robot system, in real-time, by a single operator, several challenges arise, e.g., it is difficult to control the direction of a team of robots or a swarm, the right timing for the execution of tasks, or simply selecting certain robots within the team. In this research project, we propose the development of an intuitive way to control multi-robot system via Human Body Language. We have defined a Human-Machine interaction strategy where body posture and gestures will be used to control the UAS team in a more intuitive manner, this is called semi-autonomous mode. The focus of this research is to broaden the capabilities of UAS and their integration within our everyday lives. The outcome of the project can easily be integrated into other fields such as in search and rescue operations and private companies for monitoring and inspecting large facilities.

Human-Centered Design
Moderator: Lixiao Huang, Arizona State University

Advanced Robotics and its Impact on Safety and Health at Work

Sascha Wischniewski, Federal Institute for Occupational Safety and Health, Germany

The impact of new technologies and digitalization on occupational safety and health (OSH) has become an increasingly researched topic over the last decades. The introduction of systems to provide physical work assistance like advanced robotics has changed modern workplaces. Innovations in AI-based software has also further increased the breadth of possible applications. Advanced robotic systems can nowadays perform increasingly complex physical tasks, with more autonomy than previous technologies. This also expands the range of OSH dimensions that need to be considered when working with them.


Three major OSH dimensions within a workplace context can be distinguished: physical, psychosocial and organizational. The effects of the technology application can impose chances and challenges. Physical OSH impacts are most commonly observed as a result of physical alterations to the workplace integrated by new technology. These can be the increase of workplace safety and hazard reductions, but also new risks emerging from unexpected malfunctions or collisions. Organizational effects are most often related to and dependent on the technology introduction process, change management and training. Positive effects can be achieved through employee participation and fostering upskilling, while deskilling and misuse can result from a mishandling of these processes. The psychosocial effects on workers can be observed in context of function allocation, task and interaction design and the mode of operation and supervision in relation to technology. Here factors like acceptance, motivation and social support can influence OSH just like fear of job loss, perceived monitoring and automation bias.


The presented results are based on a research project performed by the German Federal Institute of Occupational Safety and Health (BAuA) on behalf of the European Agency for Safety and Health at Work (EU-OSHA), together with the Universities of Leicester and Essex as well as Millieu Consulting. The knowledge base for the project was created through a systematic literature review that included 183 publications, expert interviews and an EU-OSHA focal point consultation.

The Challenges of Collaboration in a Hybrid Human-Robot Workforce

Gwen Bryan, Florida Institute for Human and Machine Cognition

The Challenges of Teamwork in a Hybrid Human-Robot Workforce are the same as those at the heart of Human-Centered Design. It is about designing and building systems that work effectively with people. This involves understanding both the machines and the people they are being designed to work with. Specifically, it requires an understanding of teamwork to guide the design, techniques to build systems that support teamwork, and ways to measure the effectiveness of teamwork to ensure we are achieving our performance goals.

The Present and Future of Collaborative Robotics

Matthew Gombolay, Georgia Tech University

Robot teams are increasingly being deployed into human-robot teaming environments, such as manufacturing and disaster response, to enhance the safety and productivity of human workers. Adaptive decision-making algorithms are essential to satisfy and optimize domain-specific temporospatial constraints and human factors considerations. Unfortunately, exact methods do not scale to real-world problem sizes, and ad hoc heuristics need domain-expert knowledge that is difficult to solicit and codify. In this talk, I will share how we are developing novel architectures and optimization methods for graph neural networks to model and dynamically coordinate human-robot teams. I will show how our techniques can learn rich representations of complex scheduling problems without the need for ad hoc, manual feature and reward engineering. Finally, I will discuss human-factors insights we have gleaned through human-subject experimentation for how robots can explore the latent capabilities of their human teammates to maximize human-robot team fluency.

Human-Robot Interface Simulation
Moderator: Menekse Barim, NIOSH

Modeling Computer-based agent behaviors according to Rasmussen's Abstract Decomposition Space in various hybrid work contexts

Joseph Manganelli, Xplr Design

This presentation summarizes a work in progress modeling framework that shows cognitive work analysis and Rasmussen's Abstract Decomposition Space analytic models mapping cleanly onto and integrating usefully into the Object Management Group's Systems Modeling Language (SysML) as a technical basis for representing HART concerns in a shared representational framework developed from an integration of methods and tools that already exist.  Case studies show examples of healthcare and industrial hybrid workflows.  Current thoughts being explored for how to represent computer-based agent behaviors according to Rasmussen's Skills, Rules, Knowledge Framework are also shared.

Human Gait and Motor Performance in a Physics-Based Virtual Reality Simulation Testbed

Suman Chowdhury, Texas Tech University

The virtual reality (VR) system has been widely used for sensorimotor training and rehabilitation. The recent advancement in VR research to use VR trackers or VR integrated motion capture systems can assist the researcher in developing a physics-based VR system that can provide both physical and cognitive interactions of the interactive objects. We recently developed a low-cost, physics-based VR testbed that can provide real cutaneous and kinesthetic haptic feedback of the objects instead of computer-generated haptic feedback. This study aimed to evaluate the system by comparing motor control biomechanics of the users (i.e., neuromuscular and visuomotor performance) while they performed three human-robot collaborative (HRC) sequential pick-and-place lifting tasks. We determined the efficacy of the VR testbed usage on the sensorimotor performance of the human participants by comparing their joint movement kinematics and kinetics for the same tasks being performed in virtual and real environments. We hypothesized that the biomechanical parameters, such as joint angles, electromyography-based motor performance, and joint reaction forces, would exhibit limited discrepancies between the tasks performed in virtual and real environments. Results showed minimal discrepancies between tasks performed in virtual and real environments. The innovation of our physics-based VR testbed lies in providing actual haptics of the real object while subjects are immersed in the simulated virtual environment and thereby overcoming the limitations of the computer-generated haptics.

Applications of Digital Twin for Modern Industrial Robotics

Jeremy Marvel, NIST

A marked defining characteristic of modern industrial robotic applications is the increased integration of information to enable more flexible and intelligent applications.  From sensor-driven robotic operations at a workcell level, to data-enabled, full-factory workflow optimization, the utility of procedural information has had a dramatic impact on the manufacturing process.  As we move toward a fifth industrial revolution in which human involvement and connectivity becomes even more crucial, presenting this information in a meaningful way becomes even more critical.  The concept of digital twin thus becomes prominent as a mechanism for providing crucial information about physical systems in a digital world.  In this presentation, Dr. Jeremy Marvel will present various mechanisms by which a digital twin can be leveraged in human-robot systems to improve system performance, provide system verification and validation, and present transparent process and health monitoring of robotic systems.

Mobile Robot Applications
Moderator: Hongwei Hsiao, Texas A&M University - Corpus Christie

Safe and Efficient Human-Robot Collaboration: Enabling Technologies
Fahrudin Alagic, Amazon Robotics; Mikell Taylor, Amazon Robotics; and Justin Croyle, Amazon Robotics


Enabling a hybrid human/robotic workforce requires design for collaboration at a system level, which in turn requires design expertise in many functional areas including proper interpretation and implementation of relevant functional safety and regulatory standards, safety-rated hardware and software development, human-robot interaction, virtual simulation, and sensing technologies. Highest priorities when developing collaborative robots and robotic system are human safety, human perception of those products, efficacy and efficiency. Unsafe products are simply unacceptable. Beyond that, products perceived by users as unsafe – even if they meet safety regulations – will struggle in achieving acceptance, and systems whose safety design leads to inefficient operation result in long ROIs or may be completely infeasible. The balance of safety and solution efficiency is not well addressed in the market today. In this talk, we discuss a different approach for seamless human-robot interaction and, in turn, highly efficient robotic systems.

Mobile Manipulation for Healthcare 

Charlie Kemp, Georgia Tech University

Mobile robots with arms (mobile manipulators) can meaningfully benefit people with disabilities, but versatile mobile manipulators have been too big, heavy and expensive to be practical. Charlie Kemp, director of the Healthcare Robotics Lab at Georgia Tech and co-founder of Hello Robot, will present research that led to Hello Robot’s Stretch, a compact and lightweight mobile manipulator that achieves a new level of affordability. He’ll also show examples of work by the growing community of researchers and developers using Stretch.

Achieving Functional Independence in Daily Activities with the Stretch Mobile Manipulator

Vy Nguyen, Hello Robot Inc. 

For people with and without a disability, participation in everyday activities is essential to health and well-being. Achieving functional independence promotes an individual’s ability to engage fully in life situations that are purposeful and meaningful. However, people with severe motor impairments, such as quadriplegia, may experience greater physical, social, and environmental barriers limiting their ability and access to participate in their activities more independently. Furthermore, they frequently rely on their care partners to assist them with everyday activities. Subsequently, the care partners who provide complete assistance to their loved ones may experience significant caregiver strain that impacts their health, well-being, and ability to engage in meaningful activities. As a multidisciplinary team comprised of an occupational therapist, roboticists, and human factors specialists, we believe these challenges can be confronted and relieved by robotic technology designed for assistive use cases in the home, workplace, clinic, or in communities. We present a study that takes place in a home context. It explores how the Stretch mobile manipulator, created by Hello Robot Inc., was used by a non-speaking older adult with quadriplegia to perform his desired daily activities to promote autonomy, independence, and social participation while reducing the level of assistance required from his care partner. We will also present how using Stretch can enrich the personal needs of his care partner and assist with caregiving demands, such as engaging in exercise and meal delivery/clean-up. 

bottom of page