Human-Robot Physical Interaction

Our work on human-robot interaction has focused on understanding, communicating, and controlling tasks that require physical interactions between robots and humans

Tactile Manipulation - Input Methods

tong3

Robotic Assistive Motor Skills Therapy

insert_lego

Distributed Macro-Mini Actuation

dm2

Distributed Macro-Mini Actuation

tong3

Recent Work

  • ICRA/RAL 2021 – Corrective Shared Autonomy for Addressing Task Variability Abstract—Many tasks, particularly those involving interaction with the environment, are characterized by high variability, making robotic autonomy difficult. One flexible solution is to introduce the input of a human with superior experience and cognitive abilities as part of a shared autonomy policy. However, current methods for shared autonomy are not designed to address the wide ...
  • NASA University Leadership Initiative Lunch and Learn Industrial systems engineering  professor Robert Radwin and REACH lab Ph.D. student Mike Hagenow recently traveled to Washington DC to take part in the NASA University Leadership Initiative (ULI) project lunch and learn. They were joined by the other round awardees from University of Illinois-Urbana Champaign and Carnegie Mellon University as well as staff from the NASA Aeronautics ...
  • NASA Project Funded (Aviation Manufacturing) A new project has been funded through the National  Aeronautics and Space Administration’s (NASA) University Leadership Initiative.  Greater detail is provided below: University of Wisconsin-Madison engineering professor Michael Zinn is working to improve the efficiency, flexibility and safety of the aviation manufacturing industry as part of a new NASA-funded project. Zinn, an associate professor of mechanical engineering, is collaborating with computer ...
  • CoRL 2018 – Inferring geometric constraints in human demonstrations Abstract—This paper presents an approach for inferring geometric constraints in human demonstrations. In our method, geometric constraint models are built to create representations of kinematic constraints such as fixed point, axial rotation, prismatic motion, planar motion and others across multiple degrees of freedom. Our method infers geometric constraints using both kinematic and force/torque information. The ...
  • NSF National Robotics Initiative Project Funded (Communicating Physical Interactions) A new project, entitled “Communicating Physical Interactions” has been funded through the National Science Foundation’s (NSF) National Robotics Initiative (NRI).  An overview of the project is given below: PI:     Prof. Michael Gleicher Department of Computer Science University of Wisconsin – Madison Co-PI: Prof. Bilge Mutlu Department of Computer Science University of Wisconsin – Madison Co-PI: Prof. Michael Zinn Department of Mechanical Engineering University of Wisconsin – Madison Overview: In ...
  • NSF National Robotics Initiative Project Funded (High-Power Physically Interactive Human-Robot Collaboration through Balanced Active-Passive Hybrid Actuation) A new project, entitled “High-Power Physically Interactive Human-Robot Collaboration through Balanced Active-Passive Hybrid Actuation” has been funded through the National Science Foundation’s (NSF) National Robotics Initiative (NRI).  An overview of the project is given below: PI:     Prof. Peter Adaczyk Department of Mechanical Engineering University of Wisconsin – Madison Co-PI:   Prof. Michael Zinn Department of Mechanical Engineering University of Wisconsin – Madison Overview The research will ...
  • ICRA/RAL 2018 – Recognizing Geometric Constraints in Human Demonstrations using Force and Position Signals Abstract—This paper introduces a method for recognizing geometric constraints from human demonstrations using both position and force measurements. Our key idea is that position information alone is insufficient to determine that a constraint is active and reaction forces must also be considered to correctly distinguish constraints from movements that just happen to follow a particular ...
  • Instrumented Tongs – An input method for tactile manipulations As most interactions with everyday objects involve complicated force interactions, we design and built instrumented tongs that measure dynamic interaction with objects in the environment. This novel approach allows accurate measurement of interactions in a controlled environment while still maintaining dexterity. These tongs measure grasp forces using two optoforce force sensors. We measure the location ...
  • IROS 2017 – Recognizing Actions during Tactile Manipulations through Force Sensing Abstract—In this paper we provide a method for identifying and temporally localizing tactile force actions from measured force signals. Our key idea is to use the continuous wavelet transform (CWT) with the Complex Morlet wavelet to transform force signals into feature vectors amenable to machine learning algorithms. Our method uses these feature vectors to train ...
  • IEEE Haptics Symposium 2008: Large Workspace Haptic Devices – A New Actuation Approach Large workspace haptic devices have unique requirements, requiring increased power capabilities along with increased safety considerations. While there are numerous haptic devices available, large workspace systems are hampered by the limitations of current actuation technology. To address this, the Distributed Macro-Mini (DM2 ) actuation method has been applied to the design of a large workspace ...