WS02 - How to make collaborative robots more collaborative for real environments

Workshop organized by

Pedro Dinis Gaspar, University Beira Interior, Portugal, Juan Antonio Corrales Ramón, University of Santiago de Compostela, Spain, Chedli Bouzgarrou, Clermont-Auvergne INP, France, Daniel Sánchez, AIJU Technological Institute, Spain.


Workshop type

Full-day workshop.


Focus

The organizers of this workshop are collaborating in the research project ROBOTA-SUDOE, which is used to define the main topics of this workshop. This project addresses as main challenge the technological modernization of traditional economic sectors of the SUDOE space (in particular, the agri-food industry-fruit/meat- and plastics/toys manufacturing) where robots have not been applied until now. Therefore, the main objective of the project is to improve the competitiveness and enhance the growth of SMEs, using collaborative robotics solutions that respond to their endogenous challenges: important number of sick leaves of workers due to hard working conditions, reduced productivity, lack of attractivity for new workers... To achieve this objective, ROBOTA proposes to increase the level of adaptability of current collaborative robots by adding to them 4 main elements that are the topics covered by this workshop:

  • Self-sensing Soft grippers
    Self‐sensing soft grippers can increase the dexterity level of current collaborative robots. Task‐specific designs enabled by the flexibility of additive manufacturing enhance the inherent compliance and reduced handling pressures of soft gripping technologies through increased surface contact. These grippers exploit the piezoresistive properties of commercially available materials to embed self‐sensing capabilities, enabling the monitoring of grasp status and advancing the collaborative potential of soft robotics. Advancements in the development of cost‐effective soft gripper fingers featuring customizable, task-specific designs will be presented. Material properties and multimaterial approaches to additive manufacturing will be analyzed to facilitate customized and localized sensor integration. Practical applications in agri‐food packaging demonstrate industrial performance and address challenges. Thereby, participants will gain insights into current advancements and future research directions in self‐sensing soft gripping technologies for industrial automation.
  • Perception of humans working together with collaborative robots
    The focus is on enhancing the perception capabilities of collaborative robots to improve their interactions with human workers. This includes developing advanced vision systems, tactile sensing, and AI-driven algorithms that enable robots to accurately detect and interpret human actions, intentions, and safety zones. By improving perception, robots can adapt their movements and tasks dynamically, ensuring safe and efficient collaboration. The discussion will also cover how these perception technologies can be effectively integrated into existing industrial and healthcare environments, making robots more intuitive and responsive. This approach bridges the gap between autonomous operation and seamless human-robot interaction, advancing the practical deployment of collaborative robotics. Applications in the three use-cases of the ROBOTA project (fruit, toy/plastics and meat) will show their generalization capabilities between different users and products. In the project smooth cooperation between human operators and robots will be enabled through extensive multimodal sensing including force-torque sensing, tactile sensing, stereovision, and human motion capture using IMU systems.
  • Learning from demonstration (LfD) of complex manipulation tasks
    Focusing on the acquisition of complex manipulation skills, this approach allows collaborative robots to learn directly from human demonstrations. By observing and analysing human movements, force dynamics, and decision-making patterns, robots can replicate tasks accurately and adapt to variations without extensive programming. In industry, LfD has successfully enabled robots to perform tasks like grasping, assembly, and human- robot collaboration, with kinesthetic teaching being widely used for its simplicity. However, challenges remain, including generalizing to new scenarios and handling suboptimal demonstrations. Recent advancements, such as combining LfD with reinforcement learning, have improved sample efficiency and robustness for complex tasks. Additionally, dynamical movement primitives (DMPs) enhance generalization across platforms, further expanding industrial applications. Overall, the emphasis lies on integrating machine learning techniques to enable robots to generalize these skills to different scenarios and tasks effectively. Enhancing scalability and adaptability in this way aims to improve the flexibility and autonomy of collaborative robots, making them more capable of managing diverse tasks in industrial environments alongside human workers. Real tasks in the toy/plastic sector (doll assembly and demoulding) will demonstrate the applicability of the proposed techniques.
  • Force control strategies for human assistance and HRI (Human–Robot Interaction)
    For tasks in which the human expert’s experience in contact handling is essential and difficult to replicate with fully autonomous robots, collaborative robotic arms can act as physical assistants within a Human–Robot Interaction (HRI) framework. In this context, cobots support the human operator by reducing the overall physical effort required to perform the task, while preserving human control and expertise. Classical force control strategies for robot arms typically consider only static relationships between the robot end-effector position and the applied forces. To enable effective and natural HRI, new AI-based algorithms must be integrated into these control strategies to dynamically adapt to different users, account for diverse ergonomic behaviors, and predict human motion during close collaboration. This allows robotic assistance to remain unobtrusive and intuitive, ensuring smooth cooperation while both human and robot jointly manipulate the tool required for task execution. Three real-world applications of these techniques will be presented: meat cutting, doll demoulding and patient rehabilitation.


Contact for more details

If you would like to know more about the workshop, please contact Daniel Sánchez (danielsanchez@aiju.es).