From elder companionship and assistance to exoskeleton rehabilitation to autonomous surgery, robotics is playing a key role in emerging healthcare technology. These new applications demand a seamless interaction between the robot and the user, requiring new models that both improve the utility and mitigate the inherent risks.
This webinar will discuss the state of the art relative to UX/UI where a human is supervising a cobotic cell or a platoon of multi-functional robots perform a variety of tasks, streaming data back to the supervisor, prioritizing process steps, assisting in decision making etc.
We’ll discuss a use case for coordinated motion of multiple robotic arms in surgical robotics. In this use case, the surgeon works at a console, commanding one or more robotic arms. The motion control system is responsible for moving the surgical tool precisely while maintaining constraints such as remote center of motion at the insertion point and collision avoidance with the other robotic arms and obstacles in the environment. We'll discuss how this motion control is achieved in real time while minimizing the task load on the surgeon.
In addition, we’ll cover several examples as a guide to building an intuitive and easy communication system with the robot through speech, gestures, voice and facial recognition.
Discussion topics include:
• Evolving design principles
• Improving ease of use during the design process
• Managing multimodal interfaces, including haptic feedback
• Achieving precision motion control