By navigating our site, you agree to allow us to use cookies, in accordance with our Privacy Policy.

Collaborative Robotics: The Complexity of Mimicking Humans is Just the Beginning

In the first Marvel Entertainment Iron Man movie, the protagonist, Tony Stark, is aided by a large mechanical assistant known as Dum-E. As a scaled-down and much safer version of today’s industrial-grade manufacturing robots, Dum-E has enough voice and gesture recognition, plus motion control, to help Tony with his many projects.

Collaborative Robotics

Dum-E not only illustrates the potential of a cobot but also highlights its key functions. For example, this cobot works side-by-side with Tony, to help with things like holding one of Tony’s electromechanical boots during its repair. Later, Tony yells at the cobot to stop spraying fire retardant everywhere, and the robot communicates its understanding by sympathetically lowering its arm and emoting a sigh. The cobot’s obsession with human safety—keeping Tony safe from fire—highlights its adherence to Asimov’s first law of robotics: A robot may not injure a human being or through inaction allow a human being to come to harm.

Developing collaborative robots requires creating many complex systems to sense, communicate, and move alongside humans safely and effectively.

Complex Sensing Requirements

To assist humans, cobots use a combination of technologies that mimic the basic human senses as well as its environment; however, the five senses must work in combination along with interoception (sensing internal states) and proprioception (sensing relative position) for the entire range of human motions and actions to be possible. Additionally, cobots must communicate and move, requiring yet another set of systems with which to talk, understand, and assist their human coworkers.

Sensing Their Environment: Exteroception

All cobots use some combination of technologies that mimic the basic human senses: Sight/vision, hearing, taste, smell, and touch (Figure 1). These five senses belong to the realm of exteroception—that is, sensitivity to stimuli outside of the body.

illustration depicts the sensory receptors
Figure 1: This illustration depicts the sensory receptors: Seeing (eye), hearing (ear), smell (nose), taste (tongue), and touch (finger).

To be useful to humans, cobots must have a range of environmental sensors to perform their tasks and stay out of trouble. Common exteroceptive sensors in cobots include vision, hearing, touch, smell, taste, temperature, acceleration, range finding, and more.

Sensing Their Internal State: Interoception

To be self-maintaining, robots must also be able to know their internal state. This corresponds to interoception in humans, or the ability to perceive innate statuses of the body like digestion, breathing, and fatigue. For example, a cobot must know when its batteries need charging and reactively go seek a charger. Another example is a cobot’s ability to sense heat when its internal thermal temperature is too high to work next to humans. Other interoception examples involve optical and haptic mechanisms, which we’ll cover shortly.

Sensing Their Relative Position: Proprioception

Awareness of the external and internal is critical for the operation and maintenance of a cobot, but to be useful to humans most cobots must have proprioception. It is proprioception that allows the human body to move and control limbs without looking at them, thanks to interactions and interpretations from the brain.

In humans, this results in an awareness of the relative position of human body parts and the strength and effort necessary for motion. Human proprioceptors consist of muscles, tendons, and joints. In cobots, the functions of proprioceptors are mimicked mostly by electromechanical actuators and motors. Proprioceptive measures consist of joint positions, joint velocities, and motor torques.

Communicating with Humans

Voice and motion are not senses but are necessary for humans and robots to communicate and perform tasks. Voice communication is needed by cobots to clarify what is heard and to alert humans to potential dangers. Speech synthesizing hardware and software are used to artificially reproduce human speech.

Today, artificial intelligence (AI) is beginning to enable actual conversations between humans and cobots. Robots can understand the nuances in human speech, such as chatting, half-phrases, laughter, and even when noncommittal responses like “uh huh” are uttered. Sharing resources, like conversational floors, is another concept that robots are learning. To prevent talking over one another, robots are taught that only one person can “seize the floor” and talk at a time.

Complexities of Collaboration

Humans use a combination of senses to move, operate, and communicate. One common example is body language through hand gestures that are accompanied by voice commands. For cobots, this type of collaboration requires a vision for gesture recognition, speech recognition to perceive commands, and some level of AI to interpret the context of human communication. Tony uses this technique as the primary way to interface with Dum-E.

Continuing this point is the example of combined sensory input through vision and haptic (or “touch”) feedback. Consider the real-world example of a surgeon running a simulation of operation before the actual event (Figure 2). The simulation can create a virtual reality (VR) where the surgeon can see and test the operation procedure. However, he or she has no way to sense the feeling of the scalpel’s contact with human tissue. This is where haptic feedback would help because it mimics the sense of touch and force during a computer simulation.

surgical simulations
Figure 2: Haptic feedback can make surgical simulations feel more real.

How can a machine communicate through touch? The most common form of haptic feedback is accomplished using vibration, such as the feeling created by shaking, but silent, mobile phone. In the case of the surgeon, a linear actuator might replace a vibration motor. As the surgeon puts pressure on the simulated scalpel, a linear actuator that moves up and down places greater pressure on a portion of his body via a headband. This pressure corresponds to the pressure on the simulated scalpel.

In the case of a cobot, a parallel example of haptic feedback is found in the cobot’s grippers (or hands). These grippers will often contain a wrist camera for recognition of a grasped object along with force-torque sensors that provide input for a sense of touch.

control cobots by using buttons
Figure 3: Gesture and voice recognition could replace cobot operator interfaces.

Most workers communicate and control cobots by using buttons, joysticks, keyboards, or digital interfaces (Figure 3). However, just as they are for humans, speech and haptics can be effective communication mediums for cobots.

Haptics and eye movement are another way sensory combinations can improve interplay between humans and cobots. As humans point toward an object, they first look in the direction of the object. This anticipatory action can be picked up by a cobot’s vision to provide a tip-off regarding the intention of the human collaborator. Similarly, technology can help a cobot communicate its intentions to humans. Robots now have projectors that highlight target objects or routes that the cobot will take.

Summary

As with any emerging technology, there are still many challenges that face the world of collaborative robots as they work side-by-side with humans. Like Tony Stark in Iron Man, will humans find cobots more frustrating than useful?

Today, most robots have gotten pretty good at combining voice and visual recognition to assist humans. However, what is lacking is the cobot’s ability to understand context and respond to complex situations. AI will be essential to enable cobots to truly interact, anticipate, and communicate, especially when they need to hand-off certain complex tasks to humans, which is an ongoing problem in autonomous automotive vehicles.
It’s optimistic to note that toward the end of the Iron Man movie, at a critical juncture where Tony lay dying because he couldn’t reach his artificial heart on a nearby table, Dum-E saved his life when he figured out what Tony wanted and performed the right task. Tony then looked up at Dum-E and simply said, “Good boy.” Establishing this level of trust between cobots and humans is perhaps the hardest but most worthwhile goal.

About the Author

John Blyler

John Blyler is a technology professional with expertise in multi-discipline Systems Engineering, technical program life-cycle management (PLM), content development, and customer-facing projects. He is an experienced physicist, engineer, manager, journalist, textbook author, and professor who continues to speak at major conferences and before the camera. John has many years of experience leading interdisciplinary (mechanical-electronic, hardware-software) engineering teams in both the commercial and Mil/Aero semiconductor and electronics industries. Additionally, he has served as editor-in-chief for technical trade journals and the IEEE professional engineering society publications. He was the founding advisor and affiliate professor for Portland State University’s online graduate program in systems engineering. Finally, John has co-authored several books on systems engineering, RF wireless design, and automotive hardware-software integration for Wiley, Elsevier, IEEE, and SAE.

Source: Mouser Electronics

Further information: Click Here

Tags

Mouser Electronics

Mouser Electronics, a Berkshire Hathaway company, is an award-winning, authorized semiconductor and electronic component distributor focused on rapid New Product Introductions from its manufacturing partners for electronic design engineers and buyers. The global distributor’s website, Mouser.com, is available in multiple languages and currencies and features more than 5 million products from over 750 manufacturers. Mouser offers 23 support locations around the world to provide best-in-class customer service and ships globally to over 600,000 customers in more than 220 countries/territories from its 750,000 sq. ft. state-of-the-art facility south of Dallas, Texas.

Related Articles

Upcoming Events