Consistent behavior and fluid motion
Smooth motion conveys a sense of predictability, safety, and intelligence in the robot. Sudden changes in direction, shaking, or erratic motion makes people feel apprehensive and gives the appearance that the robot is out of control. The UR5 from Universal Robots is an example of fast, smooth motions with submillimeter accuracy.[—] Fluid motion is achieved by paying attention to overall system design that incorporates selection of motors, gears, mechanical design, and controls algorithms, which typically increases the total cost substantially.
Figure 6-22. UR5 © 2013 Universal Robots
Natural feedback and adaptive behavior
This is where the real connection is made between human and robot. Rethink Robotics’ Baxter (Figure 6-23) is a forerunner in collaboration with its force sensing to detect when a human bumps into it or grabs the cuff on its wrist. Haptic feedback tells the user where the arms can be trained without colliding into any part of the robot. The robot also gazes at a location that it intends to move toward, so people can anticipate its next move. Baxter is trained by demonstration, which is a natural way that people teach each other a new task.
Figure 6-23. Getting up close and personal with Baxter (© 2013 Rethink Robotics)
The point about human-robot communication is even more compelling when you consider the driverless car from Google, which you can see in Figure 6-24. Imagine you are stopped in your car and see a driverless Google car stopped at the opposite side of an intersection. When do you move forward? The safest decision is to wait for the Google car to go first because you have no idea what it will do next.[—]
Speech recognition is a viable interface for in-home assistive robots because their mobility gives them the capability to overcome the big challenge of microphone acoustics. The robot can bring the microphone closer to the talker, which increases the strength of the received signal. The robot can also directionally point the microphone at the person speaking to focus on their speech and ignore noises coming from other parts of the room. The Kompai robot from Robosoft shown in Figure 6-25 uses speech recognition as its main interfaced90
Figure 6-24. The Google driverless car © 2013 Google
Figure 6-25. Robosoft’s Kompai (© 2013 Robosoft)
The da Vinci Surgical System depicted in Figure 6-26 is a tremendous example of collaborative robotics. On a personal yet related note, my father-in-law underwent surgery to remove a cancerous tumor on the base of his tongue. The procedure was performed by a doctor at the Boston Medical Center using the da Vinci system. The da Vinci system made it possible for the doctor to access the base of my father-in-law’s tongue without needing to break his jaw open. The robot gave visibility and provided precise movement to minimize damage and recovery time. The human in this collaboration (the doctor) observed data, made decisions, and controlled the motions.
Figure 6-26. The da Vinci Surgical System (© 2013 Intuitive Surgical, Inc.)
The Robonaut shown in Figure 6-27 is a joint development of NASA and General Motors. It can detect when somebody is reaching for an object in its grasp and release the object when the person tugs it. Robotnaut stops when somebody puts his hand in the way of its motion and continues when the hand is removed.
Figure 6-27. Robonaut, developed by NASA and GM (courtesy of GM and NASA and Marty Linn)
Figure 6-28. Robonaut demonstrating a familiar human gesture (courtesy of GM and NASA and Marty Linn)