Big Data Digest
Compiled by Yuan Yuan Wang, Lei Li, and Xin Yi Song
Have you ever dreamed of a mobile warrior Gundam, and fantasized about implanting human consciousness into a machine?
The Massachusetts Institute of Technology recently unveiled Hermes, a humanoid robot that can be operated remotely for flexible mobility.
Researchers hope it can replace humans in search and rescue missions. When faced with extremely dangerous environments, the manipulator can operate from a first-person perspective via a head-mounted display.
A wake-up call from the tragedy: the importance of rescue robots
The disaster at the Fukushima Daiichi nuclear power plant, caused by the 2011 earthquake and tsunami in Japan, was a wake-up call. In the disaster, high-risk radiation prevented workers from taking emergency measures, and they couldn't even operate the pressure valves. This is a task that would actually be best done by robots, but at the time neither Japan nor the rest of the world had the capacity to make it a reality.
The Fukushima disaster made many in the robotics community realize that rescue robots needed to go from the lab to the world.
Since then, rescue robots have begun to make significant progress. Research teams around the world have demonstrated unmanned ground vehicles that can drive through debris, robotic snakes that can squeeze through narrow gaps, and drones that map sites in the sky. Researchers are also building bionic robots that can measure damage and perform critical tasks, such as using dashboards or transporting emergency equipment.
Despite the progress, building robots with the same motor and decision-making abilities as emergency workers remains a challenge. Pushing open heavy doors, unloading fire extinguishers, and other simple but difficult tasks require a level of coordination that no robot has yet been built to master.
Putting the human brain inside the machine
The ideal rescue robot would be flexible and highly autonomous. For example, being able to autonomously enter a burning building to find a victim, or find a valve that needs to be shut off in a damaged industrial facility.
But disaster sites are unpredictable, and walking in these complex environments requires a high degree of adaptability that current rescue robots are not capable of. If an autonomous robot encounters a doorknob but can't find a match in the doorknob database, the mission fails. If the robot gets its arm stuck and doesn't know how to save itself, the mission fails.
Humans can cope with this easily: we can adapt and learn all the time, we can recognize changes in the shape of objects, cope with poor visibility, and learn how to use new tools on the fly in the field. The same goes for our motor skills. When running with weights, for example, we may run slower or not as far, but we can still run, and our bodies can easily adapt to new changes.
Wouldn't it be easier to put a human brain into a machine?
One solution to this shortcoming is the use of teleoperation, which allows an operator to remotely control a robot either continuously or for the duration of a specific task to help it do more than it is capable of.
Remote-controlled robots have long been used in industrial, aerospace, and underwater environments. More recently, some researchers have experimented with using motion capture systems to transfer human movements to bionic robots in real time: you wave your arm and the robot mimics your posture. For a fully immersive experience, special goggles allow the operator to see what the robot sees through the camera, and haptic undershirts and gloves provide the operator's body with a sense of touch.
At MIT's Bionic Robotics Laboratory, the research team is pushing human-robot integration even further, developing teleoperated systems in hopes of accelerating the development of hands-on rescue robots. They are building a teleoperated robotic system that consists of two parts: a bionic robot capable of flexible, dynamic behavior, and a new two-way human-machine interface that can relay human and robot movements to each other.
By linking robots to humans, the researchers have fully combined the strengths of both: the endurance and strength of robots, and the versatility and perception of humans. If the robot steps on debris and starts to lose its balance, the operator senses the same instability and reacts instinctively to avoid falling. That physical reaction is then captured and sent back to the robot, which helps avoid the fall. Through this human-robot interaction, the robot can utilize the operator's innate motor skills and split-second reactions to stay on its feet.
What's an improvement over previous bionic robots
One particular limitation of existing robots is that they can't perform what we call force manipulation, which is labor-intensive skills like cracking a chunk of concrete open or swinging an axe at a door. Most robots can only perform a few fine and precise maneuvers.
HERMES, a bionic robot from the MIT labs, can perform heavy-duty maneuvers. The robot weighs only 45 kilograms, but it's strong and powerful. It's about 90 percent the size of a normal human body, which is enough to allow it to maneuver naturally in a human environment.
Instead of using conventional DC motors, the actuators that power HERMES' joints include brushless DC motors fused to a planetary gearbox, so named because its three "planetary" gears rotate around a "sun" gear. The actuators include brushless DC motors fused to a planetary gearbox, so named because its three "planetary" gears rotate around a "sun" gear, which generates a lot of torque for their weight. The robot's shoulders and hips are driven directly, while the knees and elbows are driven by metal rods attached to actuators. This makes HERMES more flexible than other bionic robots, able to absorb mechanical shocks without the gears falling to pieces.
The human-machine interface that controls HERMES also differs from the traditional in that it relies on the operator's response to improve the robot's stability, known as the Balanced Feedback Interface, or BFI for short. The BFI took months and multiple iterations to develop, and the initial concept was similar to the one used for the 2018 film Steven? Spielberg-directed movie "Top Gun," which featured a full-body virtual reality costume that had some similarities.
Specific experimental tests
With HERMES, an operator stands on a square platform with about 90 centimeters on a side, and a load cell measures the force on the surface of the platform, from which it determines where the operator's feet are pushing down. A set of connecting rods is attached to the operator's limbs and waist, and a rotary encoder is used to accurately measure displacements over a range of one centimeter. The linkages are not only used for sensing, they also contain motors that are used to apply force and torque to the operator's torso. These rods can apply thrust to the operator's body while strapped to the BFI.
The researchers prepared two separate computers to control HERMES and the BFI, each with its own control loop that constantly exchanged data. At the beginning of each loop, HERMES collects data on its own posture and compares it with data obtained from the BFI about the operator's posture. Based on the difference between the two, the robot adjusts its actuation program and immediately sends the new pose data to the BFI, which then performs a similar control loop to adjust the operator's pose. This is repeated 1,000 times per second.
In order for both sides to operate at high speed, the information exchanged between them must be compressed. For example, instead of sending detailed data about the operator's posture, the BFI simply sends the position of the operator's center of gravity and the relative positions of his arms and legs. The computer controlling the robot then scales these measurements proportionally to the dimensions of the HERMES, which then reproduces that reference pose.
As with any other bi-directional teleoperation loop, the coupling between the BFI and the HERMES can lead to oscillations or instabilities, which are minimized by fine-tuning the scaling parameters of the mapping between the human body and the robot poses.
In the initial experiments, researchers gave HERMES an early balancing algorithm to understand how humans and robots behave together. In the test, one researcher used a rubber mallet to strike HERMES' upper body. With each blow, the BFI impacted the researcher himself, who would habitually turn his body sideways to find his balance, and the robot was able to maintain its balance.
In another round of experiments, HERMES managed to swing an axe and split gypsum wallboard. It also achieved putting out a fire with a fire extinguisher under the supervision of the local fire department. Rescue robots require more than brute force, so HERMES and also performed tasks that require more dexterity, such as pouring water into a cup from a kettle.
In each case, as the operator wearing the BFI simulated the task, the researchers watched how well the robot performed those same actions in place, noting which of the operator's responses helped the robot do the actions better. For example, when HERMES splits gypsum wallboard, its torso bounces backward. Almost simultaneously, the BFI applied a similar thrust to the operator, who would habitually tilt his body forward, thus also helping HERMES adjust its posture.
Little HERMES to help
Because HERMES is still too big for some experiments, and too capable. While HERMES can perform practical tasks, it's also time-consuming to move it around, and there's a lot of care involved in getting it to move, so the researchers gave HERMES a little brother.
Little HERMES is a smaller version of HERMES. Like its big brother, HERMES, Little HERMES uses customized, high-torque actuators mounted close to the body rather than on the legs, which allows the legs to swing more flexibly. To achieve a more compact design, in robotics terms, the number of axes of motion or degrees of freedom has been reduced from six to three on each limb, and the two-toed feet of HERMES have been replaced with simple rubber balls, each fitted with a three-way force sensor.
Attaching the BFI to Little HERMES required adjustments. There is a big difference in size between a human adult and the tiny robot, so when the researchers couldn't directly correlate the movements of the two, such as mapping the position of the human knee to the robot's knee, it resulted in a very uneven robot motion.
Little HERMES required a different mathematical model than HERMES, and in the new model the researchers added tracking parameters such as ground contact forces and the operator's center of gravity. This allows the new model to predict the movements the operator intends to make, and thus control Little HERMES to perform those movements.
In one experiment, the operator walked slowly, step by step, before picking up speed and walking quickly, and Little HERMES could be seen walking in the same way. When the operator jumped up, so did Little HERMES. It's still early days and Little HERMES is not yet free to stand or walk around.
The researchers are also expanding it further, hoping to allow it to stroll around the lab and even go outdoors, like the other pair of brothers, Cheetah and Mini Cheetah, who have already done so.
Next research goals
There are a number of challenges that need to be addressed next. One is operator fatigue after prolonged use of the BFI or highly focused tasks. Experiments have shown that the brain fatigues rapidly when the operator is directing not only his body but also the machine. This is especially noticeable for tasks that require fine manipulation, where the operator has to take a break after three consecutive repetitions of the experiment.
The current solution is to make the operator and controller*** equally responsible for stabilizing the robot's movements. If HERMES is performing a task that requires more of the operator's attention, then the operator does not have to assist in keeping the robot balanced, and the autonomous controller can take over control of the robot's balance. One way to recognize such situations is to track the operator's gaze. The operator's gaze indicates a high level of concentration, in which case the autonomous balance mode is activated.
As with any teleoperation system, another challenge is transmission latency. When controlling a robot remotely, if there's a 1-second delay between the command being sent and the robot's response, it's still possible to operate it remotely, but if the delay becomes longer, it may not be possible to operate it smoothly. The current plan is to rely on new wireless technologies, such as 5G, to ensure low-latency and high-throughput transmissions.
Finally, the researchers are also planning to merge the lab-developed technology of Cheetah, an upright robot, with that of HERMES, to produce a fast-moving quadrupedal robot that can quickly enter a disaster site on four legs, and that can also morph into an upright robot, so that disaster responders can use their extensive experience skills and reactions to allow the robot to carry out rescue missions.
Link to original article:
https://spectrum.ieee.org/robotics/humanoids/human-reflexes-help-mits-hermes-rescue-robot-keep-its-footing
< /p>