By Mike Oitzman | October 30, 2024
Boston Dynamics Inc. released a new video of its Atlas humanoid robot today. The video shows the electric robot handling large automotive parts autonomously. According to the company, the robot uses machine learning to execute its tasks and 3D vision to perceive the world around it.
There are no prescribed or teleoperated movements; all motions are generated autonomously online, said Boston Dynamics. The Atlas humanoid is able to detect and react to changes in the environment using a combination of vision, force, and proprioceptive sensors, the company explained.
For example, it could detect moving fixtures and react to actions like failure to insert the cover, tripping, and environment collisions (1:24).
In the video, Atlas demonstrates some unique motions including turning its head with a range of motion beyond that of a human and walking backward with its hip joint turned 180 degrees, while the torso rotates mid-motion to orient itself for the next operation.
This design provides mobility and joint rotations that are larger than the range of the human body. The video demonstrates some of the possibilities for Atlas to take advantage of its joint motion.
The electric model is following in the very big footsteps of its larger sibling, Atlas hydraulic, which handled heavy automotive parts in an industrial setting.
Atlas humanoid grasps with three-fingered hand
In the video, you can also watch how the Atlas humanoid uses its three-fingered hand, with a rotating digit, to pull items off a shelf and then grip the item for transfer. The gripper appears to be designed to handle large and heavy items.
It’s likely that the end effector deployed with the electric robot will be dependent on the types of work that it will do and the characteristics of the parts. This release follows another video in which the Atlas humanoid performed various calisthenics.
Boston Dynamics also announced this week the first deployment in Europe of its Stretch and Spot robots at Otto Group.