MIT News - Machine learning 09月25日
新一代机器人学习技能更便捷,多种方式皆可教学
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

传统的机器人编程学习耗费专业知识,但麻省理工学院的工程师们正开发一种新型机器人教学工具,它能让普通人通过“演示学习”的方式轻松教会机器人新技能。这种名为“通用演示接口”的工具,集远程操控、物理引导和模仿演示于一体,允许用户选择最适合任务和个人的方式进行教学。测试表明,这种接口显著提高了教学的灵活性,有望拓宽机器人在制造、居家及护理等领域的应用,让机器人成为更智能、更高效的人类助手。

🤖 **多样化教学模式:** 新型机器人教学工具“通用演示接口”打破了传统编程限制,集成了远程操控、物理引导和模仿演示三种教学方式,使机器人能通过更自然直观的“演示学习”掌握新技能,降低了学习门槛。

💡 **提升灵活性与适用性:** 该接口能够适配多种协作机器人臂,为不同任务和用户偏好提供了极大的灵活性。例如,远程操控适用于危险品处理,物理引导可用于精确调整机器人姿态,而模仿演示则更适合精细操作,从而拓宽了机器人技能的学习范围。

📈 **赋能更广泛应用场景:** 通过更便捷的教学方式,该技术有望推动机器人在制造业以外的领域,如家庭服务和护理等场景的普及,使机器人成为人类更智能、更得力的助手,提升工作效率和生活质量。

🤝 **增强人机协作:** 该研究旨在创造高度智能且技术精湛的机器人“队友”,能够与人类有效协作完成复杂工作。灵活的演示工具被认为是实现这一目标的关键,能够极大地促进机器人更广泛地融入人类的生产和生活环境。

Teaching a robot new skills used to require coding expertise. But a new generation of robots could potentially learn from just about anyone.

Engineers are designing robotic helpers that can “learn from demonstration.” This more natural training strategy enables a person to lead a robot through a task, typically in one of three ways: via remote control, such as operating a joystick to remotely maneuver a robot; by physically moving the robot through the motions; or by performing the task themselves while the robot watches and mimics.

Learning-by-doing robots usually train in just one of these three demonstration approaches. But MIT engineers have now developed a three-in-one training interface that allows a robot to learn a task through any of the three training methods. The interface is in the form of a handheld, sensor-equipped tool that can attach to many common collaborative robotic arms. A person can use the attachment to teach a robot to carry out a task by remotely controlling the robot, physically manipulating it, or demonstrating the task themselves — whichever style they prefer or best suits the task at hand.

The MIT team tested the new tool, which they call a “versatile demonstration interface,” on a standard collaborative robotic arm. Volunteers with manufacturing expertise used the interface to perform two manual tasks that are commonly carried out on factory floors.

The researchers say the new interface offers increased training flexibility that could expand the type of users and “teachers” who interact with robots. It may also enable robots to learn a wider set of skills. For instance, a person could remotely train a robot to handle toxic substances, while further down the production line another person could physically move the robot through the motions of boxing up a product, and at the end of the line, someone else could use the attachment to draw a company logo as the robot watches and learns to do the same.

“We are trying to create highly intelligent and skilled teammates that can effectively work with humans to get complex work done,” says Mike Hagenow, a postdoc at MIT in the Department of Aeronautics and Astronautics. “We believe flexible demonstration tools can help far beyond the manufacturing floor, in other domains where we hope to see increased robot adoption, such as home or caregiving settings.”

Hagenow will present a paper detailing the new interface, at the IEEE Intelligent Robots and Systems (IROS) conference in October. The paper’s MIT co-authors are Dimosthenis Kontogiorgos, a postdoc at the MIT Computer Science and Artificial Intelligence Lab (CSAIL); Yanwei Wang PhD ’25, who recently earned a doctorate in electrical engineering and computer science; and Julie Shah, MIT professor and head of the Department of Aeronautics and Astronautics.

Training together

Shah’s group at MIT designs robots that can work alongside humans in the workplace, in hospitals, and at home. A main focus of her research is developing systems that enable people to teach robots new tasks or skills “on the job,” as it were. Such systems would, for instance, help a factory floor worker quickly and naturally adjust a robot’s maneuvers to improve its task in the moment, rather than pausing to reprogram the robot’s software from scratch — a skill that a worker may not necessarily have.

The team’s new work builds on an emerging strategy in robot learning called “learning from demonstration,” or LfD, in which robots are designed to be trained in more natural, intuitive ways. In looking through the LfD literature, Hagenow and Shah found LfD training methods developed so far fall generally into the three main categories of teleoperation, kinesthetic training, and natural teaching.

One training method may work better than the other two for a particular person or task. Shah and Hagenow wondered whether they could design a tool that combines all three methods to enable a robot to learn more tasks from more people.

“If we could bring together these three different ways someone might want to interact with a robot, it may bring benefits for different tasks and different people,” Hagenow says.

Tasks at hand

With that goal in mind, the team engineered a new versatile demonstration interface (VDI). The interface is a handheld attachment that can fit onto the arm of a typical collaborative robotic arm. The attachment is equipped with a camera and markers that track the tool’s position and movements over time, along with force sensors to measure the amount of pressure applied during a given task.

When the interface is attached to a robot, the entire robot can be controlled remotely, and the interface’s camera records the robot’s movements, which the robot can use as training data to learn the task on its own. Similarly, a person can physically move the robot through a task, with the interface attached. The VDI can also be detached and physically held by a person to perform the desired task. The camera records the VDI’s motions, which the robot can also use to mimic the task when the VBI is reattached.

To test the attachment’s usability, the team brought the interface, along with a collaborative robotic arm, to a local innovation center where manufacturing experts learn about and test technology that can improve factory-floor processes. The researchers set up an experiment where they asked volunteers at the center to use the robot and all three of the interface’s training methods to complete two common manufacturing tasks: press-fitting and molding. In press-fitting, the user trained the robot to press and fit pegs into holes, similar to many fastening tasks. For molding, a volunteer trained the robot to push and roll a rubbery, dough-like substance evenly around the surface of a center rod, similar to some thermomolding tasks.

For each of the two tasks, the volunteers were asked to use each of the three training methods, first teleoperating the robot using a joystick, then kinesthetically manipulating the robot, and finally, detaching the robot’s attachment and using it to “naturally” perform the task as the robot recorded the attachment’s force and movements.

The researchers found the volunteers generally preferred the natural method over teleoperation and kinesthetic training. The users, who were all experts in manufacturing, did offer scenarios in which each method might have advantages over the others. Teleoperation, for instance, may be preferable in training a robot to handle hazardous or toxic substances. Kinesthetic training could help workers adjust the positioning of a robot that is tasked with moving heavy packages. And natural teaching could be beneficial in demonstrating tasks that involve delicate and precise maneuvers.

“We imagine using our demonstration interface in flexible manufacturing environments where one robot might assist across a range of tasks that benefit from specific types of demonstrations,” says Hagenow, who plans to refine the attachment’s design based on user feedback and will use the new design to test robot learning. “We view this study as demonstrating how greater flexibility in collaborative robots can be achieved through interfaces that expand the ways that end-users interact with robots during teaching.”

This work was supported, in part, by the MIT Postdoctoral Fellowship Program for Engineering Excellence and the Wallenberg Foundation Postdoctoral Research Fellowship.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

机器人 人工智能 机器学习 演示学习 人机协作 Robotics Artificial Intelligence Machine Learning Learning from Demonstration Human-Robot Collaboration
相关文章