AI Robots Trained to Make Decisions and Adjust Movement Trajectories in Real-Time

A team of researchers from the School of Engineering of Far Eastern Federal University, Institute of Marine Technology Issues, and Institute of Automation and Management Processes of the Far Eastern Branch of the Russian Academy of Sciences have come up with a software enabling industrial AI robots with technical vision to set out and modify the movement trajectories of their tools in real-time without lowering given precision levels.

The team’s report was acknowledged as the best during the ICCAD'19 conference in Grenoble (France) held between July 2nd and 4th.

The FEFU team created and applied a new standard for smart industrial robots control—the management of program signals. According to it, robots can set and modify the trajectories and regimes (speeds) of tools movement by themselves while processing specifics under undefined conditions and in a varying working environment.

The new software enabled the team to obtain around 0.5 mm precision in the operation of robotic tools (including the actions that require extra force application). However, many high-accuracy processes require precision within the 0.2-0.1 mm range.

The issue lies in the imprecise technology used to manufacture the robots themselves, and it hasn't been resolved anywhere in the world yet. We've already developed a method to eliminate this defect based on special test movements. It proved to be efficient in models, and right now we are working to implement it in practice.

Vladimir Filaretov, Professor and Head of Department for Automation and Management, School of Engineering, FEFU

Professor Filaretov adds, “If we obtain positive results, it would be a breakthrough in the practical application of robots in general. And if no, we'd continue to work until we have a positive result. Generally, this is a working method,"

Professor Filaretov is an Honored Science Worker of Russia, an Honored Inventor of Russia, and an Honored Engineer of Russia.

With a technical vision system, a machine develops a virtual image of its workspace, identifies each piece, and establishes its exact position. A robot can also identify distortions in large pieces that happen during the course of their fixation. Based on the virtual image, it establishes the course of its working tools.

It's important to emphasize that the methods, algorithms, and software developed by us are of universal nature. They can be used to control almost any types of robots: industrial robots, underwater devices, unmanned ground vehicles, flying, and many promising agricultural robots. They only require minor adjustments that are already included into the software and take into account their specific features.

Vladimir Filaretov, Professor and Head of Department for Automation and Management, School of Engineering, FEFU

Professor Filaretov continued, “Our developments, including smart VR-based control, maximize on the capabilities of modern technologies and are able to increase the efficiency of technological processes by several times while preserving the quality of the products.”

The new smart control technique has already been executed at the Dalpribor plant (Vladivostok) and is presently being tested and modified taking into consideration the latest industrial challenges. The most recent technology update was presented at the IEEE International Conference on Control, Automation and Diagnosis 2019 (ICAAD'19) in Grenoble and got special credit.

Based on the outcome of the research, a group of five scientists under the supervision of Professor Filaretov applied for the Russian Federation Government Prize.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.