Posted in | News | Consumer Robotics

UC San Diego Computer Scientists Participate in International RoboCup Competition

A team of computer scientists from the University of California San Diego are taking part for the first time in the international RoboCup @ Home competition, which takes place July 27 to 31 in Nagoya, Japan.

Computer scientists are training the Toyota Human Support for the RoboCup @ Home competition. The campus is taking part in the competition for the first time. (Credit: Photo courtesy of Toyota)

The UC San Diego team has spent the last three months refining and testing algorithms to train a Toyota Human Support Robot (or HSR) to complete two tasks: picking up and putting away groceries, and helping someone to carry groceries from their car to their home. The students are part of the research groups of computer science professors Laurel Riek and Henrik Christensen at the Contextual Robotics Institute at UC San Diego.

The competition aims to test a robot’s ability to perform everyday tasks, help people around the house and establish naturalistic robot-human communication and interaction. The teams also are tested on their ability to control robotic arms to manipulate objects; and the use of a wide range of sensors for voice interaction and image recognition.

“This is a great competition because it challenges researchers to think about the practicalities of deploying robots in real world environments, like people’s homes,” said Riek, a professor at the Jacobs School of Engineering at UC San Diego. “In a very short time, our students have accomplished remarkable work, and we are really proud of them.

UC San Diego takes part in RoboCup competition for the first time

The Tritons will face off against 10 teams in their category, including the University of Texas at Austin and Northeastern University. All teams are using Toyota’s HSR robot. The robot is designed to assist people with daily tasks. It’s compact and light weight, and equipped with an articulated arm and flexible hand as well as a telescoping body. The robot can be programmed and trained to fetch and carry objects, recognize and interact with people, and perform household tasks.

The UC San Diego team is made up of computer science Ph.D. students Angelique Taylor, Darren Chan, Priyam Parashar and Ruffin White. Taylor and Chan are members of the Healthcare Robotics Lab, led by Riek, which focuses on human-robot teaming research in safety critical environments, such as hospitals, homes, and factories. They design learning algorithms for robots which can understand people, and are currently focused on applications in critical care and neurorehabilitation. Riek leads multiple projects supported by the National Science Foundation, the Air Force Office of Scientific Research, and industry, which focus on new computational methods for human-robot teaming.

Parashar and White are members of the Cognitive Robotics Lab, led by Christensen, with a focus on systems integration, human-robot interaction, mapping and robot vision. Christensen leads the Contextual Robotics Institute at UC San Diego, which aims to solve today’s pressing challenges in key areas including healthcare and autonomous vehicles. The research is sponsored by National Science Foundation, the Army Research Laboratory and industry. The ultimate goal is to build robots that empower people in their daily lives.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.