This year, RoboCup will be held from June 22nd to June 28th as a fully remote event, spanning across various locations worldwide where RoboCup competitions and activities will take place.
One of the latest additions to RoboCup is the RoboCup@Work league, which focuses on the application of robots in work-related scenarios. Drawing inspiration from other RoboCup competitions, RoboCup@Work addresses key research challenges in industrial and service robotics.
We caught up with Asad Norouzi, a member of the executive committee, to discuss the league, its competition format, and the adaptations made for its virtual execution.
Could you provide an overview of the league and what participants can expect? Certainly. RoboCup@Work falls under the umbrella of RoboCup Industrial, which comprises two leagues: RoboCup@Work and Logistics. The primary goal of RoboCup@Work is to establish a standard for robots operating in industrial environments.
During the competition, participants face various benchmarking tests featuring autonomous mobile robots navigating through simulated industrial settings, maneuvering around obstacles, and transporting industry-relevant items like screws and nuts.
Competitors engage in tasks encompassing autonomous navigation, object detection, and manipulation, with some teams delving into advanced learning techniques.
At the onset of each task, a central agent known as the Referee Box assigns objectives to the robot, prompting it to strategize and plan an efficient route to maximize performance and score points.
How are points allocated in the competition? Points are awarded based on the successful completion of assigned tasks. For instance, a task may involve picking up a specified object from Station A and delivering it to Station C. Upon reaching Station A, the robot earns navigation points.
Upon correctly identifying and retrieving the designated object, the team receives detection points. Proper handling and placement of the object onto the robot’s tray garner additional points in the object manipulation category.
Navigating to the destination station and accurately placing the object without it falling off contribute to earning placement points.
Additionally, there are more complex tasks available for teams seeking additional challenges.
What are the details of these more complex tasks?
Among the advanced challenges is precision placement, where the robot must accurately position an object onto a table with cavities matching its shape. The object must fit precisely into the designated hole; even a slight deviation can cause it to fail.
Another task is the rotating table challenge, where the robot must retrieve an object from a table rotating at varying speeds, undisclosed until the competition. Teams must adapt on the fly, aligning the robot, locating the object, and devising a strategy to pick it up amidst the rotation.
Certain stations pose heightened difficulty, such as shelves with multiple levels. Typically, teams use a camera mounted on the robotic arm to detect objects, but this approach presents challenges for lower shelves due to clearance issues. Robots must employ alternative perspectives and techniques for object detection and retrieval, avoiding penalties for collisions within the arena.
Can you elaborate on the obstacles encountered during the course? The obstacles present unique challenges. Simple obstacles, like randomly placed boxes during the competition, force teams to adapt their strategies on the fly.
An intriguing obstacle is tape on the ground, mimicking restricted zones in industrial settings. Robots must recognize the tape as a barrier, using their cameras to identify and navigate around it—a task that simulates real-world challenges in industrial environments.
Have you made any changes to the tasks as the competition has evolved?
Regarding object detection, we’ve raised the bar since last year. Previously, all stations had standard surfaces. However, we’ve introduced arbitrary surfaces, such as grass, paper, or black tape, to challenge object detection further. These variations can confuse the robot’s image processing system, prompting teams to rely more on deep learning techniques to tackle this issue.
Previously, when directed to Station A, robots were informed of its height (e.g., ground level, 5cm, 10cm, or 15cm). Now, we’ve eliminated that information. The Referee Box instructs the robot to proceed to Station A without specifying its height. Consequently, robots must employ sensors to ascertain the height of the station autonomously.
Do the teams all use the same basic hardware for their robots?
At the inception of RoboCup@Work, a sponsor supplied the robotic base and arm. However, production of these robots ceased approximately five or six years ago. Consequently, most teams are now transitioning away from these robots, with many opting to manufacture their own mobile bases or procure them from alternative vendors. This shift has resulted in increasing divergence among teams, as they no longer rely on identical robots as they did in previous years.
Team Flexibility in Robot Selection
Yes, teams are granted considerable freedom in choosing their robots as long as they adhere to certain size constraints outlined in the rule book to ensure compatibility with the arena. Beyond these parameters, teams are encouraged to exercise creativity in their robot selections.
Adapting to the Virtual Environment
This year marks the second instance of conducting the competition virtually, following the precedent set during the 2020 RoboCup Asia Pacific event. However, this time, with a larger global participation, adjustments have been made to elevate the competition’s competitiveness.
Unlike other leagues within RoboCup, we are among the few that will conduct the competition in full scale. While the format remains akin to physical competitions, teams will utilize their own laboratory spaces as virtual arenas instead of a shared physical space. To ensure fairness, extensive deliberations within the technical committee have been undertaken to standardize the competition experience. This involved gathering information from participating teams regarding their university facilities, field dimensions, object and station availability, and subsequently designing a uniform arena accessible to all teams.
Virtual Competition Operation
The competition will be conducted virtually, mirroring the flow of a physical event. Teams will be assigned tasks just before their scheduled turn, simulating the real-time task allocation process in a physical arena. Each team possesses a local Referee Box, enabling them to commence their tasks promptly upon receiving instructions. This setup ensures that teams do not have prior knowledge of the field layout or the specific tasks generated by the Referee Box, fostering a fair and autonomous competition environment.
Collision Detection and Time Zone Challenges
Ensuring accurate collision detection poses a significant challenge, especially in a virtual setting where referees cannot physically observe all angles. To address this, two methods are employed. Firstly, teams are required to utilize cameras to detect collisions. Additionally, teams are instructed to provide logs detailing their robot’s movements, aiding in collision assessment.
Another logistical hurdle is accommodating teams across different time zones. Efforts have been made to identify a suitable time slot that accommodates all participating teams.
Competition Scheduling
Teams are allocated specific time slots for their competition runs. While all teams need to be prepared during the designated time slot, they take turns to compete sequentially. For example, Team A completes their run first, followed by Team B, and so forth, ensuring an organized and orderly competition schedule.
Task Preparation and Timing
When teams receive tasks from the Referee Box, are they responsible for setting up obstacles?
Yes, teams receive obstacle placement instructions five minutes prior to beginning their tasks.
Duration of Events
Will events be held throughout the entire week (22-28 June)?
Yes, events are scheduled across the competition week, featuring various tests. Difficulty levels escalate as the week progresses.
Scoring System and Participation
How are points awarded, and can all teams participate in every challenge?
Teams accrue points for each benchmarking test, with all teams eligible to partake in every challenge. At the conclusion of the week, teams with the highest points are ranked first, second, and third.
Live Streaming and Event Duration
Will there be live video streams, and what is the duration of each run?
Live video streams will be available for participants and spectators. The duration of initial tests is approximately five minutes per run, while more advanced tests span 10 minutes each. The final test on the last day permits a maximum of 20 minutes.
How many teams will be participating this year and where are they from?
This year we have 10 teams. These are:
AuonOHM, Nuremberg Institute of Technology, Germany
bitbots, Bonn-Rhein-Sieg University of Applied Sciences, Germany
Democritus Industrial Robotics (DIR), DIR Research Group, Greece
luhbots, Leibniz University of Hanover, Germany
MRL@Work, Qazvin Azad University, Iran
Robo-Erectus@Work, Singapore Polytechnic, Singapore
RoboHub Eindhoven, Fontys University of Applied Sciences, Netherlands
robOTTO, Otto von Guericke University Magdeburg, Germany
SWOT, University of Applied Sciences Würzburg-Schweinfurt, Germany
Team Tyrolics, University of Innsbruck, Austria
The teams consist of roughly between 5-10 people each.
Performance Enhancements Over Time
What kinds of performance improvements have you witnessed since the league’s inception?
Over the years, we’ve observed substantial advancements in performance, particularly concerning object detection. Approximately two years ago, a majority of teams transitioned from traditional image processing techniques to deep learning methods. Some teams have even openly shared their data, enabling others to train their models, resulting in significant enhancements in object detection accuracy. Despite the increased difficulty we’ve introduced, teams have adeptly adapted due to these technological strides.
Challenges in System Engineering
Another challenge lies in providing computational resources for deep learning within the constraints of limited battery power and space on the mobile base. This presents a system engineering hurdle, necessitating innovative solutions. Fortunately, significant progress has been made in addressing this issue.
Advancements in Autonomous Navigation
We’ve also witnessed remarkable improvements in autonomous navigation. While most teams utilize open-source packages, they extensively fine-tune them to suit their needs. Moreover, a few teams have developed their own navigation packages, underscoring the ongoing research and innovation in this domain.
In summary, there have been numerous noteworthy advancements since the league’s inception.