Break during a minor league game.
Today, July 21st, the competition concluded with a thrilling finale. In our third and final installment of our recaps, we give you a taste of the action from this final day. If you missed it, you can find our first two recaps here: July 19th | July 20th.
The first stop for me this morning was the Standard Platform League, where Dr. Timothy Wiley and Tom Ellis from Team RedbackBots at RMIT University in Melbourne, Australia, showed off their own interesting development: an augmented reality (AR) system to enhance the understanding and explainability of on-field action.
The 2024 RedbackBots travel team (from left to right: Murray Owens, Sam Griffiths, Tom Ellis, Dr Timothy Wiley, Mark Field and Jasper Avice Demay). Photo courtesy of Dr Timothy Wiley.
“What our students proposed to the league at the end of last year’s tournament to contribute was to develop what the league calls augmented reality (AR) visualizations,” explains Timothy, the team’s academic lead. Team Communication Monitor. This is software that is displayed on the TV screen to the audience and the referee, showing them where the robot thinks it is, information about the game, where the ball is. We decided to make this an AR system because we thought it would be much better to see it superimposed on the pitch. What AR allows us to do is project all this information live on the pitch as the robot moves.”
The team demonstrated the system to the league at the event and received very positive feedback. In fact, one team found a bug in the software during a game while trying out the AR system. Tom said he received a lot of ideas and suggestions for further development from other teams. This is one of the first (or if not the first) AR systems to be trialed in competition and was first used in the Standard Platform League. I was lucky enough to get a demo from Tom and it definitely added a new level to the viewing experience. It will be very interesting to see how the system evolves.
Mark Field sets up MetaQuest3 for use with augmented reality systems. Photo courtesy of Dr Timothy Wiley.
From the main soccer area, we headed to the RoboCupJunior area, where Executive Committee member Rui Baptista gave us a tour of the field and introduced us to some of the teams that used machine learning models to support their robots. RoboCupJunior is a student competition divided into three leagues: Soccer, Rescue, and OnStage.
I first met the four teams in the Rescue League. The robots identify “victims” in simulated disaster scenarios, with challenges ranging from following a line on a flat surface to navigating obstacles on uneven terrain. There are three different strands in the league: 1) Rescue Line, where the robots follow a black line to a victim; 2) Rescue Maze, where the robots must search a maze and identify victims; and 3) Rescue Simulation, which is a simulated version of the maze competition.
Team Skollska Knijgia, who participated in the rescue line, used the YOLO v8 neural network to detect victims in the evacuation zone. They trained their network themselves with about 5,000 images. Team Overenginiering2, who also participated in the rescue line event, also used the YOLO v8 neural network, in this case for two elements of the system. They used the first model to detect victims in the evacuation zone and to detect walls. The second model is used while following the line, allowing the robot to detect when the black line (used for most tasks) changes to a silver line, which indicates the entrance to the evacuation zone.
Left: Team Skollska Knijgia. Right: Team Overengineering2.
Team Tanorobo! participated in the maze competition. They also trained a machine learning model for victim detection on 3,000 pictures of each type of victim (represented by different letters in the maze). They also took pictures of walls and obstacles to avoid misclassification. Team New Aje participated in the simulation competition. They used a graphical user interface to train the machine learning model and debug the navigation algorithm. There are three different navigation algorithms, with different computational costs, that can be switched depending on the location (and complexity) in the maze.
Left: Team Tanorobo! Right: Team New Age.
I recently met with two teams who presented at the OnStage event. Team Medic’s performance was based on a medical scenario, and the team included two machine learning elements: speech recognition to communicate with the “patient” robot, and image recognition to classify X-rays. Team Jam Session’s robot reads American Sign Language symbols and uses them to play the piano. They used a MediaPipe detection algorithm to find different points on the hand, and a random forest classifier to determine which symbol was being displayed.
Left: Team Medic Bot. Right: Team Jam Session.
The next stop was the Humanoid League, where the final match was underway. The stadium was packed with a crowd eager to see the action.
Standing only is required to view the adult-sized humanoid.
The finals went to the Middle Size League, with home team Tech United Eindhoven beating BigHeroX convincingly 6-1. The live stream of the final day’s matches can be viewed here.
The grand finale was played between the winners of the Middle Size League (Tech United Eindhoven) and the five directors of the RoboCup. The humans won 5-2, their excellent passing and movement were too much for Tech United.
AIhub is a non-profit organization dedicated to connecting the AI community to the public by providing free, high-quality information about AI.
AIhub is a non-profit organization dedicated to connecting the AI community to the public by providing free, high-quality information about AI.
Lucy Smith is the editor-in-chief of AIhub.