Mike De Decker

Mike is currently studying for an applied informatics degree. He enjoys assembling pieces of code in order to create new functional implementations, test them or debug problems. Mike's bachelor thesis about the AI jump rope judge allowed him to combine his passion for programming and jump rope, and also strengthened his interest in the field of computer vision. 

Mike De Decker: Supported by BELNET

Lightning Talk at TNC25 | AI Judge Assistant for Recognition of Jump Rope Skills in Videos

Judging jump rope freestyle routines at the highest competitive level has become increasingly challenging due to the evolution of jump rope. Both the number of skills that are included in a routine as well as the speed with which these are executed keep increasing. This is particularly evident in so-called Double Dutch Freestyle routines, which is why assigning scores to these freestyles is done by a combination of live and delayed evaluation. The creativity of a routine (including its variation and musicality) is scored in real time, but the assignment of the appropriate difficulty level is done based on a recording of the routine replayed at half speed right after it is performed. Even though this helps reduce errors in difficulty scoring, a certain variability in the assigned scores can still be seen.

To make the objectivity in scoring more robust in Gymnastics, since 2017 Fujitsu has been collaborating with the International Gymnastic Federation to develop a Jury Support System (JSS). The results were first introduced at the 2019 Artistic Gymnastics World Championship being the first in the field. Since then, even more accessible AI tools, better computational resources, and pre-trained models have emerged. Inspired by this example and others such as sign-language recognition, or NextJumps speed counter (2023), which outperforms judges in counting speed steps, this study sets out to explore the creation of an AI jump rope assistant capable of recognising skills based on video recordings, which is different from the sensory input the JSS is using.

Mike's participation in the Future Talent Programme 2025 was supported by BELNET.

Skip to content