Paper für CV4WS@WACV 2023 akzeptiert
Das Paper mit dem Titel "Detecting Arbitrary Keypoints on Limbs and Skis with Sparse Partly Correct Segmentation Masks" von Katja Ludwig, Daniel Kienzle, Julian Lorenz und Rainer Lienhart wurde auf dem Workshop?Computer Vision for Winter Sports?auf der?IEEE/CVF Winter Conference on Applications in Computer Vision (WACV) 2023?akzeptiert. Die Autoren beschreiben in diesem Paper, wie beliebige Punkte auf den Gliedma?en und den Ski von Skispringern erkannt werden k?nnen. Für die vorgestellte Methode ist ein Datensatz ausreichend, der nur wenige Segmentierungsmasken enth?lt, und diese müssen auch nur teilweise korrekt sein. Analyses based on the body posture are crucial for top- class athletes in many sports disciplines. If at all, coaches label only the most important keypoints, since manual annotations are very costly. This paper proposes a method to detect arbitrary keypoints on the limbs and skis of professional ski jumpers that requires a few, only partly correct segmentation masks during training. Our model is based on the Vision Transformer architecture with a special design for the input tokens to query for the desired keypoints. Since we use segmentation masks only to generate ground truth labels for the freely selectable keypoints, partly correct segmentation masks are sufficient for our training procedure. Hence, there is no need for costly hand-annotated segmentation masks. We analyze different training techniques for freely selected and standard keypoints, including pseudo labels, and show in our experiments that only a few partly correct segmentation masks are sufficient for learning to detect arbitrary keypoints on limbs and skis. Katja Ludwig, Daniel Kienzle, Julian Lorenz and Rainer Lienhart. 2023. Detecting arbitrary keypoints on limbs and skis with sparse partly correct segmentation masks. In IEEE/CVF Winter Conference on Applications of Computer Vision Workshops (WACVW), Jan. 3 2023 to Jan. 7 2023, Waikoloa, HI, USA. IEEE, Piscataway, NJ, 1-10 DOI: 10.1109/WACVW58289.2023.00051
Abstract
Referenz
PDF | BibTeX | RIS | DOI