🏀 In this video, you'll learn how to use machine learning, computer vision and deep learning to create an NBA basketball analysis system. This project is able to extract a lot of insights from a basketball game by using multiple AI models — like YOLO for object detection. This includes the number of passes and interceptions a team made, the ball acquisition percentage of each team, the distance and speed of each player, and even translates the camera view into a tactical top-down view, allowing the system to fully understand what’s happening on the court.
This project utilizes YOlO a state of the art object detector to detect the players, and balls. It also utilizes trackers to track those object across frames. We also train our own object detector to enhance the output of the state-of-the-art models. Additionally, we will assign players to teams based user zero shot image classifiers with the Huggingface library. Furthermore, we will detect court keypoints on a frame to understand player location relative to the court. And Finally we will implement perspective transformation to represent the scene's depth and perspective, allowing us to measure a player's movement in meters rather than pixels. This will help us calculate a player's speed and the distance covered. This project covers various concepts and addresses real-world problems, making it suitable for both beginners and experienced machine learning engineers.
💡 Important Note about the court keypoint dataset used:
================================
In the video I used the wrong dataset, to train the model, please use the dataset linked bellow to get my output exactly. Sorry about this small inconvenience.
💡 What You’ll Learn:
================================
1. 🏀 YOLO Object Detection and tracking: Detect basketball players and the ball across video frames.
2. 🎯 Fine-tune YOLO on your own custom basketball dataset for maximum accuracy.
3. 🎨 Assign players to teams based on jersey color using Zero-Shot Image Classification
4. 🔄 Track ball possession to detect passes and interceptions.
5. 📍 Train a Keypoint Detector to identify court landmarks automatically.
6. 🧠 Apply Perspective Transformation to convert the camera view into a clean top-down tactical map.
7. ⏱️ Calculate player speed and distance covered using real-world court coordinates.
🔗 Links:
================================
Github Repo: https://github.com/abdullahtarek/basketball_analysis/tree/main
Basketball detection Dataset: https://universe.roboflow.com/workspace-5ujvu/basketball-players-fy4c2-vfsuv
Zero shot classifier: https://huggingface.co/patrickjohncyh/fashion-clip
Basketball court keypoint Dataset: https://universe.roboflow.com/fyp-3bwmg/reloc2-den7l
🔑 TIMESTAMPS
================================
0:00 - Introduction
2:08 - Object detection (YOLO) and tracking
2:36:36 - Ball interpolation
2:58:17 - Player Team assignment with Zero shot classification
3:40:03 - Ball Acquisition
4:55:55 - Passes and interceptions detection
5:29:50 - Court key point Detection
5:54:49 - Tactical View / Perspective Transformation
7:07:20 - Speed and distance calculator