An advanced Computer Vision system that performs real-time detection, tracking, and bidirectional counting of individuals in a monitored space. This project utilizes YOLOv11 for high-precision detection and Polygon ROI (Region of Interest) logic to determine entry and exit events.
- Real-time Object Tracking: Implements the
persist=Trueparameter in YOLOv11 to maintain unique identities (IDs) for every person in the frame. - Bidirectional Counting: Uses dual-polygon logic to distinguish between people entering and exiting a premises.
- Live Occupancy Analytics: Dynamically calculates the current number of people inside based on
Entered - Exited. - Visual Debugging: Renders bounding boxes, unique track IDs, and counting boundaries directly onto the video stream.
- Automated Export: Processed footage is automatically saved with all overlays (counters, polygons, boxes) using the
VideoWritermodule.
- Core Engine: YOLO11 (Ultralytics) for object detection.
- Processing: Python, NumPy (for coordinate handling).
- Vision & UI: OpenCV (cv2) for video I/O and CVZone for stylized text/UI overlays.
- Geospatial Logic:
cv2.pointPolygonTestto accurately detect when a person's center-point crosses specific spatial boundaries.
The system defines two specific polygonal areas (area1 and area2) placed near a doorway or threshold:
-
Detection: The model identifies a
personand assigns atrack_id. -
Point of Interest: The system tracks the bottom-center coordinate
$(x_1, y_2)$ of the bounding box---the point where the person's feet touch the ground. -
Cross-Verification:
-
Entry: If a
track_idis first detected inarea2and subsequently moves intoarea1, the system increments the Entered counter. -
Exit: If a
track_idis first detected inarea1and subsequently moves intoarea2, the system increments the Exited counter.
-
Entry: If a
-
State Management: Python dictionaries (
enterandexits) store the initial state of each unique ID to prevent double-counting.
-
Clone the repository:
Bash
git clone https://github.com/[Your-Username]/people-counting-yolo11.git cd people-counting-yolo11 -
Install dependencies:
Bash
pip install ultralytics opencv-python cvzone numpy -
Run the application:
Bash
python main.py
You can customize the detection area by modifying the polygon coordinates in main.py:
Python
area1 = [(250, 444), (211, 448), (473, 575), (514, 566)]
area2 = [(201, 449), (177, 453), (420, 581), (457, 577)]
Use the integrated RGB mouse callback feature to find the exact $(x, y)$ coordinates for your specific camera angle.
The system generates an output_people_count.mp4 file containing:
- Real-time counters for Entered, Exited, and Current occupancy.
- Unique Tracking IDs for every detected individual.
- Visual boundaries (Pink polygons) showing the active detection zones.
