Dataset for Track 5: UAV Tracking and Pose-estimation
Track 5: UAV Tracking and Pose Estimation
About the Dataset
The MMAUD [1] dataset is a dataset dedicated for Unmanned aerial vehicles (UAVs) tracking and position estimation. It provides fisheye camera images, millimeter-wave radar data, and lidar data obtained from a Livox Mid360 and a Livox Avia, with ground truth provided by a Leica Nova MS60 Multi-Station. We intend to fuse data from different modalities to achieve robust UAV position estimation and classification even under challenging conditions.
- Dataset and baseline report: MMAUD Dataset Report
Note that for this challenge track we have updated the dataset described in the report. The dataset used in this challenge is sampled under a different scene but with the same experimental equipment.
Training & Evaluation
This dataset comprises 102 training sequences and 16 validation sequences, each spanning approximately 20 seconds and 5 seconds, respectively. The data collection involves four distinct UAV types: Mavic 3, M30, M300, and Pham. 4. The final 3D position estimation and classification test would be performed on a hold-out test set of multimodal data from the MMAU dataset. The hold-out test set contain the modality and drone type as the provided training set.
- Paper: ArXiv
- Release Date: Feburary, 2024
- Download (Google GDrive): Track 5 Training and Validation Data Download (Baidu Netdisk): Track 5 Training and Validation Data
- Codabench: Codabench Link. Please view the Rules for more details.
- Submission Format Example: README!
If you have any questions about this challenge track please feel free to email ug2.uav.track.ntu@gmail.com and cvpr2024.ug2challenge@gmail.com
References:
[1] Yuan S, Yang Y, Nguyen T H, et al. MMAUD: A Comprehensive Multi-Modal Anti-UAV Dataset for Modern Miniature Drone Threats[J]. arXiv preprint arXiv:2402.03706, 2024.