Spatial and Temporally aligned Visible and Infrared UAV images (labelled) and videos (not labelled)
Description
This dataset was created for sensing of UAVs in the context of the Counter-UAS problem. To implement data fusion methodologies for imaging sensors, namely at pixel level, the data from the visible and infrared sensors must be spatial and temporally aligned. To this end, flight tests were conducted at the University of Victoria's Center for Aerospace Research (UVIC-CfAR) using a TASE 200 gimbal (Visible sensor: SONY FCB-EX1020 PAL, Infrared sensor: FLIR TAU 640 PAL). Additional data collected at the Universidade de Lisboa - Instituto Superior Técnico (IST) using a TeAx ThermalCapture Fusion Zoom was provided. This resulted in two separate sub-datasets: one of labelled UAV images, and one of UAV videos not labelled. All data from Visible and Infrared sensors are spatial and temporally aligned. The labelled dataset includes real frames of a DJI Mavic 2, the VTOL Mini-E (developed at UVIC-CfAR), the hybrid multirotor MIMIQ (developed at UVIC-CfAR), a DJI Mini 3 Pro, and a Zeta FX-61 Phantom Wing and artificial frames of quadcopters, a hexacopter, and a fixed-wing. It includes variety in operational conditions and characteristics, namely range, lighting, blurry and partially cut UAV, presence of birds, and background texture. Images are labelled in the YOLO format. Folders were organized in the YOLO format with 80-10-10 partition for training, test and validation sets. Images were randomly selected for each folder. The dataset of videos that are not labelled includes videos of a DJI Mavic 2, the VTOL Mini-E (developed at UVIC-CfAR), and a DJI Inspire 1. Some videos are in their original unprocessed version. Others are separated into videos of interest, which include the segments with better spatial and temporal alignment and isolation of operational conditions and characteristics.
Files
Institutions
Categories
Funding
NSERC Discovery and Canada Research Chair Programs
Fundação para a Ciência e a Tecnologia