Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
A dataset for multi-sensor drone detection
Swedish Armed Forces, Sweden.
Halmstad University, Sweden.
RISE Research Institutes of Sweden, Digital Systems, Mobility and Systems. Halmstad University, Sweden.ORCID iD: 0000-0002-1043-8773
2021 (English)In: Data in Brief, E-ISSN 2352-3409, Vol. 39, article id 107521Article in journal (Refereed) Published
Abstract [en]

The use of small and remotely controlled unmanned aerial vehicles (UAVs), referred to as drones, has increased dramatically in recent years, both for professional and recreative purposes. This goes in parallel with (intentional or unintentional) misuse episodes, with an evident threat to the safety of people or facilities [1]. As a result, the detection of UAV has also emerged as a research topic [2]. Most of the existing studies on drone detection fail to specify the type of acquisition device, the drone type, the detection range, or the employed dataset. The lack of proper UAV detection studies employing thermal infrared cameras is also acknowledged as an issue, despite its success in detecting other types of targets [2]. Beside, we have not found any previous study that addresses the detection task as a function of distance to the target. Sensor fusion is indicated as an open research issue as well to achieve better detection results in comparison to a single sensor, although research in this direction is scarce too [3–6]. To help in counteracting the mentioned issues and allow fundamental studies with a common public benchmark, we contribute with an annotated multi-sensor database for drone detection that includes infrared and visible videos and audio files. The database includes three different drones, a small-sized model (Hubsan H107D+), a medium-sized drone (DJI Flame Wheel in quadcopter configuration), and a performance-grade model (DJI Phantom 4 Pro). It also includes other flying objects that can be mistakenly detected as drones, such as birds, airplanes or helicopters. In addition to using several different sensors, the number of classes is higher than in previous studies [4]. The video part contains 650 infrared and visible videos (365 IR and 285 visible) of drones, birds, airplanes and helicopters. Each clip is of ten seconds, resulting in a total of 203,328 annotated frames. The database is complemented with 90 audio files of the classes drones, helicopters and background noise. To allow studies as a function of the sensor-to-target distance, the dataset is divided into three categories (Close, Medium, Distant) according to the industry-standard Detect, Recognize and Identify (DRI) requirements [7], built on the Johnson criteria [8]. Given that the drones must be flown within visual range due to regulations, the largest sensor-to-target distance for a drone in the dataset is 200 m, and acquisitions are made in daylight. The data has been obtained at three airports in Sweden: Halmstad Airport (IATA code: HAD/ICAO code: ESMT), Gothenburg City Airport (GSE/ESGP) and Malmö Airport (MMX/ESMS). The acquisition sensors are mounted on a pan-tilt platform that steers the cameras to the objects of interest. All sensors and the platform are controlled with a standard laptop vis a USB hub.

Place, publisher, year, edition, pages
Elsevier Inc. , 2021. Vol. 39, article id 107521
Keywords [en]
Anti-drone systems, Drone detection, UAV detection
National Category
Computer Systems
Identifiers
URN: urn:nbn:se:ri:diva-56907DOI: 10.1016/j.dib.2021.107521Scopus ID: 2-s2.0-85118496043OAI: oai:DiVA.org:ri-56907DiVA, id: diva2:1613478
Note

 Funding details: VINNOVA; Funding details: Vetenskapsrådet, VR; Funding details: Högskolan i Halmstad, HH; Funding text 1: This work has been carried out by Fredrik Svanstr?m in the context of his Master Thesis at Halmstad University (Master's Programme in Embedded and Intelligent Systems). Author F. A.-F. thanks the Swedish Research Council and VINNOVA for funding his research.; Funding text 2: This work has been carried out by Fredrik Svanström in the context of his Master Thesis at Halmstad University (Master's Programme in Embedded and Intelligent Systems). Author F. A.-F. thanks the Swedish Research Council and VINNOVA for funding his research.

Available from: 2021-11-22 Created: 2021-11-22 Last updated: 2021-11-22Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Englund, Cristofer

Search in DiVA

By author/editor
Englund, Cristofer
By organisation
Mobility and Systems
In the same journal
Data in Brief
Computer Systems

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 1767 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf