Unmanned Aircraft Vehicle (UAV) Pilot Identification Using Machine Learning
VerifiedAdded on 2023/01/11
|32
|6106
|32
AI Summary
This document focuses on the identification of Unmanned Aerial Vehicle (UAV) pilots using a novel machine learning approach. It discusses the use of three classification machine learning models to detect rogue drones and highlights the effectiveness of the proposed approach at different altitudes. The document also includes an abstract, acknowledgement, declaration and copyright, table of contents, and related works.
Contribute Materials
Your contribution can guide someone’s learning journey. Share your
documents today.
Running head: UAV PILOT IDENTIFICATION
1
Unmanned Aircraft Vehicle
(UAV) Pilot Identification Using
Machine Learning
Ahmed Saeed Al Shemeili
PhD. Thesis
February 2019
A thesis submitted to Khalifa University of Science and Technology in accordance with the
requirements of the degree of PhD in Engineering in the Department of (Electrical and
Computer Engineering).
1
Unmanned Aircraft Vehicle
(UAV) Pilot Identification Using
Machine Learning
Ahmed Saeed Al Shemeili
PhD. Thesis
February 2019
A thesis submitted to Khalifa University of Science and Technology in accordance with the
requirements of the degree of PhD in Engineering in the Department of (Electrical and
Computer Engineering).
Secure Best Marks with AI Grader
Need help grading? Try our AI Grader for instant feedback on your assignments.
UAV PILOT IDENTIFICATION 2
Unmanned Aircraft Vehicle (UAV) Pilot Identification Using Machine Learning
by
Ahmed Saeed Al Shemeili
A thesis submitted in partial fulfillment of the
requirements for the degree of
PhD in Engineering
at
Khalifa University
Thesis Committee
Dr. Abdulhadi Shoufan (Supervisor),
Khalifa University
Prof. Ernesto Damiani (Co-Supervisor)
Khalifa University
Dr. VWY (Industrial Supervisor),
EBTIC
Prof. XYZ (External Examiner &
Committee Chair),
University of TTT
Dr. MNV (Internal Examiner)
Khalifa University
February 2019
Unmanned Aircraft Vehicle (UAV) Pilot Identification Using Machine Learning
by
Ahmed Saeed Al Shemeili
A thesis submitted in partial fulfillment of the
requirements for the degree of
PhD in Engineering
at
Khalifa University
Thesis Committee
Dr. Abdulhadi Shoufan (Supervisor),
Khalifa University
Prof. Ernesto Damiani (Co-Supervisor)
Khalifa University
Dr. VWY (Industrial Supervisor),
EBTIC
Prof. XYZ (External Examiner &
Committee Chair),
University of TTT
Dr. MNV (Internal Examiner)
Khalifa University
February 2019
Abstract
This document focus on identification of Unmanned Aerial Vehicle using a novel
machine learning approach. In doing so, three classification machine learning models
including logistics regression, random forest and neural network are applied by using radio
controlled RC measurement features to detect rogue drones. The findings suggest that for
high altitude the proposed machine learning approach can perform effectively compared to
lower altitudes since the detection degrades with decrease in altitudes.
Indexing Terms: UAV, machine learning, random forest, ensemble, bagging, MATLAB.
This document focus on identification of Unmanned Aerial Vehicle using a novel
machine learning approach. In doing so, three classification machine learning models
including logistics regression, random forest and neural network are applied by using radio
controlled RC measurement features to detect rogue drones. The findings suggest that for
high altitude the proposed machine learning approach can perform effectively compared to
lower altitudes since the detection degrades with decrease in altitudes.
Indexing Terms: UAV, machine learning, random forest, ensemble, bagging, MATLAB.
Acknowledgement
Undertaking this PhD has been a major work experience change for me where I shift
my career from networking in cyber security and artificial intelligence and this was possible
with the support I received from many people and Khalifa University.
I would like first to thank my supervisor Dr. Abdulhadi Shoufan for all the support he
gave me and the continuous feedback, which helps during my PhD study.
Many thanks to Prof. Ernest Damiani for his support and guidance to ensure that my
PhD study will be successful and of benefit to our society
I gratefully acknowledge the funding received towards my PhD from Khalifa
University. Thanks to Prof. Mahmoud Al Qutairi for supporting my study at Khalifa
University. Thanks to my Father, Mom, Brothers and sisters
Thank you my Wife and kids
Undertaking this PhD has been a major work experience change for me where I shift
my career from networking in cyber security and artificial intelligence and this was possible
with the support I received from many people and Khalifa University.
I would like first to thank my supervisor Dr. Abdulhadi Shoufan for all the support he
gave me and the continuous feedback, which helps during my PhD study.
Many thanks to Prof. Ernest Damiani for his support and guidance to ensure that my
PhD study will be successful and of benefit to our society
I gratefully acknowledge the funding received towards my PhD from Khalifa
University. Thanks to Prof. Mahmoud Al Qutairi for supporting my study at Khalifa
University. Thanks to my Father, Mom, Brothers and sisters
Thank you my Wife and kids
Paraphrase This Document
Need a fresh take? Get an instant paraphrase of this document with our AI Paraphraser
Declaration and Copyright
Declaration
I declare that the work in this thesis was carried out in accordance with the regulations
of Khalifa University of Science and Technology. The work is entirely my own except where
indicated by special reference in the text. Any views expressed in the thesis are those of the
author and in no way represent those of Khalifa University of Science and Technology. No
part of the thesis has been presented to any other university for any degree.
Author Name: Ahmed Saeed Al Shemeili
Author Signature:
Date: 24-02-2019
Copyright ©
No part of this thesis may be reproduced, stored in a retrieval system, or transmitted,
in any form or by any means, electronic, mechanical, photocopying, recording, scanning or
otherwise, without prior written permission of the author. The thesis may be made available
for consultation in Khalifa University of Science and Technology Library and for inter-
library lending for use in another library and may be copied in full or in part for any bona fide
library or research worker, on the understanding that users are made aware of their
obligations under copyright, i.e. that no quotation and no information derived from it may be
published without the author's prior consent.
Declaration
I declare that the work in this thesis was carried out in accordance with the regulations
of Khalifa University of Science and Technology. The work is entirely my own except where
indicated by special reference in the text. Any views expressed in the thesis are those of the
author and in no way represent those of Khalifa University of Science and Technology. No
part of the thesis has been presented to any other university for any degree.
Author Name: Ahmed Saeed Al Shemeili
Author Signature:
Date: 24-02-2019
Copyright ©
No part of this thesis may be reproduced, stored in a retrieval system, or transmitted,
in any form or by any means, electronic, mechanical, photocopying, recording, scanning or
otherwise, without prior written permission of the author. The thesis may be made available
for consultation in Khalifa University of Science and Technology Library and for inter-
library lending for use in another library and may be copied in full or in part for any bona fide
library or research worker, on the understanding that users are made aware of their
obligations under copyright, i.e. that no quotation and no information derived from it may be
published without the author's prior consent.
Table of Contents
Abstract.......................................................................................................................................i
Acknowledgement....................................................................................................................iii
Declaration and Copyright..........................................................................................................i
Declaration.............................................................................................................................i
Copyright ©...........................................................................................................................i
List of Figures...........................................................................................................................iii
List of tables..............................................................................................................................iv
List of Abbreviations..................................................................................................................v
CHAPTER 1...............................................................................................................................6
1.0 Introduction....................................................................................................................6
The Aim of the Project.......................................................................................................12
Project Objective................................................................................................................12
CHAPTER 2.............................................................................................................................12
Related Works....................................................................................................................12
Fundamental technological framework of UAV..........................................................12
Drones and cyber security..............................................................................................16
CHAPTER 3.............................................................................................................................17
3.0 Research Methodology.................................................................................................17
Techniques to be used.....................................................................................................17
Data Collection................................................................................................................18
Tools and Applications...................................................................................................19
CHAPTER 4.............................................................................................................................21
Preliminary Results........................................................................................................21
Five Pilots Dataset...........................................................................................................21
CHAPTER 5.............................................................................................................................29
Abstract.......................................................................................................................................i
Acknowledgement....................................................................................................................iii
Declaration and Copyright..........................................................................................................i
Declaration.............................................................................................................................i
Copyright ©...........................................................................................................................i
List of Figures...........................................................................................................................iii
List of tables..............................................................................................................................iv
List of Abbreviations..................................................................................................................v
CHAPTER 1...............................................................................................................................6
1.0 Introduction....................................................................................................................6
The Aim of the Project.......................................................................................................12
Project Objective................................................................................................................12
CHAPTER 2.............................................................................................................................12
Related Works....................................................................................................................12
Fundamental technological framework of UAV..........................................................12
Drones and cyber security..............................................................................................16
CHAPTER 3.............................................................................................................................17
3.0 Research Methodology.................................................................................................17
Techniques to be used.....................................................................................................17
Data Collection................................................................................................................18
Tools and Applications...................................................................................................19
CHAPTER 4.............................................................................................................................21
Preliminary Results........................................................................................................21
Five Pilots Dataset...........................................................................................................21
CHAPTER 5.............................................................................................................................29
Conclusion...........................................................................................................................29
Moving Forward.............................................................................................................30
Project Time Plan...............................................................................................................30
Moving Forward.............................................................................................................30
Project Time Plan...............................................................................................................30
Secure Best Marks with AI Grader
Need help grading? Try our AI Grader for instant feedback on your assignments.
List of Figures
Figure 1: Navigation system architecture ………………………………………..…14
Figure 2: UAV directions………………………………………………………….….15
Figure 3: General architecture of a drone including the ground control station….16
Fig 4: UAV detection …………………………………………………………………..17
Figure 5: Cross validation……………………………………………………………...20
Figure 6: Performance of all classifiers on the same dataset………...………………22
Figure 7. Features test accuracy with different crosses starting from 50% up to
98%....................................................................................................................................24
Figure8: Single features output compared to all features output accuracy………...25
Figure 9: All Data Vs. RC Data Classification Performance………………………...26
Figure 10: Pilot performance…………………………………………………………..27
Figure 11: project timeline plan……………………………………………………….31
List of tables
Table 1: Pilots data…………………………………………………………………….20
Table 2: Different classifiers accuracy test…………………………………………..21
Table 3: Cross validation test…………………………………………………………23
Table 4: Single feature test……………………………………………………….…..24
Table 5. Effectiveness order of the features…………………………….………..….25
Table 6: All data vs RC data performance………………………………………….26
Table 7: Pilot performance…………………………………………………………...27
Table 8: Features accuracy test……………………………………………………...29
Figure 1: Navigation system architecture ………………………………………..…14
Figure 2: UAV directions………………………………………………………….….15
Figure 3: General architecture of a drone including the ground control station….16
Fig 4: UAV detection …………………………………………………………………..17
Figure 5: Cross validation……………………………………………………………...20
Figure 6: Performance of all classifiers on the same dataset………...………………22
Figure 7. Features test accuracy with different crosses starting from 50% up to
98%....................................................................................................................................24
Figure8: Single features output compared to all features output accuracy………...25
Figure 9: All Data Vs. RC Data Classification Performance………………………...26
Figure 10: Pilot performance…………………………………………………………..27
Figure 11: project timeline plan……………………………………………………….31
List of tables
Table 1: Pilots data…………………………………………………………………….20
Table 2: Different classifiers accuracy test…………………………………………..21
Table 3: Cross validation test…………………………………………………………23
Table 4: Single feature test……………………………………………………….…..24
Table 5. Effectiveness order of the features…………………………….………..….25
Table 6: All data vs RC data performance………………………………………….26
Table 7: Pilot performance…………………………………………………………...27
Table 8: Features accuracy test……………………………………………………...29
List of Abbreviations
GCS: Ground Control Station
GPS: Global Positioning System
IMU: Internal Measurement Unit
IVDR: In-Vehicle Data Recorders
SDN: Software Defined Network
RC: Remote Control
UAV: Unmanned Aircraft Vehicle
UE: User Equipment
SIM: Simulated Identity Module
BSs: Base Stations
GCS: Ground Control Station
GPS: Global Positioning System
IMU: Internal Measurement Unit
IVDR: In-Vehicle Data Recorders
SDN: Software Defined Network
RC: Remote Control
UAV: Unmanned Aircraft Vehicle
UE: User Equipment
SIM: Simulated Identity Module
BSs: Base Stations
CHAPTER 1
1.0 Introduction
A conspicuous proliferation of Drones/Unmanned Aerial Vehicles (UAVs) have been
experienced in the recent years. Besides the commercial application use like in precision
agriculture where the drones are used for surveying farms during crop spraying and pest
control, they are also being used for military strikes, scientific research, journalism,
environmental protection and border security among other uses. Besides the potential benefits
of the UAVs, there is worry that the UAVs are currently causing many problems. They are
associated with the risk to privacy and safety.
Reports of drones violating public privacy and security of critical facilities including
airports and nuclear plants have become common in the recent times [1]. As reported by [2],
a UAV got cashed intentionally into a nuclear power plant in France in the year 2018.
Moreover, Federal Aviation Administration reports that safety incidents involving UAVs are
exceeding 250 [3]. Commonly, most of these events arise from rogue drones which violates
fly zones restrictions. Additionally, terror groups have also exploited the UAVs for
smuggling explosive devices and chemicals. [4] Reports that two UAVs carrying explosives
got detonated near Venezuela president in an outdoor event. This is a clear indication that
there is an urgent need to protect the national airspace from such unusual threats. This can be
protected by accurately determining the rogue drones through the help of artificial
intelligence.
Various techniques have gotten proposed to mitigate such unconventional problems to
no avail. Conventional radar based approaches which have been widely used in identifying
airplanes have been tried to no avail as it mostly fails to determine drones. Similarly, sound
and video based detection techniques have also been tried without success; these techniques
are recommended for short range instances. Some of these challenges can be addressed by
machine learning approach. In addition to machine learning, part of the analysis is to analyze
all steps of random forest in deciding the right features to identify the pilot, this will enable us
to identify possible enhancement of feature selection by going through each tree and see how
each node or leaf is selected by random forest. By doing this there will be a chance to
discover common behavior between the trees which in return might make it easier to
1.0 Introduction
A conspicuous proliferation of Drones/Unmanned Aerial Vehicles (UAVs) have been
experienced in the recent years. Besides the commercial application use like in precision
agriculture where the drones are used for surveying farms during crop spraying and pest
control, they are also being used for military strikes, scientific research, journalism,
environmental protection and border security among other uses. Besides the potential benefits
of the UAVs, there is worry that the UAVs are currently causing many problems. They are
associated with the risk to privacy and safety.
Reports of drones violating public privacy and security of critical facilities including
airports and nuclear plants have become common in the recent times [1]. As reported by [2],
a UAV got cashed intentionally into a nuclear power plant in France in the year 2018.
Moreover, Federal Aviation Administration reports that safety incidents involving UAVs are
exceeding 250 [3]. Commonly, most of these events arise from rogue drones which violates
fly zones restrictions. Additionally, terror groups have also exploited the UAVs for
smuggling explosive devices and chemicals. [4] Reports that two UAVs carrying explosives
got detonated near Venezuela president in an outdoor event. This is a clear indication that
there is an urgent need to protect the national airspace from such unusual threats. This can be
protected by accurately determining the rogue drones through the help of artificial
intelligence.
Various techniques have gotten proposed to mitigate such unconventional problems to
no avail. Conventional radar based approaches which have been widely used in identifying
airplanes have been tried to no avail as it mostly fails to determine drones. Similarly, sound
and video based detection techniques have also been tried without success; these techniques
are recommended for short range instances. Some of these challenges can be addressed by
machine learning approach. In addition to machine learning, part of the analysis is to analyze
all steps of random forest in deciding the right features to identify the pilot, this will enable us
to identify possible enhancement of feature selection by going through each tree and see how
each node or leaf is selected by random forest. By doing this there will be a chance to
discover common behavior between the trees which in return might make it easier to
Paraphrase This Document
Need a fresh take? Get an instant paraphrase of this document with our AI Paraphraser
eliminate part of process selection. This part can be applicable to any datasets (not limited to
our UAV test), which will help in the development and enhancement of many artificial
intelligence applications. This document seek to propose a novel machine learning approach
to determine rogue UAVs in space based on radio waves which are sent by User Equipment
UEs to Base Station BSs.
The Aim of the Project
UAVs are normally flown at high altitudes where the interfering signals may be
strong if not managed properly [5]. It is essential to identify if a drone is legit or not. It is
therefore important to use mobile networks to identify legit and non-legit UAVs. For legit
UAVs, a standard mechanism like a Subscriber Identity Module SIM cards can get enforced
so that the UAVs can be identified by network. Nevertheless, it is very challenging to
determine the drones which are not legitimate i.e. those that are not registered with any
network, an instance which has drawn much attentions as flying unregistered UAV may lead
to excessive interference to networks, and it is not also allowed by policies in certain regions.
This may lead to security and privacy issues like drone aided terror attack and privacy
breaches mentioned earlier [20], [21]. As a result, it is important to detect and determine
these non-legit drones as mentioned earlier. The aim of this paper is to propose a novel
machine learning approach to determine rogue UAVs in space based on radio waves which
are sent by UEs to BSs.
Project Objective
Our primary objective is to develop a robust and highly integrated system which has a
capacity to detect and identify UAVs using machine learning approach.
CHAPTER 2
Related Works
Fundamental technological framework of UAV
The machine learning strategies for detection and identification of the UAVs can be
done without knowing technological framework of the drones. This framework is classified
under three subheadings including navigation, control and communication. This section
presents related works for the above mentioned technological framework of UAVs.
Navigation
Drone navigation involves the process by which the drones make a plan which helps it
to safely arrive at its target destination. Navigation mostly depends on the current
our UAV test), which will help in the development and enhancement of many artificial
intelligence applications. This document seek to propose a novel machine learning approach
to determine rogue UAVs in space based on radio waves which are sent by User Equipment
UEs to Base Station BSs.
The Aim of the Project
UAVs are normally flown at high altitudes where the interfering signals may be
strong if not managed properly [5]. It is essential to identify if a drone is legit or not. It is
therefore important to use mobile networks to identify legit and non-legit UAVs. For legit
UAVs, a standard mechanism like a Subscriber Identity Module SIM cards can get enforced
so that the UAVs can be identified by network. Nevertheless, it is very challenging to
determine the drones which are not legitimate i.e. those that are not registered with any
network, an instance which has drawn much attentions as flying unregistered UAV may lead
to excessive interference to networks, and it is not also allowed by policies in certain regions.
This may lead to security and privacy issues like drone aided terror attack and privacy
breaches mentioned earlier [20], [21]. As a result, it is important to detect and determine
these non-legit drones as mentioned earlier. The aim of this paper is to propose a novel
machine learning approach to determine rogue UAVs in space based on radio waves which
are sent by UEs to BSs.
Project Objective
Our primary objective is to develop a robust and highly integrated system which has a
capacity to detect and identify UAVs using machine learning approach.
CHAPTER 2
Related Works
Fundamental technological framework of UAV
The machine learning strategies for detection and identification of the UAVs can be
done without knowing technological framework of the drones. This framework is classified
under three subheadings including navigation, control and communication. This section
presents related works for the above mentioned technological framework of UAVs.
Navigation
Drone navigation involves the process by which the drones make a plan which helps it
to safely arrive at its target destination. Navigation mostly depends on the current
environment and location. The drones must be fully aware of its state including its
geographical location, the direction of movement as well as its speed in order for it to
achieved mission. This can be achieved through various methods some of which include
map-less system and map-based system. The navigation function is founded on the
coordinated turn function. The system determines the current course of the drone based on
voyage points. It then measures and calculates the lateral deviation between the flight path
and the aircraft, the angle of deviation, and ground velocity of the craft. Subsequently, it
resolves the lateral driving signal based on the law of navigation control and sends back the
angle that helps in controlling the aircraft. Figure 1 [7] illustrates the navigation system
architecture used in most UAVs.
Figure 1: Navigation system architecture [7]
1. Map-less system
Navigation can be done using this strategy without application of maps. Navigation,
here, is aided by the distinct features in the environment that have been idenfied by drone.
Feature tracking and optical flow methods are the techniques that are utilized by the drones in
maples system. Aside from this, Santos-Victor [6] came up with a map-less navigation
method which is more like the behavior of bee’s flight which is estimating the movement of
the drone from a camera on its both sides. This finding is great in the object navigation as it
can be used to calculate the navigation of the object easily by working the velocity of the
cameras relative to the drone’s wall. Its downside, however, is that it mostly fails when
navigating in an environment which is texture-less.
geographical location, the direction of movement as well as its speed in order for it to
achieved mission. This can be achieved through various methods some of which include
map-less system and map-based system. The navigation function is founded on the
coordinated turn function. The system determines the current course of the drone based on
voyage points. It then measures and calculates the lateral deviation between the flight path
and the aircraft, the angle of deviation, and ground velocity of the craft. Subsequently, it
resolves the lateral driving signal based on the law of navigation control and sends back the
angle that helps in controlling the aircraft. Figure 1 [7] illustrates the navigation system
architecture used in most UAVs.
Figure 1: Navigation system architecture [7]
1. Map-less system
Navigation can be done using this strategy without application of maps. Navigation,
here, is aided by the distinct features in the environment that have been idenfied by drone.
Feature tracking and optical flow methods are the techniques that are utilized by the drones in
maples system. Aside from this, Santos-Victor [6] came up with a map-less navigation
method which is more like the behavior of bee’s flight which is estimating the movement of
the drone from a camera on its both sides. This finding is great in the object navigation as it
can be used to calculate the navigation of the object easily by working the velocity of the
cameras relative to the drone’s wall. Its downside, however, is that it mostly fails when
navigating in an environment which is texture-less.
2. Map based system
This system involves a plan of an environment which allows the object to move with
a movement planning capability. [7] In his study used a three dimensional sensor to map
urban environment effectively with a drone. [8] On the other hand came up with an open
source for representing the three dimensional model. This approach is essential since it allows
us to rectify potential errors by filtering sensor information.
UAV Directions
Figure 2 below show three main directions of UAV
Figure 2: UAV directions [18]
Yaw
Making a move to the right or left of the quadcopter head (rotating)
Pitch
Making a forward or backward move of the quadcopter
Roll
Making the quadcopter sideway move (right or left)
Thrust
This system involves a plan of an environment which allows the object to move with
a movement planning capability. [7] In his study used a three dimensional sensor to map
urban environment effectively with a drone. [8] On the other hand came up with an open
source for representing the three dimensional model. This approach is essential since it allows
us to rectify potential errors by filtering sensor information.
UAV Directions
Figure 2 below show three main directions of UAV
Figure 2: UAV directions [18]
Yaw
Making a move to the right or left of the quadcopter head (rotating)
Pitch
Making a forward or backward move of the quadcopter
Roll
Making the quadcopter sideway move (right or left)
Thrust
Secure Best Marks with AI Grader
Need help grading? Try our AI Grader for instant feedback on your assignments.
The needed force of all engines to move the quadcopter through the air
Control
There are various flight control system applied in UAVs. These control system
normally include integrated sensors like barometric sensors used in measuring the state of the
drone, radio sensor and GPS sensor among others. Chao et al. [9] presents a study on
autopilot for small drones. The study explains the autopilot control system and the radio
control system from the perspective of both software and hardware. The study compares
various autopilot navigation techniques and controller strength. It is highly reliable for flight
control and highly accurate for an autonomous navigation. Its disadvantage, however, is that
it does not present sensor software fusion algorithms and hardware sensors. Figure 3
illustrates the general architecture of a drone including the ground control systems. The
signals from the ground control allow the drone to fly along a predefined path through the
commands entered. The controls can allow autonomous flying mode, a stabilized mode, and a
manual mode.
Figure 3: General architecture of a drone including the ground control station [4]
Communication
Communication is not only used to disseminate tasks, observations and control
information but is also used for coordinating drones. The main issues that are addressed in
this block include communication link models, connectivity, routing and scheduling as well
as data transmission.
Literature on routing and medium access protocol exists in large number. The routing
protocols considers quality of service metrics especially energy limitations mainly for
wireless networks. According to [10], it exist in both hierarchical and location based
Control
There are various flight control system applied in UAVs. These control system
normally include integrated sensors like barometric sensors used in measuring the state of the
drone, radio sensor and GPS sensor among others. Chao et al. [9] presents a study on
autopilot for small drones. The study explains the autopilot control system and the radio
control system from the perspective of both software and hardware. The study compares
various autopilot navigation techniques and controller strength. It is highly reliable for flight
control and highly accurate for an autonomous navigation. Its disadvantage, however, is that
it does not present sensor software fusion algorithms and hardware sensors. Figure 3
illustrates the general architecture of a drone including the ground control systems. The
signals from the ground control allow the drone to fly along a predefined path through the
commands entered. The controls can allow autonomous flying mode, a stabilized mode, and a
manual mode.
Figure 3: General architecture of a drone including the ground control station [4]
Communication
Communication is not only used to disseminate tasks, observations and control
information but is also used for coordinating drones. The main issues that are addressed in
this block include communication link models, connectivity, routing and scheduling as well
as data transmission.
Literature on routing and medium access protocol exists in large number. The routing
protocols considers quality of service metrics especially energy limitations mainly for
wireless networks. According to [10], it exist in both hierarchical and location based
protocols in which the network is portioned into various structures. However, the author does
not present a comprehensive analysis of applications of the routing protocols.
Drones and cyber security
Security concerns
The potential dangers of drones have been highlighted by a few incidents. In the year
2015, a drone crashed in white house compound [11]. This incident occur after escaping the
white house radar that is set to detect threats among other incidents. While these threats are
harmless, the threats are wakeup call of the potential threats of the UAVs at terrorist’s hand.
[12] Highlight the features of drones that make it attractive to terrorists. Some of these
features are the capability of drones to reach targets which are not reachable by land, the
possibility of causing a massive attack, less effectiveness of air resistance against targets
among other several features. [14] Presents three classifications of these drone threats
including accidental intrusion, intentional intrusion by sophisticated users and intentional
intrusion by non-sophisticated users.
Security approaches
These security concerns can be mitigated by various approaches [15]. Authors in [15]
presented some techniques. Some of the approaches include the use of detection system like
radar, geo-fencing and use of UAV capture and control through spoofing. Besides exploring
the approaches explicitly, the document fails to acknowledge one of the most important
technique for dealing with the threats, the machine learning approach. UAV detection by
machine learning is noteworthy approach that would be proposed in solving security concerns
posed by drones.
For instance, figure 4 below shows a detection by a sensing device, the detection
technique is machine learning aided approach that involve detecting signals from both UAV
and its controller.
not present a comprehensive analysis of applications of the routing protocols.
Drones and cyber security
Security concerns
The potential dangers of drones have been highlighted by a few incidents. In the year
2015, a drone crashed in white house compound [11]. This incident occur after escaping the
white house radar that is set to detect threats among other incidents. While these threats are
harmless, the threats are wakeup call of the potential threats of the UAVs at terrorist’s hand.
[12] Highlight the features of drones that make it attractive to terrorists. Some of these
features are the capability of drones to reach targets which are not reachable by land, the
possibility of causing a massive attack, less effectiveness of air resistance against targets
among other several features. [14] Presents three classifications of these drone threats
including accidental intrusion, intentional intrusion by sophisticated users and intentional
intrusion by non-sophisticated users.
Security approaches
These security concerns can be mitigated by various approaches [15]. Authors in [15]
presented some techniques. Some of the approaches include the use of detection system like
radar, geo-fencing and use of UAV capture and control through spoofing. Besides exploring
the approaches explicitly, the document fails to acknowledge one of the most important
technique for dealing with the threats, the machine learning approach. UAV detection by
machine learning is noteworthy approach that would be proposed in solving security concerns
posed by drones.
For instance, figure 4 below shows a detection by a sensing device, the detection
technique is machine learning aided approach that involve detecting signals from both UAV
and its controller.
Fig 4: UAV detection
Random Forest
RF can support machine learning algorithm, as well as classification, regression, and
dimensional reduction activities. The method has similarities with bagging because it
involves the development of decision tree-based on bootstrapped samples after the
consideration of every split tree. In bagging, m (random sample of predictors) would be equal
to the entire set of predictors (p) (m=p). The RF method will be applied in this study.
The classification of datasets in RF depends on the features or attributes. In this study,
voting for the attributes during the dataset classification will be conducted on each tree.
Consequently, the winning class will be calculated using the average of the accumulated
votes from the tree outputs. RF has several advantages compared to bagging and boosting.
First, it supports a multiplicity of tasks including classification and regression. Second, it has
a higher capacity for multi-dimensional support algorithm. Third, the method provides a
platform for the balancing of the unbalanced classes in dataset error detection. Nevertheless,
the approach also has several disadvantages that should be observed. First, the prediction is
little opportunities for controlling the working of the algorithm.
CHAPTER 3
3.0 Research Methodology
Techniques to be used
Machine learning approach will be used for this study. Machine learning refer to
application of artificial intelligence enabling system to learn automatically and improve from
experience without programming the system. In this section, we will discuss the proposed
machine learning approach to be used in UAV detection.
Random Forest
RF can support machine learning algorithm, as well as classification, regression, and
dimensional reduction activities. The method has similarities with bagging because it
involves the development of decision tree-based on bootstrapped samples after the
consideration of every split tree. In bagging, m (random sample of predictors) would be equal
to the entire set of predictors (p) (m=p). The RF method will be applied in this study.
The classification of datasets in RF depends on the features or attributes. In this study,
voting for the attributes during the dataset classification will be conducted on each tree.
Consequently, the winning class will be calculated using the average of the accumulated
votes from the tree outputs. RF has several advantages compared to bagging and boosting.
First, it supports a multiplicity of tasks including classification and regression. Second, it has
a higher capacity for multi-dimensional support algorithm. Third, the method provides a
platform for the balancing of the unbalanced classes in dataset error detection. Nevertheless,
the approach also has several disadvantages that should be observed. First, the prediction is
little opportunities for controlling the working of the algorithm.
CHAPTER 3
3.0 Research Methodology
Techniques to be used
Machine learning approach will be used for this study. Machine learning refer to
application of artificial intelligence enabling system to learn automatically and improve from
experience without programming the system. In this section, we will discuss the proposed
machine learning approach to be used in UAV detection.
Paraphrase This Document
Need a fresh take? Get an instant paraphrase of this document with our AI Paraphraser
This predicted operation will be done by comparison of three machine learning models. The
models include:
a) Logistics regression: this emphasize on the relationship between dependent and
independent variables. The probability of outcome is represented by the following
equation.
p = 1 /(1 + e−(α+β1x1+......+βnxn))
b) Random Forest: this model will be used to predict the value of a target variable
through simple value rules from datasets.
c) Artificial neural network: the artificial neural network on the other hand connects
codes to external server thus helping with the compute power that is required during
prediction. It works the same way as decision tree method.
All of these approaches are referred to as machine learning models.
The accuracy of the prediction process and methods is crucial in the validation of the
classifiers. Cross-validation can also be applied in checking the classifier’s accuracy. The
Decision tree has been used often in the cross-validation that takes feature values. Decision
Tree is a machine learning algorithm that represents the aspects of interest for the
classification of nodes and the values that the branches from the node assume. The root nodes
represent the point at which the classification of the instances from which the features are
sorted based on their values. The algorithm is straightforward and allows easy exploration
because it requires minimal data cleaning. However, the approach could also lead to over-
fitting because it requires continuous variables.
Ensemble classification methods can be used in the minimization of the disadvantages
and increasing feature analysis. The method involves the combination of weak decision trees,
which, in turn, yields strong ensembles. The bagged tree model engrosses trees that can be
trained independently based on the data that is bootstrapped from the input data. Ensembles
classifies groups into multiple models to yield higher levels of accuracy and stability in the
prediction process. The ensemble method provides the highest boost to the decision tree
models. Boosting and Bagging are the commonest ensemble methods.
Data Collection
To train an ensemble of bagged classification trees, or to explore other ensemble-
learning options, Classification Learner App in MATLAB R2017B is used. 'Bag' algorithm
applies to all methods. ‘Bag’ can be used as a classifier or regressor with the type name-value
models include:
a) Logistics regression: this emphasize on the relationship between dependent and
independent variables. The probability of outcome is represented by the following
equation.
p = 1 /(1 + e−(α+β1x1+......+βnxn))
b) Random Forest: this model will be used to predict the value of a target variable
through simple value rules from datasets.
c) Artificial neural network: the artificial neural network on the other hand connects
codes to external server thus helping with the compute power that is required during
prediction. It works the same way as decision tree method.
All of these approaches are referred to as machine learning models.
The accuracy of the prediction process and methods is crucial in the validation of the
classifiers. Cross-validation can also be applied in checking the classifier’s accuracy. The
Decision tree has been used often in the cross-validation that takes feature values. Decision
Tree is a machine learning algorithm that represents the aspects of interest for the
classification of nodes and the values that the branches from the node assume. The root nodes
represent the point at which the classification of the instances from which the features are
sorted based on their values. The algorithm is straightforward and allows easy exploration
because it requires minimal data cleaning. However, the approach could also lead to over-
fitting because it requires continuous variables.
Ensemble classification methods can be used in the minimization of the disadvantages
and increasing feature analysis. The method involves the combination of weak decision trees,
which, in turn, yields strong ensembles. The bagged tree model engrosses trees that can be
trained independently based on the data that is bootstrapped from the input data. Ensembles
classifies groups into multiple models to yield higher levels of accuracy and stability in the
prediction process. The ensemble method provides the highest boost to the decision tree
models. Boosting and Bagging are the commonest ensemble methods.
Data Collection
To train an ensemble of bagged classification trees, or to explore other ensemble-
learning options, Classification Learner App in MATLAB R2017B is used. 'Bag' algorithm
applies to all methods. ‘Bag’ can be used as a classifier or regressor with the type name-value
pair set for 'classification' or 'regression'. 20 dataset collected from 20 different pilots with a
total of 52,000 instances for each flight type (random, vertical, horizontal), which gives at
least 156,000 instances for analysis
Tools and Applications
Several methods are available for melding results from many weak learners into one
high-quality ensemble predictor. These methods closely follow the same syntax; therefore,
different methods can be tested with minor changes in the commands.
MATLAB R2017B was installed to conduct UAV pilot identification. Five UAV
random pilot’s data were provided Khalifa University. Data was loaded into MATLAB
(CSV) and imported as a table (Table 1).
Table 1: Pilots data
Cross validation option was selected (varies from test to test) as shown in (Figure 5).
total of 52,000 instances for each flight type (random, vertical, horizontal), which gives at
least 156,000 instances for analysis
Tools and Applications
Several methods are available for melding results from many weak learners into one
high-quality ensemble predictor. These methods closely follow the same syntax; therefore,
different methods can be tested with minor changes in the commands.
MATLAB R2017B was installed to conduct UAV pilot identification. Five UAV
random pilot’s data were provided Khalifa University. Data was loaded into MATLAB
(CSV) and imported as a table (Table 1).
Table 1: Pilots data
Cross validation option was selected (varies from test to test) as shown in (Figure 5).
Figure 5: Cross validation
The 22 classifiers that were used to run the analysis are presented in (Table 2)
Model Performance
Tree (Fine tree) 83.70%
Tree (Medium tree) 73%
Tree (Coarse tree) 58.30%
Linear discriminant 29.70%
Quadratic discriminant 34.10%
SVM (Linear) 31.90%
SVM (Quadratic) 40.60%
SVM (Cubic) 24.80%
SVM (Fine Gaussian) 80.30%
SVM (medium Gaussian) 60.50%
SVM (Coarse Gaussian) 35.70%
KNN (Fine) 85.10%
KNN (Medium) 77.10%
KNN (Coarse) 61.40%
The 22 classifiers that were used to run the analysis are presented in (Table 2)
Model Performance
Tree (Fine tree) 83.70%
Tree (Medium tree) 73%
Tree (Coarse tree) 58.30%
Linear discriminant 29.70%
Quadratic discriminant 34.10%
SVM (Linear) 31.90%
SVM (Quadratic) 40.60%
SVM (Cubic) 24.80%
SVM (Fine Gaussian) 80.30%
SVM (medium Gaussian) 60.50%
SVM (Coarse Gaussian) 35.70%
KNN (Fine) 85.10%
KNN (Medium) 77.10%
KNN (Coarse) 61.40%
Secure Best Marks with AI Grader
Need help grading? Try our AI Grader for instant feedback on your assignments.
KNN (Cosine) 75.60%
KNN (Cubic) 75.40%
KNN (Weighted) 84.10%
Ensmbel (Boosted trees) 80.20%
Ensmbel (Bagged trees) 96.10%
Ensmble (Subspace
disriminant) 29.80%
Ensmble (Subspace KNN) 90.70%
Ensmble (RUSBoosted trees) 73.30%
Table 2: Different classifiers accuracy test
CHAPTER 4
Preliminary Results
This section presents results on basis of two trained machine learning models
including logistic regression and decision tree models. First results cover 5 pilots and the
second test covers 20 pilots. Many UAVs work just the same way as radio controlled RC
devices. The frequency band that is usually allocated by FCC for the RC devices ranges
between 27MHz to 49 MHz. It is therefore important to note that transmitters and receivers
have varying frequencies.
Five Pilots Dataset
Ensemble Bagged tree model provided the best accuracy results among all classifiers
(96.1%) (Figure 16). In addition, Table 4 demonstrates that increasing the cross validation
from 1 to 50 raised the accuracy of bagged tree from 95.1% to 96.5%. It also implements the
impact of removing one features while keeping rest of features on accuracy. Removing the
thrust resulted in dropping the accuracy to 82.8%, which is the highest impact. In contrast,
removing the time had the minimum impact where it was 1.4% if it was selected alone, while
thrust scored 76.2%.
KNN (Cubic) 75.40%
KNN (Weighted) 84.10%
Ensmbel (Boosted trees) 80.20%
Ensmbel (Bagged trees) 96.10%
Ensmble (Subspace
disriminant) 29.80%
Ensmble (Subspace KNN) 90.70%
Ensmble (RUSBoosted trees) 73.30%
Table 2: Different classifiers accuracy test
CHAPTER 4
Preliminary Results
This section presents results on basis of two trained machine learning models
including logistic regression and decision tree models. First results cover 5 pilots and the
second test covers 20 pilots. Many UAVs work just the same way as radio controlled RC
devices. The frequency band that is usually allocated by FCC for the RC devices ranges
between 27MHz to 49 MHz. It is therefore important to note that transmitters and receivers
have varying frequencies.
Five Pilots Dataset
Ensemble Bagged tree model provided the best accuracy results among all classifiers
(96.1%) (Figure 16). In addition, Table 4 demonstrates that increasing the cross validation
from 1 to 50 raised the accuracy of bagged tree from 95.1% to 96.5%. It also implements the
impact of removing one features while keeping rest of features on accuracy. Removing the
thrust resulted in dropping the accuracy to 82.8%, which is the highest impact. In contrast,
removing the time had the minimum impact where it was 1.4% if it was selected alone, while
thrust scored 76.2%.
Figure 6: Performance of all classifiers on the same dataset
Case Number of Crosses Accuracy
All features 1 95.10%
All features 5 96.10%
All features 10 96.30%
All features 20 96.50%
All features 50 96.50%
No pitch 5 95%
No roll 5 95%
No thrust 5 82.80%
No time 5 92.20%
No Yaw 5 95.10%
Only pitch 5 34%
Only roll 5 40%
Only thrust 5 76.20%
Only time 5 1.40%
Only yaw 5 33.20%
Case Number of Crosses Accuracy
All features 1 95.10%
All features 5 96.10%
All features 10 96.30%
All features 20 96.50%
All features 50 96.50%
No pitch 5 95%
No roll 5 95%
No thrust 5 82.80%
No time 5 92.20%
No Yaw 5 95.10%
Only pitch 5 34%
Only roll 5 40%
Only thrust 5 76.20%
Only time 5 1.40%
Only yaw 5 33.20%
Table 3: Cross validation test
Figure 9 illustrates classifier performance when different number of crosses and
different features selections are given. Table 5 and Figure 17 show single feature test output
while Table 6 illustrates the order of feature impact on classifier output, where Thrust has the
primary impact. There are various implementations of bagging models, such as the Random
Forest.
Figure 7. Features test accuracy with different crosses starting from 50% up to 98%.
Case Accuracy
All features 96.10%
No pitch 95%
No roll 95%
No thrust 82.80%
No time 92.20%
No Yaw 95.10%
Only pitch 34%
Only roll 40%
Only thrust 76.20%
Figure 9 illustrates classifier performance when different number of crosses and
different features selections are given. Table 5 and Figure 17 show single feature test output
while Table 6 illustrates the order of feature impact on classifier output, where Thrust has the
primary impact. There are various implementations of bagging models, such as the Random
Forest.
Figure 7. Features test accuracy with different crosses starting from 50% up to 98%.
Case Accuracy
All features 96.10%
No pitch 95%
No roll 95%
No thrust 82.80%
No time 92.20%
No Yaw 95.10%
Only pitch 34%
Only roll 40%
Only thrust 76.20%
Paraphrase This Document
Need a fresh take? Get an instant paraphrase of this document with our AI Paraphraser
Only time 1.40%
Only yaw 33.20%
Table 4: Single feature test
Figure8: Single features output compared to all features output accuracy.
Order Effect of feature
1 Thrust
2 Roll
3 Pitch
4 Yaw
5 Time
Table 5. Effectiveness order of the features
Twenty Pilots Dataset
Table 7 and Figure 19 shows and increase in the performance of data classification
using random forest with highest value 99.80 % compared to radio control signal with 89.1%.
Classifier Vertica Triangul Rando
Only yaw 33.20%
Table 4: Single feature test
Figure8: Single features output compared to all features output accuracy.
Order Effect of feature
1 Thrust
2 Roll
3 Pitch
4 Yaw
5 Time
Table 5. Effectiveness order of the features
Twenty Pilots Dataset
Table 7 and Figure 19 shows and increase in the performance of data classification
using random forest with highest value 99.80 % compared to radio control signal with 89.1%.
Classifier Vertica Triangul Rando
l ar m
RF All 97.00% 99.80% 94.40%
RF RC 89.10% 87.80% 81.60%
Performance
Increase 7.90% 12.00% 12.80%
Table 6: All data vs RC data performance
Figure 9: All Data Vs. RC Data Classification Performance
Table 8 and Figure 20 shows the overall increase in the performance of classification
using full data 98.92 % compared to radio signal 85.15%
Pilot
All Data
Average
RC
Average
Performance
Increase
Pilot 1 99.33% 87.00% 12.33%
Pilot 2 99.33% 93.00% 6.33%
Pilot 3 99.67% 97.00% 2.67%
Pilot 4 91.67% 75.00% 16.67%
Pilot 5 99.00% 77.00% 22.00%
Pilot 6 99.67% 94.00% 5.67%
Pilot 7 99.67% 98.00% 1.67%
RF All 97.00% 99.80% 94.40%
RF RC 89.10% 87.80% 81.60%
Performance
Increase 7.90% 12.00% 12.80%
Table 6: All data vs RC data performance
Figure 9: All Data Vs. RC Data Classification Performance
Table 8 and Figure 20 shows the overall increase in the performance of classification
using full data 98.92 % compared to radio signal 85.15%
Pilot
All Data
Average
RC
Average
Performance
Increase
Pilot 1 99.33% 87.00% 12.33%
Pilot 2 99.33% 93.00% 6.33%
Pilot 3 99.67% 97.00% 2.67%
Pilot 4 91.67% 75.00% 16.67%
Pilot 5 99.00% 77.00% 22.00%
Pilot 6 99.67% 94.00% 5.67%
Pilot 7 99.67% 98.00% 1.67%
Pilot 8 99.67% 87.00% 12.67%
Pilot 9 99.67% 83.00% 16.67%
Pilot
10 99.00% 87.00% 12.00%
Pilot
11 99.33% 86.00% 13.33%
Pilot
12 99.00% 84.00% 15.00%
Pilot
13 99.33% 82.00% 17.33%
Pilot
14 99.00% 73.00% 26.00%
Pilot
15 99.00% 83.00% 16.00%
Pilot
16 99.67% 77.00% 22.67%
Pilot
17 98.67% 71.00% 27.67%
Pilot
18 99.67% 87.00% 12.67%
Pilot
19 99.33% 92.00% 7.33%
Pilot
20 98.67% 90.00% 8.67%
Table 7: Pilot performance
Pilot 9 99.67% 83.00% 16.67%
Pilot
10 99.00% 87.00% 12.00%
Pilot
11 99.33% 86.00% 13.33%
Pilot
12 99.00% 84.00% 15.00%
Pilot
13 99.33% 82.00% 17.33%
Pilot
14 99.00% 73.00% 26.00%
Pilot
15 99.00% 83.00% 16.00%
Pilot
16 99.67% 77.00% 22.67%
Pilot
17 98.67% 71.00% 27.67%
Pilot
18 99.67% 87.00% 12.67%
Pilot
19 99.33% 92.00% 7.33%
Pilot
20 98.67% 90.00% 8.67%
Table 7: Pilot performance
Secure Best Marks with AI Grader
Need help grading? Try our AI Grader for instant feedback on your assignments.
Figure 10: Pilot performance
Table 9 shows the accuracy of all features, were testing all features scored 99 %.
During the test, it was identified that combining selective 6 features (Thrust,
AngleYawDegree, Roll, Pitch, Yaw, Heightmm, and TransaccYg) will provide 99.3 %
accuracy. Major weights come from Thrust and AngleYawDegree with 96.30 %
Feature Accuracy
All 99.00%
AnglePitchdeg 27.50%
AngleRolldeg 26.90%
AngleYawdeg 88.50%
AngvelPitchdegs 22.50%
AngvelRolldegs 24.50%
AngvelYawdegs 22.60%
TotalaccXg 24.30%
TotalaccYg 26.20%
TotalaccZg 28.00%
TransaccXg 23.90%
Table 9 shows the accuracy of all features, were testing all features scored 99 %.
During the test, it was identified that combining selective 6 features (Thrust,
AngleYawDegree, Roll, Pitch, Yaw, Heightmm, and TransaccYg) will provide 99.3 %
accuracy. Major weights come from Thrust and AngleYawDegree with 96.30 %
Feature Accuracy
All 99.00%
AnglePitchdeg 27.50%
AngleRolldeg 26.90%
AngleYawdeg 88.50%
AngvelPitchdegs 22.50%
AngvelRolldegs 24.50%
AngvelYawdegs 22.60%
TotalaccXg 24.30%
TotalaccYg 26.20%
TotalaccZg 28.00%
TransaccXg 23.90%
TransaccYg 28.30%
TransaccZg 27.10%
Totalaccg 28.90%
Heightmm 38.90%
dHeightmms 31.80%
Pitch 34.00%
Roll 40.00%
Yaw 33.20%
Thrust 76.20%
Thrust and
AngleYawdeg 96.30%
Thrust ,
AngleYawdeg, and
Roll 97.70%
Thrust,
AngleYawdeg,
Roll,and Pitch 98.40%
Thrust,
AngleYawdeg,
Roll,Pitch, and Yaw 98.80%
Thrust,
AngleYawdeg,
Roll,Pitch, Yaw, and
Heightmm 99%
TransaccZg 27.10%
Totalaccg 28.90%
Heightmm 38.90%
dHeightmms 31.80%
Pitch 34.00%
Roll 40.00%
Yaw 33.20%
Thrust 76.20%
Thrust and
AngleYawdeg 96.30%
Thrust ,
AngleYawdeg, and
Roll 97.70%
Thrust,
AngleYawdeg,
Roll,and Pitch 98.40%
Thrust,
AngleYawdeg,
Roll,Pitch, and Yaw 98.80%
Thrust,
AngleYawdeg,
Roll,Pitch, Yaw, and
Heightmm 99%
Thrust,
AngleYawdeg,
Roll,Pitch, Yaw,
Heightmm, and
dHeightmms 99.20%
Thrust,
AngleYawdeg,
Roll,Pitch, Yaw,
Heightmm,
andTransaccYg 99.30%
Table 8: Features accuracy test
AngleYaw Degree scored the best accuracy in random test, another rest was
conducted to ensure that this feature is important. Using vertical data, this feature scored
95.90%, while all features score was 99.40%.
CHAPTER 5
Conclusion
In summary, this article has proposed a machine learning technique for detecting and
identifying UAV in a mobile network. The study has examined three classification models
including logistics regression, random forest and artificial neural network. The findings
suggest that the proposed drone detection approach is an effective approach for detecting
drones which are sixty meters high while meeting zero positive rate. However, the accuracy
of detection decreases as the height reduces.
The Machine Learning approach can be used in identifying features based on raw
datasets. One of the methods used in UAV study is ensemble with bagged tree which
provided the best accuracy in identifying the pilots flying in different style modes.
Nevertheless, bagged tree in MATLAB needs processing power and time. This study
provided accuracy results using bagged tree on different settings; were thrust was defined as
the prime feature for deriving the accuracy of the dataset (RC signal), while by analyzing the
full IMU dataset, AngleYawDegree was the prime feature on identifying the pilots.
AngleYawdeg,
Roll,Pitch, Yaw,
Heightmm, and
dHeightmms 99.20%
Thrust,
AngleYawdeg,
Roll,Pitch, Yaw,
Heightmm,
andTransaccYg 99.30%
Table 8: Features accuracy test
AngleYaw Degree scored the best accuracy in random test, another rest was
conducted to ensure that this feature is important. Using vertical data, this feature scored
95.90%, while all features score was 99.40%.
CHAPTER 5
Conclusion
In summary, this article has proposed a machine learning technique for detecting and
identifying UAV in a mobile network. The study has examined three classification models
including logistics regression, random forest and artificial neural network. The findings
suggest that the proposed drone detection approach is an effective approach for detecting
drones which are sixty meters high while meeting zero positive rate. However, the accuracy
of detection decreases as the height reduces.
The Machine Learning approach can be used in identifying features based on raw
datasets. One of the methods used in UAV study is ensemble with bagged tree which
provided the best accuracy in identifying the pilots flying in different style modes.
Nevertheless, bagged tree in MATLAB needs processing power and time. This study
provided accuracy results using bagged tree on different settings; were thrust was defined as
the prime feature for deriving the accuracy of the dataset (RC signal), while by analyzing the
full IMU dataset, AngleYawDegree was the prime feature on identifying the pilots.
Paraphrase This Document
Need a fresh take? Get an instant paraphrase of this document with our AI Paraphraser
This research provides a stepping stone for future works. It should inspire the future
researchers to work towards investigating more sophisticated features to optimize the
proposed approached. Additionally, the researchers should find out the action that will be
taken after locating the rogue drones.
Moving Forward
All analysis was conducted used MATLAB, which paved the research area in our
study. To be able to analyze how random forest selects each feature form the dataset and
assign a weight for it, I will use python to be able to debug each stage (tree) process and
analyze the overall weighting/selecting criteria. Once completed I will train new model with a
modified random forest algorithm and see how the performance differ from regular random
forest algorithm.
Project Time Plan
UAV Pilot Identification Using Machine Learning
Period Highlight: 1
PERIODS
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26
Related Works 1 5 1 4 70%
Data Collection 1 6 1 6 100%
Data Analysis 2 4 2 5 100%
Main features 4 8 4 6 100%
Random Forest 4 2 4 8 30%
Train the model 4 3 4 6 0%
New Feature
Test 5 4 5 3 0%
RF Python
simulation 5 2 5 5 0%
Train the model 5 2 5 6 0%
Test 6 5 6 7 0%
ACTIVITY 2017 3 ACTUAL
START
ACTUAL
DURATION
PERCENT
COMPLETE
Plan Duration Actual Start % Complete Actual (beyond plan)
Figure 11: project timeline plan
researchers to work towards investigating more sophisticated features to optimize the
proposed approached. Additionally, the researchers should find out the action that will be
taken after locating the rogue drones.
Moving Forward
All analysis was conducted used MATLAB, which paved the research area in our
study. To be able to analyze how random forest selects each feature form the dataset and
assign a weight for it, I will use python to be able to debug each stage (tree) process and
analyze the overall weighting/selecting criteria. Once completed I will train new model with a
modified random forest algorithm and see how the performance differ from regular random
forest algorithm.
Project Time Plan
UAV Pilot Identification Using Machine Learning
Period Highlight: 1
PERIODS
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26
Related Works 1 5 1 4 70%
Data Collection 1 6 1 6 100%
Data Analysis 2 4 2 5 100%
Main features 4 8 4 6 100%
Random Forest 4 2 4 8 30%
Train the model 4 3 4 6 0%
New Feature
Test 5 4 5 3 0%
RF Python
simulation 5 2 5 5 0%
Train the model 5 2 5 6 0%
Test 6 5 6 7 0%
ACTIVITY 2017 3 ACTUAL
START
ACTUAL
DURATION
PERCENT
COMPLETE
Plan Duration Actual Start % Complete Actual (beyond plan)
Figure 11: project timeline plan
References
[1] Y. Zeng, R. Zhang, T.J. Lim. Wireless communications with unmanned aerial vehicles:
Opportunities and challenges. IEEE Communications Magazine. 2016 May; 54(5):36-42.
[2] A. Solodov, A. Williams Al Hanaei S. Goddard. Analyzing the threat of unmanned aerial
vehicles (UAV) to nuclear facilities. Security Journal. 2018 Feb 1;31 (1):305-24.
[3] P. Kopardekar and S. Bradford, “UAS traffic management (UTM) research transition
team (RTT) plan: FAA and NASA collaborative efforts planned through September 2020,”
2018. Accessed on 21st April, 2019 from: < https://www.groundai.com/project/micro-uav-
detection-and-classification-from-rf-fingerprints-using-machine-learning-techniques/>
[4] C. Colin, “Approaching a new normal: What the drone attack in Venezuela portends,”
2018. Accessed on 21st April, 2019 from:
<https://fortunascorner.com/2018/08/14/approaching-a-new-normal-what-the-drone-attack-
in-venezuela-portends/>
[5] V. Yajnanarayana, Y. E. Wang, S. Gao, S. D. Muruganathan, and X. Lin, “Interference
mitigation methods for unmanned aerial vehicles served by cellular networks,” CoRR, vol.
abs/1802.00223, 2018.
[1] Y. Zeng, R. Zhang, T.J. Lim. Wireless communications with unmanned aerial vehicles:
Opportunities and challenges. IEEE Communications Magazine. 2016 May; 54(5):36-42.
[2] A. Solodov, A. Williams Al Hanaei S. Goddard. Analyzing the threat of unmanned aerial
vehicles (UAV) to nuclear facilities. Security Journal. 2018 Feb 1;31 (1):305-24.
[3] P. Kopardekar and S. Bradford, “UAS traffic management (UTM) research transition
team (RTT) plan: FAA and NASA collaborative efforts planned through September 2020,”
2018. Accessed on 21st April, 2019 from: < https://www.groundai.com/project/micro-uav-
detection-and-classification-from-rf-fingerprints-using-machine-learning-techniques/>
[4] C. Colin, “Approaching a new normal: What the drone attack in Venezuela portends,”
2018. Accessed on 21st April, 2019 from:
<https://fortunascorner.com/2018/08/14/approaching-a-new-normal-what-the-drone-attack-
in-venezuela-portends/>
[5] V. Yajnanarayana, Y. E. Wang, S. Gao, S. D. Muruganathan, and X. Lin, “Interference
mitigation methods for unmanned aerial vehicles served by cellular networks,” CoRR, vol.
abs/1802.00223, 2018.
[6] Conroy J, Gremillion G, Ranganathan B, Humbert JS. Implementation of wide-field
integration of optic flow for autonomous quadrotor navigation. Autonomous robots. 2009 Oct
1;27(3):189.
[7] Fournier J, Ricard B, Laurendeau D. Mapping and exploration of complex environments
using persistent 3D model. InFourth Canadian Conference on Computer and Robot Vision
(CRV'07) 2007 May 28 (pp. 403-410). IEEE.
[8] Hornung A, Wurm KM, Bennewitz M, Stachniss C, Burgard W. OctoMap: An efficient
probabilistic 3D mapping framework based on octrees. Autonomous robots. 2013 Apr
1;34(3):189-206.
[9] Chao H, Cao Y, Chen Y. Autopilots for small unmanned aerial vehicles: a survey.
International Journal of Control, Automation and Systems. 2010 Feb 1;8(1):36-44.
[10] C. E. Perkins, E. M. Royer, Ad-hoc on-demand distance vector routing, in: Proc. IEEE
Workshop onMobile Comput. Sys. and Appl. (WMCSA), 1999, pp. 90–100.
[11] Gallagher, S., German Chancellor’s Drone “Attack” Shows the Threat of Weaponized
UAVs. Available online at: http://arstechnica.com/information-technology/2013/09/german-
chancellors-drone-attack-shows-the-threat-of-weaponized-uavs (Last access date: 14 July
2015).
[13] Miasnikov, E., Threat of Terrorism Using Unmanned Aerial Vehicles: Technical
Aspects. Center for Arms Control, Energy and Environmental Studies, Moscow Institute of
Physics and Technology, Moscow, 2015.
[14] Humphreys TO. Statement on the security threat posed by unmanned aerial systems and
possible countermeasures. Oversight and Management Efficiency Subcommittee, Homeland
Security Committee, Washington, DC, US House. 2015 Mar 16.
[15] Kerns AJ, Shepard DP, Bhatti JA, Humphreys TE. Unmanned aircraft capture and
control via GPS spoofing. Journal of Field Robotics. 2014 Jul;31(4):617-36.
[16] Xing, Zheng, Yang, He, and Cheng, Jian. “Design and implementation of UAV flight
simulation based on Matlab/Simulink.” International Conference in Advances in Mechanical
integration of optic flow for autonomous quadrotor navigation. Autonomous robots. 2009 Oct
1;27(3):189.
[7] Fournier J, Ricard B, Laurendeau D. Mapping and exploration of complex environments
using persistent 3D model. InFourth Canadian Conference on Computer and Robot Vision
(CRV'07) 2007 May 28 (pp. 403-410). IEEE.
[8] Hornung A, Wurm KM, Bennewitz M, Stachniss C, Burgard W. OctoMap: An efficient
probabilistic 3D mapping framework based on octrees. Autonomous robots. 2013 Apr
1;34(3):189-206.
[9] Chao H, Cao Y, Chen Y. Autopilots for small unmanned aerial vehicles: a survey.
International Journal of Control, Automation and Systems. 2010 Feb 1;8(1):36-44.
[10] C. E. Perkins, E. M. Royer, Ad-hoc on-demand distance vector routing, in: Proc. IEEE
Workshop onMobile Comput. Sys. and Appl. (WMCSA), 1999, pp. 90–100.
[11] Gallagher, S., German Chancellor’s Drone “Attack” Shows the Threat of Weaponized
UAVs. Available online at: http://arstechnica.com/information-technology/2013/09/german-
chancellors-drone-attack-shows-the-threat-of-weaponized-uavs (Last access date: 14 July
2015).
[13] Miasnikov, E., Threat of Terrorism Using Unmanned Aerial Vehicles: Technical
Aspects. Center for Arms Control, Energy and Environmental Studies, Moscow Institute of
Physics and Technology, Moscow, 2015.
[14] Humphreys TO. Statement on the security threat posed by unmanned aerial systems and
possible countermeasures. Oversight and Management Efficiency Subcommittee, Homeland
Security Committee, Washington, DC, US House. 2015 Mar 16.
[15] Kerns AJ, Shepard DP, Bhatti JA, Humphreys TE. Unmanned aircraft capture and
control via GPS spoofing. Journal of Field Robotics. 2014 Jul;31(4):617-36.
[16] Xing, Zheng, Yang, He, and Cheng, Jian. “Design and implementation of UAV flight
simulation based on Matlab/Simulink.” International Conference in Advances in Mechanical
Secure Best Marks with AI Grader
Need help grading? Try our AI Grader for instant feedback on your assignments.
Engineering and Industrial Informatics, (2015).
[17] Altawy, Riham, and Amr M. Youssef. “Security, Privacy, and Safety Aspects of Civilian
Drones.” ACM Transactions on Cyber-Physical Systems, vol. 1, no. 2, 2016, pp. 1–25.,
doi:10.1145/3001836.
[18] Res´endiz, Victor Manuel, and, Edgar Araiza. System Identification of a Quad-Rotor in
X Configuration from Experimental Data. www.rcs.cic.ipn.mx/2016_118/System
Identification of a Quad-rotor in X Configuration from Experimental Data.pdf.
[19] “HOW DO DRONES WORK? PART 9 - IMU (Inertial Measurement Unit).” LinkedIn,
www.linkedin.com/pulse/how-do-drones-work-part-9-imu-inertial-measurement-unit-
fiorenzani
[20] Sánchez-García, J., et al. “A Survey on Unmanned Aerial and Aquatic Vehicle Multi-
Hop Networks: Wireless Communications, Evaluation Tools and Applications.” Computer
Communications, vol. 119, 2018, pp. 43–65., doi:10.1016/j.comcom.2018.02.002.
[21] Shakhatreh, Hazim, and Ahmad Sawalmeh. Unmanned Aerial Vehicles: A Survey on
Civil Applications and Key Research Challenges. 19 Apr. 2018,
arxiv.org/pdf/1805.00881.pdf.
[17] Altawy, Riham, and Amr M. Youssef. “Security, Privacy, and Safety Aspects of Civilian
Drones.” ACM Transactions on Cyber-Physical Systems, vol. 1, no. 2, 2016, pp. 1–25.,
doi:10.1145/3001836.
[18] Res´endiz, Victor Manuel, and, Edgar Araiza. System Identification of a Quad-Rotor in
X Configuration from Experimental Data. www.rcs.cic.ipn.mx/2016_118/System
Identification of a Quad-rotor in X Configuration from Experimental Data.pdf.
[19] “HOW DO DRONES WORK? PART 9 - IMU (Inertial Measurement Unit).” LinkedIn,
www.linkedin.com/pulse/how-do-drones-work-part-9-imu-inertial-measurement-unit-
fiorenzani
[20] Sánchez-García, J., et al. “A Survey on Unmanned Aerial and Aquatic Vehicle Multi-
Hop Networks: Wireless Communications, Evaluation Tools and Applications.” Computer
Communications, vol. 119, 2018, pp. 43–65., doi:10.1016/j.comcom.2018.02.002.
[21] Shakhatreh, Hazim, and Ahmad Sawalmeh. Unmanned Aerial Vehicles: A Survey on
Civil Applications and Key Research Challenges. 19 Apr. 2018,
arxiv.org/pdf/1805.00881.pdf.
1 out of 32
Your All-in-One AI-Powered Toolkit for Academic Success.
+13062052269
info@desklib.com
Available 24*7 on WhatsApp / Email
Unlock your academic potential
© 2024 | Zucol Services PVT LTD | All rights reserved.