2018 AUVSI SUAS COMPETITION ANADOLU UNIVERSITY SUAS ANADOLU TEAM ZAFER JOURNAL PAPER

Size: px
Start display at page:

Download "2018 AUVSI SUAS COMPETITION ANADOLU UNIVERSITY SUAS ANADOLU TEAM ZAFER JOURNAL PAPER"

Transcription

1 2018 AUVSI SUAS COMPETITION ANADOLU UNIVERSITY SUAS ANADOLU TEAM ZAFER JOURNAL PAPER ABSTRACT This document refers to the analysis and system approaches during the design of the Unmanned Air System (UAS) that we, SUAS Anadolu Team, named ZAFER. The platform for autonomous flight, ground object detection, air distribution and avoidance of virtual obstacles is designed by our team, which was formed by the cooperation of Anadolu University Faculty of Engineering and Aerospace Sciences Faculty undergraduate students. The system and the design processes are described in detail in this document. The system, itself, and its development process are described in detail under four main sections of this document: (1) system engineering approach, (2) system design, (3) test and evaluation plan, and (4) safety risks and mitigation. Following successful simulations, the first test flights also show that system stabilization, autonomous flight, image acquisition, safely image transferring and image processing tasks on the ground monitoring system are successfully accomplished. Anadolu University SUAS Anadolu 1

2 CONTENT... 1 ABSTRACT SYSTEM ENGINEERING APPROACH MISSION REQUIREMENT ANALYSIS DESIGN RATIONALE SYSTEM DESIGN AIRCRAFT AUTOPILOT OBSTACLE AVOIDANCE IMAGING SYSTEM AUTONOMOUS OBJECT DETECTION, CLASSIFICATION AND LOCALIZATION (ADCL) Detection Classification Color Detection Alphanumeric Character Recognition and Shape Orientation Shape Detection Localization SEARCH AREA INTERFACE COMMUNICATIONS Telemetry Link Imagery Link RC Link Interoperability Summary AIR DELIVERY CYBER SECURITY SUMMARY SAFETY, RISKS, & MITIGATION DEVELOPMENTAL RISKS & MITIGATION MISSION RISKS & MITIGATION CONCLUSION System Engineering Approach The 2018 AUVSI SUAS Competition requires that all the indicated mission tasks must be performed automatically in a given time limit. Therefore, Zafer was designed and built to quickly perform an autonomous flight including auto takeoff and landing, stationary obstacle avoidance, air delivery, searching the area, taking pictures of this search area and autonomous target detection. The major aim of the SUAS-Anadolu Team is to minimize the development and security risks and to maximize the overall task performance of Zafer using an integrated system engineering approach Mission Requirement Analysis When the 2018 SUAS Competition period began, the first step was analyzing and understanding the mission requirements and then determining a strategy for successful execution of each mission. The complete task is split into grade-wise weighted components as in Table 1. Anadolu University SUAS Anadolu 2

3 Mission Demonstration Timeline (10%) Autonomous Flight (30%) Obstacle Avoidance (20%) Mission Analyses Mission Time (80%): 45 minutes will be provided to complete the all missions and this time will be used in two parts: flight time and post-processing time. Timeout (20%): One timeout can be used to stop the mission clock. It can be either a timeout-in place (10 minutes) or a timeout-in-pit (20 minutes). Autonomous Flight (AF) (40%): An autonomous flight is required for at least 3 minutes to get AF points and the team will lose 10% points for each safety pilot takeover into manual flight. Waypoint Capture (10%): Approaching the given waypoints within at least 100 ft. Waypoint Accuracy (50%): The ratio of points will be calculated by proximity to the waypoints. Stationary Obstacle Avoidance (50%): To avoid the stationary obstacles which are shaped as cylinders with given geometric characteristic. Moving Obstacle Avoidance (50%): To avoid the moving obstacles which are shaped as spheres with given geometric characteristic and airspeed. Determined Strategy for Successful Mission Execution Planning of possible bad case scenarios earlier. Checklists for each sub-team must be created. To not to exceed mission time, we consider the mission time no more than 35 minutes. Autonomous preparation of entire mission plan to remove human factor as much as possible. Using simulation to mitigate possible crashes and avoiding spending unnecessary time on flight tests. Risk analysis of the moving obstacles on entire mission and, if necessary, never attempt. Object Detection, Classification, Localization (ODCL) (20%) Air Delivery (10%) Characteristic (20%): To identify each object s characteristics such as shape, shape color, alphanumeric, alphanumeric color, and alphanumeric orientation. Geolocation (30%): Accurately providing the GPS location of the objects. Actionable (30%): To submit the objects in given object file format during first flight without any delay. Autonomy (20%): To gain the points from this section, it is needed to submit the objects autonomously. Bottle must hit close to the precise point and the water lock must open upon landing so that the water is dispersed to the given location. Increase the likelihood of object detection by filtering soil and grass. Perform object color detection by testing daylight effect. Location verification with 3D projection and orientation to the Judges Server with minimum human factor. The water bottle should be dropped above the minimum altitude. The bottle must also open upon landing and water must be dispersed to the given location. Operational Team performance will be graded by the judges Excellence (10%) subjectively. Table 1: Mission Analysis and Determined Strategy for Successful Mission Execution After the mission tasks analysis, the second step was to perform a self-assessment in SUAS Anadolu Team to define which main mission tasks could be attempted within our team know-how, abilities and budget (see Table 2). Main Missions State of Attempted Importance Difficulty Autonomous Flight Entire Tasks 5/5 2/5 Obstacle Avoidance Entire Tasks except Moving Obstacle Avoidance 3/5 4/5 Object Detection, Classification, Localization (ODCL) Entire Tasks except Off-Axis 3/5 5/5 Air Delivery All Mission 2/5 2/5 Table 2: Mission Attempts, Importance and Difficulty Analysis In order to successfully complete the tasks, which will be attempted, four sub-teams such as structural sub-team, autopilot sub-team, communication sub-team and imagery sub-team have been formed and the related tasks Anadolu University SUAS Anadolu 3

4 have been assigned to these sub-teams within a systematic approach and order. In the final step, an optimization process was initiated after ensuring the safety and reliability of these systems developed with the relevant subteams or teams. Figure 1: Flow Chart of Development Process 1.2. Design Rationale SUAS Anadolu Team consists of ten undergraduate students from various departments of engineering faculty (computer engineering, electrical-electronics engineering, material science and engineering) and faculty of aerospace engineering. It is mostly financed by the Anadolu University. However, the budget only partially covers travel and accommodation expenses for the competition in the US, and construction expenses for a high-cost platform. For this reason, communication with the companies was made and sponsorship support was tried to be received. A narrow support has been received from a few companies and the search for sponsors is ongoing. Zafer for the 2018 SUAS Competition is designed to meet all task requirements which will be attempted within a limited time in constrained budget and resources, according to the team's capabilities and performance. In order to define the priorities of the tasks, a scale from 1 to 5 is used according to the task significances and difficulties, as shown in Table 2. The most important task was designated as the autonomous flight mission, which is directly related to many sub-tasks, as can be seen in Table 1.Therefore, the first goal was identified as achieving this mission completely, and, for that purpose, a flight controller was needed. Unlike the controller that we used in the last year, Pixhawk 2.1 cube Autopilot was selected as the autopilot controller due to its enhanced sensors and capabilities to better fulfill requirements of the autonomous flight mission. In addition to these capabilities, Pixhawk flight controller is quite accessible in our country and more suitable for the team's financial policy. Pixhawk 2.1 cube needed an interface to remotely control setup and observe the instantaneous state of the aircraft during flight. As a result of the researches made, Mission Planner was defined as the autopilot interface which has many sources and provides ease of use. Besides, the same hardware system can be used for air delivery and stationary obstacle avoidance tasks. However, different interfaces will be created and used by the autopilot subteams for their implementations. Since several of the team members change every year, and students have various other responsibilities, there was not enough time for developing the moving obstacle avoidance task because the team was more concentrated on stationary obstacle avoidance task, interface design and other tasks. Consequently, moving obstacle avoidance task will not be attempted this year. As the last task, the image processing task will be performed, which has the highest degree of difficulty. A camera is obviously needed for this task. Due to its availability from previous years, a Canon EOS 100D kit was chosen Anadolu University SUAS Anadolu 4

5 as the input of this task. This camera has capability of continuous shooting at high resolution(18mp). Also, it supports libgphoto2 library which provides remote controlling of shooting images automatically. Ideally, the camera should have improved mobility for off-axis object detection. A gimbal system was tried to be designed for providing this mobility. However, due to the bulky size and weight of the camera, the required gimbal should also be relatively large and heavy. Therefore, the use of gimbal was abandoned. Communication section is important because it is directly related to all other sections and operations. If the communication systems fail, there would be penalties and all further sections will be affected negatively. Communication section is responsible for management of telemetry link and imagery link traffic and between UAS, Ground Control Station (GCS), and Judge Server. To satisfy hardware needs for telemetry system, an XBee module was preferred. It transmits telemetry info from UAS to GCS. The XBee has been chosen because it provides sufficient data transmit rate and its range is enough to maintain healthy communication. This module proved its reliability in previous year and has been chosen for this reason. On the other hand, a number of communication devices were needed to transfer the photo that was acquired by the camera, to the ground station. The onboard computer controls the camera to take pictures and transmits pictures to the GCS via using Nanobeam. For communication requirements, Raspberry Pi 3 Model B was used as the onboard computer and Ubuntu Ubiquiti Nanobeam was used as the transmitter. The Nanobeam was chosen since it provides the bandwidth we need for image transmission at long physical ranges for reliable communication. Raspberry's performance/cost ratio is at a sweet point. It has been chosen because it is compatible with a vast array of sensors and it has low power consumption despite its high performance. In the 2018 AUVSI SUAS Competition, as a result of "Analysis of Mission Requirements" and "Design Rationale", various key points have been gathered to decide on an aircraft type and size. It was decided that the payload (consisting of the weights of the sub-systems hardware, mechanical manifolds, and drop payload) would be approximately 4.5 pounds. In addition to these payloads, it was calculated that the flight cruise speed is needed to be between ft/s for optimum image capturing. 2. System Design 2.1. Aircraft Using the necessary hardware, our UAV s total weight reaches to about 7 pounds (the payload being about 4.5 pounds). The UAV should be flying around 55-65ft/s. For optimum mission requirements, there must be an accurate measure for the aerodynamics of the UAV. According to researches made by the structural sub-team, the NACA-4412 airfoil type was chosen. For the most convenient value of L/D (lift over drag), the wingspan is decided to be 7,87 ft. and angle of wing setting is 4 o. Furthermore, wings are given a sweep angle enough to make root chord 1 ft. 2.9 in and tip chord in order to strengthen the wing loading and increase the max speed of the UAV. The center of mass and center of lift (which was decided according to the airfoil, angle of setting and sweep angle) coincide to a point within the UAV body, so tail generates no lift in cruising flight. Because of the fact that tail will not generate any lift, NACA-0009 airfoil, which is a symmetrical airfoil, is used for it.the previous competitions of the team was participated with a Trainer 60, which is a ready-to-fly model due to its being easy to find, smooth customs controls, simple transportation and other similar reasons. However, Trainer 60 is designed for hobby purposes and it is not a suitable model for UAVs which can be used for increasing demands of competitions like AUVSI SUAS. In order to get the UAV which has special properties for meeting the mission requirements, based on the experience gained in previous years, it was decided that we participate the 2018 SUAS Competition with an UAV that is design by SUAS Anadolu Team, from the scratch. In Table 3, the comparison of our design, Zafer, and the Trainer 60 is illustrated. TRAINER 60 Trainer 60 is difficult to make modifications on, due to being a ready-to-fly model. Trainer 60 is not designed to lift a payload. Trainer 60's fuselage is not large enough to place the payloads and it has a certain center of gravity. When it is loaded, it is too hard fix the center of gravity. ZAFER Table 3: Comparison of Trainer 60 and Zafer Zafer's design can be modified as desired to meet the needs of the tasks. Zafer is designed according to weight of payload which are needed for the tasks, so it is not expected to encounter a problem about lifting with payload. Zafer's fuselage is designed large enough to place the payload and as a result of it, the center of gravity is generally fixed with desired payloads. Anadolu University SUAS Anadolu 5

6 Several aircraft models have been examined according to their advantages and disadvantages. Eventually the designed UAV is decided to be a pusher model that can be assembled and disassembled easily for transportation. Basically 3 prototypes have been manufactured during the design process according to the below specifications: FLIGHT STATISTICS Max speed 72.1ft/s Cruise speed 55.7ft/s Stall speed 45.9ft/s Table 4: Flight Statistics The first prototype Wing structure Airfoil naca-4412 Area 8.5 ft 2 Tip chord 0ft in Root chord 1ft in Angle of wing setting 4 0 Table 5: Wing Structure This prototype was aimed to be an easy, convenient and safe model to produce according to the needs mentioned in Design Rationale. It was designed with an inverted V-tail made by foam board. The fuselage was made by the combination of plywood and birch. Wings were made of a combination of plywood, birch, balsa and foam board. Balsa was used for ribs. Plywood and birch was used as support elements and foam board was for coating. Carbon tubes are used as main and rear spar of wings and connections between tail and wings. During tests of the first prototype, several problems occurred, leading us to make changes on the aerodynamics of the UAV to avoid tail stretching and loss of aerodynamic control due to sudden thrust changes. The second prototype This prototype was designed and constructed according to the results obtained after the tests of the first prototype. The design of the wings was modified for easier production and to lighten the UAV. In the new design, wings are covered with coating paper instead of foamboard to have a lighter UAV. The tail of the UAV was changed in the direction of the results obtained from the tests of the first prototype. In the second prototype, the H tail was made instead of the inverted V tail. A new design from scratch was made for the new tail design. It was noted that the modified tail design maintains the center of gravity of the UAV. The scratching that come to the tail connection is blocked by the new tail type. Due to the change of the coating material in the new tail and wing designs, XPS foam which was not in use in previous designs is used to ensure conglutination of the coating paper on the control surfaces. Figure 2: Dimensions of the Second Prototype Figure 3: Second Prototype Configuration 1. Autopilot Battery 8. Motor 2. Steering 9. Tail Boom 3. Power Module 10. Servo 4. Batteries 11. Tail 5. Access Point 12. ESC 6. Camera 13. Landing Gear 7. Airdrop 14. Autopilot Table 6: Definition of Figure 3 Anadolu University SUAS Anadolu 6

7 The third prototype Due to the problems in the workshop machines this year, the production of the second prototype was excluded from the first specified timeline, and with the new sponsorship possibility, it was possible to produce a composite fuselage which was not previously possible under current workshop conditions. Therefore, the production plan was changed and production of prototype 2 and 3 is conducted in parallel. The final model will be selected according to the flight, aerodynamics and strength successes. Third prototype s fuselage is shaped as an airfoil which is aimed to create extra lift and reduce the drag. An important issue is that the fuselage is designed to be compatible with the wing and tail structure of the second prototype, so that it will be possible to integrate parts interchangeably when necessary. Figure 4: Dimensions of the Third Prototype 1.Propulsion System 2.ESC 3.Autopilot 4.Camera 5.Main Batteries 6.Sub-system battery 7.Air Speed sensor 8.Steering 9.Access Point 10.Telemetry Module 11.Air Delivery System 12.Back Landing Gear Figure 5: Third Prototype Configuration Table 7: Definition of Figure 4 The OMA OS engine, OCA- 1100HV 100A ESC, 16x10 APC propeller and 6000 mah 6S Li-po battery combinations were selected based on the price, weight and performance scales in this model. The motor has been chosen to assemble in the rear to avoid the antenna's working area being affected. The wing width of the plane is 7.87 ft, TECHNICAL SPECIFICATIONS Wingspan 7.87ft Max safe wing loading 30.31lbs Length 5.51ft Aspect ratio 7,27s/c Height 1.8ft Fuselage volume 8.05ft 3 Power Max takeoff weight 1.29 KW/ 1,73 HP Last flight weight Drop system weight 16.55lbs 1.65lbs 18 lbs Payload capacity 5.51lbs Table 8: Technical Specifications of Aircraft design weight is 15.4 lbs with a high carrying capacity of lbs. Airdrop system has been designed to fit entirely in the fuselage, since the fuselage is wide enough. The UAV was designed to enable gimbal incorporation to satisfy the requirements of the competition. Overall specifications are listed in Table 8. INTERNAL COMBUSTION ENGINE Lighter than electrical engine and systems. As fuel is running low, the center gravity is differential. Flight time cost is more expensive. Causes vibration during flight. Effects cable connections in payload. Fuel engine is harmful for environment. ELECTRICAL ENGINE Fixed center of gravity. Heavier than internal combustion engine. Flight time cost is less expensive. Vibration is little enough to not effect cable connections in payload. Not harmful to environment. Table 9: Comparison of Internal Combustion Engine and Electrical Engine Anadolu University SUAS Anadolu 7

8 2.2. Autopilot APM unit was initially considered two years ago by previous team members, but it is a discontinued board, which means that its last firmware is already published and there will be no longer updates for fixed wings. Instead of APM system, Pixhawk products will be used from now on. Pixhawk 2.1 cube was eventually chosen as the UAS s autopilot system. It has the capabilities such as fully autonomous flight, autonomous takeoff and landing and failsafe modes. It has three IMU sensors for extra redundancy, three meters of GPS sensitivity. An important advantage of Pixhawk is its being an open source application that allows making changes in its source code. Through powerful processor of Pixhawk, operations are expected to be performed more easily. Since this autopilot was distributed as open source code, a software is needed to realize the missions. At this point, ArduPilot was chosen. ArduPilot is a very popular software with good stability properties. It has a well-documented library. In the ground control station, the Mission Planner is used as an interface for ArduPilot. All required information is shown on display to assess the flight data such as aircraft speed, altitude Figure 6: Autopilot System Figure 6 color legends: Autopilot Sensors Control Surfaces Manual Flight Modules Telemetry Communication Modules Autopilot Ground Station Communication Ground Station Interoperability and yaw. Parameters and flight plan are also entered easily in Mission Planner. The interface of Mission Planner is shown below in Figure 7. In order to include additional competition tasks, it is planned to create an additional interface to remove the human factor and generate a fully autonomous mission plan. Specific inputs will be used by this interface and its output will be a mission plan including certain waypoints. This output will be uploaded to the UAS by Mission Planner. Figure 7: Mission Planner Flight Control Interface Anadolu University SUAS Anadolu 8

9 2.3. Obstacle Avoidance Obstacle avoidance starts by detecting obstacles in the flight route and continues by creating a new path to not coincide with the obstacles. In order to sense a hit candidate between the waypoints and obstacles, the altitudes are checked first. If there is no overlap in altitude, the route is not altered. If an intersection between route and obstacle is foreseen, the avoidance process begins. Avoidance process consists of an algorithm which was designed by the team, itself. This algorithm determines the position of the obstacles over the mission area and creates Figure 8: Projection of Obstacles an efficient alternative path. This process is repeated for each obstacle until there is no overlap. If the alternative route generation produces a previously generated but invalid route, a random perturbation is applied to avoid deadlock. Due to technical problems experienced in the production phases, sufficient number of flight tests could not be performed on schedule. Therefore, it is planned to use SITL simulator for obstacle avoidance and autonomous flight. SITL simulator allows us to make realistic plane run simulations. It also allows us to run Ardupilot on PC directly. Testing of the generated algorithms are being done through this simulator for stationary and moving obstacles. Nevertheless, for an actual attempt on obstacle avoidance during the competition, airborne test must be completed. In moving obstacle avoidance, a similar strategy will be adopted. The avoidance task will run continuously for the moving obstacles and checks will instantaneously update during flight. Since this part is in the development process, a risk analysis will be done according to the system tests and its actual realization during the competition will be decided according to the results Imaging System The imaging system is designed to reduce human factors when performing ODCL mission. The system captures images from the air and transfers to the ground. A layout of the imaging system is shown in Figure 9. UAS The camera choice is the same as previous years, which is a Canon EOS 100D kit. The lens view angle, acquisition period and UAS speed combination is calculated to efficiently trace the search area. Setting the shutter speed to high speeds is critical during flight, because a motion blur is not desired. Besides, since UAS is flying at high Figure 9: Imagery System altitudes, the camera should be capable of continuous shooting at high resolutions. The current camera is experimentally providing solutions for these requirements. The camera also supports libgphoto2 library, which enables remote and automatic controlling of shooting images. The onboard computer system is a Raspberry Pi 3, which fetches sensor data, controls camera, captures images and transmit these data to Ground Control Station (GCS). The power of Raspberry Pi is sufficient for these tasks, which makes it a plausible choice for the tasks. Furthermore, it is lightweight and it consumes low power. Knowing the coordinates and orientation of the UAS is an important factor for registering the positions of the acquired images, because the detected targets need these tags. An accurate GPS is used to get coordinates of the Anadolu University SUAS Anadolu 9

10 search area target and a compass is used to get heading angle of UAS precisely. These sensors are connected to the onboard computer. As the UAS moves away from the GCS image transmission speed is decreased and the connection is weakened. The signal quality during the connection is depended on 4 main factors: trasmit power, receiver sensitivity, antenna gains and path loss. The link margin is calculated considering these factors is an indicator of the signal quality. The highest link margin value was obtained with Nanobeams considering the path loss calculated according to the estimated farthest point in the search area and the signal strength characteristics of access points in the market. For that reason, Nanobeam is preferred to establish communication with the GCS. In summary, each component works synchronously. The camera captures images and transfers to the onboard computer. Onboard computer fetches the data from sensors and merges them with captured images to create a packet. Finally, these packets are transmitted to the GCS Autonomous Object Detection, Classification and Localization (ADCL) At this point, the image processing algorithm runs in the ground station. It autonomously takes transferred images and sequentially processes. A plain and simple approach is adopted for object detection. With the help of OpenCV library functions (such as 2D random field analysis and salient point density), the outliers are detected automatically. Then, the detected standard targets are classified automatically on the user interface at the ground control using edges, shape libraries and OCR tools. Lastly, the image crop corresponding to the object point located on the ground is directly sent to the interoperability interface without any human assistance Detection Thanks to the more successful communication system enabling a wider bandwidth, this year, we decided to use full resolutions of images are used for detecting objects. With these images, regular pre-processing steps are applied for de-noising and conversion into HSV color space. After the conversion, possible candidates are extracted using MSER blob detection algorithm, which basically groups similar pixels. Then the candidates are eliminated considering their size and aspect ratio. If the candidate passes this step, then their processing continues. In Figure 10 three detection results are given as examples. These detection crops are then sent to the computer for further processing according to color, shape and letter content Classification Figure 10: Object Detection Color Detection Color detection includes a background elimination process that uses a trained dataset of possible background objects (grass, soil, road etc.). After background elimination, color detection is performed on the remaining blobs, by simply using histograms of HSV values. The detection creates two binary images, one for the alphanumeric character and the other one for the shape of the target. Alphanumeric Character Recognition and Shape Orientation In order to detect and recognize the alphanumeric character, a simple Optical Character Recognition (OCR) algorithm is applied at various rotations. If a character is recognized at a certain rotation, the character value and compass-corrected shape orientation is recorded. The algorithm contains the following steps; firstly, The HOG (Histogram of Oriented Gradients) features of the characters are extracted from dataset which contains about images from each character with all 8 different rotations. Then these features are classified with the popular k-nearest Neighbor (K-NN) classifier. With the help of k-nn, the shape orientation and character detected. Then, the orientation of shape that is corrected Figure 11: Color Detection Figure 12: Object Orientation Anadolu University SUAS Anadolu 10

11 according to the yaw angle of UAS (from north orientation) is recorded. The alphanumeric character and the shape orientation angle are finally written onto the json file. Shape Detection Using a similar to the approach as in character recognition, the team has created a dataset of shapes including over images to train and test the shape detection algorithm. The classifier is trained with HOG features of images with SVM classifiers, similar to the alphanumeric character detection stage. The extracted shapes that come from color detection and thresholding are pre-processed with morphological operations before its Figure 13: Shape Detection feature characteristics are calculated. Then the HOG features are compared with the trained data and the best match is accepted as the true shape. A sample pre-processing stage of a circular shape is illustrated in Figure Localization At the localization part, firstly the pixel per meter value is calculated with the help of focal length and camera lens angles. Then the image back projection algorithm is applied with the angles roll and pitch. The projected image localization is calculated with the recorded UAS position and pixel per meter value on the earth surface. The calculated localization parameters are written on to the JSON file Search Area Interface The graphical user interface (GUI) for search area mission results is currently being developed in C++ with QT API as shown in Figure 15. This GUI is able to display the images and the results of the targets for both automatic and manual detections. Additionally, it has a description box for the detected target. The json tags are also displayed in the GUI. Figure 14: Lens Angle Effect Figure 15: Search Area GUI 2.7. Communications Telemetry Link A link is established between the UAS and the GCS to transmit telemetry data. 900 MHz XBee is used this year since it allows to perform long-range communication without disconnecting. It has a proven reliability and performance, as observed in previous years. The XBee module has a high transmission power and receiver sensibility specifications. Also, it has high efficiency due to low power consumption Imagery Link Imagery Link system s aim is to provide robust transfer of captured images from the UAS and the GCS. In order to achieve this, Ubiquiti Nanobeams (located both at the UAS and at the GCS) are used. Their frequency is synchronized at 5.8 GHz and they provide reliable network packet delivery with TCP. Anadolu University SUAS Anadolu 11

12 RC Link The safety pilot could control the UAS with Futaba 14SG 2.4GHz transmitter and Futaba R6014FS receiver in the case of manual flight. These devices use standard 2.4 GHz frequency. A secure coupling is established prior to the start of flight Interoperability Interoperability Interface communicates with Interoperability Server to provide control. It collects all data from the GCS and has privilege to send data to the Interoperability Server. Interoperability interface also loads obstacle data from Interoperability Server and sends to GCS. Interoperability Interface uses HTTP and TCP methods to communicate with Interoperability Server, and uses a Ground Control Interface such as ODCL Interface, and Flight Control Interface. Flight Data streams via the UDP method from Flight Control Interface to Interoperability Interface Summary Figure 16 illustrates communication links, their corresponding frequencies and connections. Figure 16: Communication System with Links 2.8. Air Delivery A major priority when designing the air delivery system is to create the simplest possible system to satisfy the conditions. According to the experiences from last year, the change in the rules requires that the payload bottle must explode in the event of a collision and generate water traces on the ground. During tests without any additional system the bottle explodes when it falls from a height of 30 feet to the asphalt road or from 50 feet to the lawn ground. In addition to the obstacle avoidance algorithm, an algorithm is constructed to refer to the wind speed and the cruising speed of the aircraft at the moment of departure to increase the impact sensitivity of the payload. Figure 17: Air Delivery System Anadolu University SUAS Anadolu 12

13 2.9. Cyber Security Security is problem for every communicated device and they should be protected against malicious takeover attacks. Maintaining the security can be focused in several sections, which are Telemetry, Imaginary, RC Links and Interoperability. Table 10 shows links and mitigations of threads related to that link. Link Telemetry Link Imagery Link RC Link Interoperability Mitigation XBee module has on the board AES Encryption option and it enabled. AES protocol was chosen because it can be implemented on both hardware and software easily. XCTU allows to setting built-in AES encryption on both modules. To protect imagery link, both Nanobeams bound to each other with their MAC addresses. Therefore, they use dedicated way to communicate with them. It is also protected by Wi-Fi Protected Access 2 (WPA2) and AES protocols. WPA2 protocol uses CCMP encryption method, is based on AES encryption. By default, RC transmitter has unique and permanent ID code and this code defines its identity. This code ensures to provide reliable RC communication. When linked with receiver, RC transmitter responds only to that receiver. Also, RC transmitter supports FHSS which provides hops between frequencies unpredictable way during communication. To secure network from external cyber-attacks, all communication between computers but the Imagery Link has connected each other with ethernet cables, Wi-Fi communication disabled. Firewall activated for all Interfaces. Table 10: Cyber security mitigations according to the links Summary In Figure 18, the design schema of the UAS and the GCS is illustrated. Figure 18: Design schema of the UAS and the GCS Anadolu University SUAS Anadolu 13

14 3. Safety, Risks, & Mitigation 3.1. Developmental Risks & Mitigation Table 11 describes possible risks and corresponding mitigation plans with their occurrence probabilities and shortterm impact levels. Risk Probability Impact Mitigation Plan Production accidents: Such as laser cutter, hot glue gun, inhalation of carbon particles, fretsaw may cause damage of user s skin or affects lungs badly. High Low Compressor, protection masks and gloves should be used to prevent damage. Budget insufficiencies: Because the materials prices are costly, it may restrict the team Low High To assist the team budget, sponsorship agreements with ready companies should be made. Material purchases: Process of material purchases is one of the most important issues; it may disrupt the entire process. If lack of material occurs, production of UAV might come to a standstill. High To accelerate the process of material purchases, the specifications and decisions should be made early. An urgency team is decided to move further with alternative tools. Production can t meet design expectation: Performance of UAV might stay below expectation. There might be inaccuracy on design process or while production process, faults can be made. High Early started the design and production might prevent the problems. Faults can be predicted when worked in detail and supported with simulations. Our team is simultaneously constructing two designs. Unexpected production issues: Broken down of the equipment and the electronic devices adversely affect the production process. For instance, disruption of the laser cutting device causes production to stop for a while. Alternative production routes should be searched to not waste time if any device is broken. Our team is simultaneously constructing two designs. Crash in Tests: Accidents can occur during the tests and these accidents can cause loss of material or even loss of aircrafts. High All tests should be planned as best as possible. Checklists should be created and spare parts for possible accidents must be provided. Spare productions should be made. Table 11: Developmental risks & mitigation Anadolu University SUAS Anadolu 14

15 3.2. Mission Risks & Mitigation Table 12 describes possible risks, their occurrence probabilities, short term impacts and corresponding mitigation plans. Risk Probability Impact Mitigation Plan Loss of connection of Telemetry Link Low Safety pilot takes the control using safety switch. Loss of connection of RC Link Low When the RC link is lost, autopilot takes control and same process runs. Loss of connection of Imagery Link Low Take images from onboard computer after landing of the UAS. Air Delivery Failure Low Low None. Try to accomplish the rest of the missions Structural Problems Low Hardware Error LiPo Battery Failure Low Before the flight, the functionality of the system and visible defects are checked. When an error of the onboard computer is encountered, new onboard computer will be replaced. If an error occurs based on autopilot hardware, the safety pilot takes control. In pre-flight, safety of the batteries and their charges are controlled. Obstacle Avoidance Failure Low Before the flight of the UAS, obstacle avoidance system checking by the GCS. If any problem occurs during the flight, GCS operator can interfere between the UAS and autopilot. Human Error Injuries Low High Every process controlled by checklists. Team members informed about safety rules and how to control checklists. First aid kit is ready for every flight. Equipment checking to avoid injuries in pre-flight. Fire Low High Table 12: The mission risks & mitigation plans All wired connections controlled before the flight to avoid short circuit and fire. Fire extinguisher is ready for the fire risk at every flight. 4. Conclusion SUAS-Anadolu Team has managed to create an array of new aircrafts, which were designed by the team members for meeting the needs of AUVSI SUAS 2018 competition during the past year. There are continuous improvements in image processing, obstacle avoidance and autonomous image transfer due to past years experiences. A long and exhaustive testing process has been carried out to ensure that the designed system is safe, up and running. At this stage, the team aims to complete a majority of the tasks in the contest successfully. Anadolu University SUAS Anadolu 15