Control System Development for Small UAV Gimbal

Published on January 2021 | Categories: Documents | Downloads: 1 | Comments: 0 | Views: 58
of x
Download PDF   Embed   Report

Comments

Content

 

 

CONTROL SYSTEM DEVELOPMENT FOR SMALL UAV GIMBAL

A Thesis Presented to the Faculty of California Polytechnic State University, San Luis Obispo

In Partial Fulfillment of the Requirements for the Degree Master of Science in Aerospace Engineering

by Nicholas J. Brake August 2012

 

© 2012 Nicholas J. Brake ALL RIGHTS RESERVED

ii

 

COMMITTEE MEMBERSHIP

TITLE:

Control System Development for Small UAV Gimbal

AUTHOR:

Nicholas J. Brake

DATE SUBMITTED:

August 2012

COMMITTEE CHAIR:

Eric A. Mehiel, Ph.D., Associate Professor, Aerospace Engineering Department

COMMITTEE MEMBER:

Daniel J. Biezad, Ph.D., Professor, Aerospace Engineering Department

COMMITTEE MEMBER:

Rob A. McDonald, Ph.D., Associate Professor, Aerospace Engineering Department

COMMITTEE MEMBER:

Alexander Bogdanov, Ph.D., AME Unmanned Air Systems

iii

 

ABSTRACT Control System Development for Small UAV Gimbal Nicholas J. Brake

The design process of unmanned ISR systems has typically driven in the direction of increasing system mass to increase stabilization performance and imagery quality. However, through the use of new sensor and processor technology high performance stabilization feedback is being made available for control on new small and low mass stabilized platforms that can be placed on small UAVs. This project develops and implements a LOS stabilization controller design, typically seen on larger gimbals, onto a new small stabilized gimbal, the Tigereye, and demonstrates the application on several small UAV aircraft. The Tigereye gimbal is a new 2lb, 2-axis, gimbal intended to provided high performance closed loop LOS stabilization through the utilization of inertial rate gyro, electronic video stabilization, and host platform state information. Ground and flight tests results of the LOS stabilization controller on the Tigereye gimbal have shown stabilization stabilizati on performance improvements over legacy systems. However, system characteristics identified in testing still limit stabilization performance, these include: host system vibration, gimbal joint friction and backlash, joint actuation compliance, payload CG asymmetry, and gyro noise and drift. The control system design has been highly modularized in anticipation of future algorithm and hardware upgrades to address the remaining issues and extend the system's capabilities.

Keywords: Select descriptive keywords and separate terms with a comma and a space.

iv

 

ACKMOWLEDGEMENTS I would like to thank my friends and colleagues who have helped me along the way in completing my thesis, without these people this project would not have been possible. I’d like to thank my advisor, Dr. Mehiel, for his advice and support in focusing my efforts over the years on what I was trying to solve. I’d like to thank my friends and colleagues at AeroMech Engineering Inc. for providing the resources and drive to develop a new gimbal gimbal and giving me the opportunity to develop the control system. To my parents for their support and encouragement when even they were tired of hearing about my thesis. I’d also like to thank my loving wife Krystal, who has supported and driven me to finish.

v

 

Table of Contents List of Tables ................................................................................................................... ................................................................................. .................................. vii viiii  List of Figures ........................................................................................................... .................................................... ................................................................ ......... ix  1  Introduc Introduction tion ....................................................................................................... ................................................ ................................................................. .......... 1   1.1  Topic area ..................................................... ............................................................................................................. ........................................................ 1  1.2  General problem ................................................................................................... .................................................... ............................................... 3  1.3  Project statement & goals..................................................................................... ................................................ ..................................... 4  1.4  Thesis layout ........................................................................................................ 5  2  Backgro Background und Information Information ............................................................................................. ........................................................ ..................................... 7  2.1  Line of Sight Stabilization........................................................ Stabilization.................................................................................... ............................ 7  2.1.1  Dampe Dampening ning Vs. Stabilization Stabilization ...................................................................... 10  2.1.2  Active Vs. Passive Passive ...................................................................................... ............................................................ .......................... 11  2.2  Airborn Airbornee stabilized platforms............................................................................. ................................................... .......................... 11  2.2.1  UAV system integration integration ............................................................................. .......................................... ................................... 16  2.3  Tigerey Tigereyee Design Overview.......................................................................... Overview................................................................................. ....... 18  2.3.1  Control System Goals ....................................................... ................................................................................. .......................... 20  2.3.2  Operat Operating ing environment ............................................................................... 22  2.3.3  Electro Electro-Mechanica -Mechanicall Overview Overview .................................................................... .......................................... .......................... 23    2.3.4   Camera Mechanical Mechani cal Design...................................................................................... Design...................................................................... ................ 24 2.3.5 Sensors ........................................................ ........................................................................................... ................................... 26  2.4  Coordina Coordinate te systems ............................................................................ ............................................................................................ ................ 27  2.5  Dynamic Dynamicss model..................................................... ................................................................................................. ............................................ 30  2.5.1  Kinema Kinematic tic constraints........................................................................... constraints.................................................................................. ....... 30  2.5.2  Ideal LOS Definition......................................................... Definition................................................................................... .......................... 32  2.5.3  Equation Equationss of motion ................................................. .................................................................................... ................................... 33  2.6  Control Architecture Architecture Review ...................................................................... ............................................................................. ....... 35  3  Simulation Development Development ................................................................................... .......................................................................................... ....... 40  3.1  Equation Equationss of motion mechanization mechanization ................................................................... ................................................... ................ 40   3.2  Mass & Inertia .................................................................................................... ........................................................ ............................................ 41   3.3  Friction ........................................................................................................ ................................................. .............................................................. ....... 44  3.4  Drive System ...................................................................................................... ................................................. ..................................................... 45 

3.4.1  Actuato Actuators rs ..................................................................................................... ................................................ ..................................................... 46  3.4.2  Pan Drivetrain ................................................. ............................................................................................. ............................................ 46  3.4.3  Tile Drivetrain............................................................................................. ................................................. ............................................ 47  3.5  Sensors ............................................................................................................... 48  3.5.1  Inertial – MEMS Gyros Gyros ..................................................... .............................................................................. ......................... 49  3.5.2  Absolute – Magnetic Encoder..................................................................... ..................................................... ................ 51   4  Control Development................................................................................................ .................................................... ............................................ 54  4.1  Overvie Overview w ............................................................................................................ ..................................................................................................... ....... 54  4.2  Require Requirements ments................................................. ...................................................................................................... ..................................................... 56  4.3  Primary inner/outer inner/outer loop .................................................................................... .................................................................... ................ 57  4.3.1  Inner loop ............................................... .................................................................................................... ..................................................... 58  4.3.2  Outer loop ....................................................... ................................................................................................... ............................................ 59  4.3.3  PID detail ............................................... .................................................................................................... ..................................................... 61   4.3.4  No-Go position limit functions ................................................................... ................................................... ................ 62  vi

 

4.4  Gimbal navigation .............................................................................................. ................................................. ............................................. 63  4.4.1  Euler Lock ...................................................... ................................................................................................... ............................................. 64  4.4.2  GPS Lock .................................................................................................... ....................................................... ............................................. 64   4.4.3  Visual Target Tracking ..................................................... ............................................................................... .......................... 65  4.5  Sensor & Actuator Processing Processing ........................................................................... ................................................. .......................... 66  5  Implem Implementation entation and Test.......................................................................... Test........................................................................................... ................. 69  5.1  Hardwa Hardware re & Software development development ................................................................... .................................................. ................. 69   5.1.1  Develo Development pment environment environment .......................................................................... ................................................ .......................... 71  5.1.2  Ground test software ................................................................................... ................................................ ................................... 72   5.1.3  Key issues ....................................................... ................................................................................................... ............................................ 74  5.2  Ground Testing & Calibration.................................................. Calibration............................................................................ .......................... 75  5.2.1  Alignm Alignment ent ................................................................................................... ................................................................ ................................... 76  5.2.2  Therma Thermall Calibration Calibration ................................................. .................................................................................... ................................... 78  5.2.3  Motion table ................................................................................................ .................................................... ............................................ 80  5.3  Flight testing................................................. ....................................................................................................... ...................................................... 82  5.3.1  Manned ....................................................................................................... 82  5.3.2  Unmann Unmanned ed ................................................................................................... ....................................................... ............................................ 84  6  Results................................................................................................................ ........................................................ ............................................................... ....... 88  6.1  Ground Test Disturbance Rejection ................................................................... .................................................. ................. 88  6.2  Flight Test .......................................................................................................... 92  6.2.1  Inertial dampening dampening ...................................................................................... ............................................................................... ....... 93  6.2.2  GPS lock ..................................................................................................... ............................................... ...................................................... 94  6.2.3  Target tracking ................................................ ............................................................................................ ............................................ 97  7  Summary ............................................................................................................ .................................................... ............................................................... ....... 99  7.1  Conclusio Conclusions ns .................................................. ........................................................................................................ ...................................................... 99  7.2  Future Work .................................................. ..................................................................................................... ................................................... 100  8  Bibliogra Bibliography phy .......................................................................... ........................................................................................................... ................................. 102 

vii

 

List of Tables Table 2-1 Primary EO/IR camera payload payloadss ..................................................... ...................................................................... ................. 26  Table 2-2 Relevant External Coordinate Systems ............................................................ 28  Table 2-3 Coordinate Coordinate systems........................................................................................... ........................................................ ................................... 29  Table 3-1 Mass model assumptions.................................................................................. ........................................................ .......................... 43  Table 4-1 Gimbal Control alignment system modes overview ...................................................................... 55   Table 5-1 fixture dimensions and tolerances tolerances ...................................... 77 Table 5-2 Alignment accuracy w/ perfect alignment to center FOV ................................ 77   Table 5-3 Alignment accuracy w/ center FOV tolerance ................................................. 78  Table 6-1 Mission stabilization performance performance estimate ..................................................... 91   Table 6-2 Raw GPS Lock CEP ......................................................................................... ...................................................... ................................... 97  Table 6-3 Bias corrected GPS Lock CEP ......................................................................... ................................................................. ........ 97 

viii

 

List of Figures Figure 2-1 Definition of sensor Line of Sight..................................................................... .................................................. ................... 8  Figure 2-2 LOS stabilization reference frames ................................................................. ................................................ ................. 10  Figure 2-3 Key mechanical mechanical sub-assemblies of an airborne gimbal .................................. 13   Figure 2-4 Classes of airborne stabilized gimbals gimbals ............................................................ 15  Figure 2-5 gimbalfor UAS integration integrati on ................................................................... Figure 2-6 Example Design process Tiger Tigereye........................................................... eye............................................................................ ................. 17 19   Figure 2-7 Tigereye dual imager 4-view & picture .......................................................... 20  Figure 2-8 Histogram of total vehicle body rate sampled @ 10Hz .................................. 23  Figure 2-9 Key mechanical mechanical sub-assemblies of an airborne gimbal .................................. 24   Figure 2-10 External gimbal reference reference frames ................................................................. ......................................................... ........ 27  Figure 2-11 Gimbal Coordinate Coordinate systems .................................................................. .......................................................................... ........ 29  Figure 2-12 Gimbal free body diagrams ........................................................................... ................................................. .......................... 34  Figure 2-13 Example of Direct stabilization system architecture..................................... 37  Figure 2-14 GIT 3axis gimbal on GTmax helicopter helicopter ....................................................... 38   Figure 2-15 Adaptive control for a two axis gimbal - Experimental Experimental Setup ...................... 39  Figure 3-1 Gimbal EOM Mechanization Mechanization .......................................................................... ................................................ .......................... 41  Figure 3-2 Mass distribution distribution ............................................................................................. ................................................ ............................................. 42  Figure 3-3 CAD inertia tensor calculation equations ....................................................... 42  Figure 3-4 Friction model .................................................... ................................................................................................. ............................................. 45   Figure 3-5 Motor Steady State Characteristics (V in=12V) ............................................... 46  Figure 3-6 Pan Bearing Comparison Comparison - 10deg Position Step Response ............................. 47   Figure 3-7 Tilt driven pulley............................................................................................. ................................................ ............................................. 48  Figure 3-8 Sensor Sub-System................................................................ Sub-System.......................................................................................... .......................... 49  Figure 3-9 MEMs Gyro characteristics [12], [11] ............................................................ 50  Figure 3-10 Single-axis gyro model, overview................................................................. ................................................ ................. 51   Figure 3-11 Gyro dynamics, detail ................................................................................... 51  Figure 3-12 Absolute position encoder diagram ............................................................... ....................................................... ........ 52  Figure 3-13 Error sources for hall-effect hall-effect encoder..................................................... ............................................................. ........ 53  Figure 4-1 Control system overview................................................................................. 55   Figure 4-2 Primary Inner/Outer controller overview ........................................................ ................................................ ........ 57  Figure 4-3 Inner loop detail .............................................................................................. 59   Figure 4-4 Outer loop detail................................................. .............................................................................................. ............................................. 61  Figure 4-5 PID implementation in Simulink .................................................................... 62   Figure 4-6 Joint No-Go error functions plot ..................................................................... .................................................... ................. 63  Figure 4-7 GPS lock block diagram.................................................................................. 65  Figure 4-8 Sensor processing subsystem .......................................................................... ................................................ .......................... 67  Figure 4-9 Deadzone Deadzone soft inverse comparisons comparisons ................................................................ ............................................... ................. 68  Figure 4-10 Deadzone Deadzone inverse implementations implementations (Hard vs. Soft)...................................... 68  Figure 5-1 Electronics Electronics block diagram..................................................... ............................................................................... .......................... 70   Figure 5-2 Software block diagram .................................................................................. 71   Figure 5-3 Example dataset from MEMs gyros @1KHz sample rate .............................. 72   Figure 5-4 Gimbal bench test software ............................................................................. ................................................... .......................... 73  Figure 5-5 Desktop development development kit................................................................................. kit.................................... ............................................. 74   Figure 5-6 Alignment Alignment fixture conceptual diagram............................................................ .................................................... ........ 76  ix

 

Figure 5-7 Temperature control chamber ......................................................................... 78  Figure 5-8 Gyro calibration command profile .................................................................. ................................................. ................. 80  Figure 5-9 Single axis test stand with pivot..................................................... ...................................................................... ................. 81  Figure 5-10 Dual axis test stand (inverted operati operation on left, CAD model right) .................. 82  Figure 5-11 Manned platform integration........................................................ integration......................................................................... ................. 83  Figure 5-12 UAV platforms................................................. .............................................................................................. ............................................. 85  Figure 5-13 Vibration Vibration test matrix ..................................................................................... .................................................. ................................... 86  Figure 5-14 Gimbal view from ROTM at EFR range....................................................... 87  Figure 6-1 Pan disturbance rejection performance to 5deg sine wave disturbance .......... 90   Figure 6-2 Tilt disturbance rejection performance to 5deg sine wave disturbance .......... 90   Figure 6-3 Target motion = f(%of flight time, zoom level).............................................. 91  Figure 6-4 Max slant range = f(allowable f(allowabl e target movement, aircraft an angular gular rate) .......... ..... ..... 91  Figure 6-5 Max zoom = f(allowable f(allowable target movement, aircraft angular rate) .................. 92  Figure 6-6 Education Flying Research Research facility at Cal Poly .............................................. 93   Figure 6-7 Long distance view w/ overview (slant range ~ 3,600ft) ................................ 94   Figure 6-8 Flight plan using Cloud Cap's PCC ground station software.......................... 95  Figure 6-9 GPS lock target and center FOV axes ............................................................. ...................................................... ....... 96  Figure 6-10 GPS lock performance summary .................................................................. 96  Figure 6-11 Target tracking screenshots screenshots ........................................................................... ................................................. .......................... 98  

x

 

1

Introduction

1.1 Topic area The main objective of an Intelligence Surveillance and Reconnaissance, ISR, platform is to return the highest quality information possible often in the form of a realtime video stream. There are many important factors in addition to the quality of the image to be considered when developing an ISR system including: response time, portability, operating costs, detection footprint (radar, visual, acoustic), and overall reliability. reliabilit y. An increasing number of ISR systems are now selecting small Unmanned Aerial Vehicles, UAVs as the platform of choice because of their ability to exceed the performance of manned and large unmanned aircraft in cost, portability, response time, and detection footprint. One of the most significant limitations to small UAV IISR SR systems is their ability to carry a stabilized gimbal capable of delivering the stabilization performance required to high target resolution while the platform stays outside of its detection footprint. Large, high mass, stabilized gimbal systems can provide excellent stabilized imagery. However, they require large aircraft with significant infrastructure requirements to carry these larger gimbals to their target. To give an example of the drive for smaller and smaller systems consider the design spiral for a traditional ISR platform on a manned full scale aircraft. Full scale aircraft carrying heavy payloads require: large runways and infrastructure, a dedicated human pilot and usually a separate payload operator. They also have significant: acoustic, visual, environmental, radar signatures that can affect the quality of the information collected. These larger vehicle signatures require long slant 1

 

ranges between the target and the platform to avoid detection. This large standoff range requires very high resolution cameras with narrow fields of view to get the required target resolution. With the narrow field of view the stabilization stabilizat ion performance requirements of the gimbal increase significantly and can only be achieved by large heavy gimbals and thus driving the aircraft size up. This design spiral can be reversed through increased capability on small low mass gimbal systems now possible through the use of new MEMs gyros and high performance microcontrollers. Enabling high performance stabilization on small gimbals/UAV gimbals/UAV systems can be used to reduce system cost, complexity, and infrastructure requirements giving the operator much more flexibility in gathering information. To give an example of this reversal in the design spiral consider a gimbal small enough that a small electric or gas powered UAV, less than 30lb GTOW, can be used. These small UAVs can be launched by field operators in rough terrain at a moment’s notice. The smaller host vehicles can get closer to the target due to their reduced signatures. By getting close to the target the imaging device can now use a smaller lens reducing the weight of the payload allowing even smaller vehicles to carry the imager. Getting closer to the target also allows the stabilization requirements to be reduced for the same quality of imagery. The enabling technology here in getting the required stabilization performance out of a small light weight gimbal is using modern inertial rate sensors and microcontrollers and developing a control system to take full advantage of the new technology. This brings us back to the topic area of this paper which is the control system development for a small UAV gimbal.

2

 

1.2 General problem Stabilized imaging platforms on small low cost systems (UAV + turret) have been significantly lagging behind the LOS stabilization performance offered by larger systems. In part this performance gap is due to the biggest advantage these systems have over their larger competition, they are low cost and have thus suffered from limited research and development efforts as well as available technology. Being low cost these smaller stabilized gimbals are limited to inexpensive commercial-off-the-shelf, COTS, components and have had to wait for the advanced technology utilized in larger designs to trickle down. The geometry and weight restrictions of small UAV g gimbals imbals have also restricted the type of inertial rate sensors capable of fitting inside to MEMs gyros which have lagged in performance behind other inertial rate sensing technologies such as fiber optic and ring laser gyros. With developments to the performance increases in MEMs inertial sensors, EO and IR cameras, and high speed processors over the last decade these advanced technologies are now available in the size, weight, power, and performance ranges needed to make significant improvements to stabilization on small gimbal designs. Integrating this technology into these smaller stabilized platforms fills the current performance gap of small airborne stabilized imaging platforms and has the potential to significantly significantl y increase the effectiveness of the small UAS. However the integration of this newly available technology has revealed significant technical challenges to high stabilization performance due to additional system limitations not yet fully considered on small UAS platforms. Presenting a way to address this stabilization problem with new enabling technology the using the Tigereye gimbal is the goal of this paper 3

 

1.3 Project statement & goals The scope of this work is to develop and implement a control system that combines the inertial stabilization capabilities seen on, traditionally, large gimbals within a compact 2lb gimbal, the Tigereye (section 2.3), which is capable of being carried by many of today’s small UASs. The goals of the combined system are:

  Stabilization performance increase over legacy system



  Reduction of operator workload through the implementation of additional



outer-loop control Each of the stated goals are tied to increasing the overall mission effectiveness of the ISR system by filling the stabilization performance gap between small UAV gimbals and their larger cousins. The system will then be flight tested on several different aircraft representing a wide variety of applications followed by a discussion about the performance of each application. Advanced algorithms for Euler lock, GPS lock, and optical target tracking will be discussed and implemented for purposes of reducing user workload. The resulting gimbal system’s stabilization will be evaluated based on its ability to stabilize the payloads such that the remaining LOS inertial disturbances do not degrade the imagery quality at the payload’s narrowest field of view. This project contributes to the field by discussing the design and implementation requirements and for a stabilized stabiliz ed optical ISR payload. By starting with a base conceptual mechanical design and target UAV platform this paper shows the development of control algorithms from simulation to full deployment on an embedded control system. This project also identifies the important system characteristics limiting the system’s overall 4

 

performance. Testing and analysis of the physical gimbal has been done to demonstrate the resulting system’s capabilities and limitations. limitations . Finally, the outer loop algorithms, GPS lock and visual target tracking are integrated and demonstrated in flight performance is shown for the complete system. With new enabling technology integrated into the Tigereye gimbal, this investigation will show the development of small a high performance inertial stabilized imaging platform. The increased computing power of modern processors, high performance micro-electro-mechanical, MEM, inertial sensors, inertial imaging platforms can now be made small enough to be carried by small inexpensive UAVs weighing less than 30lbs.

1.4 Thesis layout This work is laid out into 7 chapters, chapters 1 and 2 cover background information, chapters 3 thru 6 cover the system development and test, and the final chapter covers the conclusion and future work. Chapter 1, Introduction, has introduced the topic area, the general problem and motivation for the project, as well as state the project statement. Chapter 2, Background Information, provides in-depth information on the details of stabilized gimbals, their application to UAV ISR systems, and introduces the relevant definitions. Chapter 3, Simulation Development, lays out the work done in the simulation environment, and key concepts for the accurate simulation of the Tigereye gimbal. Chapter 4, Control Development, provides in-depth information of the control system, system requirements, lays out the primary inner and outer loop control architecture, and introduces the advanced secondary outer loops implemented in this project. Chapter 5, 5

 

Implementation and Test, covers the software implementation & development, test equipment development, and flight test platforms. Chapter 6, Results, covers the results from testing on each of the platforms and what key performance limitations can be identified from each test. Chapter 7, Summary, summarizes the key findings of the project and provides the  jumping off off points for for additional additional work. This is the most important important chapter chapter of the work work in that it provides multiple points from which to continue work to focus on each of the key performance limiting characteristics of the Tigereye small UAV gimbal.

6

 

2 Background Information This chapter serves as an introduction to the details of line of sight stabilization, its application to UAV payloads and the details of the Tigereye gimbal system. This project assumes that the payload being stabilized by the gimbal is a video camera however the LOS stabilization concepts can be applied to any directional payload such as a directional radio antenna. The goal in limiting the scope is to stay focused on specific information pertaining to the Tigereye gimbal whose primary payloads are EO or IR video cameras. This chapter also defines the coordinate systems, equations of motion, and performance metrics used in this project. Currently, there exists a significant amount of work done in this field and this paper will work to capitalize on existing developments to fill the performance gap in small UAV gimbals.

2.1 Line of Sight Stabilization Stabilizati on To define the line of sight the payload must first be directional meaning that the Field of Regard1, abbreviated FOR and synonymous with Field of View FOV for sensing payloads, is less than a 360 degree sphere. The center of this field of regard is the look direction and the ray2 originating at the sensor and extending through the center field of regard off into infinity defines the payload’s line of sight, abbreviated LOS. For this work it is assumed that any curvature of this line of sight between the payload and its

1

 Field of Regard is associated with generic directional payloads, both transmitting and sensing type payloads. The term Field of View, FOV, is a field of regard m more ore specifically associated associated with sensing type payloads. 2 Ray: “a line which starts at a point with given coordinates, and goes off in a particular direction to infinity, possibly through a second point” [8]

7

 

3

target  over the distances considered is small and can be neglected. A diagram of a directional sensor and its associated FOV is shown in Figure 2-1 Definition of sensor Line of Sight Field of View (FOV)

Line of Sight

Vertical FOV Horizontal FOV

Sensor Target

Ideal Line of Sight Figure 2-1 Definition of sensor Line of Sight

Line of sight stabilization is the act of maintaining the target in the sensor’s center field of view, LOS, under arbitrary host platform and target motion. The platform and target are assumed to be allowed to move in all six degrees of freedom. However the line of sight vector only has two degrees of freedom. This is because LOS LOS stabilization only constrains the target to the center field of view of the sensor. Stabilization Stabilizatio n in this context allows the target to translate to/from the sensor and rotate along the along the LOS vector while still satisfying the intent of stabilization. The 2-axis gimbal is an example of a mechanical system capable of maintaining the two Euler angles which define the ideal LOS vector. The 2-axis gimbal does this by rotating the payload about a pair of

3 A

sensor target is also commonly referred to as the Sensor Point of Interest, abbreviated either SPoI or

SPI

8

 

orthogonal revolute joints; an example diagram is shown in Figure 2-13 Example of Direct stabilization system architecture. Inertial space and the sensor’s FOV are two common reference frames for the stabilization mechanism to measure the error between the LOS and the nominal LOS that centers the target in the FOV, these are displayed in Figure 2-2 LOS stabilization reference frames. The most common form of active LOS stabilization is to measure the sensor’s LOS disturbances in the inertial frame through the use of inertial sensors. This information is then used in the control system to drive the joint angles of the stabilization mechanism to zero the estimated LOS error. One major drawback of this method is that the ideal LOS vector is only estimated and is subject to drift over time with non-perfect sensors. Because of this drift an absolute reference needs to be in place to stabilize the system for long durations. Without an absolute reference the estimated ideal LOS LOS vector will drift unbounded, in this situation the control system is no longer stabilization control but a LOS dampening control system. Sensor

Line of Sight

OSensor Ideal Line of Sight

XSensor

Field of View YSensor ZSensor OGlobal

XGlobal  Target

YGlobal 



 

ZGlobal 

9

 

Figure 2-2 LOS stabilization reference frames

Directly measuring the target’s deviation from the center field of view via the information provided in the sensor’s video or data stream is called target tracking. While this method of direct measurement seems to be the simplest solution by directly measuring the LOS error it requires accurate knowledge of the field of view of the sensor, significant processing power to track the target in real time under a variety of conditions, and a transformation of the error measurement into required joint positions for feedback control. This method is also subject to external influences such as clouds obstructing the view of the target. Several methods for the estimation of motion from video as well as target tracking are discussed in [1]. Modern camera stabilization stabilizat ion gimbals today combine measurement information from GPS, inertial sensors, joint positions, air vehicle state solutions, and target tracking information from a video processing board to generate a robust estimate of what the current LOS is and what joint angles are requires to get to the Ideal LOS.

2.1.1 Dampening Vs. Stabilization For the scope of the control system being developed an important difference between inertial stabilization and inertial dampening needs to be made. Inertial dampening focuses on the short dynamics and cannot indefinitely maintain LOS due to sensor drift rates. An inertial stabilized imager can indefinitely maintain LOS stabilization. stabilizat ion. Inertial stabilization includes such capabilities as G GPS PS lock, target tracking, and Euler lock. These operating modes provide corrections for long term drifting of inertial rate sensors. Inertial stabilization requires inertial dampening, however inertial dampening does not have to be inertial stabilized. The definitions below are intended to 10

 

provide differentiation between the two. In the context of this project the turret’s control system will be designed to provide inertial dampening in all situations and, when aircraft state data is available, provide inertial stabilization. Inertial stabilization is the long term alignment of the LOS vector

from an imaging device subject to inertial disturbances. Inertial dampening is the short term stabilization of the LOS

vector from an imaging device against subject to inertial disturbances, without guaranteeing long term pointing.

2.1.2 Active Vs. Passive Active stabilization is also subject to the limitations of the mechanical characteristics of the gimbal and must be robust to structural flexibility, joint misalignment, backlash, actuator rate limits, linear and non-linear friction forces, etc. To achieve high levels of performance the gimbal design must also maximize its passive stabilization stabilizat ion characteristics: low friction joints and high inner axis inertia. The passive stabilization characteristics are intended to take advantage of the fact that the platform, sensor, and target move within inertial space. By maximizing the inertia of the inner most gimbal frame, this is the frame that the sensor is fixed to, and minimizing the system’s frictional forces the disturbances to the platform will minimally disturb the LOS vector with respect to the inertial reference frame.

2.2 Airborne stabilized platforms Airborne stabilized platforms come in a variety of shapes sizes and are matched to a host aircraft to meet a wide variety of missions. Common payloads include: 11

 

-  Laser payloads (range finders, designators, and illuminators) -  IR Cameras (sub classes divided into: long medium and short wave) -  Electro Optical Cameras for the visible spectrum (still and motion) -  Directional antennas The most common configuration of 2-axis gimbal systems for airborne applications are with the first axis, or outer axis, allowing for pan stabilization and the second axis, or inner axis, allowing for tilt stabilization. These designs have three major sub-assemblies: the mount, pan yoke, and the tilt ball, these are shown in Figure 2-3 Key mechanical sub-assemblies of an airborne gimbal. The base is usually lightweight and provides structural support as well as vibration isolation from the mount’s dynamic motion. The first axis pans the camera’s image left and right. The next axis rotates the camera about its pitch axis and moves the camera’s image up and down. Common terms for these motions include: azimuth/elevation, pan/tilt, and y yaw/pitch. aw/pitch. The azimuth/elevation combination is typically related to the earth’s horizon, and the yaw/pitch combination is typically used for an Euler angle reference in a local level North East Down coordinate frame. For this paper we will use pan/tilt to refer to the joint angles of the turret.

12

 

Mount

Pan Yoke

Tilt Ball

Figure 2-3 Key mechanical sub-assemblies of an airborne gimbal

The connection between the mount and the tilt ball is called the pan yoke and provides an offset between the mount and the tilt axis of rotation. The distance of this offset is defined by the radius of the tilt ball as well as the size and shape of the pan axis slip ring. The pan/tilt order of the axes allows the gimbal to pan around independent of the aircraft’s heading throughout 360 degrees of motion without obstructing the payload’s LOS vector to the target. This is made possible by the use of an electrical slip ring which allows for continuous panning without having to “unwind” the gimbal and potentially potentiall y interrupt the operator’s view of the target. This section of the gimbal also often houses the gimbal’s actuation system, usually two electric motors and a series of belts and pulleys to transmit the stabilizing torques to the mount and tilt ball. The tilt ball houses the sensor and payload assembly. The tilt volume of the gimbal is often the limiting factor on the size and number of payloads the gimbal can 13

 

carry. The tilt volume also often defines the rough height and diameter of the gimbal. This is where the connection between mission requirements aircraft size and gimbal size often come together in defining the overall UAS system. system. As the mission requirements go up they often increase the number of payloads that must be stabilized. The number of payloads will define the size of the gimbal and which can be a key driver in the available payload volume needed on the aircraft. As the aircraft’s available payload volume increases so does the size of the overall aircraft this in turn increases the standoff distances required due to the larger aircraft signatures. The larger standoff distances then increase the size of the optics needed in the imagers and increases the gimbal size required. To break this design spiral it is necessary to drive in high performance stabilization into the smaller gimbals. There is a wide spectrum of gimbals which can be classified into classes based on their total weight: superlight, small, medium, and large; these are shown in Figure 2-4 Classes of airborne stabilized gimbals. Superlight gimbals, those averaging 1lb or less are typically carried by hand launched UAVs with MGTOWs of around 5 to 10lbs. These gimbals can stabilize two small CCD board type cameras or a single block camera with variable zoom. These gimbals are very specific to their platform and their shape is often part of the existing aerodynamic shape of the vehicle. LOS stabilization performance is typically greater than +/-0.5deg. This disadvantage is overcome by their short slant ranges between the host platform and the intended target.

14

 

Figure 2-4 Classes of airborne stabilized gimbals

Small gimbals, the focus of this work, fill the gap between the superlight and medium classes. These gimbals still have tight restrictions on their size and weight but are more cylindrical shaped to allow for full range of motion seen in larger systems. The gimbals in this class often carry one to two payloads offering interesting combinations of sensor resolutions and focal lengths. Some of the standard resolution cameras with 4

longer focal lengths can deliver lower ground sample distances, GSDs , and a sharper image than high definition cameras with their available lens combinations. LOS stabilization performance is on the order of +/-0.5 to +/-0.1deg. Medium and large gimbals, those weighting 10-20lbs and greater than 50lbs respectively, serve the purposes of legacy UAS systems offering a wide variety of multisensor combinations. These gimbals are used on vehicles with on-station endurances in

4

 Ground Sample Distance – is the distance measured on the ground between the centers of the sensor’s pixels

15

 

the 8-24+ hour ranges and need to provide a variety of video options for the operator to deal with changing light conditions. These gimbals can provide LOS stabilization stabilizat ion performances to less than +/- 0.1deg but are usually operated at long slant ranges because of the large signatures of the their host aircraft.

2.2.1 UAV system integration Integration of a stabilized gimbal into an unmanned aircraft brings up some important additional system integration issues. For illustrative purposes consider the conceptual integration shown in Figure 2-5 Example gimbal UAS UAS integration. UAVs rely on a communications link to send command and control command to the gimbal. Due to the latency and link quality the commands may be significantly delayed from the time the operator sends them to the time that the gimbal receives the command. This has led to the development of more autonomy in the gimbal to reduce the operator’s workload to track the target. Features such as pointing to a GPS coordinate, target tracking, and even target triangulation5 are common on large gimbal systems and are just now starting to trickle down to smaller and smaller gimbals as their available computing power increases.

5

 Target triangulation is the act of estimating a target’s position by tracking the target through feedback from the sensor’s field of view, estimating a series of ideal LOS vectors and using the intersection point of the LOS vectors as the target’s position. [7]

16

 

Figure 2-5 Example gimbal UAS integration

The UAS must provide a bi-directional data link between the operator and the gimbal for command and and control as well as health monitoring of the gimbal. The UAS must also provide a data link that can transmit the video stream from the gimbal’s imagers to the operator in real time. It was determined, through testing, that latency above 100-250ms between command issued and response displayed in the video begins to significantly reduce the operator’s effectiveness during manual control of the system. There are several ways to address this issue, one is to improve the data links to reduce the latency, and the other is to add additional autonomy to the gimbal in-order to increase the maximum latency allowable. The additional autonomy in the gimbal take the form of GPS lock ad target tracking algorithms to provide the longer term stabilization above the pure inertial stabilization provided under manual control. Another key area in system integration is the vibration environment the gimbal is subjected to. Aircraft that have the payload weight and volume capacities to carry medium sized gimbals are often powered by 2 or 4 stroke internal combustion engines which produce large torque pulses due to the non-continuous nature of their operation. These torque pulses are often in the range of 50-80Hz depending and, without specific 17

 

gimbal vibration isolation, can cause significant image blurring and/or excitation of jitter in the gimbal’s control system. Electric aircraft propulsion offers a continuous torque propulsion system with common vibrations at much higher frequencies which are easier to dampen and have less of an effect on the image quality. Aircraft with electric propulsion are often limited to carrying only small payloads due to energy limitations of their batteries. A side benefit of electric propulsion is a significantly quieter acoustic signature allowing the UAV to get closer to its target and reducing the size of the imager optics and overall gimbal stabilization requirements. Next to video cameras, directional antennas and transceiver devices, such as lasers and laser detectors, also require platform stabilization. With equal fields of view the camera payload is one of the more challenging and payloads because the camera must be kept still while the shutter is open as to not blur the image as well as provide adequate robustness to jitter. Directional antennas have the advantage of being insensiti insensitive ve to jitter as long as the LOS stays within requirements. This allows for reduced jitter margins and increased stabilization performance.

2.3 Tigereye Design Overview The gimbal system for which the control system will be developed is the Tigereye Turret developed by AeroMech Engineering Inc. The Tigereye gimbal was started clean sheet design to provide high performance stabilization stabilizati on in the small gimbal class. One of the key design goals was to take advantage of COTS components as much as possible. The design process, shown in Figure 2-6 Design process for Tigereye, was followed for the overall system design in parallel with the development of a new small UAS. The

18

 

development of the gimbal control system played an important role in each phase of the design.

Figure 2-6 Design process for Tigereye

The resulting system was a 2lb gimbal that could be configured to carry single or dual imager payloads. A 4-view and picture of the Tigereye dual imager gimbal is shown in Figure 2-7 Tigereye dual imager 4-view & picture. Some key design features of the Tigereye include: -  -  -  -  -  - 

Command and control over CAN bus Continous pan and tilt Single sensor hot swap capability Low friction joints <+/-0.3deg LOS stabilization stabilizati on Video processing for image stabilization and target tracking

19

 

Figure 2-7 Tigereye dual imager 4-view & picture

The intended host platform for the Tigereye is a small UAV required to track a person sized target with a minimum 1,200ft standoff distance. The small UAV would be operated by a single operator and controlled via a low latency a line of sight data-link. On the host platform command, control, and gimbal telemetry is provided by the Controller Area Network, CAN, bus interface. This is the same bus implemented by the other avionics systems on-board the aircraft allowing multiple different modules to interact with the gimbal. A single analog video output for standard definition video in NTSC format is also also provided. For the dual imager payload a video mux device is included allowing instant switching between the two different video streams without having to wait for the imager to power-up or re-focus, both are always on.

2.3.1 Control System Goals The primary goal of the Tigereye control system is to fill the performance gap between legacy small UAV gimbal systems and the LOS stabilization performance seen on larger gimbals. The control system is designed to reduce the workload of the small 20

 

UAV system operator and increase the video quality through increased stabilization performance. To meet this goal the control system will utilize sensor information available on a small UAV platform, such as the host state information, on-gimbal inertial rate gyros and target tracking information, to implement long term stare capability to allow the user to focus on the video imagery content and not on stabilizing the imagery. The goal of any airborne LOS stabilization system is to enable the full use of the sensors contained inside the gimbal’s payload bay. “Full use” is defined as the ability of the gimbal to deliver stabilization performance such that the image quality returned by the sensor is not adversely affected by the motion of the host platform. If this can be satisfied then the sensor becomes the limiting factor on performance not the gimbal’s stabilization. stabilizat ion. For Tigereye, full use of the imagers is seen as a long term objective and not a requirement of the initial control system. An additional goal for the control system is to also make the gimbal a production ready system. Derived requirements from this additional goal are to develop supporting alignment and calibration algorithms to aide assembly technicians during production as well as both low and high level command and control functionality to give the customer the greatest flexibility during ISR system integration. Low level control shall be provided through direct servo motor control as well as closed loop joint position and joint rate control so the user can integrate custom control loops around the gimbal system. High level control shall be provided in the form of indefinite stare at a GPS coordinate through the use of additional host state information. Intermediate level control shall be provided in the form of short term inertial dampening without the use of additional host information.

21

 

2.3.2 Operating environment The system is designed to be operated on small UAV platforms with 2lb payload capacities. This translates to vehicles with maximum gross takeoff weight in the range of 15lbs to 45lbs. Typical cruise altitudes for these vehicles range between 500 and 2000ft AGL with loiter airspeeds from 25 to 60knots. While this represents a fairly small section of airspace it also represents the section airspace susceptible to unpredictable turbulence. The air is affected by geography, manmade obstructions, surface heating, in addition to most of the weather effects seen at other altitudes [2 ]. The implication here is that the smaller the air vehicle the more susceptible it is to turbulence which drives stabilization performance requirements up. For small UAV’s UAV’s the amount of flight time during a given mission with high body angular accelerations and rates goes up significantly. significantl y. Reduced mass, inertia, and wing loading of the typical small UAV adds to the vehicles vulnerability to turbulence. At typical cruise speeds of these small UAV’s a 5 knot change in airspeed represents a significant change in the aircraft’s state where a larger vehicle would not be affected. The below chart, Figure 2-8 Histogram of total vehicle body rate sampled @ 10Hz, shows a histogram of the total angular rate magnitude of a small UAS developed from empirical data collected by an autopilot at 10Hz under light turbulence conditions. Notice that 99% of the flight time is spent at angular rates of 100deg/sec or less.

22

 

25%

100% 90%

20%

80%

99% of mission flight time

  y 15%   c   n   e   u   q   e   r    F 10%

100 deg/sec

70%    %   e   v    i    t 50%    l   a   u 40%   m   u    C

60%

Flight #1 Flight #2

30% 5%

20% 10%

0%

0% 5

15 25 35 45 55 65 75 85 95 105 1 11 15 1 12 25 1 13 35 Total Angular Rate (deg/sec) Figure 2-8 Histogram of total vehicle body rate sampled @ 10Hz

2.3.3 Electro-Mechanical Overview The Tigereye electromechanical system contains seven key subsystems involved in the control and stabilization stabilizati on of the payload. These components are: two position sensors, two MEMs inertial rate gyros, a microcontroller, and two drive assemblies. The general layout of these subsystems is shown on the conceptual gimbal in Figure 2-9 Key mechanical sub-assemblies of an airborne gimbal. To save space in the tilt ball the tilt gyro was the only component placed in the tilt ball. This allowed for the maximum volume to be used by the imager. The rest of the components were placed in the pan yoke. One advantage here was to increase the inertia of the pan y yoke oke to allow for a maximum amount of passive stabilization.

23

 

Mount

Gyro

Position

Pan:

Encoder

-  -  -  -  -  - 

Microcontroller

Microcontroller Pan gyro Pan position encoder Tilt position encoder Pan drive assembly Tilt drive assembly

Position Encoder

Gyro

Tilt: -  Tilt gyro -  Sensor payloads

Figure 2-9 Key mechanical sub-assemblies of an airborne gimbal

All digital communication, command, control, and telemetry reporting is done via the CAN bus which runs through both the pan and tilt slip rings to give CAN bus command and control access to the camera payloads.

2.3.4 Mechanical Design The Tigereye gimbal mechanical design was a combination of many lessons learned from previous gimbal mechanisms for small UAVs. The electromechanical system was designed to be as light as possible and bias any parasitic (required) weight to the stabilized axes with the goal of increasing the inertia and thus the passive stabilization characteristics of the assembly. Taken to the extreme an object with infinitely high inertia and very small friction values will be naturally resistant to inertial disturbances

24

 

seen by the gimbal mounts. The goal is to drive the system to a high inertia to friction ratio while still maintaining a low mass. By choosing a high inertia, low friction design the system will have a high amount of passive stabilization. The active inertial dampening is designed to take care of the low frequency, less than 5Hz, disturbances. As the frequency of the disturbance increases, between 4 and 20Hz, the mechanical design provides a significant amount of passive inertial dampening. At the higher frequencies the mechanical drive system transmits the disturbances to the imager. At these frequencies it becomes the responsibility of the gimbal mounting system to dampen out disturbances such as engine vibration. Along with placing more mass on the stabilized portion of the system and turret was designed to have a smooth, symmetric shape to avoid aerodynamic buffeting of the camera pod. This helps reduce the chance chance of the exterior acting as a sail generating disturbance torques on the gimbals joint axes and reducing the stabilization performance The mechanical drive mechanism for the pan axis uses a small rubber driven wheel mounted on the motor shaft. The motor is mounted perpendicular to the pan axis’s rotation axis and the driven wheel runs along the pan race which is fixed to the base. The pan yoke assembly is supported by 6 wheels in the pan race to locate the center of the yoke with the center center of rotation. Vertical play is taken up by the motor shaft preload onto the pan race and resisted by 3 of the 6 wheels. To locate the pan yoke horizontally horizontall y and account for manufacture variances one of the 3 remaining wheels is spring loaded against the pan race. This design has shown to be very responsive with very little friction. Both  joint axes use use slip rings rings that allow for continuous continuous >360degree >360degree motion. motion. This simplifies simplifies the

25

 

control al algorithm co complexit xity y and and allo allows ws the the gimb gimbal al to to move move fro from m one one look direction to another anot her without without worryin worryin about unwinding or avoiding a stop.

2.3.5 Camera Sens rs

Tige gerey reyee gimb gimbal al is ca capa pable ble of be ng configured The tilt ball payload bay of the Ti for for a si sing ngle le EO or IR ima imager or a dual EO/IR imager combination. Table 2-1 Primary EO/IR camera payloads

Model SONY FCB-EX980S

Perspective

Key Specs Optical zoom = 26x Horiz. Hori z. Field Field of View = 42.0°(wid 42.0°(widee) to 1.6° (tele) S/N ratio >50dB Electronic shutter = [1/1 1/10,000 ] Min. Illumination = 2.0lx Mass = 230g

FLIR Photon 640 w/ 50mm lens   FLIR Photon 640 w/ 35mm lens  

Size (WxHxD) = 55.3x57.5x88.5 55.3x57.5x88.5 m Optical zoom = fixed Field of View (HxV)= 14° x 11° Nominal wavelength = 8.0 to 14.0 microns Mass = 251g Core Size (WxHxD) = 51.4x49.8x34.0mm Lens Size (Diam. x Length) = 45. x66.9mm Optical zoom = fixed Field of View (HxV)= 20° x 15° Nominal wavelength = 8.0 to 14.0 microns Mass = 209g Core Size (WxHxD) = 51.4x49.8x34.0mm Lens Size (Diam. x Length) = 42. x43.4mm

The data in Table 2-1 is provided by the sensor manufacturer manufacturer dat sheets; Sony [3] and FLIR [4]. The Tiger gereye eye gimbal gimbal is capabl capablee of carrying carrying many of t e SONY FCB family of imagers as well as IR sensors from FLIR’s photon family. For this project the EO/I O/IR ima imag gers ers we were lim limiited to to the SONY the FCB-EX FCB-EX980S 980S and the FLI FLI Photon 640 with two different lens o tions, with the smaller lens, 35mm, being used in the dual imager configuration.

26  

 

2.4 Coordinate systems The LOS also gives a starting point for the definition of the sensor’s body coordinates with the x-axis aligned coincident with LOS LOS ray. The sensor and target positions and orientations are given given in global global coordinates. The sensor’s body axes are defined with respect to the local tangent plane via a position vector and the three Euler angles defining the rotation to NED directions. For a camera type payload the FOV is further broken down into its horizontal and vertical components.

Figure 2-10 External gimbal reference frames

27

 

Table 2-2 Relevant External Coordinate Systems

Symbol OECEF

Origin location Center of the earth

OLocal Tangent

Fixed to the ground

Plane

OAircraft

Fixed to the aircraft CG

OAutopilot

Fixed either at AP IRU or GPS antennae

Orientation X+ = Y+ = Z+ = X+ = North Y+ = East Z+ = Down X+ = Nose Y+ = Right wing Z+ = Bottom of vehicle *defined by autopilot navigation system

Description Earth Centered Earth Fixed

Local level, local tangent plane, z direction is parallel to the gravity vector Standard aircraft body coordinates

The navigation solution of the AP is usually parallel to the aircraft body coordinates but may be translated due to GPS and IRU antennaae placement and orientation

To define an inertial reference frame this project assumes that the Earth is fixed in inertial space. This implies that any coordinate system fixed with respect to the earth is also fixed in inertial space including: earth centered earth fixed (ECEF), and local tangent plane (LTP). The local tangent plane coordinates are defined as being aligned with the x axis pointed north, y axis pointed east and the z axis pointed down aligned parallel with the gravity vector. The coordinate systems associated with the gimbal’s various body axes are as follows. The Base coordinate system is fixed to the mounting holes, x-axis pointing forward, z-axis pointing down coincident with the pan axis of rotation. The xy-plane of the Pan coordinate system is parallel with the xy-plane of the Base coordinate system and fixed to the gimbal pan yoke. The angle between the x-axis of the base and the x-axis of the pan is called the pan angle indicated by the symbol α. The x-axis of the Tilt coordinate system is aligned with the nominal sensor LOS, the y-axis is coincident with

28

 

the axis of rotation. The joint angles, η and ε, 󰁡󰁮󰁤 󰁰󰁯󰁳󰁩󰁴󰁩󰁶󰁥 󰁪󰁯󰁩󰁮󰁴 󰁲󰁯󰁴󰁡󰁴󰁩󰁯󰁮 󰁤󰁩󰁲󰁥󰁣󰁴󰁩󰁯󰁮󰁳 󰁡󰁲󰁥 󰁡󰁬󰁳󰁯 󰁳󰁨󰁯󰁷󰁮 󰁩󰁮 Figure 2-11 Gimbal Coordinate systems.

Figure 2-11 Gimbal Coordinate systems

An additional coordinate system not shown in Figure 2-11 Gimbal Coordinate systems is the imager LOS coordinate system. The imager’s x-axis points along the imager LOS with the yz-plane parallel to the image plane. All of these coordinate systems are described in Table 2-3 Coordinate systems. Although the imager and tilt coordinate systems are closely aligned there is typically a fixed non-zero rotation between the imager and tilt axis. By accounting for the imager coordinate frame the advanced pointing modes can align the current imager’s LOS with the target in a multiple imager gimbal where the operator is switching between imagers. The rotation from the tilt axis to the imager is typically captured during production and helps aide in imager interchangeability. Table 2-3 Coordinate systems

Symbol Obase

Origin location Center of gimbal base

Opan

Center of gimbal

Orientation X+ = Out connector Y+ = 90deg from x in plane of base

Description Origin of the base of the turret fixed to the host aircraft payload mount.

Z+ = Out center of tilt ball X+ = out 0deg encoder

Same origin as base but 29

 

Symbol

Origin location base

O

Center of tilt ball

tilt

Oimager

Center of imager

Orientation position Y+ = out 90deg encoder position / parallel to the tilt joint Z+ = out center of tilt ball X+ = out lens cap Y+ = parallel to tilt joint axis of rotation Z+ =out bottom of tilt ball X+ = aligned with center of FOV of the imager Y+ = 90deg from x axis parallel to tilt joint Z+ = down thru the base of the imager

Description rotates with the pan axis. Rotation is about the z axis, when pan angle = 0deg Obase = Opan  Origin is placed at the volumetric center of the tilt assembly with the y axis aligned with the axis or rotation This defines camera body coordinates. These are aligned to have the x axis aligned with the LOS of the imager and y axis parallel to the tilt axis of rotation

2.5 Dynamics model The following section provides background on the key points of the dynamics model (kinematic constraints and equations of motion) used in this project additional details can found in the Direct Vs. Indirect LOS Stabilization paper [5] as well as [6]. Adaptations specific to the Tigereye made to the mathematical model will also be identified in this section. For simplicity of the derivation the (t) has been dropped from the derivation of the equations of motion. Constants will be explicitly identified, otherwise the assumption that all symbols are functions of time can be made

2.5.1 Kinematic constraints To account for the joint axis constraints for the 2-axis gimbal, the general 6-DoF EOM of the tilt and pan axes are subject to the following kinematic relationships. The coordinate transformation from the base frame to the pan frame is as follows:

30

 

      0   0 0 01;   coscos ;   sin 

 

Applying the transformation to the angular rate vector results in following expression for the pan angular rate as a function of the base angular rate and the pan joint axis

  

velocity .

       0   0

 

 

The coordinate transformation from the pan frame to the tilt frame is as follows:

 

  0 01  0  ;   coscos ;   sin 

Following a similar application of the transformation matrix to the angular rate vector of the tilt axis results in the below equation.

         00   0 010 0      00      0 00 0      0 01 0       0    0     0    0  

 

Expansion of this equation yields:

 

Taking the derivative results in:

 

31

 

2.5.2 Ideal LOS Definition As stated before ideal LOS stabilization keeps the target in the center field of view at all times. This can be represented mathematically with the following equation:

 _  0 

 

With arbitrary rotation of the base coordinate frame and assuming the following -  that the slant range from the base to the target >> the distance from the base center of rotation to the origin of the sensor -  Sensor frame to tilt frame alignment error is small -  Rigid body motion -  Stabilization initial condition is with the target in the center FOV  

        ,     0 01 0       00     000    0  0 0 1    0

Substituting in the Pan axis angular rate equation and expanding the result:

 

Solving this equation for the joint rotation rates as functions of the base angular velocity and joint angles results in the following:

   0     0       0  00

 

Setting the left hand side of the above equation to the value for ideal stabilization, ,

 = 0, and solving for the joint axis rates the relationships for ideal

ωy,sensor ωz,sensor

stabilization are derived as functions of the base angular rates.

                      

  32

 

With the



 term on the denominator of the pan axis rate equation it can be seen

that at tilt angles close to 90deg, ε~90󰂰, the pan joint rate approaches infinity. This is defined as the ‘nadir’ direction for the gimbal and is in the direction of the mount Z-axis. Applying to the UAV application this prevents perfect LOS with direct over flight of the target. Through careful flight path planning this condition can be without requiring additional gimbal axes or a reconfiguration of the mount position.

2.5.3 Equations of motion In this section the gimbal equations of motion are summarized. They have been derived from the Euler moment equations for general rigid body 6DoF motion with the application of the kinematic constraints from 2.5.1 to define the joint axes. The gimbal equations of motion used in this project closely follow the equations of motion given in [5], for a complete derivation see the previously referenced paper. Euler’s equation states that the sum of the moments, angular momentum,

 ∑ 

  

, about a body is equal to the rate of change of its

.

           Ω  

 

The gimbal is broken up into two independent bodies, Pan and Tilt and are represented by the free body diagrams shown in Figure 2-12 Gimbal free body diagrams

33

 

Inner axis reaction torques on outer joint axis of rotation

TOz = TFriction + TDrive + [RIO*TI]z

TIy = TFriction + TDrive

OPan

 X P a n

TOx

TIx

CGPan

+ ωPan

ITilt

X T i il  t  OTilt

CGTilt

Y  P  

Y  P a n 

+ ωTilt

TOy

a n 

FG

FG

TOz

Tix, Tiz are torques exerted by the inner/ tilt axis onto the outer/pan axis

TIy

TIz

Figure 2-12 Gimbal free body diagrams

Assuming alignment of the both sets of body axes principle inertia axes the gimbal moment equations can be written in matrix form. Inner/Tilt axes:

                                                      1             1                                    ,                                ,,  

Solving for the unknowns the EoM of the Inner/Tilt axes results in the following:

 

Moment equations for the Outer/Pan axis written in matrix form are shown below:

 

Note that the inner axis reaction torques are accounted for in the [T] IO term. Solving for the unknowns the EoM of the Outer/Pan axis results in the following:

34

 

                     1          1       , , 

  1     

 

TOx, TOy are reaction torques of the gimbal onto the base. For the scope of this project it is assumed that the inertia of the base, or host aircraft, is much larger than the gimbal allowing us to ignore any base disturbances caused by the gimbal’s reaction torques. The term TGravity represents the mass imbalance torques of the gimbal due to the force of gravity. To simplify the gimbal dynamics it is assumed that center of gravity of the inner (tilt) axis lies on the inner axis of rotation and that the center of gravity of the outer axis lies on the outer axis of rotation. This assumption requires that the real gimbal gimbal system be balanced with counterweights (refer to section 5.1 for how this was achieved). Applying the CG constraint to the outer axis requires the inner axis CG to lie not only on its axis of rotation but also along the outer axis of rotation. This implies that these two rotational axes intersect putting an additional constraint on the mechanical design. In carefully aligning the CG locations the torque induced from gravity can be canceled out significantly simplifying the dynamics and the control system complexity.

2.6 Control Architecture Review The focus of this work will be to implement a simple PID control system for the Tigereye gimbal and evaluate the resulting performance as it applies to small UAV ISR applications. It is important for the reader to understand the various controls 35

 

architectures that have been developed for 2-axis stabilized stabilized gimbals. This section discusses the application of three different controls architectures that provide a representative sample of current technology. Direct Versus Indirect Line of Sight Stabilization [5], this paper discusses the controller implications of mounting the inertial sensors directly on the LOS stabilization axes versus sensing the motion of the base and transforming the sensed disturbances into the LOS axes to calculate the required control signal for stabilization. stabilizati on. The paper derives the control equations for both cases including terms for sensor error and plant model linear and non-linear dynamics. A simple PI controller is used in both cases. It is shown that without the sensor and plant noise terms the loop gain for both architectures is equivalent. However the indirect approach is much more susceptible to sensor noise than the direct approach. Sensor sampling errors and gimbal structural rigidity dynamics were not considered in simulation of either either approach. It was concluded concluded that given an equal design effort the indirect approach would result in reduced stabilization stabilizat ion performance. A diagram of the direct stabilization approach is shown in Figure 2-13 Example of Direct stabilization system architecture.

36

 

Figure 2-13 Example of Direct stabilization system architecture

The focus of this thesis will use a hybrid of the indirect and direct approaches discussed in [5]. Instead of mounting the inertial sensors on the LOS axis in the tilt body each joint will get an inertial sensor for its axis of rotation. The azimuth/pan axis will get a joint position encoder and analog MEMs gyro and the elevation/tilt axis will get an identical joint position encoder and analog MEMs gyro. Control Architecture for a UAV-Mounted Pan/Tilt/Roll Camera Gimbal [7], this is a very basic implementation of a joint position control for a 3-axis gimbal. The controller used was a basic PID with the addition of integrator anti-windup to handle actuator saturation and derivative filtering of the position encoders. The gimbal was actuated with hobby quality servos and joint positions were sensed with optical encoders.

37

 

Figure 2-14 GIT 3axis gi mbal on GTmax helicopter

Adaptive Control of a Two Axis Gimbal [8], this paper explores the implementation of adaptive control for a large desktop mounted experimental gimbal. The adaptive control scheme use is a Model Reference Adaptive Controller. The gimbal base is fixed in the earth frame and does not contain any inertial sensors. The position state of each joint is measured directly and the velocity is calculated from the position derivative and then filtered. The performance of the adaptive controller was compared to the performance of a PD controller under the same commanded trajectory. The paper resulted in a successful implementation of a simple adaptive control algorithm to follow a specified trajectory and when combined with visual feedback they were able to track a ball moving through space. Performance of the system was hampered by a cable 38

 

extending from the camera, required for communication with the gimbal, which added un-modeled dynamics. A diagram of the experimental setup is provided in Figure 2-15 Adaptive control for a two axis gimbal - Experimental Setup. The investigation found that for accurate parameter estimation of the system using adaptive control the dynamics models need to incorporate the following elements: -  “exciting” trajectory that will excite all modes of the system in which the parameters are to be estimated -  Accurate model of all dynamic elements of the system

two o axis gimbal - Experimental Ex perimental Setup Figure 2-15 Adaptive control for a tw

39

 

3 Simulation Development The 2-axis gimbal was modeled from the top down using engineering judgment and best practices to add simulation detail as the project progressed. The dynamics simulation of the Tigereye gimbal was developed in parallel with the production of the prototype Tigereye. As experience was gained with the actual hardware various subsystems and details were added to the simulation model. As the prototype went through several iterations during its development so did the simulation to keep up with the constantly changing hardware. Due to the very rapid pace of development the simulation was used to prototype a tunable controller and not be a place where the system dynamics were rigorously modeled.

3.1 Equations of motion mechanizati mechanization on A two phase development the equations of motion was completed by first modeling the tilt, ‘inner’, dynamics, then the tilt ball dynamics model was ‘mounted’ to the pan yoke, ‘outer’, dynamics model. This strategy allows the simulation to be very modular and focused on one subsystem at a time to minimize the development risk. Both the inner dynamics model and outer dynamics model have axis torques and their respective ‘base’ angular rates as inputs, for example the tilt ball’s ‘base’ is the outer gimbal coordinate frame. Tilt ball reaction torques are communicated back as torque disturbances to the outer gimbal dynamics. These torques are necessary to account for generic base motion and the off diagonal terms in the tilt inertia tensor. The implementation of these equations is shown in the below diagram, Figure 3-1 Gimbal EOM Mechanization. 40

 

W_oo_dot

1 W_oo_dot [3] (rad/s)

W_oo

1

2 W_oo [3] (rad)

W_bb Euler_eo

W_bb [3] (rad/s)

3 Euler_eo [3] (rad)

Joint Rate

4 JointRate_bo (rad/s (rad/s))

Joint Angle

5 JointAngle_bo (rad)

2 DCM_eo

Tmotor_o (Nm) Tz_external

6

DCM_bo

Tbody_o Tbody_o

Outer Gimbal Dynamics

W_ii_dot [3] (rad/s (rad/s))

7

10

W_ii

DCM_bo Add1

9

W_ii_dot

DCM_eo

W_ii [3] (rad)

W_oo

8

11

Euler_ei

Tbody_o [3] (Nm)

Euler_i [3] (rad) 12

JointRate_oi

JointRate_oi (rad/s (rad/s)) 13

JointAngle_oi

JointAngle_oi (rad) 14

DCM_ei

3

Ty_external

Tmotor_i (Nm)

DCM_ei 15

DCM_oi

DCM_oi Tbody_i

Tilt generated Pan axis torque Tzjoint_oi (Nm) 17

Inner Gimbal Dynamics

t body

16 Tbody_i [3] (Nm)

DCM_o,i Tz_joint Tbody_i Tbody_i

Figure 3-1 Gimbal EOM Mechanization

3.2 Mass & Inertia The mass moment of inertia is a measurement of the distribution of the mass relative to its distance to the CG of the object. Objects with high inertia require more torque to change its angular velocity than objects with low inertia. Ideal mass distribution of a LOS stabilized gimbal is to concentrate the mass of the gimbal along the stabilization stabilizat ion axes. By doing this the stabilization stabilizatio n axis is less susceptible to external disturbance forces and allows for high angular accelerations of the outer gimbal axes at elevation angles close to +/-90deg, See the below picture for a picture of this.

41

 

Figure 3-2 Mass distribution

The mass and inertia model of the gimbal was taken from the detailed CAD assembly. Initially the off diagonal terms in the inertia matrix were set to zeros to simplify the development of the simulation. The final simulation uses the complete inertia matrices for the tilt and pan assemblies. The modeling method of the CAD system uses the following equations, shown in Figure 3-3 CAD inertia tensor calculation equations, to generate the inertia tensor [9].

Figure 3-3 CAD inertia tensor calculation equations

42

 

The sensor’s own inertia was estimated by modeling its outer shape and applying a constant density to equal its total weight. An exact CAD modeling of the sensor internal parts was not completed. The final key component to the gimbal mass properties is the balance weight. The purpose of the balance weight is to bring the CG of the complete tilt ball with sensor installed, in line with the tilt axis of rotation. This exact weight was found by trial and error with the actual system and found to be unique for the different payload configurations. By having the balance weight the Ixx inertia of the final inner axis system is slightly increased and the gravity induced torques are kept small enough to ignore simplifying the control laws A summary of the assumptions made to simplify the simulation and control architecture can be found in the following table, Table 3-1 Mass model assumptions Table 3-1 Mass model assumptions

Assumption CG of tilt ball is along the tilt axes of rotation

Justification Tilt ball is balanced during manufacturing

Gravity induced torques on pan axis are small and can

Distance between pan axis and CG of pan and tilt

be neglected

components is small Angle between turret pan axis of rotation and gravity vector is small. Forces generated by the rotational momentum of the motors are small relative to friction and momentum of the rest of the system This is a close approximation and matches mass

Inertia of drivetrain components is small compared to gimbal

Payload sensor is modeled as constant density mass

Motivation eliminates gravity induced torques about the Tilt axis simplifying the sim and controller complexity eliminates gravity induced torques about the Pan axis simplifying the sim and controller complexity

Rotational inertia

With the available information this is the closest approximation possible

43

 

3.3 Friction To conceptualize the impact of friction on a gimbal LOS stabilization system, consider the gimbal base undergoing a sine wave tilt rotational disturbance, ie the base is rotating back and forth about an axis the is parallel to the tilt axis of rotation. From the equations of motion the inner gimbal LOS is affected by both the external and internal torques transmitted to the inner gimbal. gimbal. The moments include torques from the drive motors and joint friction. For this type of disturbance a gimbal with zero friction would not need any inputs from the drive motors to stabilize the axis. By reducing friction in the system the passive LOS stabilization characteristics can be maximized requiring minimal control input to achieve high performance stabilization. Frictional forces can be broken down into two different types: coulomb and viscous friction. Viscous friction is proportional to the relative velocity of two objects and is linear in nature. In the simulation the viscous friction is represented as a gain gain on the joint axis rate.

           

 

 

 

The coulomb friction model is based on the frictional component between two objects due to the normal force applied. In the case of the Tigereye the drivetrain components on each axis have a fixed preload making the Coulomb friction constant in magnitude. As the gimbal changes direction the direction of the coulomb friction must be changes. It is because of this that the force = f(velocity) due to coulomb friction is nonlinear. 44

 

It was found through flight test telemetry that a significant portion of the motion seen by the vehicle is at lower body rates, meaning that most of the time the gimbal will be traveling at low angular velocities constantly switching in direction. This requires modeling the coulomb frictional component in the gimbal dynamics to account for the start/stop transition. The implementation of the friction model is shown in Figure 3-4 Friction model.

Figure 3-4 Friction model

3.4 Drive System The drive system for each axis of the Tigereye gimbal is made up of a brushless dc servomotor and a custom set of belts, pulleys, and gears to transfer the motor torque to the gimbal axis. The Tilt axis uses a belt system to get the motor torque from the motor mounted near the top of the gimbal gimbal down to the tilt axis. The driven belt wheel was slotted to act as a belt tensioner and allow for the required manufacturing tolerances; however this compliance added another ‘spring’ to the dynamics of the system. The Pan axis went through several iterations on the design. The final design was was to use the motor without a gearbox (no 0-backlash gearboxes were available at the time) driving a small rubber wheel directly on an interior bearing surface on the pan axis. This allowed the turret to maintain the necessary gear reduction ratios while having 0-deg backlash in the system. An important advantage is that the resultant system had very little friction increasing the passive stability of the system. 45

 

3.4.1 Actuators The gimbal actuators are small DC servo motors controlled by a Pulse Width Modulation signal. These can be modeled as either a simple torque input or as a more complex servo motor. For the initial control development the simple torque input proportional to the PWM signal was chosen and then transitioned to a higher fidelity servo motor model which included the steady state torque speed relationships shown in the figure below, Figure 3-5 Motor Steady State Characteristics (Vin=12V). The detailed motor coefficients were provided by the manufacture MicroMo, [10]. [10 ].

Figure 3-5 Motor Steady State Characteristics (Vin=12V)

3.4.2 Pan Drivetrain Beyond the motor, the pan drive system is a direct drive between the motor output shaft and track fixed to the base on which a rubber drive wheel applied force. This allowed for a large gear reduction between the motor output and the pan axis that was simple light weight and low friction. However during initial testing it was found that this 46

 

drive system has a significant amount of backlash resulting in damped non-linear oscillations during position control, shown in Figure 3-6 Pan Bearing Comparison 10deg Position Step Response, and a limit cycle during inertial dampening. The slop was reduced through mechanical design iteration on the pan bearing and the stabilization performance was significantly improved.

0.3 new old

0.25

0.2    )    d   a   r    (   n   o    i    t    i   s   o    P

0.15

0.1

Backlash resulted in undesirable oscillation

0.05

0

-0.05   0

20

40

60

80

100 120 Time (ms)

140

160

180

200

Figure 3-6 Pan Bearing Comparison - 10deg 1 0deg Position Step Response

3.4.3 Tile Drivetrain The tilt drivetrain utilized a belt drive system to achieve the necessary gear reduction. The tilt axis is constrained with off the shelf bearings and the belt was a low stretch of the shelf smooth belt. The low stretch belt and bearings provided a system with low drivetrain spring constants, however keeping adequate belt tension required the use of other mechanical features to take up the manufacturing variances. The initial solution was to use a driven pulley in the shape of the picture below, Figure 3-7 Tilt driven pulley.

47

 

Figure 3-7 Tilt driven pulley

This pulley provides a good spring to allow the system to flex and take up manufacturing tolerances however it added a non-linear spring constant that changed value based on the direction of the torque being applied when under belt tension. This was found to be a primary limitation on the stabilization performance and the design was changed.

3.5 Sensors Feedback signals to the control system are provided by two sets of position and inertial rate sensors, simulation implementation shown in Figure 3-8 Sensor Sub-System. For each axis the joint position, sensed by an absolute position encoder, and the inertial rate, sensed by a MEMs rate gyro are sampled at 10KHz over the digital Serial Peripheral Interface, SPI.

48

 

Figure 3-8 Sensor Sub-System

3.5.1 Inertial – MEMS Gyros The inertial rate sensors selected for the Tigereye gimbal were selected based on fitting within the physical dimensions of the gimbal and providing a low noise, low drift, high sensitivity signal for inertial rates around 0deg/sec. The gyro down-selected was the Analog Devices ADXRS614. This gyro is based on MEMs technology and fit all of the selection criteria. The ADXRS operates by electrostatically electrostat ically vibrating a silicon structure to resonance and uses capacitance pick off fingers to sense the effect of the Coriolis forces on the structure [11]. The output is then conditioned into an analog voltage from ~0.25 to 4.75V. Reference voltage and temperatures are also output to help with the calibration

49

 

Figure 3-9 MEMs Gyro characteristics [12], [11]

To simulate the MEMs gyro system on the Tigereye a single axis gyro model was developed based on the 3-axis gyro in Matlab/Simulink’s aerospace toolbox. The angular rate and accelerations of the body under motion (pan axis or tilt axis) are passed in and then transformed into the local body coordinates of the gyro, gyrospace, through a direction cosine matrix. Gyrospace is defined with the gyro Z-axis as the rate sensing axis. By first transforming into gyrospace the lateral acceleration effects of the gyro can be consistently applied to both the pan and tilt gyros consistently. Within the gyro model nd

2  order dynamics, white noise, and a constant biases are also applied to the output signal. At this point all three gyroscope measurements are output and the z-axis measurement is selected for conversion to an analog voltage, then to digital through an idealized 12bit quantization block and then back to radians/sec before being delivered to the control system.

50

 

Figure 3-10 Single-axis gyro model, overview

Rate sensing axis

Figure 3-11 Gyro dynamics, detail

3.5.2 Absolute – Magnetic Encoder Each axis also has a hall-effect absolute rotary encoder. The principle of operation is to detect the orientation of the poles of a round magnet placed just above the sense chip. As the magnet rotates the magnetic field through the chip rotates as well allowing the chip to report the absolute position of the magnet. The diagram below shows the relative placement of the magnet with respect to the chip. The diagram to the right shows the sensed vertical field component of the magnet and the rotation direction. 51

 

The sensor used for the Tigereye application provides a 12bits of resolution, nominally 0.0879deg. Information provided in this section is based on the AS5145 encoder datasheet, [13].

Figure 3-12 Absolute position encoder diagram

The hall-effect sensor is subject to several different sources of error including angular and translational misalignment of the magnet over the center of the sense chip and external magnetic sources. The typical error in position across the measurement domain has a sinusoidal profile, see actual vs. ideal position plots below (figure provided by [13]). For the Tigereye application the position encoders are used for primarily primaril y for pointing at a GPS coordinate and any error sources would not affect the inertial damping capability of the system. For these reasons a detailed error model of the encoders was left out of the dynamics simulation.

52

 

Integral Non-linearits (INL) is the maximum deviation between actual position and indicated position Differential Non-Linearity (DNL) is the maximum deviation of the step length from one position to the next Transition Noise (TN) is the repeatability of an indicated position (Definitions provided by [13]) Figure 3-13 Error sources for hall-effect encoder

53

 

4 Control Development 4.1 Overview The control system development for the Tigereye gimbal is centered on the need to reduce the operator’s workload when doing surveillance with a low cost ISR system. Inertial damping on legacy gimbals for small UAV systems were done either purely through operator feedback or from a coordinate transformation of the autopilot body axis rates. In the case where the system uses the autopilot rate estimates the operator would command the inertial rate of the pan and tilt axes. This control methodology is referred to as indirect stabilization in [5]. The Tigereye gimbal control system uses the direct measurement of the joint axis inertial rate, typically only found on larger gimbals, to increase stabilization performance and reduce the operator’s workload to stare at a target. This method is less susceptible to structural misalignments and flexing from the indirect method that could lead to unobserved stabilization errors. The Tigereye control system is broken up into primary inner/outer loops, gimbal navigation, sensor processing, and actuator processing functions. The primary inner loop is a direct feedback on joint inertial rate. The primary outer loop is a 2nd PID loop level to provide the operator with two levels of inertial dampening on joint position and joint velocity controls. The gimbal navigation component calculates either joint position or  joint velocity velocity commands. commands. The relationship relationship of all of these compone components nts is shown below. below.

54

 

Figure 4-1 Control system overview

This chapter will go through the details of each of these loops and the design considerations that lead to the current control system. There are many stabilization modes provided to the user, each mode is summarized in the table below along with the input requirements. Table 4-1 Control system modes overview

Mode

0 1 2 3 4 5 6 7

Name

Joint Velocity Joint Position Joint Velocity, Damped Joint Position, Damped Inertial Velocity Euler Lock GPS Lock Target Tracking, Velocity based

Description

User commands joint referenced velocity User commands joint referenced velocity Mode 0 + inertial dampening Mode 1 + inertial dampening

  r   e    d   o   c   n    E

X X X

X

X

X

User commands inertial referenced velocity User commands NED (North, East, Down) referenced Euler angles User commands GPS coordinate to look at. User commands Target pixel coordinat coordinate. e. Pan A uses target tracking information from Pan B to follow a target. Inertial dampening is provided through use of gyro feedback

  s   o   r   y    G

   l   e   x   e   n   i    d    t    t   o   P   )   s   u   s    i    t   o   t   o   i    t    Y    i    t    H  s   e  ,    H   t   o   g   X    A    P   r   a    T

X X

X

X

X

X

X

X

X X

55

 

Mode

8

Name

Description

Target Tracking,

User commands Target pixel coordinat coordinate. e. Pan A uses target tracking information from Pan B to

Positoin based

follow a target. Inertial dampening is provided through use of gyro feedback

  r   e    d   o   c   n    E

  s   o   r   y    G

X

X

   l   e   x   e   n   i    )    t    d   t    i   o   s   u   s    t    P   Y    t  ,   o   t   o   i    i   e    t    H  s    H   t   o   g   X    A    P   r   a    T

X

4.2 Requirements Explicit stabilization performance requirements have been left out of the control system design as this project seeks to see what performance is possible with a conventional PID inner/outer control loop architecture. Additional requirements will be placed on the system once the proof of concept has demonstrated in-flight performance improvements over legacy systems. The major gimbal control system requirements derived from the goals above are as follows: 1.  The control system shall improve upon legacy system inertial dampening performance 2.  The control system shall be capable of maintaining LOS stabilization to a gps position given host attitude and position information 3.  The control system shall allow the operator to send steering commands to the gimbal while maintaining inertial dampening in the absence of host information 4.  The control system shall track a visual target given it’s pixel location from the center field of view and necessary camera state information. 5.  The control system shall allow for joint position and joint velocity commands both with and without inertial dampening enabled 6.  The control system shall allow direct feed through of actuator commands 56

 

7.  The control system shall provide a configurable no-go range a.  Shall be defined by a center and width b.  Shall be effective in all modes

4.3 Primary inner/out inner/outer er loop The primary controller uses PID loop control algorithms to control the system’s mechanical motions. The controller receives sensor inputs from the turret’s two MEMs gyros and its two absolute position encoders. The control system uses “inner-outer” loop architecture with each loop containing a PID controller that sends command to the next inner loop. Outputs from the most inner loop are then used to command the servo motors. All of the modes use this basic control strategy.

Figure 4-2 Primary Inner/Outer controller overview

The inner/outer loop structure, seen in the diagram above, is done to use rate based control on the inner loop and position based control on the outer loop. This method has been shown to best provide smooth gimbal motion for the Tigereye. The organization of the controller is also key when developing a new system. Just like well

57

 

commented code, a well-organized diagram will help modularity and is self-documenting to enable quick development for future control improvements.

4.3.1 Inner loop The inertial velocity feedback is used as the inner loop control on all feedback modes. This was done after issues with movement smoothness were observed and found to be caused by taking the derivative of the relatively low resolution position encoder combined with the quick response time of the gimbal. The inner loop can also be commanded directly through direct feed through of the commands given to the outer loop. This functionality gives the operator direct inertial rate control and was found to be one of the primary modes of operation during flight test when the operator is conducting search.

58

 

Si Sim m li lifi fied ed loo loo mode modell

Figure 4-3 Inner loop detail

The inner loop allows for two special cases where the inertial velocity feedback is bypassed: pass-through and control off modes. The pass-through subsystem allows the controller to be configured such that any of the outer loop subsystems can send commands directly to the actuator processing subsystem. This was found to be necessary when the gimbal is used in non-inertial stabilized applications and inertial gyro information is not available.

4.3.2 Outer loop The outer loop controller is comprised of three modes: ramped position, position, and pass-through. The ramped position mode provides the ability for the operator to command a joint rate. Instead of using joint velocity as the feedback signal, due to the 59

 

previously discussed issues with taking the derivative of the absolute encoder signal, the loop integrates the user’s command and sends position commands for position control feedback. This was demonstrated to produce a smooth gimbal motion. This loop has two preconfigured gainsets that output commands to the inertial velocity inner loop. The resulting controller is an inertial damped joint velocity mode with two levels of inertial dampening. The second controller in the outer loop accepts joint position commands with feedback on joint position. This loop also uses the inertial velocity inner loop with two different gainsets to provide a weakly damped and strongly damped joint position mode. For GPS pointing the strongly damped joint position mode is preferred and provides a “hands off” mode for the operator. The final subsystem in the outer loop is a simple pass-through. This allows for direct inner loop control either sending commands to the inertial rate, pass-through, or control-off paths.

60

 

Simplified Joint rate loop

Simplified Joint position loop

Figure 4-4 Outer loop detail

4.3.3 PID detail The PID loop used in each of the controllers is a discrete time version of the standard parallel PID. The transfer function for this controller is shown below in both the continuous and discrete time domains.

 

      1  1   11 1

 

During implementation it was found that the use of a 2 nd order filter for the derivative term helped reduce the detrimental effects sensor quantization errors. The filter parameters were set with a cutoff frequency,





, set at 100Hz and damping ratio, ,

of 0.7. The modified PID transfer function is shown below.

61

 

     2             1      11 1        1     2245.5. 41.916161  240.245.5.0.41491499  

 

 

Figure 4-5 PID implementation in Simulink

Additionally the integral and derivative terms are subject to saturation limits before summation into the final control signal.

4.3.4 No-Go position llimit imit functions To limit the gimbal’s motion during operation, for instance to accommodate camera sensors that extend beyond the tilt ball OML and prevent continuous tilt operation, two additional subsystems were added to the controller. First, for the control loops that used position feedback a check of the nearest no-go edge and limited the error signal to prevent the system from being commanded into the no-go range. For the inertial velocity loops a more complicated algorithm was used to smoothly generate more error as the system got close to the edge of the no-go range by using a cosine function. The function was set such that during the transition zone the additional error would smoothly 62

 

add until the maximum error was reached before sending the error signal into the PID controller. The maximum error is defined by the error which would generate a 100% command signal when multiplied by the proportional PID gain.  

,  ,  ,                1 ,    ,  ,  2 ,  cos       

 

Figure 4-6 Joint No-Go error functions plot

4.4 Gimbal navigation The gimbal navigation subsystem performs the functions necessary to provide the next higher level inertial stabilization stabilizati on mode for “hands-off” operation of the gimbal. The goal of the primary inner/outer loop controller is only to dampen inertial disturbances and is not intended to provide long term stabilization. The goal of the gimbal navigation subsystem is to provide long term inertial stabilization by using the host state information provided over the CAN bus to the gimbal.

63

 

4.4.1 Euler Lock For this mode the gimbal will remain fixed in orientation with respect to the local North East Down coordinate frame. In this mode the gimbal receives the host attitude information and transforms the Euler angle commands into gimbal mount coordinates. The desired joint angles are then calculated and sent to the inertial damped joint position mode of the primary controller. This mode of operation is useful for when the air vehicle is flying parallel to a road and the operator wants to scan the road. The air vehicle will work to maintain its flight path parallel and at constant altitude with respect to the road making it possible for a constant NED orientation to maintain the LOS on the road as the vehicle responds to disturbances. This mode is also subject to host attitude accuracy errors, see discussion in the GPS lock section

4.4.2 GPS Lock The GPS lock mode is used to point the gimbal at a specific target position in 3D space given by a set target GPS coordinates. This mode requires the host attitude, and position information to be continuously updated. Once the host attitude and position are known the ideal look vector from the host to the target is calculated in NED coordinates. This unit vector is then transformed into gimbal mount coordinates and the joints angles necessary to point the gimbal’s gimbal’s LOS at the target are calculated. These joint angle are then sent as commands to the inertial damped joint position mode to maintain short term stabilization of the LOS vector until the next update of host state information.

64

 

Figure 4-7 GPS lock block diagram

The accuracy of this mode is highly dependent on the host state solution provided to the gimbal specifically the host attitude estimate. The error in the sensor FOV is proportional to the slant range multiplied by the angle between the ideal LOS vector and the current LOS of the sensor. For example a 1deg error produces a 17.45ft target error at 1000ft slant range. The small UAVs that the Tigereye is intended for have simple automotive grade MEMs IMUs onboard that produce state solutions good enough for autopilot controls but with errors on the order of 1-2degrees in pitch and roll and up to 5deg+ in heading attitude estimation. These small UAVs also often do not have an absolute heading reference derive heading from GPS information as an approximation. When flying in non-zero wind conditions the difference between the heading of the aircraft’s body axes and its ground track can become significant.

4.4.3 Visual Target Tracking The final navigation mode provides the operator with the ability to use a video processor to track a target in the video signal of the gimbal’s payload and send the pixel offset from the center FOV to the controller for mechanical stabilization to the target. This mode is not susceptible to the host attitude errors from the GPS or Euler lock modes described previously. In addition to the pixel error the controller needs to be able to calculate the LOS error angles represented by the pixel errors. To do this the controller is

65

 

preprogrammed either the fixed FOV of the sensor or the equation to get the FOV from the current zoom level of the camera and the resolution and aspect ratio of the sensor. Once the LOS error angles are calculated with respect to the sensor coordinate system they are transformed into the gimbal mount coordinate system. Inner loop joint angle commands are then used to point the LOS of the gimbal at the target. Inertial dampening modes are used on the inner loop while waiting for new target pixel positions to be calculated, this occurs 1/30hz.

4.5 Sensor & Actuator Processing The sensor processing subsystem provides all of the conversion from encoder counts and gyro ADC ADC counts into engineering units. This subsystem also implements the encoder alignment and gyro calibration tables which correct for the encoder rotation, gyro temperature effects on scale and bias. The sensor processing also allows for the application of low pass filters to remove some of the sensor noise before making it into the controller. The gimbal samples each sensor at 10 KHz, while this is overkill overkill the processor is able to handle it. By sampling the sensors extremely fast the nyquist criteria for filtering is kept very high, 2-3 orders of magnitude higher than the critical disturbance frequency range being stabilized, 5-50Hz. To optimize speed of the code only the 1 KHz KHz tasks are done in the Simulink controller model and the 10KHz filtering tasks are done in optimized c-code.

66

 

Figure 4-8 Sensor processing subsystem

The actuator processing subsystem is where the inner loop command gets turned into the PWM signal to be sent to the motor control driver. This block also applies a softdeadzone inverse to compensate for the effects of the Coulomb friction. The soft deadzone inverse was chosen to keep a continuous curve to allow for smooth motion of the gimbal as well as allowing the control signal to pass through 0 unlike a hard deadzone inverse. The hard deadzone inverse is undefined at 0 and does not let the controller settle to 0 control power resulting in high frequency jitter. The equation for the soft deadzone inverse is shown as an example of the effect of the inverse deadzone feature.

      11    

67

 

Output command, K Soft  = 0. 025

Output command, KWidth = 20 100

100

 

 

80

80 KSoft) = 0.0025    t   u    O    d   m    C

   t   u    O    d   m    C

KSoft) = 0.005

60

KSoft) = 0.01 KSoft) = 0.025

40

K

60 KWidth = 0.025 KWidth = 0.025

40

) = 0.05

K

Soft

20

KSoft) = 0.1

20

K

KWidth = 0.025

) = 10

K

Soft

0  0

20

40

60

80

 = 0.025 Width

100

0  0

20

40

20

40

 = 0.025

Width

60

80

100

60

80

100

40

20

35 15

30

   t   u    O    d 10   m    C

   t 25   u    O    d 20   m    C

5

10

      ∆ 15

      ∆

5 00

20

60

40

80

100

00

md In

Cmd In

Figure 4-9 Deadzone soft inverse comparisons

Fig igu ur 4-10 Deadzone inverse implementations (Hard vs. Soft)

68

 

 

5 Implementation Implemen tation and Test This chapter discusses the implementation of the control system on the actual Tigereye gimbal. The limitations of implementing the control algorithm on a real time operating system and working with the actual sensors and drive mechanisms created significant hurdles that needed to be overcome. There were also several key UAV platform specific integrations issues that required creative test methods to ensure the system was safe and ready for flight on an autonomous vehicle.

5.1 Hardware & Software development The control algorithms developed in chapter 4 were implemented on a Blackfin 537 digital signal processor. The Blackfin 537 is a blended 16/32bit processor with many high speed digital signal processing and microcontroller capabilities. This makes it ideally suited for quickly sampling and filtering sensor data for the control loops and handling communications to the host system and video processor.

69

 

Figure 5-1 Electronics block diagram

70

 

Figure 5-2 Software block diagram

5.1.1 Development environment The VisualDSP++ Integrated Development Environment, IDE, was used to program the processor using C++. Visual DSP also provided a real time data collection and debugging tool for bench top testing and initial software development. The IDE was used to evaluate initial communications to the processor and it peripherals, an example of the data collected from the ADC is shown below.

71

 

󰁐󰁁󰁎 󰁇󰁙󰁒󰁏 󰁔󰁉󰁌󰁔 󰁇󰁙󰁒󰁏

󰁐󰁓󰁄 󰀭 󰁃󰁏󰁍󰁂󰁉󰁎󰁅󰁄 Figure 5-3 Example dataset from MEM s gyros @1KHz sample rate

5.1.2 Ground test software During the development it was found that software specific to testing and tuning of the gimbal needed to be put in place to separate the software development away from the testing, tuning, and calibration work done for production. The “TurretCanComm” software was developed in Visual C++ express to perform the test tune and calibration functions for the gimbal. This software executable also acted as the primary control software for the motion tables and simulated host system messages to verify gimbal navigation functionality. functionalit y. Below is the primary screen for the command and control of the gimbal.

72

 

Figure 5-4 Gimbal bench test software

For test and control functionality checks a joystick interface was integrated into the TurretCanComm application. This was a very convenient feature as it was the primary method for the operator to control the gimbal functions during manned aircraft flights. This program also communicated to the video processing board via relayed communication through the 537 processor and communicated to the camera sensors through the CAN bus interface. Each setting on the control processor, video processor, and camera control board is settable from this interface. For tuning a high speed data collection method was developed where the gimbal would collect 1 to 10 seconds worth of sensor data and downlink the data to the operator in non-real time across the CAN bus. This communication work well and was found to be a very valuable tool in graphically assessing if the gimbal was jittering. This method 73

 

of data collection was found to be very effective and the data rate was eventually moved from the control interrupt on the control processor to the sensor interrupt which sampled each sensor at 10KHz.

Figure 5-5 Desktop development kit

5.1.3 Key issues A significant number of software and hardware issues were encountered during the development of the Tigereye system, most in some way related to the use of a new to market Blackfin processor. The unfamiliarity of the processor to the development team 74

 

and its use in a prototype system made it difficult to isolate issues and find their root cause. Once the base system was communicating with its peripherals the processor’s computing capability allowed for un-optimized code to run very quickly and development progressed much more smoothly. To decrease the development a key mitigation used was to leave most calculations in single precision floating point. This allowed for the use of engineering units throughout the software and reduced the debug time. On the hardware side another set of significant issues needed to be solved. A key issue specific to UAV applications was found during integration and pre-flight testing. When powered on the Tigereye gimbal produced a significant enough amount of electromagnetic interference, EMI, to prevent the aircraft from keeping or obtaining a lock on the GPS satellites. It was found that this interference was due to the processor’s internal clock speeds originally set to 600MHz and 133MHz for maximum performance. A matrix of GPS and processor clock speeds found that a core clock speed of 550MHz and system clock speed of 110MHz did not affect GPS reception and provided adequate performance for the control system.

5.2 Ground Testing & Calibration Calibrati on Ground testing and calibration of the Tigereye gimbal system mainly consisted of:  joint position encoder encoder alignments, alignments, temperature temperature calibration calibration of the gyros, gyros, and the development of two dynamic motion table systems to check stabilization performance. The encoder alignment and gyro calibration were required for each gimbal and helped keep the gimbal’s gimbal’s performance consistent from unit to unit. Additional calibration and built in test features were also programmed into the gimbal system such as gyro direction 75

 

detection, control loop step time calibration, and automatic deadzone estimation. These were used with varying success and not utilized on every gimbal unit.

5.2.1 Alignment Each Tigereye gimbal requires alignment of its joint position report to enable accurate mount to sensor coordinate frame rotations and the use of all modes that depend on this rotation (Euler and GPS pointing). The alignment process is used to apply an angular offset to the joint position encoder readings to compensate for the unknown installation installatio n angle of the sensed magnet. The alignment fixture conceptual layout is shown in Figure 5-6 Alignment fixture conceptual diagram.

Turret mount and Target Coordinate systems must be parallel. parallel. Horizontal planes of both coordinate systems must be colinear.

Laser Look Vector

Target Figure 5-6 Alignment fixture conceptual diagram

The procedure developed for aligning the gimbal uses an alignment laser mounted to the motion table and pre-aligned to be parallel to the turret mount coordinate system. The laser is then turned on and a gridded target is set approximately 25ft away. The larger the distance the less translational error will exist in the alignment angles. The gimbal is then manually steered to align the center FOV of the sensor with a position on the gridded target that is the same translational distance from the laser’s reflection as the distance between the laser and the sensor on the motion table. The achievable tolerance

76

 

for aligning the center FOV with the target is +/-2pixels as observed on a standard definition tv. The ability for the imager to zoom in on the target can significantly reduce the angular error between the center FOV and the target. Values and tolerances for the linear offsets are shown in Table 5-1 Gimbal alignment fixture dimensions and tolerances. Table 5-1 Gimbal alignment fixture dimensions and tolerances

󰁌󰁩󰁮󰁥󰁡󰁲 󰁭󰁥󰁡󰁳󰁵󰁲󰁥󰁭󰁥󰁮󰁴

󰁶󰁡󰁬󰁵󰁥

󰁴󰁯󰁬󰁥󰁲󰁡󰁮󰁣󰁥

󰁵󰁮󰁩󰁴󰁳

󰁙󰁟󰁬󰁡󰁳󰁥󰁲2󰁳󰁥󰁮󰁳󰁯󰁲

8.0

0.25

󰁩󰁮

󰁚󰁟󰁬󰁡󰁳󰁥󰁲2󰁳󰁥󰁮󰁳󰁯󰁲

10.0

0.25

󰁩󰁮

󰁤󰁩󰁳󰁴 󰁴󰁯 󰁴󰁡󰁲󰁧󰁥󰁴

25.0

0.5

󰁦󰁴

300

6

󰁩󰁮

Taking into account the measurement tolerances the expected alignment accuracy is <+/-0.077deg or approximately 0.9 encoder counts with a maximum allowable sensor FOV of 1.78deg. With the FLIR photon IR IR camera installed the alignment accuracy is reduced to approximately +/-0.12deg due to the larger fixed FOV of 11deg. Although the error in this method is still observable by the gimbal with some of the intended sensor packages it has been reduced to being less than 1/40 th of the driving system error (heading report from the autopilot is ~ +/-5deg). To reduce the alignment error further one option is to increase the distance to the gridded target to 77ft, this reduces the alignment error to approximately approximatel y 0.5 encoder counts. Beyond this additional decreases to the alignment error are non-functional until the joint position system increases in resolution. Note that additional alignment errors may be introduced into the system based on the autopilot to mount attitude measurements and structural stiffness. Table 5-2 Alignment accuracy w/ perfect alignment to center FOV

󰁐󰁡󰁲󰁡󰁭󰁥󰁴󰁥󰁲 󰁣󰁯󰁭󰁢󰁩󰁮󰁥󰁤 󰁯󰁦󰁦󰁳󰁥󰁴 󰁥󰁲󰁲󰁯󰁲 󰁡󰁬󰁩󰁧󰁮󰁭󰁥󰁮󰁴 󰁡󰁣󰁣󰁵󰁲󰁡󰁣󰁹

󰁶󰁡󰁬󰁵󰁥 0.35 0.068902 0.783949

󰁵󰁮󰁩󰁴󰁳 󰁩󰁮 󰁤󰁥󰁧 󰁥󰁮󰁣󰁯󰁤󰁥󰁲 󰁣󰁯󰁵󰁮󰁴󰁳

77

 

Table 5-3 Alignment accuracy w/ center FOV tolerance

󰁐󰁡󰁲󰁡󰁭󰁥󰁴󰁥󰁲

󰁶󰁡󰁬󰁵󰁥

󰁁󰁬󰁬󰁯󰁷󰁡󰁢󰁬󰁥 󰁡󰁬󰁩󰁧󰁮󰁭󰁥󰁮󰁴 󰁥󰁲󰁲󰁯󰁲

󰁵󰁮󰁩󰁴󰁳

0.077 0.88

󰁓󰁣󰁲󰁥󰁥󰁮 󰁡󰁬󰁩󰁧󰁮󰁭󰁥󰁮󰁴 󰁵󰁮󰁣󰁥󰁲󰁴. 󰁈󰁯󰁲󰁩󰁺. 󰁲󰁥󰁳󰁯󰁬󰁵󰁴󰁩󰁯󰁮 (󰁎󰁔󰁓󰁃) 󰁖󰁥󰁲󰁴. 󰁲󰁥󰁳󰁯󰁬󰁵󰁴󰁩󰁯󰁮 (󰁎󰁔󰁓󰁃) 󰁭󰁡󰁸 󰁈󰁆󰁏󰁖 󰁭󰁡󰁸 󰁖󰁆󰁏󰁖

󰁤󰁥󰁧 󰁥󰁮󰁣󰁯󰁤󰁥󰁲 󰁣󰁯󰁵󰁮󰁴󰁳

2

󰁰󰁩󰁸󰁥󰁬󰁳

483

󰁰󰁩󰁸󰁥󰁬󰁳

440 1.955733

󰁰󰁩󰁸󰁥󰁬󰁳 󰁤󰁥󰁧

1.78162

󰁤󰁥󰁧

5.2.2 Thermal Calibration For calibration of the MEMs gyros the across the design temperature range the gimbal was placed in a temperature chamber allowed to thermal soak for 1hour and a calibration routine was run. The gimbal was programed with with a preset calibration routine to calculate a 1st order calibration, calibratio n, scale and offset(bias). To calculate the gyro scale and offset the turret was assumed to have its inner most loop tuned to be stable and able to maintain a steady state velocity. During the calibration the gimbal was mounted to a fixed stand inside a temperature controlled chamber and using its own axes and joint position sensors as a motion table for the gyros performed a series of constant velocity motions.

Figure 5-7 Temperature control chamber

Steps:

78

 

1.  Mount turret to stationary reference (joint velocity = true inertial velocity) 2.  Allow to thermal soak for 10minutes once the temperature chamber has reached steady state. ------- start of automated section ------3.  Single axis data collection a.  Turn off other axis b.  Calculate gyro calibration command array c.  Send ith command to the inner inertial velocity loop (gimbal should hold constant gyro velocity d.  Wait for settle e.  Collect high speed data f.  Calculate average gyro velocity g.  Calculate average joint velocity h.  Record gyro reported temperature reference value i.  Return to step 3 and repeat until all commands have been sent 4.  Calculate linear least squares 1st order fit for Vgyro_calibrated = V gyro*M+B 5.  Record scale and offset for the average gyro temp reference value nd 6.  Return to step 3 and repeat for the 2  axis ------- end of automated section ------7.  Return to step 2 for additional temperature conditions

Application of the gyro calibration during normal operation of the gimbal is done by interpolating the table of scale and offset values to the current value of the gyro temperature reference. This calibration routine produced very good results and and was found to be very user friendly by allowing additional calibration data points to be inserted into the temperature calibration database along with the ability to reset the entire table. For instance if a gyro is replaced the table would need to be recollected. The process is also fully automated with the exception of waiting for the thermal chamber operation and initiating the temperature calibration routing.

79

 

󰁉󰁮󰁥󰁲󰁴󰁩󰁡󰁬 󰁖󰁥󰁬󰁯󰁣󰁩󰁴󰁹 󰁃󰁭󰁤 60 40     󰀩 20    󰁳     󰀯    󰁧    󰁥     󰁤     󰀨 0     󰁤 0    󰁭    󰁃

50

100

150

200

250

󰀭20 󰀭40 󰀭60

󰁔󰁩󰁭󰁥 󰀨󰁳󰁥󰁣󰀩 Figure 5-8 Gyro calibration command profile

5.2.3 Motion table To evaluate the gimbal’s stabilization characteristics in a controlled manner without the need for expensive flight testing two different motions tables were developed. The first motion table had a single axis of actuation driven by a computer controlled stepper motor through the use of a belt drive system. The cabling for the gimbal passes through the center of the motion table’s axis of rotation. A unique feature for this test stand is the pivot mounting system shown in Figure 5-9 which allows for the testing of the tilt axis as well well as combined axis motion. Figure 5-9 Single axis test stand with pivot, show the single axis test stand setup for combined axis rotations.

80

 

Figure 5-9 Single axis test stand with w ith pivot

To run specific disturbance profiles the motor is capable of running motion scripts and responding to real time commands through a serial interface. It was found that the real time command interface was the easiest and most flexible interface for sine wave disturbance profile commands. The motor command interface was integrated into the TurretCanComm control software and a complete motion table control, gimbal tune, and data collection interface was created. The physical design of the single axis test stand and its inability to complete smooth sine waves with high enough update rate was found to be inadequate and drove the development of a 2 nd test stand. To solve these issues a second two axis test stand was created with fixed aluminum push rods connected to eccentric wheels that when driven create very sine like motion. By implementing the sine wave disturbance profile in the mechanics of the motion table a single command could be sent to the motor eliminating the data rate limits of the real-time command interface. The resulting system produced smooth profiles that could be adjusted in frequency through software commands and in magnitude through adjustment of the eccentric drive wheels. Figure 5-10 Dual axis test stand (inverted operation left, CAD model right), shows the dual axis test stand in use in the inverted orientation and the CAD model in the normal orientation with a Tigereye gimbal.

81

 

Figure 5-10 Dual axis test stand (inverted operation left, CAD model right)

5.3 Flight testing Flight testing was conducted on both manned and unmanned platforms. The manned flights were integrated into the flight test program of the Tigereye in an effort to gather operational data on the gimbal without being subject to data latency or range/UAV availability availabilit y issues. Unmanned testing was done primarily at the Camp Roberts McMillian airstrip within restricted airspace, R-2504. The EFR, Educational Flight Research Facility, was also used for flight testing with the ROTM platform. Both test platforms saw unique integration issues as well as the ability to test different aspects of the system performance.

5.3.1 Manned For quick iteration testing the manned platform provided a short time to flight due to the close proximity to the San Luis Obispo airport, short lead time for mission planning. Both manned platforms, the Cessna 150 and Van’s RV-7 aircraft, were 2-place aircraft with the pilot in the left seat and gimbal operator in the right seat. In both cases the gimbal was in full view of the free stream airflow and mounted to a vibration

82

 

isolation unit. The biggest drawback for the manned testing flights was the lack of host data information on both aircraft flown which prevented the testing of GPS and Euler lock modes. To keep the installation simple the gimbal was connected directly to a laptop running TurettCanCom and the gimbal was operated with the use of an Xbox controller.

Tigereye

Figure 5-11 Manned platform integration

Initial manned flight testing was done on the Cessna 150 which provided for airspeeds in the 60-70knot range during simulated operations. Testing on this aircraft provided insight into the gimbal’s Issues found during this flight testing included susceptibility to jitter due to mount vibration, pan drive system stability to oil, and aerodynamic effects on early single imager gimbals. Changes integrated into the Cessna test hardware to improve the image quality included: improved vibration isolation mount

83

 

to reduce the jitter tendencies, addition of a clear dome to eliminate aerodynamic effects moving of mount location to avoid engine oil vent. The biggest limitation to the Cessna testing was the mounting of the gimbal on the main landing gear strut. Significant mount vibration problems were seen by the gimbal due to the fact that the natural frequency of the landing gear was very low and the gear leg was cantilevered off of the aircraft into free-stream flow. During additional testing the program changed the manned platform to an experimental RV-7 airplane. This allowed the gimbal to be mounted to structure supported by the wing spar. This new mount had significantly higher natural frequency when compared to the gear leg mount of the Cessna 150. This aircraft is powered by a 6cylinder Subaru based automotive engine which provided a smoother vibration environment for the gimbal to be subjected to. Testing on the RV-7 was limited to only a few flights.

5.3.2 Unmanned The Tigereye gimbal was also flown on 4 different UAV platforms referred to as: Rise of the Machines, T-16, and Electric UAS, pictures are shown in Figure 5-12 UAV platforms [14], [15], [16]. Each test bench aircraft used a Cloud Cap Piccolo II Autopilot as the primary flight control system. This autopilot provides very reliable host attitude information at 10Hz. It also provides host LLA, latitude, longitude and altitude, GPS position at 4Hz, with the ability to incorporate DGPS to increase the system’s positional accuracy. Command and control to the gimbal was done through the communications link provided by the piccolo autopilot and using AeroMech Engineering’s custom ground control software: Sharkfin.

84

 

Electric UAS (not shown due to proprietary information) ROTM (Gas & Electric)

Fury UAS

Figure 5-12 UAV platforms

Both ROTM and T-16 have internal combustion based propulsion systems that created significant airframe vibration environments which were mitigated during testing through the use of vibration isolation mounts, gains tuning of the Tigereye to avoid jitter, and the use of high shutter speed camera settings to avoid blurring of the image. As long as the image stayed clear and focused the digital image stabilization and track algorithms were able to track the target. As soon as the image went blurry the image was lost and had to then be manually reset. Significant ground testing of the vibration environment was done by suspending the UAS from a metal frame using bungee cords attached to the main center of pressures on each lifting surface. This allowed the engine to be run with minimal aircraft stand dampening of the vibrations. Several test matrices were completed

85

 

to compare side by side video performance. In the test matrices engine rpm, engine vibration isolation method, gimbal vibe isolation method, and camera settings were varied while the gimbal maintained active inertial dampening while looking at a target ~40ft away. Below is an example of the side by side comparison done on the ROTM aircraft.

Figure 5-13 Vibration test matrix

ROTM and Electric UAS were one off versions specific for the development of the Tigereye system and required flight control simulation and tuning of the flight control laws to produce a stable host system for the Tigereye gimbal. The T16 UAV already had a developed set of flight control laws for its piccolo system and did not require further adjustment for gimbal testing.

86

 

Figure 5-14 Gimbal view from ROTM at EFR range

87

 

6 Results This section provides a discussion on the performance results of the prototype system completed at the end of this project. The Tigereye gimbal performance was evaluated in two major categories: ground test and flight test. During ground testing the disturbance rejection performance was evaluated and tuned for best performance while staying away from any jitter limit cycles. The gimbal was then flight tested and the performance of the 3 primary operational modes, inertial dampening, GPS lock, and target tracking, were evaluated. The testing phase of the program was on-going with continual improvements being worked into the gimbal and host systems.

6.1 Ground Test Disturbance Rejection During ground testing the gimbal was setup in the dual axis test stand and the gimbal mount frame was subjected to a sine wave rotational disturbance profile. The test setup used the same gridded target as used in the alignment procedure positioned approximately 25ft away from the test stand. The test was initiated by first pointing the gimbal at the center of the target and zeroing the gyro bias with the test stand stationary. A constant velocity command was then sent to the test stand motor. If there was significant drift of the center FOV of the sensor and the center of the target after the system reached steady state the gimbal was steered so any motion was approximately centered on the target. Changes to the motor velocity were used to adjust the frequency of disturbance and the magnitude of disturbance was set to 5deg. During the test both gimbal axes control loops were active however only one axis of the test stand was disturbed. This was done to avoid potentially artificially high stabilization stabilizati on performance

88

 

with one axis off. Disturbance error amplitudes were measured by measuring the peak error between the center of the target and the sensor’s center FOV seen in the sensor’s real-time video feed. The observed motion was a very characteristic ‘tic-toc’ motion with peak image velocities occurring when the joint axis velocity changes sign. It was found that increasing the integral gain value of the inertial rate error reduced this tic toc along with increasing the sharpness and width of the deadzone compensation to allow the system to quickly compensate for the change in friction forces due to the step in the coulomb friction. It is during this transition period that the joint rate is approximately = 0deg/sec and the magnitude of the LOS error is equal to the time integral of the disturbance velocity over the period of zero joint velocity. The proportional, P, and derivative, D, gains were used only enough to stabilize the integral gain and were found to be the biggest contributors to initiating axis jitter. Continuing to increase the P and D gains, beyond the point at which jitter occurred, did continue to improve the low frequency disturbance rejection performance. However any amount of jitter caused the image quality to deteriorate rapidly due to blurring making these settings impractical when the gimbal is carrying imaging sensors.

89

 

5

   )    B    d    (   n   o    i    t   c   e    j   e    R   e   c   n   a    b   r   u    t   r    i    D

0 -5 -10 -15 -20

Legacy Error TigerEye

-25 -30 0.1

1

10

Frequency (Hz)

Figure 6-1 Pan disturbance rejection performance to 5deg sine wave disturbance

10 5 0    )    B    d    (   n   o    i    t   c   e    j   e    R   e   c   n   a    b   r   u    t   r    i    D

-5 -10 -15 -20

Legacy Error TigerEye

-25 -30 0.1

1

10

Frequency (Hz)

Figure 6-2 Tilt disturbance rejection performance to 5deg sine wave disturbance

Applying these results to mission performance parameters by assuming the worst case angular displacement is proportional to the maximum angular velocity of the vehicle. The justification for this assumption comes from the ‘tic-toc’ nature of the LOS motion and its error magnitude being proportional to the time integral of the disturbance velocity during the changing in direction of the gimbal joint rate.

90

 

Table 6-1 Mission stabilization performance estimate Value

Parameter % mission design margin Physical res

99 20 0.69

Max target motion

Units

Notes

% % ft/pixel

Design goal Design margin calculated from NTSC resolution to find human Target allowed to move +/-20% of FOV

20

%

440

ft.

Maximum HFOV length on ground Min FOVhoriz 

1.623077

deg.

calculated from NTSC resolution to find human Camera spec

Max Zoom Level d Total angular rate Worst case stabilization

26 15531.28 100 1.58

x ft. deg/sec deg.

Camera spec Analytical limit of stationary camera Aircraft total angular rate Estimated performance

40    )   n   e   e   r   c   s      %    (   n   o    i    t   o    M    t   e   g   r   a    T

99% 95% 90% 75% 50%

35 30 25

󰁒󰁥󰁱. 20% 󰁭󰁡󰁸 󰁴󰁡󰁲󰁧󰁥󰁴 󰁭󰁯󰁴󰁩󰁯󰁮

20 15 10 5 0 1

6

11 Sensor Zoom Level (x)

16

21

26

Figure 6-3 Target motion = f( f(%of %of flight time, zoom level) 16,000 Min Slant Range Camera Limit 10% Target movement 20% Target movement 30% Target movement 40% Target movement

14,000    )    t 12,000    f    (   e   g   n10,000   a    R    t 8,000   n   a    l    S 6,000   x   a    M 4,000    t   s    E

2,000 0

0

10

20

30 40 50 60 70 Total Aircraft Angular Rate (deg/sec)

80

90

Figure 6-4 Max slant range = f(allowable target movement, aircraft angular rate)

100

91

 

40 35    )   x    (    l   e   v   e    L     m   o   o    Z   x   a    M

Camera limit

30

10% Target movement 20% Target movement

25

30% Target movement

20

40% Target movement

15 10 5 0 0

20

40 60 Total Aircraft Angular Rate (deg/sec)

80

100

Figure 6-5 Max zoom = f(allowable target movement, aircraft angular rate)

6.2 Flight Test The goals of flight testing were to qualitatively evaluate the real world performance of the gimbal and its stabilization algorithms. Flying on an actual aircraft subjected the system to real vibration and aerodynamic loads as well as rotations in all three axes. The performance of the Tigereye system was adequate to meet the mission requirements however the stabilization performance was still very far below the capabilities of the sensor. The Cal Poly EFR was used for a significant portion of the flight testing of this project, the center of the runway is located at: lat=35.328461󰂰, lon=-120.752403󰂰.

92

 

Runway center Figure 6-6 Education Flying Research facility at Cal Poly

6.2.1 Inertial dampening Inertial dampening evaluations were done on both the manned and unmanned platforms. This mode was the primary mode used by the operators to search and investigate an area of interest. For command and control of the turret a Microsoft Xbox controller was used and found to provide satisfactory performance for an inexpensive COTS controller.

93

 

Figure 6-7 Long distance view w/ overview (slant range ~ 3,600ft)

6.2.2 GPS lock To evaluate the GPS lock performance of the gimbal and host system at a slant range of 1200ft an 850ft radius orbit was setup around the center of the runway at an altitude of 850ft above ground level, AGL. Before flight the center of the runway was surveyed using the vehicle’s GPS system. This was done in an attempt to reduce the number of error sources for the GPS lock test. Once surveyed the vehicle was launched and established in the orbit. The gimbal was then commanded to look at the center of the orbit. Under smooth zero-wind atmospheric wind conditions the gimbal joint angles would be maintained at constant values to look at the center of an orbit. By flying this geometry the effect of misalignments between the autopilot reference frame and the sensor reference frame will be seen as mean biases in the LOS error with respect to the target. Stabilization Stabilizat ion errors will show up as relatively high frequency noise in the LOS LOS

94

 

error. The effects of constant non-zero wind will show up as sinusoidal errors at the orbit frequency, this is a relatively low frequency.

Figure 6-8 Flight plan using Cloud Cap's PCC ground station software

To quantitatively evaluate the system performance the recorded video was post processed through the video stabilization toolbox from Matlab. This toolbox was used to track the surveyed center of the orbit, indicated by the circle in the center of the runway in Figure 6-9 GPS lock target, and calculate the LOS error in degrees. The gimbal’s camera was operated at a constant 5x zoom level with an 8.8 󰂰 HF  HFOV. OV. The system performance is summarized in Figure 6-10 GPS lock performance summary.

95

 

Y

X

GPS Lock Target Lat: 35.328409󰂰  Lon: -120.752435󰂰 

ax es Figure 6-9 GPS lock target and center FOV axes

Figure 6-10 GPS lock performance summary

The X, Y, and total LOS error magnitudes were calculated, statistical information for the total error is shown in the right two subplots in Figure 6-10 GPS lock performance summary. The errors were calculated from a 210sec video clip representativ representativee of the overall performance of the system. From the cumulative distribution a 50% center error

96

 

probability was found to be 1.671 󰂰. Correcting for the average error to eliminate the alignment errors the 50% CEP value reduces to 1.323 󰂰. The remaining results are summarized in the following tables. Table 6-2 Raw GPS Lock CEP

󰁒󰁡󰁷

󰀵󰀰󰀥 󰁃󰁅󰁐 󰀨󰁤󰁥󰁧󰀩 1.61

󰁄󰁩󰁳󰁴󰁡󰁮󰁣󰁥 󰁀 󰀱󰀲󰀰󰀰󰁦󰁴 󰁳󰁬󰁡󰁮󰁴 󰁲󰁡󰁮󰁧󰁥 󰀨󰁦󰁴󰀩 70.0

󰁘

1.49

62.3

󰁙

0.58

24.3

󰁔󰁯󰁴󰁡󰁬

Table 6-3 Bias corrected GPS Lock CEP

󰁂󰁩󰁡󰁳 󰁒󰁥󰁭󰁯󰁶󰁥󰁤 󰁔󰁯󰁴󰁡󰁬

󰀵󰀰󰀥 󰁃󰁅󰁐 󰀨󰁤󰁥󰁧󰀩 1.32

󰁄󰁩󰁳󰁴󰁡󰁮󰁣󰁥 󰁀 󰀱󰀲󰀰󰀰󰁦󰁴 󰁳󰁬󰁡󰁮󰁴 󰁲󰁡󰁮󰁧󰁥 󰀨󰁦󰁴󰀩 55.4

󰁘

1.23

51.7

󰁙

0.34

14.0

The bias errors for this test were calculated to be approximately 0.3 󰂰 on each axis. The large discrepancy in error magnitude between X and Y errors, approximately 3x, can be attributed to the lack of a true measurement of the aircraft’s heading with respect to the NED coordinate system. On a piccolo based autopilot without a magnetometer or compass the reported heading of the vehicle is derived from the ground track velocity vector. This attitude error is not critical for flight safety of the autopilot system but drives a significant portion of the useable zoom level of the gimbal system under full hand’s off operation.

6.2.3 Target tracking The target tracking control loop was tested both on the manned and unmanned platforms. The video processing algorithms implemented on the Tigereye are describes in [1]. The electronic target tracking algorithm identifies features inside the white

97

 

rectangle and transmits their pixel X,Y location to the mechanical stabilization board. The gimbal then calculates a set of inertial velocity commands, described in 4.4.3, and attempts to center the target. For the images show below there was a software bug that centered the top right corner of the white rectangle in the sensor’s FOV and not the center of the white rectangle. The image stabilization stabilizati on algorithm then electronically electronicall y offsets the image to eliminate any remaining errors before the image is displayed to the operator. This gives them a very clear and stable "hands off" video stream from which to observe the target.

Tracker Imager stabilization stabilizat ion

ON ON

With Image stab. video image is electronically moved to center target

Tracker Imager stabilization stabilizati on

ON OFF

Target is centered through mechanical stabilization

Figure 6-11 Target tracking screenshots

98

 

7 Summary 7.1 Conclusions The Tigereye gimbal system has been equipped with inner loop inertial dampening and high level outer loop controls in an effort to reduce the gimbal operator’s workload when controlling a small UAS gimbal system. Hands off performance of the target tracking and GPS lock algorithms have been demonstrated on several aircraft platforms including both manned and unmanned aircraft. Inertial dampening performance improvements over legacy gimbal systems designed for other small UAS systems has also been demonstrated. The control system development for the Tigereye has successfully brought stabilization technology utilized in larger gimbal systems to the small UAS and has filled the identified gap in performance between small and large gimbal systems. From this perspective the project has been a success in meeting a significant number of goals set out at the beginning of the development. However, the Tigereye gimbal system is not without its limitations. The mechanical stabilization performance is still limiting overall gimbal system performance in the goal of full use of the sensor payload returning high quality clear video feeds at the narrowest FOV. The Tigereye gimbal was found to be very susceptible to control system  jitter and mount mount vibration. vibration. The gimbal gimbal system in its current current form also requires that that each gimbal spend significant time at the factory undergoing test and tuning in order to achieve the desired performance while accounting for manufacturing differences between gimbals. With improved control system and mechanical designs these issues can be addressed.

99

 

Considerable time was also spent mitigating the effects of aircraft vibration and its effects on image quality. To achieve the best image quality the gimbal should be isolated as much as possible from aircraft vibration and if possible electric propulsion systems should be used because of their lower vibration characteristics. The design of the Tigereye gimbal was never to address vibration but the project required significant efforts be made to isolate the system from vibration so that an accurate assessment of the stabilization performance, with respect to aircraft attitude disturbances, could be made.

7.2 Future Work There are many areas of future work for the Tigereye gimbal system from stabilization stabilizat ion performance enhancements to advanced applications. The first area to be addressed is to correct the architecture design flaw that has resulted in the disturbances in the sensor’s HFOV being unobservable at non-zero tilt angles and especially significant at large tilt angles greater than 30 󰂰. The solution to this is to move the pan gyro from the pan axis where it sensed rotations about Z pan and place it inside the tilt ball where it will measure rotations about Zsensor. This will correctly convert the system to a full direct LOS stabilization system as referenced in [5]. The design modification will make the cross-elevation axis disturbances, those along the horizontal view axis of the image, observable during all orientations. An additional additional coordinate system transformation will also be required to calculate the pan axis rotational velocity required to zero the cross elevation axis disturbances. Utilizing modern adaptive control techniques is one area that can make significant improvements to the production gimbal system while still utilizing the existing mechanical design. It was found in section 6.1that application of inverse dead-zone

100

 

compensation provided significant performance improvements but was not fully utilized in the final control system because of jitter limitations and required robustness levels to deal with plant variability. An adaptive control law can also be used to significantly reduce the time spent tuning the gimbal as well as extending the maintenance interval to re-tuning the gimbal. In addition to increasing the inner loop stabilization of the gimbal many additional applications of the gimbal can also be explored now that the basic stabilization architecture exists. Extensions of the video processing algorithms such as those described in [17] can be used to calculate the GPS location of a target tracked. This capability can be applied to ground systems used to track aircraft in the local airspace, auto land systems for aircraft that have a gimbal installed, as well as navigation and attitude estimation during GPS and IRU failure conditions.

101

 

8 Bibliography [1] Nicholas S. Cross, "On Board Video Stabilization for Unmanned Air Vehicles," San Luis Obispo, 2011. [2] Department of Defense, "MIL-HDBK-1797, Flying Qualities of Piloted Aircraft," 1997. [3] Sony Corporation. (2004) Sony Pofessional. [Online]. http://www.pro.sony.eu/biz/lang/en/eu/product/fcbcseries/fcbex980s/technicalspecs   ex980s/technicalspecs [4] FLIR Inc. (2008, December) FLIR coorporation website. [Online]. http://www.corebyindigo.com/files/Documents/Photon640_UsersGuide.pdf  http://www.corebyindigo.com/files/Documents/Photon640_UsersGuide.pdf    [5] Peter J. Kennedy and Rohnda L. Kennedy, "Direct Versus Indirect Line of Sight (LOS) Stabilization," IEEE Transactions on Control Systems Engineering 2003. Technology, vol. 11, no. 1, pp. 3-15, Janurary 2003. [6] Martin Ernesto Orejas, "UAV Stabilized Platform," Kiruna, 2007. [7] Ole C Jakobsen and Eric N Johnson, "Control Architecture for a UAV-Mounted Pan/Tilt/Roll Camera Gimbal," Georgia Institude of Technology, Atlanta, GA, 30332-0150, AIAA 2005-7145, 2005. [8] Won Hong, "Adaptive Control of a Two Axis Camera Gimbal," Massachusetts Institute of Technology, 1994. [9] Dassault Systems. (2011) SolidWorks Help Site. [Online]. http://help.solidworks.com/2011/english/SolidWorks/sldworks/LegacyHelp/Sldwo rks/Parts/HIDD_MASSPROPERTY_TEXT_DLG.htm   rks/Parts/HIDD_MASSPROPERTY_TEXT_DLG.htm [10] FAULHABER Group. (2012) MicroMo. [Online]. http://www.micromo.com/Micromo/DCMicroMotors/1331_SR_DFF.pdf   http://www.micromo.com/Micromo/DCMicroMotors/1331_SR_DFF.pdf   [11] Analog Devices. (2010, Feburary) Analog Devices. [Online]. http://www.analog.com/static/imported-files/data_sheets/ADXRS614.pdf    http://www.analog.com/static/imported-files/data_sheets/ADXRS614.pdf  [12] Jonathan Bernstein. (2003, Feburary) Sensor Magazine. [Online]. http://www.sensorsmag.com/sensors/acceleration-vibration/an-overview-memsinertial-sensing-technology-970   [13] Austria Microsystems. (2011) A Digi-Key Corporation Site. [Online]. http://www.austriamicrosystems.com/eng/content/download/12890/229418   [14] ARCTURUS-UAV inc. (2012, August) ARCTURUS-UAV. [Online]. http://www.arcturus-uav.com/docs/products_brochure08.pdf    http://www.arcturus-uav.com/docs/products_brochure08.pdf  [15] Peter Abbe. (2004, July) Hanger-9 Web site. [Online]. http://www.hangar9.com/Articles/Article.aspx?ArticleID=1347&Page=2   [16] Gary Mortimer. (2010, December) SUAS News. [Online]. http://www.suasnews.com/2010/12/2884/aeromech-engineering-inc-to-expandinto-new-facility-in-san-luis-obispo-ca/    into-new-facility-in-san-luis-obispo-ca/  [17] D. Blake Barber, "Accurate Target Geolocation And Vision-Based Landing With Application To Search And Engage Missions For Miniture Air Vehicles," Department of Mechanical Engineering, Bringham Young University, Masters Thesis 2007.

102

 

[18] John D Page. Math Open Reference. [Online]. http://www.mathopenref.com/coordray.html   [19] James L. Meriam, L. G. Kraige, and William J. Palm,  Engineering Mechanics  Dynamics, 5th ed. New York: John Wiley & Sons Inc., 2001. [20] Per Skoglar, "Modeling and control of IR/EO-gimbal for UAV surveillance applicaions," Electrical Engineering, Linkoping Thesis ISRN LITH-ISY-EX-3258-2002, 2002. University, Linkoping, Sweden, [21] Luca Zaccarian. Department of Computer Science, Systems and Production of Civil Engineering. [Online]. http://control.disp.uniroma2.it/zack/LabRob/DCmotors.pdf  http://control.disp.uniroma2.it/zack/LabRob/DCmotors.pdf   

Sponsor Documents

Or use your account on DocShare.tips

Hide

Forgot your password?

Or register your new account on DocShare.tips

Hide

Lost your password? Please enter your email address. You will receive a link to create a new password.

Back to log-in

Close