Constance D. Hendrix, Michael J. Veth, and Ryan W. Carr
Introduction
The need to conduct warfare in urban environments presents a new frontier for navigation and control of small UAVs in locations where GPS may be degraded or denied. The Advanced Navigation Technology (ANT) Center Indoor Micro Aerial Vehicle project seeks to build a small UAV with the capability to operate autonomously in these constrained environments. This first step in the development of a robust, non-GPS guidance, navigation, and targeting system for small indoor air vehicles is to design and validate a flight control system for micro aerial vehicles using an externally-mounted optical tracking system. For this project, the controller was designed using a modified Linear Quadratic Gaussian (LQG) technique which incorporated an Unscented Kalman Filter (UKF) for state estimation. The resulting controller was successfully test flown using the Air Force Research Laboratory's (AFRL) Micro Air Vehicle (MAV) Indoor Flight Facility (IFF) in September 2008. In this article, the top-level design and flight testing approach is presented.
Design and Testing Approach
Helicopters without the benefits of a stability augmentation system tend to possess unstable dynamics. The small electric helicopter chosen in this article (see Figure 1) is no exception. Maintaining a stable hover requires continuous control updates. As such, a large number of previous research efforts have been dedicated to design and testing of automatic flight control algorithms designed to stabilize small helicopters. Various flight control techniques used for this application include, but are not limited to, vision-only navigation and control using a UKF, loosely coupled GPS/INS integration using a UKF, and camera and IMU updates to a UKF for a trajectory generator.
Although the end goal is to autonomously navigate unknown GPS-denied environments, the first step is to build a model-based hover controller which utilizes an external optical tracking system recently installed in AFRL's MAV-IFF. The IFF uses motion capture cameras to provide accurate position and orientation information to a real-time controller which in turn relays commands through an RC transmitter to any off-the-shelf or uniquely-designed vehicle. This real-time motion capture facility is one of a handful in the US. Other organizations using similar facilities for controller design and testing include MIT (Aerospace Controls Laboratory - RAVEN), Georgia Tech, and Boeing, which have produced useful results mostly within the past year.
The helicopter flight control system design process consisted of modeling, simulation, and flight-tests. To begin, the helicopter model was designed from the bottom up using measurements collected in the ANT laboratory. Once designed, the vehicle and sensor models were implemented in Simulink. The vehicle and sensor models were then used to derive the Unscented Kalman Filter (UKF) state estimator and the LQG controller.
The system was then tested in Simulink using a combination of simulated and pre-recorded flight data. After adjusting the models for acceptable performance in the simulation environment, the design effort proceeded to hardware tests in the IFF. The hardware testing was accomplished in two steps. First, the controller was tested in the LQR configuration (i.e., no state estimator in the loop). After verifying the performance of the LQR controller, the state estimator was inserted into the loop, thus representing the full LQG controller.
AFRL/RB Indoor Flight Facility
The MAV IFF is a 30’ x 30’ x 20’ chamber outfitted with a suite of 36 Vicon motion capture cameras. The lenses on the cameras are encircled by LED strobe lights, which serve to illuminate the flight volume. Retro-reflective markers are placed on an object (such as a MAV) and each camera sees a 2D image of the markers. These images are processed using the Vicon IQ software on a quadcore computer to produce accurate 3D position and attitude data of the object within the capture volume. When properly calibrated, the Vicon motion capture system produces position measurements of the target markers with repeatability on the order of millimeters.
The Vicon motion capture data are transmitted to a National Instruments real-time processor via an ethernet connection. Labview is used to design and run stability and navigation controls in real-time. Controller commands are then sent to the vehicle using a wireless datalink. A diagram of the system is displayed in Figure 2. The control software has been designed to allow easy interfacing with controllers not written in Labview. Researchers can write code in Matlab Simulink or C, C++, and simply compile the code into a Microsoft Windows Dynamic Link Library (DLL). Theoretically, the DLL can then be bound with the Labview interface. This allows quick integration of a variety of controllers.
Results
The control system was tested in the IFF using a combination of profiles, both static and dynamic. The vehicle tracked the desired trajectory with errors of less than 20 cm in position and less than five degrees in heading. The largest errors occurred during transients in position commands by the operator. A video of the flight test evaluation is available as well. Unmanned aerial vehicles have truly changed the modern battlefield. As UAVs continue to shrink, the operating environment for UAVs is changing. The controller designed and tested in this article is the first step in addressing issues arising within this environment, specifically the denial of the most accurate navigation solution available, GPS. In designing the LQG controller, AFRL’s MAV-IFF was used to verify each design step through hardware testing while eliminating the need to place a microprocessor and sensors onboard the vehicle. Future efforts using the same process will include incorporating inertial measurements into the model, providing image updates to the UKF, and integrating the system onboard to make the Walkera truly autonomous.