Abstract
The United States Navy (USN) intends to increase the amount of uncrewed aircraft in a carrier air wing. To support this increase, carrier-based uncrewed aircraft will be required to have some level of autonomy as there will be situations where a human cannot be in/on the loop. However, there is no existing and approved method to certify autonomy within Naval Aviation. In support of generating certification evidence for autonomy, the United States Naval Academy (USNA) has created a training and evaluation system (TES) to provide quantifiable metrics for feedback performance in autonomous systems. The preliminary use case for this work focuses on autonomous aerial refueling. Prior demonstrations of autonomous aerial refueling have leveraged a deep neural network (DNN) for processing visual feedback to approximate the relative position of an aerial refueling drogue. The training and evaluation system proposed in this work simulates the relative motion between the aerial refueling drogue and feedback camera system using industrial robotics. Ground-truth measurements of the pose between the camera and drogue are measured using a commercial motion capture system. Preliminary results demonstrate calibration methods providing ground-truth measurements with millimeter precision. Leveraging this calibration, the proposed system is capable of providing large-scale datasets for DNN training and evaluation against a precise ground truth.