Researchers and collaborators at the Naval Postgraduate School (NPS) are applying AI to automate critical parts of the tracking system used by laser weapon systems (LWS) fitted on US Navy ships.
By improving target classification, pose estimation, aimpoint selection and aimpoint maintenance, the ability of an LWS to assess and neutralise a hostile unmanned aerial system (UAS) greatly increases.
The tracking system of an LWS follows a sequence of demanding steps to successfully engage an adversarial UAS. When conducted by a human operator, the steps can be time consuming, especially when facing numerous drones in a swarm.
Also, when an adversary’s missiles and rockets are traveling at hypersonic speeds, efforts to mount proper defences can become even more complicated.
By automating and accelerating the sequence for targeting drones with an AI-enabled LWS, a research team from NPS, Naval Surface Warfare Center Dahlgren Division, Lockheed Martin, Boeing and the Air Force Research Laboratory (AFRL) developed an approach to have the operator on-the-loop overseeing the tracking system instead of in-the-loop manually controlling it.
"Defending against one drone isn’t a problem," said Distinguished Professor Brij Agrawal, NPS Department of Mechanical and Aerospace Engineering, who leads the NPS team. "But if there are multiple drones, then sending million-dollar interceptor missiles becomes a very expensive tradeoff because the drones are very cheap.
"The navy has several LWS being developed and tested. LWS are cheap to fire but expensive to build. But once it’s built, then it can keep on firing, like a few dollars per shot."
To achieve this level of automation, the researchers generated two datasets that contained thousands of drone images and then applied AI training to the datasets. This produced an AI model that was validated in the laboratory and then transferred to Dahlgren for field testing with its LWS tracking system.
Funded by the Joint Directed Energy Transition Office (DE-JTO) and the Office of Naval Research (ONR), this research addresses advanced AI and directed energy technology applications cited in the CNO NAVPLAN.
During a typical engagement with a hostile drone, radar makes the initial detection and then the contact information is fed over to the LWS. The operator of the LWS uses its infrared sensor, which has a wide field of view, to start tracking the drone.
Next, the high magnification and narrow field of view of its high energy laser (HEL) telescope continues the tracking as its fast-steering mirrors maintain the lock on the drone.
With a video screen showing the image of the drone in the distance, the operator compares it to a target reference to classify the type of drone and identify its unique aimpoints. Each drone type has different characteristics, and its aimpoints are the locations where that particular drone is most vulnerable to incoming laser fire.
Along with the drone type and aimpoint determinations, the operator must identify the drone’s pose, or relative orientation to the LWS, necessary for locating its aimpoints. The operator looks at the drone’s image on the screen to determine where to point the LWS and then fires the laser beam.
Long distances and atmospheric conditions between the LWS and the drone can adversely affect the image quality, making all these identifications more challenging and time consuming to conduct.