THE DEFENSE ADVANCED RESEARCH PROJECTS AGENCY: Faster, Lighter, Smarter: DARPA Gives Small Autonomous Systems a Tech Boost

The Defense Advanced Research Projects Agency (DARPA) issued the following announcement on July 18.

DARPA’s Fast Lightweight Autonomy (FLA) program recently completed Phase 2 flight tests, demonstrating advanced algorithms designed to turn small air and ground systems into team members that could autonomously perform tasks dangerous for humans – such as pre-mission reconnaissance in a hostile urban setting or searching damaged structures for survivors following an earthquake.

Building on Phase 1 flight tests in 2017, researchers refined their software and adapted commercial sensors to achieve greater performance with smaller, lighter quadcopters. Conducted in a mock town at the Guardian Centers training facility in Perry, Georgia, aerial tests showed significant progress in urban outdoor as well as indoor autonomous flight scenarios, including:

Flying at increased speeds between multi-story buildings and through tight alleyways while identifying objects of interest;

Flying through a narrow window into a building and down a hallway searching rooms and creating a 3-D map of the interior; and

Identifying and flying down a flight of stairs and exiting the building through an open doorway.

Begun in 2015, the FLA applied research program has focused on developing advanced autonomy algorithms—the smart software needed to yield high performance from a lightweight quadcopter weighing about five pounds with limited battery power and computer processing capability onboard. FLA’s algorithms have been demonstrated so far on air vehicles only, but they could be used on small, lightweight ground vehicles as well.

“The outstanding university and industry research teams working on FLA honed algorithms that in the not too distant future could transform lightweight, commercial-off-the-shelf air or ground unmanned vehicles into capable operational systems requiring no human input once you’ve provided a general heading, distance to travel, and specific items to search,” said J.C. Ledé, DARPA program manager. “Unmanned systems equipped with FLA algorithms need no remote pilot, no GPS guidance, no communications link, and no pre-programmed map of the area – the onboard software, lightweight processor, and low-cost sensors do all the work autonomously in real-time.”

FLA’s algorithms could lead to effective human-machine teams on the battlefield, where a small air or ground vehicle might serve as a scout autonomously searching unknown environments and bringing back useful reconnaissance information to a human team member. Without needing communications links to the launch vehicle, the chances of an adversary detecting troop presence based on radio transmissions is reduced, which adds further security and safety, Ledé said. This could be particularly important in a search-and-rescue scenario, where an FLA-equipped platform could search in radio silence behind enemy lines for a downed pilot or crew member.

During Phase 2, a team of engineers from the Massachusetts Institute of Technology and Draper Laboratory reduced the number of onboard sensors to lighten their air vehicle for higher speed.

“This is the lightweight autonomy program, so we’re trying to make the sensor payload as light as possible,” said Nick Roy, co-leader of the MIT/Draper team. “In Phase 1 we had a variety of different sensors on the platform to tell us about the environment. In Phase 2 we really doubled down trying to do as much as possible with a single camera.”

A key part of the team’s task was for the air vehicle to build not only a geographically accurate map as it traversed the cityscape but also a semantic one.

“As the vehicle uses its sensors to quickly explore and navigate obstacles in unknown environments, it is continually creating a map as it explores and remembers any place it has already been so it can return to the starting point by itself,” said Jon How, the other MIT/Draper team co-leader.

Using neural nets, the onboard computer recognizes roads, buildings, cars, and other objects and identifies them as such on the map, providing clickable images as well. The human team member could download the map and images from the onboard processor after the mission is completed.

Additionally, the MIT/Draper team incorporated the ability to sync data collected by the air vehicle with a handheld app called the Android Tactical Assault Kit (ATAK), which is already deployed to military forces. Using an optional Wi-Fi link from the aircraft (that the human team member could turn on or off as desired), the air vehicle can send real-time imagery of objects of interest. During the flight tests, researchers successfully demonstrated autonomous identification of cars positioned in various locations around the mock town. With “exploration mode” mode on, the air vehicle identified the cars and provided their location with clickable high-resolution images in real-time via Wi-Fi, appearing as an overlay on the ATAK geospatial digital map on the handheld device.

A separate team of researchers from the University of Pennsylvania reduced their air vehicle’s size and weight to be able to fly autonomously in small, cluttered indoor spaces. UPenn’s air vehicle took off outside, identified and flew through a second-story window opening with just inches of width clearance, flew down a hallway looking for open rooms to search, found a stairwell, and descended to the ground floor before exiting back outside through an open doorway.

The platform’s reduced weight and size brought new challenges, since the sensors and computers used in Phase 1 were too heavy for the smaller vehicle.

“We ended up developing a new integrated single-board computer that houses all of our sensors as well as our computational platform,” said Camillo J. Taylor, the UPenn team lead. “In Phase 2 we flew a vehicle that’s about half the size of the previous one, and we reduced the weight by more than half. We were able to use a commercially available processor that requires very little power for the entirety of our computational load.”

A key feature of the UPenn vehicle is its ability to create a detailed 3-D map of unknown indoor spaces, avoid obstacles and ability to fly down stairwells.

“That’s very important in indoor environments,” Taylor said. “Because you need to actually not just reason about a slice of the world, you need to reason about what’s above you, what’s below you. You might need to fly around a table or a chair, so we’re forced to build a complete three-dimensional representation.”

The next step, according to Taylor, is packing even more computation onto smaller platforms, potentially making a smart UAV for troops or first responders that is small enough to fit in the palm of the hand.

Algorithms developed in the FLA program have been scheduled to transition to the Army Research Laboratory for further development for potential military applications.

Original source can be found here.




Top