Air Force Wants AI Assistant That Talks Back

The Air Force wants to take the idea of a virtual assistant to the next level, with a system that not only draws from existing information to answer questions, but puts some additional thought into helping airmen make better decisions. This is accomplished by quizzing them about what, precisely, they plan to do.

The idea behind the Multi-Source Exploitation Assistant for the Digital Enterprise (MEADE) project is to have a virtual wingman than can perform advanced analytics on the spot for essentially any user, regardless of their technical expertise, the Air Force Research Laboratory (AFRL) said in a Broad Agency Announcement kicking off what could be a five-year, $25 million initiative.

While most artificial intelligence analytic ideas have dealt with large-scale data or image processing, this one would be more focused, tackling each job as it comes up. The project aims to develop the software tools, interfaces, and processes to put automated analytics to practical use.

MEADE would respond to voice, text, or file upload inputs, and the Air Force acknowledges that it would operate similarly to virtual assistants like Apple’s Siri, Amazon’s Alexa, or Google Assistant. Sure, it would answer questions, do calculations, and readily provide basic information such as weather conditions. But rather than simply offering a ranked list of possible answers, MEADE would perform another level of interaction, asking follow-up questions to better define the user’s intent and reduce ambiguity. AFRL plan to develop multi-INT tools (multiple intelligence, combining signals, open-source, and other intelligence inputs) in support of the project.

“Ultimately, an airman would be armed with a richer set of information, ‘connecting more dots’ across air, space, cyber, land, sea, and undersea to provide a more comprehensive situation understanding,” AFRL said.

In theory, MEADE could work across the board, on the ground, or in the air. If used in cockpits, its ability to analyze incoming data might help pilots avoid the “helmet fire” that results when a barrage of too much information creates mental overload, and actually reduces situational awareness. But in any setting, it would assist with any task at hand, while also helping to inform military decision-making and command and control functions.

The project has two focus areas. Real-Time Operator-Driven Gist Exploration and Response (ROGER) would combine multi-INT search and retrieval, natural language processing, recommendation engines, applied analytics, and a question-and-answer system that can execute in a cloud or distributed computing environment. While ROGER will be able to answer questions concerning multiple domains, it also would interact with the users to tailor and optimize results.

Interactive Analytics and Contextual Fusion (IACF), working in tandem with ROGER, would seek to add context plus anticipatory and prescriptive analytics (determining the best course of action) to the process. It would add elements such as behavior analytics and a hybrid information extraction process that would help with “patterns of life” anomaly detection, among other features.

By taking advantage of advances in artificial intelligence, natural language processing, and other fields, AFRL hopes to add a new dimension in the use of these technologies. “MEADE involves creating the foundation for a fully functional ‘Information Wingman’ by adding the element of mutual support to enhance analytics,” AFRL said.

No Comments

    Leave a Reply

    Recent