Military Technology

Pentagon looks to acquire generative AI for influence activities: RAND

With the Pentagon’s acquisition of deepfake capabilities & history of assisting Hollywood, distinguishing PSYOPs will be more difficult than ever before: perspective

The Pentagon is looking to acquire generative AI for its influence activities as it faces “increasing scrutiny” about affecting US citizens and acquiring publicly available information, according to a RAND Corporation report.

Commissioned by the Irregular Warfare Technical Support Directorate (IWTSD), the RAND report, “Acquiring Generative Artificial Intelligence to Improve US Department of Defense Influence Activities,” takes a look at what it would take for the Pentagon to acquire generative AI for influence activities at scale, along with possible implications of its use.

An influence activity is “a deliberate attempt to affect a person’s or group’s thoughts, feelings, or behavior,” and the US Department of Defense (DoD) sees great opportunities in leveraging the power of generative AI not only for creating deepfakes and propaganda campaigns for its own influence activities, but also for gaining unique insights into its targeted audiences for planning purposes.

“Generative AI can be used to create videos that purport to show well-known public figures or members of particular racial or ethnic groups making statements that support or refute particular political viewpoints or committing particular acts”

RAND, Acquiring Generative Artificial Intelligence to Improve US Department of Defense Influence Activities research brief, July 2025
Source: RAND

The advent of generative AI presents significant opportunities for enhancing DoD’s influence activities by improving the quality, agility, speed, and scale of operations

RAND, Acquiring Generative Artificial Intelligence to Improve US Department of Defense Influence Activities, July 2025

For the report, RAND and the IWTSD interviewed research subject matter experts and held a workshop to develop recommendations for the Pentagon’s acquisition of generative AI.

Since influence activities are aimed at specific audiences for specific outcomes, the US military is constantly monitoring social media and other types of Open Source Intelligence (OSINT) and all-source intelligence channels to discover emerging patterns and narratives.

One thing the authors note is that as the digital environment spreads across national borders, it is getting more difficult not to affect US persons that get caught in the dragnet of US military psychological operations (PSYOPs).

“DoD’s influence activities have faced increasing scrutiny as it becomes increasingly challenging to avoid affecting US persons in the hyperconnected information age”

RAND, Acquiring Generative Artificial Intelligence to Improve US Department of Defense Influence Activities, July 2025
Source: RAND

“Operational forces (particularly at the tactical level) require the ability to clone voices to convey messages. A team may want to mimic an enemy commander as part of an effort to get them to surrender”

RAND, Acquiring Generative Artificial Intelligence to Improve US Department of Defense Influence Activities, July 2025

Generative AI can help the military better understand the audiences they are looking to manipulate.

One of the recommendations made was that the Pentagon should “obtain publicly available information (PAI) and fuse it with all-source intelligence” to make sense of the operational environment (OE).

In doing so, “Generative AI can enhance the capability to identify and acquire relevant PAI; several industry partners offer ‘narrative intelligence’ products that use generative AI processes to perform or supplement PAI analysis.”

On this front, the US intelligence community is already setting up a portal for the government to purchase PAI through the Intelligence Community (IC) Data Consortium (ICDC),” previously known as the “IC Data Co-op,” which seeks solutions “to manage a commercial data consortium that unifies commercial data acquisition.”

Fusing publicly available information with all-source intelligence and generative AI would allow the Pentagon to surveil multiple channels in near real-time, and with that knowledge create models of the target audience for its influence activity planning.

DoD could maintain a common operating picture (COP) or baseline of the OE […] This means near-real-time fusing of different types of data received in different formats on different rhythms.

These types of data can include social media posts, 24-hour television news, blog posts, and academic journal articles as much as all-source intelligence and situation reports from tactical units.

A true information COP would distill these feeds into a digestible format, such as a dashboard

RAND, Acquiring Generative Artificial Intelligence to Improve US Department of Defense Influence Activities, July 2025

This common operating picture (COP) would allow the DoD to experiment with different messaging and behavior modification strategies “in ways that are not yet possible.”

For example, the report includes “building a ‘sandbox’ for testing messages and thinking about how they may resonate with different target audiences. It may also be possible to model a particular target audience. For example, one idea included a chatbot connected to PAI to test messages against a representation of a target audience.”

Going further, “This COP would be enhanced by capabilities to identify and understand emerging themes in the media ecosystem. Influencing target audiences entails identifying specific themes and narratives as they emerge.

Operational forces need to understand what is being discussed and how to get ‘ahead of the target.’ This could include understanding how a troop deployment or joint exercise is being discussed in news interviews or on social media.”

As we can see, the use of generative AI in the military isn’t just about creating deepfakes and planting propaganda — a huge chunk is dedicated to surveillance, intel gathering, and data collection in order to understand everything before carrying out influence activities.

“As generative AI expands the scale and scope of influence activities, it could increase the damage caused by a poor decision by users or unanticipated consequences of thoroughly planned activities.

These might include influencing the wrong target audiences, working at cross-purposes with other national objectives, harming civilians, or undermining US credibility”

RAND, Acquiring Generative Artificial Intelligence to Improve US Department of Defense Influence Activities, July 2025

As the report frequently mentions, generative AI is a tool and not a panacea. Mistakes can be made.

When your whole reason for existing is to manipulate others to modify their behavior, you better make sure you’re targeting hits its mark.

And planning for influence activities is particularly challenging.

For example, the report says that “Planners must determine the desired effect on relevant actors (typically a behavioral outcome, not just an attitudinal outcome) and then assess outcomes, which often lag.

Generative AI can help planners understand the OE (whether emerging themes in online discourse or the physical movements of people) and aid in development of appropriate MOPs [Measure of Performances] and MOEs [Measure of Effectiveness], as well as in assessment of effects.”

The report lists countries like China and Russia as examples of targeted audiences, but the US government has a habit of turning its operations inwards for domestic purposes at home.

“Generative AI offers information personnel the potential to analyze large volumes of data and to generate high quality content far more efficiently than with the tools they currently possess”

RAND, Acquiring Generative Artificial Intelligence to Improve US Department of Defense Influence Activities research brief, July 2025
Source: RAND

“As one PSYOP officer mentioned, effective influence activities are an art form. An effective MISO [Military Information Support Operations] product should be appealing, even ‘beautiful’ to its target audience”

RAND, Acquiring Generative Artificial Intelligence to Improve US Department of Defense Influence Activities, July 2025

In the end, influence activities are all about delivering the right message to the right audience in the most effective way possible.

The audience could be an individual, a group, a population, or even an automated system.

Generative AI can provide the Pentagon with a wide range of tools for executing influence campaigns. The RAND report lists several examples:

  • Create images and videos at the desired level of detail and fidelity:
    • Generative AI has already shown promise in creating text, graphics, and video; audio content generation is the “furthest behind” in terms of creating original content.
  • Create products in an austere environment:
    • A small unit must be internet-optional, able to use a stand-alone laptop or even sketch pads to design and produce basic messages. As soon as the unit is able to gain connectivity, it must be able to disseminate this content instantly.
  • Deliver tailored messages with precision:
    • To identify the associates of terrorist fighters through human networks and then target messages specifically tailored to resonate with the individuals to get them to assist in deradicalization efforts.
  • Deliver more products faster:
    • Generative AI can “balance the battlefield” by allowing fewer US personnel to produce more content faster and compete at scale.
  • Clone voices:
    • Operational forces (particularly at the tactical level) require the ability to clone voices to convey messages. A team may want to mimic an enemy commander as part of an effort to get them to surrender.
  • Manage signatures:
    • AI might help to integrate multiple data streams and present information warfare officers and tactical commanders in the field with a clear picture of their tactical situation.
  • Translate human voices in real or near-real time:
    • Near-real-time translation of bridge-to-bridge communications would help maximize US influence with allies during a freedom of navigation operation.
  • Translate text quickly
    • A Security Force Assistance Brigade might be trying to understand a piece of equipment for which the manual is in French, German, or Korean and then convey specific influence products to the partner forces. This might involve translating hundreds of pages of highly technical information into English and then into the host nation language.

All of these capabilities are painted in ways that could help the US military conduct influence operations on foreign audiences for benevolent purposes.

However, who is to say that these generative AI capabilities won’t be leveraged for domestic purposes?

This is where public-private partnerships come into play.

The DoD is already deeply embedded in Hollywood and the entertainment industry, with over 2,500 war-themed movies and TV programs produced with Pentagon assistance, according to a report coming out of Brown University earlier this year.

What better way for the Pentagon to conduct domestic influence activities than by partnering with the entertainment industry?

Even more alarming is that in 2022, “Twitter and Meta both identified networks of fake accounts believed to be connected to the US military.”

As The Verge reported at the time, “Though researchers were not able to conclusively attribute the origin of the associated influence campaigns, the accounts ‘consistently advanced narratives promoting the interests of the United States and its allies’ while linking to news sites that were backed by the US government and military.”

And the IWTSD itself — whose mission is “to identify and develop capabilities for DoD to conduct Irregular Warfare against all adversaries, including Great Power competitors and non-state actors, and to deliver those capabilities to DoD components and interagency partners through rapid research and development, advanced studies and technical innovation, and provision of support to US military operations” — partners with both foreign and domestic performers from countries like Australia, Canada, Congo, Israel, the Netherlands, Singapore, Spain, and the UK.

They include academic institutions, private companies, national and international research labs, and government agencies.

With generative AI in the hands of the Pentagon, knowing what is real and what is a PSYOP will be more difficult to distinguish than ever before.


Image source: AI generated with Grok

Tim Hinchliffe

The Sociable editor Tim Hinchliffe covers tech and society, with perspectives on public and private policies proposed by governments, unelected globalists, think tanks, big tech companies, defense departments, and intelligence agencies. Previously, Tim was a reporter for the Ghanaian Chronicle in West Africa and an editor at Colombia Reports in South America. These days, he is only responsible for articles he writes and publishes in his own name. tim@sociable.co

Recent Posts

El Salvador and Pakistan pledge “strategic collaboration” in Bitcoin field

El Salvador’s Bitcoin Office and Pakistan’s Crypto Council on July 16 signed a letter of…

4 days ago

WEF calls on stakeholders to ‘inoculate’ public against disinformation ‘super-spreaders’: report

Those who decry 'disinformation' the loudest almost never give any examples of what they're denouncing:…

6 days ago

Shift left, ship fast: How software teams can offer speed without sacrificing quality (Brains Byte Back Podcast)

Even the biggest software companies understand that moving quickly is no longer a luxury; it's…

7 days ago

Extremists weaponize COVID, climate issues with conspiracy theories about state & elite control: RAND Europe

The RAND Europe authors are so stuck in their own echo chamber they don't realize…

2 weeks ago

Digital ID, vaccine passports are expanding to pets & livestock: UN AI for Good report

Humans, animals & commodities alike are all to be digitally tagged, tracked-and-traced equally: perspective The…

2 weeks ago

Teaching with tech: What’s changing and why It Matters (Brains Byte Back Podcast)

Teaching has changed a lot over the years, from chalkboards to laptops, from printed worksheets…

2 weeks ago