This article is from the source 'rtcom' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at https://www.rt.com/usa/415609-us-army-ai-language-bot/

The article has changed 3 times. There is an RSS feed of changes available.

Version 0 Version 1
Pentagon bots in your comments? US Army wants AI tool for social networks Pentagon bots in your comments? US Army wants AI tool for social networks
(about 11 hours later)
The US Army wants a new intelligence tool able to understand social media posts in languages including Russian, Arabic and French. It must also be able to answer on its own – just like those pesky “Kremlin bots” we hear about.The US Army wants a new intelligence tool able to understand social media posts in languages including Russian, Arabic and French. It must also be able to answer on its own – just like those pesky “Kremlin bots” we hear about.
The description of what the US military wants from the future software is outlined in a request for the submission of white papers published on the Federal Business Opportunities website on Wednesday.The description of what the US military wants from the future software is outlined in a request for the submission of white papers published on the Federal Business Opportunities website on Wednesday.
The self-improving AI tool is meant to work with text, voice, images and other content on social media in Arabic, French, Pashtu, Farsi, Urdu, Russian and Korean. It should understand colloquial phrasing, spelling variations, social media brevity codes and emojis, and also recognize various dialects.The self-improving AI tool is meant to work with text, voice, images and other content on social media in Arabic, French, Pashtu, Farsi, Urdu, Russian and Korean. It should understand colloquial phrasing, spelling variations, social media brevity codes and emojis, and also recognize various dialects.
The content will be automatically analyzed for sentiment – at minimum distinguish positive, neutral and negative emotions and preferably tell anger, pleasure, sadness and excitement. It should also have the “capability to suggest whether specific audiences could be influenced based on derived sentiment.”The content will be automatically analyzed for sentiment – at minimum distinguish positive, neutral and negative emotions and preferably tell anger, pleasure, sadness and excitement. It should also have the “capability to suggest whether specific audiences could be influenced based on derived sentiment.”
Additionally, the tool must be able to serve as a translator to English and back into the original language, and automatically generate “at least three, and up to 10, unique statements derived from one original social media statement, while retaining the meaning and tone of the original.” The responses should be customized according to whatever slang and emojis the original contained. The software is also required to monitor and analyze the impact of the message on the target audience.Additionally, the tool must be able to serve as a translator to English and back into the original language, and automatically generate “at least three, and up to 10, unique statements derived from one original social media statement, while retaining the meaning and tone of the original.” The responses should be customized according to whatever slang and emojis the original contained. The software is also required to monitor and analyze the impact of the message on the target audience.
The US military’s involvement in social media communication in other nations is hardly surprising. The Pentagon was among the pioneers of state ‘astroturfing’ campaigns online propaganda and social media manipulation through ‘sockpuppets’,  which are fake online personas purporting to be real people advocating whatever views the US military wants them to. The software is also required to monitor and analyze the impact of the message on the target audience.
This particular request came from the United States Army Intelligence and Security Command (INSCOM), which is primarily tasked with collecting intelligence. The bot-like tool described seems more along the lines of psychological warfare, the domain of the United States Army Special Operations Command (USASOC). It is possible, however, that the spooks need to convince their sources to cooperate with a little help from AI-generated messages. According to former MI5 officer Annie Machon, the Pentagon could be attempting to use the allegations of ‘Russian troll farms’ to justify and deflect attention away from the fact that US military intelligence has been engaged in exactly this kind of activity for years.
“The most obvious interpretation would be that this is a pushback against the allegations that have been made consistently for the last 18 months about so-called Russian troll farms influencing elections across the West, and it’s interesting to see the languages they are advertising for are the languages of Iran, and of course North Korea and Russia, so that would be a giveaway about which countries they want to be targeting,” Machon told RT.
“Having said that, the timing to me is interesting, because for sure the West has been running these so-called troll farms against other countries as well for a long time, so are they just trying to expand their operations by developing this new software? Or are they trying to disingenuously suggest to people that actually they haven’t done it before and only the Big Bad Russians, or the Big Bad Chinese, have run troll farms.”
The US military’s involvement in the social media communications in other countries is hardly surprising. The Pentagon was among the pioneers of state ‘astroturfing’ campaigns – online propaganda and social media manipulation through ‘sockpuppets,’ which are fake online personas purporting to be real people advocating whatever views the US military wants them to.
“Such software is essential for any army, any commander that is in involved in [such] an intelligence operation,” Pierluigi Paganini, Chief Technology Officer at CSE Cybsec, told RT. “For example, such kind of software is very important for social media activities and intelligence. That means it can also be used against terrorism.”
“Of course, in this specific case we can imagine also that such kind of software can be used for propaganda or to influence the sentiment of other countries. Basically, they can use this software exactly in the same way they say Russia is using it.”
This particular software request came from the United States Army Intelligence and Security Command (INSCOM), which is primarily tasked with collecting intelligence. The bot-like tool described seems more along the lines of psychological warfare, the domain of the United States Army Special Operations Command (USASOC). It is possible, however, that the spooks need to convince their sources to cooperate with a little help from AI-generated messages.