Jump to content

Russian web brigades

From Wikipedia, the free encyclopedia

WP web-brigade icon

Russian web brigades,[a] also called Russian trolls, Russian bots, Kremlinbots, or Kremlin trolls are state-sponsored anonymous Internet political commentators and trolls linked to the Russian government.[1][2] Participants report that they are organized into teams and groups of commentators that participate in Russian and international political blogs and Internet forums using sockpuppets, social bots, and large-scale orchestrated trolling and disinformation campaigns to promote pro-Putin and pro-Russian propaganda.[1][3][4][5][6]

Kremlin trolls are closely tied to the Internet Research Agency, a Saint Petersburg-based company run by Yevgeny Prigozhin, who was a close ally to Putin and head of the mercenary Wagner Group, known for committing war crimes before his death in 2023.[7] Articles on the Russian Wikipedia concerning the MH17 crash and the Russo-Ukrainian War were targeted by Russian internet propaganda outlets.[1][8][9][10] In June 2019, a group of 12 editors introducing coordinated pro-government and anti-opposition bias was blocked on the Russian-language Wikipedia.[11] During the Russian invasion of Ukraine in 2022, Kremlin trolls were still active on many social platforms and were spreading disinformation related to the war events.[12]

Background

[edit]

The earliest documented allegations of the existence of "web brigades" appear to be in the April 2003 Vestnik Online article "The Virtual Eye of Big Brother" by French journalist Anna Polyanskaya (a former assistant to assassinated Russian politician Galina Starovoitova[13]) and two other authors, Andrey Krivov and Ivan Lomako. The authors claim that up to 1998, contributions to forums on Russian Internet sites (Runet) predominantly reflected liberal and democratic values, but after 2000, the vast majority of contributions reflected totalitarian values. This sudden change was attributed to the appearance of teams of pro-Russian commenters who appeared to be organized by the Russian state security service.[14][15][16][17] According to the authors, about 70% of Russian Internet posters were of generally liberal views prior to 1998–1999, while a surge of "antidemocratic" posts (about 60–80%) suddenly occurred at many Russian forums in 2000. This could also be a reflection to the fact that access to Internet among the general Russian population soared during this time, which was until then accessible only to some sections of the society.

In January 2012, a hacktivist group calling itself the Russian arm of Anonymous published a massive collection of email allegedly belonging to former and present leaders of the pro-Putin youth organization Nashi (including a number of government officials).[18] Journalists who investigated the leaked information found that the pro-Putin movement had engaged in a range of activities including paying commentators to post content and hijacking blog ratings in the fall of 2011.[19][20] The e-mails indicated that members of the "brigades" were paid 85 rubles (about US$3) or more per comment, depending on whether the comment received replies. Some were paid as much as 600,000 rubles (about US$21,000) for leaving hundreds of comments on negative press articles on the internet, and were presented with iPads. A number of high-profile bloggers were also mentioned as being paid for promoting Nashi and government activities. The Federal Youth Agency, whose head (and the former leader of Nashi) Vasily Yakemenko was the highest-ranking individual targeted by the leaks, refused to comment on the authenticity of the e-mails.[18][21]

In 2013, a Freedom House report stated that 22 of 60 countries examined have been using paid pro-government commentators to manipulate online discussions, and that Russia has been at the forefront of this practice for several years, along with China and Bahrain.[22][23] In the same year, Russian reporters investigated the St. Petersburg Internet Research Agency, which employs at least 400 people. They found that the agency covertly hired young people as "Internet operators" paid to write pro-Russian postings and comments, smearing opposition leader Alexei Navalny and U.S. politics and culture.[24][25]

Each commenter was to write no less than 100 comments a day, while people in the other room were to write four postings a day, which then went to the other employees whose job was to post them on social networks as widely as possible.[24]

Some Russian opposition journalists state that such practices create a chilling effect on the few independent media outlets remaining in the country.[23]

Further investigations were performed by Russian opposition newspaper Novaya Gazeta and Institute of Modern Russia in 2014–15, inspired by the peak of activity of the pro-Russian brigades during the Russo-Ukrainian War and assassination of Boris Nemtsov.[26][27][28][29] The effort of using "troll armies" to promote Putin's policies is reported to be a multimillion-dollar operation.[30] According to an investigation by the British Guardian newspaper, the flood of pro-Russian comments is part of a coordinated "informational-psychological war operation".[31] One Twitter bot network was documented to use more than 20,500 fake Twitter accounts to spam negative comments after the death of Boris Nemtsov and events related to the Ukrainian conflict.[32][33]

An article based on the original Polyanskaya article, authored by the Independent Customers' Association, was published in May 2008 at Expertiza.Ru. In this article the term web brigades is replaced by the term Team "G".[34][35]

During his presidency, Donald Trump retweeted a tweet by a fake account operated by Russians. In 2017, he was among almost 40 celebrities and politicians, along with over 3,000 global news outlets, identified to have inadvertently shared content from Russian troll-farm accounts.[36]

Methods

[edit]

Web brigades commentators sometimes leave hundreds of postings a day that criticize the country's opposition and promote Kremlin-backed policymakers.[20][23][24][25][37][38] Commentators simultaneously react to discussions of "taboo" topics, including the historical role of Soviet leader Joseph Stalin, political opposition, dissidents such as Mikhail Khodorkovsky, murdered journalists, and cases of international conflict or rivalry (with countries such as Estonia, Georgia, and Ukraine, but also with the foreign policies of the United States and the European Union).[20] Prominent journalist and Russia expert Peter Pomerantsev believes Russia's efforts are aimed at confusing the audience, rather than convincing it. He states that they cannot censor information but can "trash it with conspiracy theories and rumours".[25]

To avert suspicions, the users sandwich political remarks between neutral articles on travelling, cooking and pets.[25] They overwhelm comment sections of media to render meaningful dialogue impossible.[39][40]

The effect created by such Internet trolls is not very big, but they manage to make certain forums meaningless because people stop commenting on the articles when these trolls sit there and constantly create an aggressive, hostile atmosphere toward those whom they don’t like. The trolls react to certain news with torrents of mud and abuse. This makes it meaningless for a reasonable person to comment on anything there.[24]

A collection of leaked documents, published by Moy Rayon, suggests that work at the "troll den" is strictly regulated by a set of guidelines. Any blog post written by an agency employee, according to the leaked files, must contain "no fewer than 700 characters" during day shifts and "no fewer than 1,000 characters" on night shifts. Use of graphics and keywords in the post's body and headline is also mandatory. In addition to general guidelines, bloggers are also provided with "technical tasks" – keywords and talking points on specific issues, such as Ukraine, Russia's internal opposition and relations with the West.[25] On an average working day, the workers are to post on news articles 50 times. Each blogger is to maintain six Facebook accounts publishing at least three posts a day and discussing the news in groups at least twice a day. By the end of the first month, they are expected to have won 500 subscribers and get at least five posts on each item a day. On Twitter, the bloggers are expected to manage 10 accounts with up to 2,000 followers and tweet 50 times a day.[38] There are some suggestions that Russian troll networks may be operating on Reddit, with certain boards being inundated with posts about how Ukraine is losing since Russia launched its full-scale invasion.[41]

Timeline

[edit]

In 2015, Lawrence Alexander disclosed a network of propaganda websites sharing the same Google Analytics identifier and domain registration details, allegedly run by Nikita Podgorny from Internet Research Agency. The websites were mostly meme repositories focused on attacking Ukraine, Euromaidan, Russian opposition and Western policies. Other websites from this cluster promoted president Putin and Russian nationalism, and spread alleged news from Syria presenting anti-Western and pro-Assad viewpoints.[42][43]

In August 2015, Russian researchers correlated Google search statistics of specific phrases with their geographic origin, observing increases in specific politically loaded phrases (such as "Poroshenko", "Maidan", "sanctions") starting from 2013 and originating from very small, peripheral locations in Russia, such as Olgino, which also happens to be the headquarters of the Internet Research Agency company.[44] The Internet Research Agency also appears to be the primary sponsor of an anti-Western exhibition Material Evidence.[45]

Since 2015, Finnish reporter Jessikka Aro has inquired into web brigades and Russian trolls.[46] In addition, Western journalists have referred to the phenomenon and have supported traditional media.[47][48]

In May 2019, it was reported that a study from the George Washington University found that Russian Twitter bots had tried to inflame the United States' anti-vaccination debate by posting opinions on both sides in 2018.[49]

In June 2019 a group of 12 editors introducing coordinated pro-government and anti-opposition bias was blocked on the Russian-language Wikipedia.[11] In July 2019 two operatives of the Internet Research Agency were detained in Libya and charged with attempting to influence local elections.[50] They were reportedly employees of Alexander Malkevich, manager of USA Really, a propaganda website.[51]

In 2020, the research firm Graphika published a report detailing one particular Russian disinformation group codenamed "Secondary Infektion" (alluding to 80's Operation Infektion) operating running since 2014. Over 6 years the group published over 2,500 items in seven languages and to over 300 platforms such as social media (Facebook, Twitter, YouTube, Reddit) and discussion forums. The group specialized in highly divisive topics regarding immigration, environment, politics, international relations and frequently used fake images presented as "leaked documents".[52]

Starting in February 2022, a special attempt was made to back the Russian war in Ukraine. Particular effort was made to target Facebook and YouTube.[citation needed]

Russian invasion of Ukraine

[edit]

In May 2022, during the Russian invasion of Ukraine, the trolls allegedly hired by Internet Research Agency (IRA) had reportedly extended their foothold into TikTok, spreading misinformation on war events and attempting to question or sow doubt about the Ukraine war.[12] Authentic-looking profiles had allegedly hundreds of thousands of followers.[12] IRA was reported to be active across different platforms, including Instagram and Telegram.[12]

See also

[edit]

Opinion-influencing operations in other countries

[edit]

Other countries and businesses have used paid Internet commenters to influence public opinion in other countries; some examples are below.

[edit]

Notes

[edit]
  1. ^ Russian: Русские веб-бригады

References

[edit]
  1. ^ a b c Stukal, Denis; Sanovich, Sergey; Bonneau, Richard; Tucker, Joshua A. (February 2022). "Why Botter: How Pro-Government Bots Fight Opposition in Russia" (PDF). American Political Science Review. 116 (1). Cambridge and New York: Cambridge University Press on behalf of the American Political Science Association: 843–857. doi:10.1017/S0003055421001507. ISSN 1537-8633. LCCN 08009025. OCLC 805068983. S2CID 247038589. Archived (PDF) from the original on 1 April 2022. Retrieved 10 March 2022.
  2. ^ Sultan, Oz (Spring 2019). "Tackling Disinformation, Online Terrorism, and Cyber Risks into the 2020s". The Cyber Defense Review. 4 (1). West Point, New York: Army Cyber Institute: 43–60. ISSN 2474-2120. JSTOR 26623066.
  3. ^ Shaun Walker (2 April 2015). "Salutin' Putin: inside a Russian troll house". the Guardian. Archived from the original on 2 April 2015. Retrieved 6 September 2024.
  4. ^ Paul Gallagher (27 March 2015). "Revealed: Putin's army of pro-Kremlin bloggers". The Independent. Archived from the original on 26 November 2015. Retrieved 6 September 2024.
  5. ^ Daisy Sindelar (12 August 2014). "The Kremlin's Troll Army". The Atlantic. Archived from the original on 12 August 2014. Retrieved 6 September 2024.
  6. ^ Olga Khazan (9 October 2013). "Russia's Online-Comment Propaganda Army". The Atlantic. Archived from the original on 9 October 2013. Retrieved 6 September 2024.
  7. ^ "Wagner boss Prigozhin confirmed dead in plane crash - Moscow". 27 August 2023. Archived from the original on 27 August 2023. Retrieved 13 January 2024.
  8. ^ Sorokanich, Robert (18 July 2014). "A Tweetbot Caught the Russian Gov't Editing Flight MH17 Wikipedia Info". Archived from the original on 15 November 2016. Retrieved 3 December 2016.
  9. ^ Dewey, Caitlin (21 July 2014). "Flight MH17's Wikipedia page edited by Russian government; An IP address associated with Vladimir Putin's office has made multiple edits to the Wikipedia page for the MH17 flight page". Toronto Star. The Washington Post. Archived from the original on 12 June 2018. Retrieved 10 August 2016.
  10. ^ Zeveleva, Olga (6 August 2014). "Knowledge is power: why is the Russian government editing Wikipedia?". The Calvert Journal. Retrieved 3 December 2016.
  11. ^ a b Kovalev, Alexey (5 July 2019). "Revenge of the editors". Meduza. Translated by Hilah Kohen. Archived from the original on 11 January 2020. Retrieved 8 July 2019.
  12. ^ a b c d "Russia's trolling on Ukraine gets 'incredible traction' on TikTok". the Guardian. 1 May 2022. Archived from the original on 13 May 2022. Retrieved 9 July 2022.
  13. ^ (in Russian) "They are killing Galina Starovoitova for the second time" Archived 2018-08-17 at the Wayback Machine, by Anna Polyansky
  14. ^ (in Russian)Virtual Eye of the Big Brother Archived 2019-12-19 at the Wayback Machine by Anna Polyanskaya, Andrei Krivov, and Ivan Lomko, Vestnik online, April 30, 2003
  15. ^ "Russian-American Russian Language biweekly magazine "Vestnik": Main Page [English]". Vestnik.com. Archived from the original on 26 September 2014. Retrieved 19 May 2014.
  16. ^ Vestnik online, April 30, 2003
  17. ^ (in Russian) Eye for an eye Archived 13 January 2013 at archive.today by Grigory Svirsky and Vladimur Bagryansky, publication of the Russian Center for Extreme Journalism [1] Archived 16 April 2013 at archive.today
  18. ^ a b Miriam Elder (7 February 2012). "Polishing Putin: hacked emails suggest dirty tricks by Russian youth group". the Guardian.
  19. ^ (in Russian) "Kremlin's Blogshop" Archived 3 March 2016 at the Wayback Machine by Anastasia Karimova. Kommersant Dengi, February 13, 2012
  20. ^ a b c "Russia - Country report - Freedom on the Net - 2013". Archived from the original on 5 February 2017. Retrieved 3 December 2016.
  21. ^ (in Russian) "Kommersant Director General Files Complain against Nashi Spokesperson". Izvestia, February 9, 2012.
  22. ^ "A global assessment of internet and digital media" (PDF). Archived (PDF) from the original on 29 November 2023. Retrieved 21 June 2024.
  23. ^ a b c Russia's Online-Comment Propaganda Army Archived 9 October 2013 at the Wayback Machine, The Atlantic, by Olga Khazan, 9 October 2013
  24. ^ a b c d "Internet Troll Operation Uncovered in St. Petersburg" Archived 6 October 2013 at the Wayback Machine, The St. Petersburg Times, by Sergey Chernov, 18, September,2013
  25. ^ a b c d e Ukraine conflict: Inside Russia's 'Kremlin troll army' Archived 19 August 2024 at the Wayback Machine, BBC
  26. ^ "The Menace of Unreality: How the Kremlin Weaponizes Information, Culture and Money". The Interpreter Magazine. 22 November 2014. Archived from the original on 11 December 2016. Retrieved 13 March 2015.
  27. ^ "Documents Show How Russia's Troll Army Hit America". BuzzFeed. 8 July 2014. Archived from the original on 10 October 2017. Retrieved 13 March 2015.
  28. ^ "Novaya Gazeta Publishes List of Kremlin Trolls, Finds Further Information About 'Troll Farm'". The Interpreter Magazine. 6 March 2015. Archived from the original on 4 November 2019. Retrieved 13 March 2015.
  29. ^ Dmitry Volchek, Daisy Sindelar (26 March 2015). "One Professional Russian Troll Tells All". Radio Liberty. Archived from the original on 23 September 2016. Retrieved 26 March 2015.
  30. ^ Sindelar, Daisy (12 August 2014). "The Kremlin's Troll Army". The Atlantic. United States: Atlantic Media. Archived from the original on 12 August 2014. Retrieved 6 June 2015.
    Seddon, Max (2 June 2014). "Documents Show How Russia's Troll Army Hit America". BuzzFeed. Archived from the original on 10 October 2017. Retrieved 5 June 2015.
  31. ^ Pomerantsev, Peter (9 April 2015). "Inside the Kremlin's hall of mirrors". The Guardian. Archived from the original on 2 November 2019. Retrieved 11 April 2015.
  32. ^ Lawrence Alexander (2 April 2015). "Social Network Analysis Reveals Full Scale of Kremlin's Twitter Bot Campaign". Global Voices Online. Archived from the original on 25 August 2015. Retrieved 13 April 2015.
  33. ^ "#KremlinTrolls and Other Acquaintances of RU EMB Canada". kremlintrolls.com. Archived from the original on 17 September 2015. Retrieved 12 September 2015.
  34. ^ Team "G" (How to unveil agents of siloviks at popular forums in the Internet) Archived 29 May 2008 at the Wayback Machine, May 25, 2008
  35. ^ "The Kremlin's virtual squad". openDemocracy. Archived from the original on 7 August 2018. Retrieved 6 September 2024.
  36. ^ "Russian Trolls Duped Global Media Thousands of Times". NBC News. 4 November 2017. Archived from the original on 4 June 2024. Retrieved 6 September 2024.
  37. ^ "Russia - Country report - Freedom on the Net - 2014". Archived from the original on 24 April 2019. Retrieved 3 December 2016.
  38. ^ a b Documents Show How Russia’s Troll Army Hit America Archived 10 October 2017 at the Wayback Machine, buzzfeed
  39. ^ The readers' editor on… pro-Russia trolling below the line on Ukraine stories Archived 27 November 2016 at the Wayback Machine, the Guardian, 4 May 2014
  40. ^ Putin's G20 Snub Archived 25 September 2015 at the Wayback Machine, The Moscow Times, Nov. 18 2014
  41. ^ "Pro-Putin Disinformation Warriors Take War of Aggression to Reddit". 12 December 2023.
  42. ^ Entous, Adam; Nakashima, Ellen; Jaffe, Greg (25 December 2017). "Kremlin trolls burned across the Internet as Washington debated options". The Washington Post. Archived from the original on 30 December 2017. Retrieved 12 January 2018.
  43. ^ "Open-Source Information Reveals Pro-Kremlin Web Campaign". Global Voices. 13 July 2015. Archived from the original on 17 July 2015. Retrieved 19 July 2015.
  44. ^ "Google выдал логово кремлевских троллей". 19 August 2015. Archived from the original on 29 January 2019. Retrieved 20 August 2015.
  45. ^ "Emails Link Kremlin Troll Farm to Bizarre New York Photography Exhibit". StopFake.org. 20 August 2015. Archived from the original on 17 April 2024. Retrieved 13 September 2015.
  46. ^ "Finnish journalist Jessikka Aro's inquiry into Russian trolls stirs up a hornet's nest". The Sydney Morning Herald. Archived from the original on 16 May 2024. Retrieved 6 September 2024.
  47. ^ "Article December 2017". Archived from the original on 23 May 2023. Retrieved 6 September 2024.
  48. ^ "Press association supports EL PAÍS official targeted by Russian smear campaign". EL PAÍS. Archived from the original on 6 December 2019. Retrieved 6 September 2024.
  49. ^ O'Kane, Caitlin (31 May 2019). "Russian trolls fueled anti-vaccination debate in U.S. by spreading misinformation on Twitter, study finds". CBS News. Archived from the original on 23 July 2019. Retrieved 1 June 2019.
  50. ^ "Libya Uncovers Alleged Russian Plot to Meddle in African Votes". Bloomberg.com. 5 July 2019. Archived from the original on 19 February 2022. Retrieved 6 September 2024.
  51. ^ Mackinnon, Amy (10 July 2019). "The Evolution of a Russian Troll". Foreign Policy. Archived from the original on 11 July 2019. Retrieved 14 July 2019.
  52. ^ "Executive Summary". secondaryinfektion.org. Archived from the original on 23 February 2024. Retrieved 17 June 2020.
[edit]

Literature

[edit]
  • Jolanta Darczewska: The Anatomy of Russian Information Warfare: The Crimean Operation, a Case Study. Centre for Eastern Studies, Warsaw 2014, ISBN 978-83-62936-45-8 (PDF)
  • Peter Pomerantsev & Michael Weiss: The Menace of Unreality: How the Kremlin Weaponizes Information, Culture and Money. The Institute of Modern Russia, New York 2014 (PDF)