Category: Disinformation

Leading governmental bodies and support organizations worldwide have become aware of the significant current threats related to disinformation. Disinformation is being used by state actors and–more insidiously–enabled by a growing industry of Disinformation as a Service (DaaS) vendors. The impacts are experienced well beyond the political realm, for example, as corporations and their employees are targeted to avoid Covid 19 vaccination– a threat to business continuation and public health.

Malign actors use false, misleading, and manipulated information to trigger mass changes in the public’s narratives, beliefs, emotions, and actions, to weaken societal infrastructures, and undermine the populace’s confidence in its government. Collectively, disinformation campaigns exploit enterprise platforms, data, and communities, but these tech platforms, coupled with organizations that confer credibility on domains, can be part of the defense against disinformation. These organizations can successfully monitor for and respond to relevant disinformation using the framework provided in a new white paper from FiveBy Solutions.

The white paper entitled, “Enterprise Appropriate Responses to Disinformation Risks,” describes an open-source framework of taxonomy, classification, labeling, and data interchange developed by data science industry experts as building blocks for a detection and remediation program which integrates into enterprise client platforms. The paper is intended for a business audience, including security, finance and marketing executives planning cognitive security responses and resilience measures.

The paper is structured around the tasks needed to create sustainable disinformation risk management:


Download the Whitepaper Here

FiveBy is a specialized risk intelligence services firm. We give you the insight you need to move faster and further with the confidence to transform your risks into opportunity. The opportunity to grow your profits, strengthen your brand, and exceed your customer expectations.  

Our unique point of view brings together expertise spanning security, technology, data science, and business operations to connect your dots. By turning data into an enabler, FiveBy designs adaptable responses—whether to an ongoing incident or to implement preventive measures—tailored to your business needs and always with a human touch. 

Click here for PDF

Executive Summary

A flood of proposed legislation to increase regulation over online content and recent contentious hearings on the subject indicate an appetite on Capitol Hill to hold tech companies accountable for disinformation and other harmful content hosted on their platforms. To mitigate possible liabilities, tech companies would be wise to increase content monitoring and analysis by linguistic, regional, cultural, and disinformation experts. Rising volumes of disinformation originating in Russia, Iran, and China underscore the necessity of specialized analytic expertise to supplement existing measures.

See Something, Say Something


On January 22, Senator Joe Machin (D-WV) proposed the See Something, Say Something Online Act of 2021 that would require tech companies to report suspicious content to support “criminal investigations and counterintelligence activities relating to international terrorism.” The proposal would require tech platforms to monitor and report suspicious content through Suspicious Transmission Activity Reports (STARs)—similar to the Suspicious Activity Reports (SARs) financial institutions must file in accordance with the Bank Secrecy Act.

  • The legislation would amend Section 230 of the Communications Decency Act to strip tech companies of their protections from being held legally liable for user activity if they do not file STARs to report suspicious social media content.
  • The bill mandates the creation of an online system to file STARs administered by a Justice Department agency that would also be established under the legislation. Tech companies would have 30 days to submit a STAR and would need to keep the report on file for five years.
  • The components of the STAR would identify the user’s name and location, the date and nature of the user’s post or other content, as well as time, origin, and destination and any relevant information or metadata related to the suspicious transmission. 

More Possible Regulation on the Horizon


Skopos Labs projects that Manchin’s bill has only a 4 percent chance of becoming law, but Congress and state legislatures clearly have an appetite for more regulation to counter social media disinformation and amend Section 230 to hold tech platforms at least partially accountable for what users post on their websites. The riots in Washington DC on January 6 amplified legislators’ inclination to hold tech companies responsible for posts published on their sites, which will likely prompt platforms to mitigate their liability by increasing monitoring of user content or implementing controls that would restrain engagement, reducing users’ ability or willingness to repost disinformation they see on the platforms.

  • Republicans have expressed discontent with social media companies targeting conservative content as disinformation. Section 230 does not obligate social media companies to conduct politically neutral content moderation. Legislation, such as the Platform Accountability and Consumer Transparency (PACT) Act, reintroduced by John Thune (R-ND) and Brian Schatz (D-HI) in March, would require tech companies to justify removing content and demonstrate that their moderation is neutral through biannual transparency reports. Analysts need disinformation expertise to moderate content and create these reports.
  • Republicans in the Iowa Senate have advanced a proposal to outlaw tax breaks for and contracts with companies that censor free speech, and several other states are considering measures to allow social media companies to be sued for censorship.

Regional and Language Expertise Needed


Although some disinformation originates in the United States, most of the content comes from Russia, China, and Iran. Regional, cultural, and linguistic expertise would almost certainly allow tech platforms to recognize the origin of disinformation using language and regional analysis and help nip disinformation campaigns in the bud before they become widespread across US audiences and inflict reputational damage on tech company brands.

  • Russia’s disinformation strategy involves more than just social media accounts. An August 2020 State Department report found that Russia also spreads disinformation via state-funded global messaging, official government communications, proxy sources such as news websites that republish disinformation, ghostwriters that cite proxy sources to build credibility, and cyber-enabled disinformation which includes hack and release cyberattacks, cloned websites, and fake satellite imagery. Social media helps spread and amplify disinformation from these channels.
  • China and Russia are both targeting Latin America with COVID disinformation campaigns, working to undermine confidence in the safety of the Pfizer vaccine and in the ability of the United States to manage the pandemic, according to SOUTHCOM Commander Craig Faller’s Senate testimony in March. China is combining its disinformation campaigns with “mask and vaccine” diplomacy by distributing millions of Sinovac vaccines and masks in Latin America to bolster Beijing’s political influence there. The Kremlin is targeting Latin America through coronavirus conspiracy theories and promotion of its Sputnik V vaccine on social media. Russia’s goal is to build closer ties with Latin American countries, which could “help Russian military and security actors gain access to the United States’ backyard,” according to the Carnegie Endowment for International Peace.
  • In November, the US Justice Department seized 92 domains linked to Iran’s Revolutionary Guard Corps (IRGC) that were spreading disinformation about US foreign policy in Iran and the Middle East. At least one of the websites had social media presence on Twitter, Facebook, Instagram, and YouTube and claimed to operate in the United States while listing an Iranian phone number. Similarly, In May 2020, Facebook dismantled eight Iran-linked networks involving more than 500 accounts that coordinated pro-Iran messaging campaigns targeted at Western voters to support Iran’s geopolitical interests. Iran’s PressTV network has also spread coronavirus conspiracy theories, along with criticism of the US maximum pressure campaign.
  • Analysts can apply their regional and language expertise to uncover linguistic patterns and disinformation networks and typologies. An analyst with expertise in Russian historical disinformation techniques and strategies can expose Russian methodologies, such as the use of sleeper accounts that gain long term audience trust and spread false news. Linguistic, cultural, and regional expertise can also help identify disinformation methods such as those Russia employed before the 2016 US presidential election or during the 2016 “Lisa case” in which Russian media promoted a false narrative about a missing young Russian-German girl being raped by Arab immigrants to accuse Germany of tolerating child abuse, provoking protests by Russian Germans.



Although opposing political forces have different visions for social media reform, both are pushing for more regulation of online disinformation. Whether through STARs or other regulatory requirements, regulators have shown a significant appetite to track and mitigate disinformation that aims to spread online hate, stop the manipulation of information, and undermine US society’s confidence in its government structures. The recent Global Trends report from the Office of the Director of National Intelligence highlights the growing power of social media that during the next 20 years will produce “content that could overtake expertise in shaping the political and social effects engendered by a hyperconnected information environment.” Power increasingly will be wielded by the generators of content as well as the arbiters of who gets to see it, which legislators will almost certainly cite as justification for further regulation of online platforms. Analyzing data from social network communities that provides insight into the origination of disinformation, funding, and its proliferation will require deep understanding of language and financial transfer methodologies, as well as cultural and regional expertise.


FiveBy is a specialized risk intelligence services firm. We give you the insight you need to move faster and further with the confidence to transform your risks into opportunity. The opportunity to grow your profits, strengthen your brand, and exceed your customer expectations.  

Our unique point of view brings together expertise spanning security, technology, data science, and business operations to connect your dots. By turning data into an enabler, FiveBy designs adaptable responses—whether to an ongoing incident or to implement preventive measures—tailored to your business needs and always with a human touch. 

Click here for PDF

Courtesy of Maryland Today

Disinformation about the COVID-19 vaccine—and the resultant refusal of many to get vaccinated—presents a significant threat to the US economy, and especially to companies operating in service industries and critical infrastructure sectors. Employees of many of these companies are more vulnerable to the virus because of extensive interactions with the general public and prolonged proximity to one another in the performance of their jobs, making countering false vaccination narratives even more critical for these firms. Foreign influencers seek to disrupt the US economy and are actively posting disinformation on social media and interjecting a fear of vaccines into the workforce’s feeds. Although there are some who will not take the vaccine for moral or health reasons, the refusal of too many workers to vaccinate can slow down forward momentum to reopen the economy. Proactively engaging disinformation experts to help counter anti-vaccination messaging can help counteract its negative impact.

Vaccination Key to Economic Recovery

In the time it has taken to develop a vaccine, several service and infrastructure sectors have experienced devastating impacts. As of early February, at least 28,700 grocery workers have been infected with or exposed to the virus and at least 134 have died from the disease. A study released in October found that nearly a quarter of New York City transit workers reported having contracted COVID-19. Another study on the impacts of the virus on the manufacturing industry indicates that the 10 most affected states could see a decline in manufacturing revenue of $400 billion. Women have disproportionately dropped out of the workforce, and the US gross domestic product was in decline for every state except Utah in the first three quarters of 2020.

Employers are key stakeholders in supporting vaccine adoption and are well positioned to facilitate vaccine distribution within their communities and among their employees, potentially accelerating herd immunity and economic recovery. However, disinformation campaigns aimed at the COVID-19 vaccine may diminish the effectiveness of these efforts. Disinformation is a significant factor in vaccine hesitancy in the United States, where, according to a recent survey, only 51 percent of unvaccinated adults have said they will definitely get vaccinated against COVID-19.

  • Even before the onset of the COVID pandemic, the World Health Organization (WHO) had identified vaccine hesitancy—the reluctance or refusal to vaccinate despite the availability of vaccines—as one of the top 10 threats to global health. Various experts estimate the threshold for herd immunity, historically achieved through vaccination, to be anywhere between 60-90 percent for the coronavirus.
  • Several companies have already started engaging in efforts to help their communities reach the required level of vaccination. Amazon recently announced its intention to begin administering vaccines to nearly 20,000 of its warehouse and grocery workers in Washington state. Some businesses are considering making the vaccine mandatory for returning to the office, while others, such as Google, are strongly encouraging it. Big-name companies such as Walmart, Starbucks, and Microsoft are partnering with local governments and medical providers to distribute vaccines within their communities

Disinformation Foments Distrust

The historic speed of the COVID-19 vaccine development process and mistrust of the medical community are among the factors contributing to vaccine hesitancy in the United States. Disinformation campaigns exploit these uncertainties and seek to amplify doubt, encompassing a wide range of topics from safety concerns to distrust of government.

  • Concerns about personal health and safety during a pandemic are natural, and much of the disinformation around the vaccine seeks to exacerbate these fears. Various conspiracy theories about the vaccine causing infertility (debunked here and here) or claims that the vaccine alters human DNA (debunked here) have propagated on the Internet.
  • One of the top myths about the vaccines is that high level officials, such as infectious disease expert Dr. Anthony Fauci (debunked here), and “corrupt” billionaires, such as Bill Gates (debunked here) are profiting from their hasty release. One conspiracy theory claims that the Gates Foundation could make nearly $43.5 billion from a COVID vaccine in the UK.
  • Disinformation campaigns also target the religious and moral convictions of significant sections of the US population. Several myths spreading across the Internet claim that the vaccine contains aborted human fetal tissue (debunked here and here).
  • Some disinformation campaigns rely on shock value, such as the assertion likely spread by Russia that one of the vaccine variants will turn people into monkeys because it relies on a modified chimpanzee adenovirus.

Disinformation Tool for Foreign Adversaries

Foreign adversaries have played a significant role in spreading COVID-19 vaccine disinformation in the United States, exploiting government and vaccine distrust among the US population. Russia and China have been the two major sources of foreign disinformation campaigns while working to promote their domestically developed COVID-19 vaccines.

  • Even pre-pandemic, Russia was a significant source of vaccine-related disinformation. An international time trends analysis, which examined attitudes toward vaccines on social media from 2018-2019, determined that a preponderance of social media campaigns promoting anti-vaccine content “originate from within Russia or via pseudo-state actors informally associated with Russia.”
  • The Russian COVID-19 disinformation strategy aims to undermine trust in western vaccines, while promoting the commercial success of the Russian vaccine. One such campaign, aimed at discrediting the AstraZeneca COVID-19 vaccine, targeted western nations as well as countries where Russia hopes to sell its own Sputnik V vaccine. Another false report, which quickly spread from a small Kremlin-friendly website to US-based Facebook groups, claimed five Ukrainians had died after getting the US vaccine. Although these efforts are driven in part by efforts to promote Russia’s Sputnik V vaccine, these disinformation campaigns are also consistent with Russia’s long-term strategy of sowing chaos, distrust, and division in western societies.
  • China’s disinformation campaigns promote its own vaccines while seeking to undermine trust in the vaccine developed by US pharmaceutical firm Pfizer and Germany’s BioNTech. Defenders of the Chinese Communist Party known as the “Wolf Warriors” have led efforts to cast doubt on the safety of US vaccines, spreading disinformation across social media. A propaganda network that owns thousands of fake YouTube, Twitter, and Facebook accounts has also carried out campaigns combining criticism of the US COVID response with vaccine disinformation—efforts that have recently been amplified by Chinese diplomats, as well as influencers in Latin America, Pakistan, and Hong Kong.
  • The Chinese propaganda network has deployed tens of thousands of bot accounts since 2018, although each account had relatively few followers and the content was often removed by tech platforms. However, the network has improved the ability of its bots to mimic humans, including by posting photos and videos and interacting with real humans in comments, making its disinformation efforts more sophisticated and harder to track.

Conclusion and Remediation

Companies can take several measures to encourage and facilitate COVID-19 vaccinations among their employees. Providing on-site vaccination and offering additional paid time off for employees to get vaccinated during the workday are just two strategies that may encourage personnel who may be hesitant to take time off to get inoculated. Companies such as Kroger have recently announced monetary incentives for employees who get vaccinated. Once logistical and financial barriers to vaccination have been addressed, employers can mitigate the risks of foreign disinformation campaigns by building disinformation literacy using trusted resources, especially in the educational and nonprofit sectors that can be adapted to meet their needs. FiveBy can help companies develop processes to address disinformation promptly and decisively and in a manner that informs longer term prevention.


FiveBy is a specialized risk intelligence services firm. We give you the insight you need to move faster and further with the confidence to transform your risks into opportunity—the opportunity to grow your profits, strengthen your brand, and exceed your customer expectations. Our unique point of view brings together expertise spanning security, technology, data science, and business operations to connect your dots, and we bring unique expertise in disinformation analysis and remediation to the table to help protect your reputation and your business. By turning data into an enabler, FiveBy designs adaptable responses—whether to an ongoing incident or to implement preventive measures—tailored to your business needs and always with a human touch.

Click here for PDF

Courtesy of Pixabay

In a press release on July 24, 2020, National Counterintelligence and Security Center (NCSC) Director William Evanina highlighted foreign threats to the 2020 US election, specifically focusing on China, Iran, and Russia’s efforts to undermine US citizens’ confidence in their government institutions. Influence measures on social and traditional media to sway US voters’ preferences during a particularly contentious election year have been pervasive and diverse. However, we assess that foreign actors’ efforts this election season are just the tip of the iceberg.

Read more