Features

China’s Changing Disinformation and Propaganda Targeting Taiwan

Recent Features

Features | Society | East Asia

China’s Changing Disinformation and Propaganda Targeting Taiwan

Since early August, evidence has emerged of a new type of CCP influence operation that sharply diverges away from traditional messaging and tactics.

China’s Changing Disinformation and Propaganda Targeting Taiwan
Credit: Illustration by Catherine Putz

The fallout from U.S. Speaker of the House Nancy Pelosi’s visit to Taiwan in early August has raised several questions over the state of cross-strait relations. Following announcements from the People’s Liberation Army (PLA) of military action in the Taiwan Strait as well as fiery rhetoric from Chinese officials, many wondered whether there would be another Taiwan Strait missile crisis, or indeed whether a Chinese invasion of Taiwan is likely to occur in the near future.

However, amid the controversy stirred by Pelosi’s trip to the island, few have picked up on a more subtle development in China’s strategy toward Taiwan: a shift in its disinformation and propaganda approach. Taiwan has long been inundated with Chinese Communist Party (CCP) pro-unification rhetoric and flooded with fake news. These activities are part of the CCP’s decades-long campaign to persuade Taiwanese people to support unification with China and a key aspect of “United Front work” – a strategy of propaganda alliance-building, which aims to influence civil societies overseas in ways that support the CCP’s central foreign policy goals.

In recent weeks, there are fresh indicators the CCP is actively gearing up for a new series of United Front activities toward Taiwan. Due to its Chinese-speaking population as well as its cultural similarities, shared history, and people-to-people exchanges with China, the island has long been treated by party officials as a testing site for various Chinese-language influence operations. These have often manifested in the form of disinformation campaigns, which seek to weaken the Taiwanese public’s trust in the government and or promote more China friendly candidates in local elections. The suicide of Taiwanese diplomat Su Chii-cherng in 2018 due to public backlash roused by fake news reports circulated by Chinese sources and the election of Kaohsiung mayoral candidate Han Kuo-yu are prime examples of the potential impact of these campaigns.

Similarly, during COVID-19 CCP disinformation sought to undermine trust in the current administration through rumors that the government was deliberately allowing an outbreak to occur and concealing the true number of COVID-related deaths from the public. Such activities have often been accompanied by pro-unification rhetoric referring to Taiwanese people as “compatriots,” urging “both sides of the Taiwan Strait to come together,” and claiming that people of Taiwan and China are part of “one family.”

Since early August, evidence has emerged of a new type of CCP influence operation, one that 1) sharply diverges away from traditional messaging, 2) seeks to remain resilient against single points of failures, and 3) utilizes new types of digital platforms such as news aggregation and discussion platforms like Reddit. This remains an ongoing campaign and the findings outlined in this article are preliminary. Nevertheless, these developments raise questions over the effectiveness and feasibility of the CCP’s disinformation and propaganda campaigns – all of which warrant further attention to such events as they unfold in the coming weeks and months.

A Shift in Tone

On the heels of Pelosi’s visit to Taiwan, a Chinese-language essay titled “Urging Tsai Ing-wen and her military and political leaders to surrender” (敦促蔡英文及其军政首脑投降书) was published on the internet. Authored by ex-deputy director of the PLA Daily Chen Xianyi, the essay is seemingly addressed to Taiwanese President Tsai Ing-wen and Taiwanese political and military leaders and stresses the ushering in of a “new stage” in cross-strait relations.

The essay opens with vivid descriptions of wartime scenarios where “the deafening sounds of artillery fire surrounds you on all sides” and “the PLA’s modern fighter jets have surrounded all of Taiwan like an iron barrel.” Chen also declares that the median line between China and Taiwan no longer exists and threatens to enforce a blockade, the duration of which remains at the whim of the CCP and the Chinese people. Compared to the previous pro-unification rhetoric from Chinese officials, such as “Chinese people do not fight Chinese people,” this marks a dramatic shift in tone in the CCP’s messaging toward the Taiwanese public.

The sharp shift in tone has been accompanied by perhaps more alarming developments: the emergence of a new digital disinformation campaign accompanied by distribution methods that deviate from the previous, well-established mold. Chen’s essay was turned into a video with traditional Chinese subtitles, narrated in Southern Min (Minnan hua), a dialect spoken within Taiwan but not as widely as Mandarin. The video also features a montage of PLA military footage.

Based on the CCP’s past propaganda and disinformation campaigns, one would expect this content to be shared, circulated, and promoted on platforms like Twitter by prominent Chinese officials, so-called “wolf warriors.” A key goal of this campaign style is public opinion management, as previous research indicates.

However, following Pelosi’s visit, it appears the CCP is evolving this strategy. Back in the summer of 2020, Twitter removed over 23,000 accounts for being part of a CCP disinformation campaign, and a further 150,000 accounts that sought to boost the tweet traction of the former group. However, the CCP’s new approach to circulating disinformation and propaganda builds in a degree of resiliency to avoid quick detection and removal.

The new campaign style appears to adopt a two-stage process, the first of which involves uploading video content onto Google-owned platforms such as YouTube through fake accounts with generic or gibberish names. For example, the video of Chen’s essay has been uploaded through YouTube accounts under the names of Araceli Kihn, Analisa Stradley, Eliseo Aimee, bshdh snbdg, and so on.

Here is the list of accounts that we have uncovered, along with their channel creation date (note that some might have already been taken down):

The second stage of this process involves circulating links to these YouTube videos on non-Google platforms such as Reddit – a news aggregation and discussion website consisting of subreddits, communities that revolve around a particular topic or theme. Fake Reddit accounts are then created to serve as a distribution mechanism by referencing (via hyperlink) the videos posted on the fake YouTube channels. Below are some of the fake Reddit accounts that we detected, along with their account creation date:

Building Resiliency Into Digital Disinformation Campaigns

The use of these separate digital platforms in conjunction (Google owned and non-Google owned) appears to demonstrate the CCP’s desire to strengthen the resilience and longevity of its digital disinformation framework. By contrast, previous disinformation and propaganda campaigns tended to rely on a single platform such as Twitter for both content hosting and content distribution. However, this method risked having a “single point of failure,” as the mass removal of a CCP-affiliated Twitter accounts in 2020 highlights. Platform administrators were too easily able to uncover a web of fake accounts that constituted a link in the broader disinformation network by simply tracing the digital activity and footprint of a single fake account (i.e., whom it follows, retweets, or engages with).

In this new style of disinformation and propaganda distribution, using sites like YouTube for content hosting and platforms like Reddit for content distribution ensures some resilience by avoiding breakage between the links. Under this set-up, removing CCP-affiliated accounts would require two successful reports: one from the hosting site and another from the distribution site. If one account is taken down by administrators, it would only partially disrupt the overall chain of disinformation. For example, if one of the aforementioned YouTube channels was removed the Reddit account would likely remain untouched, and the latter would simply reference another campaign-affiliated YouTube channel and vice versa.

Unlike previous Chinese disinformation and propaganda campaigns which featured a great deal of mutual interaction by different accounts, these YouTube channels also notably do not engage in any form of platform interaction (for example, commenting on other videos), suggesting they are intended to be as low-lying and inconspicuous as possible. As long as these accounts maintain “interaction silence,” it is possible for the content they host to continue slipping under the radar. The possibility of these channels being discovered is minimized as their existence can only be detected if they are referenced by the content distribution site (in this case, Reddit). By avoiding interacting with each other on these platforms, the accounts avoid generating a footprint of digital activity that would easily identify them as part a wider disinformation network – thus greatly reducing their likelihood of being identified and removed by platform administrators. This again appears to signify an important shift in Beijing’s approach to digital disinformation and propaganda circulation, which could apply to other cases beyond Taiwan.

An Incoherent or Evolving Digital Disinformation Strategy?

While the CCP’s attempt to build greater digital resiliency into its disinformation campaigns is worth further examining, these recent activities also raise questions over the coherency and feasibility of this strategy. The first question involves the objectives of these campaigns. If the goal is to instill fear into the Taiwanese population over a Chinese invasion, then the video of Chen’s essay may have been more effective if it had been voiced in Mandarin, Taiwan’s lingua franca, rather than Southern Min.

Second, if the target audience of this latest disinformation campaign is indeed the Taiwanese population, as it appears, using Reddit as a distribution mechanism would likely have minimal effect. Most Reddit users neither read Mandarin nor understand Southern Min, and Reddit is not as popular as other discussion forums among Taiwanese people.

Furthermore, some of the fake Reddit accounts within this disinformation campaign have also posted the video in forums that appear to have no relevance whatsoever – such as this post, which appeared on a subreddit for pet adoption. It is therefore no surprise that such posts are deemed irrelevant to those subreddits and deleted by auto-filters. In response, the fake accounts resort to posting on their own Reddit profile pages, which is unlikely to gain much digital traction.

All these developments point to either an incoherent digital disinformation strategy, or an attempt to utilize YouTube and Reddit as testing grounds to further improve and refine future CCP-led digital disinformation campaigns. For policymakers, researchers, and the people of Taiwan, this is certainly a development worth our careful attention over the coming weeks and months. By examining the CCP’s attempts to innovate its disinformation and propaganda tactics toward Taiwan, we can better understand and identify similar activities unfolding in other countries and regions.