Former U.S. Defense Intelligence Agency (DIA) Director Vincent R. Stewart notes that modern war has entered the age of fifth-generation warfare, which entails a combination of military and non-military means and the employment of information and public opinion control to achieve strategic high grounds. From this emerges a new term: hybrid warfare. Russian operations in the Crimea campaign in 2014 are a typical example, in which the combined use of non-conventional methods (subversive activities and cyberattacks) and conventional forces was applied to achieve geopolitical strategic goals.
As in Russia’s combined use of fake and real moves at once in its operations in the Crimea campaign in 2014, a similar situation might occur in Taiwan. For example, in a crisis situation in the Taiwan Strait, an interim government might be set up by pro-China elements cooperating from within the island. They would announce that the interim government has taken over the military and replaced the current government. The announcement, regardless of its legitimacy, could create considerable turbulence and even reduce the will of the military to fight. With cooperation from certain media outlets, the interim government could announce to local people that it has control of the whole country while declaring to the international community that the current government has surrendered. Such measures are meant to disrupt people’s will to rebel and to reduce the willingness of other countries to intervene. In such a scenario, the media – not military might – is likely to become the final winning factor. Whether Taiwan’s existing crisis management mechanism is capable of restoring media functions in a chaotic situation and ensuring the legitimacy of the current government remains to be seen. The military should not make light of a war scenario of this kind. Publicity through the media, not direct warfare, is now the main field for competition between nations.
A book entitled Unrestricted Warfare, written by two senior colonels of the PLA, caught much attention after its release in 1999. Nowadays, under the assistance of technology, there are new variations to this unrestricted warfare, which could be used together with the “three warfares” that China unveiled in 2003 to gain an advantage in its struggle to control public opinion.
With its progress in technology, China has moved beyond its past internet blockade tactics to actively initiating public opinion warfare on the basis of technological advantages in many areas. It spares no effort in promoting its image in the international community via official and unofficial channels. The most important part of this digital war for public opinion is knowledge about the target subject’s computer usage habits, preferred topics, ideology, and political orientation. Personal data of this kind will become a perfect sample for a country to design computer viruses and tactics to manipulate public opinion.
Apart from implanting malware through advanced persistent threat (APT) attacks, China also manages to obtain the personal information of computer users via a combined use of hardware and software. China’s progress in information technology in recent years has allowed its consumer electronic products to enter the global market thanks to their cheaper prices. Electronic products manufactured in China, including routers, mobile phones, and laptop computers, might all be embedded with backdoor programs, which automatically send user information back to the mainframe. The information is to a great extent about individual usage habits. In respect to software, people outside of China often use certain specific communication apps in their contacts with Chinese people. They may also register as users of Chinese-developed social media platforms or apps. These apps might contain backdoor programs that will automatically send user information back to the mainframe, too. Such information will be analyzed and used for future cyberattacks.
The most common form of digital public opinion warfare is the content farm, which refers to websites or web-based companies that seek profits through attracting a large number of page views. Content farms run large amounts of articles on their websites, which are promoted through accounts under their control, and sent via communication apps to a wide range of app users. Many of these users forward the articles to others without reading them through, thus becoming accomplices in disseminating fake news.
Content farm articles have also been weaponized. These articles often attack certain celebrities or scholars as part of a smear campaign. For example, one scholar, after an interview with a media organization with specific political stance, saw his remarks misinterpreted by some content farms, which purposefully took his words out of context. The scholar immediately made a clarifying statement on social media, but it was already too late. His misinterpreted remarks soon went viral on the Internet.
Content farm articles contain not just public opinion warfare propaganda re-written from other articles. They are often supplemented with seemingly irrelevant text or pictures, which convey messages such as greetings, “life hacks,” tips on staying healthy, and show business gossip. This arrangement is meant to attract readers and lure them into disseminating the articles. It is also a way of winning trust. Once that goal is achieved, the general public may be provided with misinformation, resulting in (for example) delays in medical treatment or misunderstandings about government policies.
At critical moments like elections and government announcements of major decisions, large amounts of fake news could be released to affect public opinion. At appropriate times, distributed denial of service (DDoS) attacks could also be launched against government websites to stop the general public from accessing correct information. People might be misled by rumors into creating internal turbulence or even taking violent action. As the country becomes divided, the enemy reaches the strategic goal of subjecting the opponent to its rule without fighting.
As a democratic country, Taiwan provides legal protection for the freedom of speech. Content farm articles are not entirely false; that makes it difficult to crack down on them. Furthermore, those websites usually make disclaimer statements to be exempted from liabilities. What they post is not wholly fake news, which makes it hard for law enforcement to take legal action against them.
With the popularization of internet technology, a combined use of digital technology and public opinion warfare on the internet will become the focus of attention for every country. How to beat the enemy by being one step ahead, prevent against possible war scenarios, and even launch a preemptive strike to gain the high ground in digital public opinion warfare will become areas of concern for Taiwan to probe into in the future.
In its development of conventional and non-conventional tactics to pose military threats, the PLA notably adopts more diversified and flexible means. Although the chances of the PLA using force against Taiwan are low, extra-military means such as sending military aircraft and ships to patrol around Taiwan and maximizing the effects of the patrols through making public relevant photos could still affect the public opinion in Taiwan and achieve intimidating effects at the same time. Digital public opinion warfare is also an option. All these tactics may affect the morale of both civilians and soldiers in Taiwan. The end effect could be greater than that of the regular military drills directed toward Taiwan; the threat of fake news to the security of Taiwan cannot be ignored.
In response, Taiwan should strengthen its publicity efforts, provide correct information, and establish channels to clarify rumors and stop misinformation from being disseminated. The government should make public its policy orientations and relevant information on a regular basis while setting up channels to refute rumors and gain an advantageous position in information control. Relevant government agencies should develop their spokespeople’s crisis management capabilities and establish communication channels with conventional media and social media so that they can immediately make clarifying statements, stop misinformation from spreading, and prevent Taiwan from being manipulated by outside forces who want to create internal unrest on the island. It is even possible to establish an independent third-party verification mechanism to add credibility to the effort.
Dr. Ying Yu Lin is assistant professor at the Institute of Strategic and International Affairs, National Chung Cheng University, Taiwan.