Photo: CNA. Image: IORG.
Taiwan is under attack from the inside and out
“I think the situation is more serious than in previous years,” said Lin, and that the impact of information manipulation on the public has only grown. He divides the manipulation into two types: the first type spreads from Chinese social media – such as Weibo or from other web groups or sites – to Taiwanese web groups, news media or political talk shows, and into the public consciousness. The second kind has no clear source, such as the recent supply scare over locally-produced garlic, the peanut supply scare in Yunlin county, and other unfounded local rumours. The news then becomes a battle over public opinion, where fact and fiction mix and confuse the public.
The first type is the same as the one documented in the IORG’s report, Analysis of China’s Information Operations and Interpersonal Infiltration, which analyzes the spread path of disinformation in seven incidents, and shows how local proxies deepened its impact. The IORG believes these local proxies play a critical role in the process of information manipulation. Whether intentional or not, a high degree of participation leads to higher diffusion and impact for specific narratives, and a larger possibility for cognitive manipulation.
“Place of origin” labelling to expose source of information, and raising awareness on what’s fact and fiction
In order to properly monitor the influence of local proxies and reduce the damage they cause, Lin believes the issue should be discussed from two levels – ‘information recognition literacy’ and ‘information management.’ In regards to the former, Lin notes that Taiwan still lacks methods and tools for information literacy. “LINE has a mini-program called ‘Aunt Meiyu’ that helps clarify information, but the amount it can clarify is very limited.” Such a mechanism requires a database in the back end that can effectively detect and clarify false information, and there’s a time lag for that. In addition, building information recognition literacy into a habit is a long-term project; otherwise people who believe false information won’t click on fact-checking tools like Aunt Meiyu.
As for regulations, Lin believes that the most urgent task is establishing a “place of origin” labelling system that could be put under the title of a YouTube video when it features sponsored content or in any other commercial activities. Twitter puts warning labels under controversial posts, and Lin believes these can be used as a short-term response to information manipulation attacks. That way, the public can know the origin of the content and raise awareness about the authenticity of the information.
DPP’s role as a forum
Since Taiwan’s 2018 local elections, there has been an increase in the number and influence of political incidents caused by disinformation. The seven incidents analyzed by IORG all occurred between 2018 and 2020. Lin says the DPP has spent a lot of effort to study and review how to deal with information manipulation attacks over the past year. At present, the DPP has accelerated its response speed and improved how it clarifies information. In the past, press releases were in text form. Now they’ve evolved into simple illustrated “memes” that can be easily forwarded to others, and can match the speed to compete with the false information.
As the DPP’s Deputy Secretary-General, Lin sees the party’s role as a forum for academics, civil society groups and the media to come up with a workable solution that can be presented to the government for review. He also calls on the government to respond to this new kind of information warfare, stressing that there’s not enough discussion at the legislative level, and that more effort is needed to frame potential solutions early on.