CNN [excerpt]: According to a report by Stockholm University’s Varieties of Democracy Project, published in March this year, #Taiwan for the 10th consecutive year received the greatest amount of #disinformation from outside its borders, highlighting the need for effective fact-checking mechanisms on the island. A growing security risk Disinformation is something Taiwan’s security agencies are particularly alert to. At a recent closed-door security briefing attended by CNN, Taiwan’s intelligence community warned that #China has been working to influence Taiwan’s upcoming #election through a series of disinformation, military and economic operations, with the goal of boosting the chances of opposition candidates who favor improving ties with Beijing. According to Taiwanese intelligence, Wang Huning, the fourth-ranking leader in th Chinese Communist Party, recently convened a meeting to coordinate efforts to influence the election, while reducing the likelihood that external parties could find evidence of such interference. “They hope that the party they dislike will lose the election,” a senior Taiwanese security official, referring to the ruling Democratic Progressive Party (DPP), which views Taiwan as a de facto sovereign nation and has prioritized elevating Taipei’s ties with Western powers since taking office in 2016. The candidate for DPP, Vice President Lai Ching-te, is currently leading in the polls, and is openly loathed by Chinese officials. Lai is ahead of two other candidates – Hou Yu-ih from the Kuomintang party and Ko Wen-je from the Taiwan People’s Party – who are seen as favoring closer relations with Beijing. Among the different strategies deployed by Beijing, Taiwan believes China’s cognitive warfare operations – which included spreading disinformation in Taiwan and magnifying talking points that favor China-friendly candidates – are the most sophisticated, multiple officials said at a closed-door briefing on security affairs attended by CNN. Besides operating content farms and fake accounts on #socialmedia, the officials alleged that China’s information operations are multifaceted. Other tactics used by Beijing included working with private companies to impersonate genuine #news websites, handpicking soundbites that fit Beijing’s narratives from Taiwanese television programs and repackaging them into short social media videos, and illicitly funding small news organizations in Taiwan that mostly report on local livelihood issues but also occasionally post content that cast doubts toward candidates unfavorable to Beijing. ...Besides spreading rumors, Beijing has also been exerting pressure on Taiwanese businesses with investments in mainland China to toe the partyline, and luring Taiwanese politicians with discounted trips to mainland cities in an attempt to generate support for candidates lobbying for closer ties to Beijing, the officials claimed. #geopolitics
Political Propaganda Techniques
Explore top LinkedIn content from expert professionals.
-
-
Microsoft’s second Threat Intelligence Election report for the USA published today: BLUF: Russian efforts are focused on undermining U.S. support for Ukraine while China seeks to exploit societal polarization and diminish faith in U.S. democratic systems. 🇷🇺 For example, the actor Microsoft tracks as Storm-1516, has successfully laundered anti-Ukraine narratives into U.S. audiences using a consistent pattern across multiple languages. Typically, this group follows a three-stage process: 1️⃣ An individual presents as a whistleblower or citizen journalist, seeding a narrative on a purpose-built video channel 2️⃣ The video is then covered by a seemingly unaffiliated global network of covertly managed websites 3️⃣ Russian expats, officials, and fellow travellers then amplify this coverage. 4️⃣ Ultimately, U.S. audiences repeat and repost the disinformation likely unaware of the original source. 🇨🇳 China is using a multi-tiered approach in its election-focused activity. It capitalizes on existing socio-political divides and aligns its attacks with partisan interests to encourage organic circulation. 💻 🇷🇺🇨🇳China’s increasing use of AI in election-related influence campaigns is where it diverges from Russia. While Russia’s use of AI continues to evolve in impact, People’s Republic of China (PRC) and Chinese Communist Party (CCP)-linked actors leverage generative AI technologies to effectively create and enhance images, memes, and videos. 🤡 Audiences do fall for generative AI content on occasion, though the scenarios that succeed have considerable nuance. The following factors contribute to generative AI risk to elections in 2024: ✔️AI-enhanced content is more influential than fully AI-generated content ✔️AI audio is more impactful than AI video ✔️Fake content purporting to come from a private setting such as a phone call is more effective than fake content from a public setting, such as a deepfake video of a world leader ✔️Disinformation messaging has more cut-through during times of crisis and breaking news ✔️Impersonations of lesser-known people work better than impersonations of very well-known people such as world leaders Report: https://lnkd.in/d-DjesN6
-
Russia is employing new tactics to spread disinformation in Scotland, now targeting small language communities through automated websites in minority languages. One recent example is the Pravda Alba website, which publishes fake news in Gaelic- a language spoken by only 1 in 40 Scots- to fuel ethnic tensions and discredit local politicians such as Scottish Labour leader Anas Sarwar. The site falsely claims, without evidence, that he is working to allow Pakistani Muslims to dictate what is taught in schools, among other racist and xenophobic insinuations: https://lnkd.in/dUdcYy3p. This campaign is part of a broader Russian strategy to destabilize Western democracies through disinformation, now using artificial intelligence for automated translation and mass dissemination of content in dozens of languages. The network of sites, bearing the name Pravda (“truth” in Russian), covers more than 80 countries and 130 different platforms, including those targeting Maori in New Zealand and Welsh speakers in the UK, and is already well-known to information operations and disinformation researchers. The goal is to flood the internet ecosystem with pro-Russian fake news so that even chatbots and search engines start reproducing these narratives as credible information. Experts note that the choice of small language communities is deliberate: disinformation spreads more easily where there is a lack of high-quality content in those languages, and automated translations allow for the rapid and cheap production of large volumes of material. The strategy also relies on the idea that even a small portion of the affected community might pass these false narratives on to the English-speaking majority. In the case of Pravda Alba, the content is machine-translated from Russian, often with grammatical and semantic errors, highlighting the campaign’s mass-produced rather than personalized nature. Although the site does not have a large audience, specialists warn that flooding the internet with such materials has a long-term effect, undermining trust in the media and making it easier for disinformation to penetrate automated systems and search engines. Russian disinformation operations in Scotland are not new. Moscow has previously attempted to influence public opinion through outlets like Sputnik, social media, and campaigns to sow distrust in institutions and societal division. What’s new is that, with the help of artificial intelligence, this strategy can now be applied even to the smallest linguistic and cultural communities, making it even harder to counter.
-
𝗪𝗶𝗹𝗹 𝘁𝗵𝗲 🇺🇸 𝟮𝟬𝟮𝟰 𝗨𝗦 𝗣𝗿𝗲𝘀𝗶𝗱𝗲𝗻𝘁𝗶𝗮𝗹 𝗘𝗹𝗲𝗰𝘁𝗶𝗼𝗻 𝗕𝗲 𝗟𝗶𝗸𝗲 𝟮𝟬𝟭𝟲 🐘 𝗼𝗿 𝟮𝟬𝟮𝟬 🐎 ? In our latest study, published in the Journal of Business Research, Prof. dr. Koen Pauwels, Dr. Kai Manke, and I analyze over 200 million social media posts to map the dynamic system and "echoverse" of political marketing. By combining campaign advertising data with media coverage, both online and offline word-of-mouth (WoM) data, disinformation, and candidates’ own social media posts, we demonstrate that the political marketing system is far more dynamic than the traditional marketing echoverse (as shown by Hewett et al., 2016). Our empirical analysis uncovers numerous bi-directional effects among the various stakeholders in this system, which can be leveraged to generate attention, engagement, media coverage, and ultimately, support for a candidate. Our key findings include: 💯 Both social media and traditional TV campaigns influence polls. However, social media is becoming more impactful than traditional TV. 📱 Candidates’ social media actions drive online discussions, which in turn influence polling numbers. 📣 💩 Social media chatter and polling data significantly drive disinformation volume, which leads to more media attention and further impacts polls. 📰 Traditional media coverage is predominantly driven by social media discussions and disinformation, amplifying online debates and ultimately enabling disinformation to influence polls. 🛑 While external events do affect political support, these impacts are much smaller than those driven by marketing, media, and WoM effects. 🗳 What should we expect in 2024? It seems the media still hasn’t learned its lesson. Trump continues to receive significantly more media coverage and attention than Harris. Once again, media coverage is heavily driven by social media interest and outcry, amplifying the MAGA narrative due to the rivalry between traditional and social media. Both candidates are seeing high social media engagement, but Trump benefits from a larger established user base, while Harris has lost momentum over the past three to four weeks. Where does that leave us? Certainly, in a very close race that is difficult to predict, but with a slight advantage for Trump. Study link in the first comment!
-
Major respect to Saman Nazari, Maria V. and Pavlo Kryvenko, and contributors Aleksandra Wójtowicz for their incredible work furthering research into the Doppelganger disinformation network, specifically on #Poland 🇵🇱. Shout out to Marie-Doha Besancenot for highlighting this article this morning. ➤ What Alliance4Europe and Debunk.org Exposed: • Russia’s Doppelganger network reactivated — targeting Polish presidential elections with anti-EU, anti-Ukraine, anti-establishment narratives. • 279 coordinated inauthentic posts pushing divisive, high-emotion narratives on social media (mostly on X). • False amplification tactics: fake Polish citizen personas, mass spam via bot networks, and real Polish news articles repurposed to inject disinfo. • The Social Design Agency (a sanctioned Russian entity) is directly behind the operation. • Clear election interference intent: inflame divisions, degrade political trust, and facilitate state propaganda. ➤ Tactics Identified (per DISARM Framework): ✴ Divide society (using cost of living, climate, immigration). ✴ Degrade adversaries (smear campaigns against Donald Tusk, EU leaders). ✴ Amplify pro-Kremlin narratives under the guise of local discourse. ✴ Fabricate legitimacy by using stolen or purchased X accounts dating back years. ➤ The Broader Implications: ⚡ Influence operations are moving from fake news creation to hijacking legitimate media and manufacturing grassroots voices. ⚡ Cognitive warfare now involves persistent, low-attribution manipulation — the information space is under continuous hostile shaping. ⚡ Platform defenses (like X) are lagging. Rapid detection and disruption capabilities are critical — especially during elections. The vignette regarding lack of reaction from X once alerted to the bot network is particularly alarming. ⚡ Defense tech must integrate agentic cognitive defenses — we can't just monitor narratives anymore, we need dynamic counteraction at machine speed. The battlefield isn't just physical or cyber anymore — it's inside public opinion, inside societies, and inside democracies. Read the article here: https://lnkd.in/emGssaba #CognitiveWarfare #InformationWarfare #DefenseTech #Disinformation #HybridWarfare #NationalSecurity #Doppelganger #Alliance4Europe #VannevarLabs #CounterInfluence
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development