"Disinformation campaigns aimed at undermining electoral integrity are expected to play an ever larger role in elections due to the increased availability of generative artificial intelligence (AI) tools that can produce high-quality synthetic text, audio, images and videos and their potential for targeted personalization. As these campaigns become more sophisticated and manipulative, the foreseeable consequence is further erosion of trust in institutions and heightened disintegration of civic integrity, jeopardizing a host of human rights, including electoral rights and the right to freedom of thought. → These developments are occurring at a time when the companies that create the fabric of digital society should be investing heavily in, but instead are dismantling, the “integrity” or “trust and safety” teams that counter these threats. Policy makers must hold AI companies liable for the harms caused or facilitated by their products that could have been reasonably foreseen. They should act quickly to ban using AI to impersonate real people or organizations, and require the use of watermarking or other provenance tools to allow people to differentiate between AI-generated and authentic content." By David Evan Harris and Aaron Shull of the Centre for International Governance Innovation (CIGI).
Election Security Practices
Explore top LinkedIn content from expert professionals.
-
-
Online misinformation is frequently highlighted as a blight that threatens to undermine the fabric of society, polarize opinions and even destabilize elections. In the latest issue of Nature, a collection of articles probe the scourge of misinformation and try to assess the real risks. In one research paper, David Lazer and colleagues examine the effects of Twitter deplatforming 70,000 traffickers of misinformation in the wake of violent scenes at the US Capitol in January 2021. In a second paper, Wajeeha Ahmad and co-workers explore the relationship between advertising revenue and misinformation. A Comment article by Ullrich Ecker and colleagues discusses the risks posed by misinformation to democracy and elections, and an accompanying Comment article by Kiran Garimella and Simon Chauchard assesses the prevalence of AI-generated misinformation in India. And David Rothschild and colleagues put the harms of misinformation into perspective, highlighting common misperceptions that exaggerate its threat and suggesting steps to improve evaluation of both the effects of misinformation and the efforts made to combat it. In our accompanying editorial we call for more data availability for researchers and greater transparency from online platforms https://lnkd.in/eWi6f_qt Nature Portfolio
-
The article discusses a recent development in Georgia's election system, where election-denying MAGA supporters have gained control of the State Elections Board. Their influence has led to the implementation of a controversial ballot-counting measure aimed at disrupting future elections. This measure is perceived as a strategic move to hinder the counting and certification of election results, particularly in the event of a Democratic victory. The ultimate goal, as suggested by the article, is to create delays that could potentially allow the election outcome to be decided by the House of Representatives, a scenario that could favor the former president. The article highlights concerns about the integrity of Georgia's election process, pointing out that these changes are part of a broader effort to undermine the electoral system. The author argues that the hand-counting measure, while framed as a way to improve accuracy, is in reality a tactic to create confusion and chaos in the vote-counting process. This could lead to significant delays, casting doubt on the legitimacy of the election results and potentially paving the way for further manipulation. Overall, the piece portrays these developments as a calculated attempt by election-deniers to exert control over Georgia's elections, with the potential to disrupt the democratic process. The author expresses alarm at the implications of these actions, warning that they could severely undermine public confidence in the electoral system and the peaceful transition of power.
-
5 Ways AI is Impacting the U.S. Election As AI advances, its influence on democracy grows, reshaping campaign strategies, voter outreach, and information dissemination. Embodying the duality of technology, AI has the potential to enhance democracy but introduces risks, particularly around misinformation and privacy. 1. CREATOR AND DESTROYER OF MISINFORMATION A primary concern is AI's role in fueling misinformation. Deepfakes and AI-generated text spread rapidly, polarizing the public and eroding trust. Recently, Grok, an AI chatbot on X, misled voters with outdated polling hours—taking 10 days to correct. This incident underscored the risk of unmonitored AI delivering misinformation directly rather than misinformation being created by users. Yet, AI also combats misinformation. Increasingly, it’s deployed to detect deepfake images it can generate. Platforms like Facebook now integrate AI fact-checking, providing real-time verification essential for journalists and voters alike. 2. THE ILLUSION OF CONNECTION: CHATBOTS IN POLITICS Chatbots and AI-powered communication tools allow campaigns to automate interactions, responding to voter inquiries on social media or through direct messaging at scale. The risk is these interactions create an illusion of accessibility and engagement, giving voters the sense that they’re interacting directly with a candidate when they haven’t. There are obvious philosophical implications to reducing politics to superficial interactions rather than meaningful civic engagement. 3. PRECISION OR MANIPULATION? MICRO-TARGETING AI’s data analysis powers micro-targeting in campaigns, analyzing demographic data, social media trends, and voting patterns. This lets campaigns tailor messages—for instance, climate change for younger voters, Social Security for older ones, which is critical in swing states. However, this precision raises ethical concerns around manipulation and privacy, even as it enhances campaign alignment with voter priorities. 4. DETECTING FOREIGN INFLUENCE AI is the primary tool to counter foreign influence online. FBI Director Christopher Wray noted that generative AI tools lower the barrier for foreign actors to create and distribute disinformation, like deepfakes. The FBI is collaborating with social media platforms to detect interference and deactivate bot accounts aimed at swaying public opinion. 5. SECURING THE VOTE Cybersecurity has been bolstered, with AI now detecting anomalies that could indicate hacking. The Department of Homeland Security is using advanced AI-driven monitoring in states like Pennsylvania and Arizona, responding to rising threats to election databases. In Georgia, AI has helped identify potential voter suppression patterns, especially in marginalized communities. This split between progress and concern underscores the need for scrutiny. Political philosophers, AI companies, and legislators alike must ensure AI's democratic potential is balanced against the risks it poses.
-
The Trump administration has initiated the dismantling of crucial federal defenses against foreign interference in U.S. elections, raising significant concerns: - Closure of the FBI's Foreign Influence Task Force - Reduction of over 100 positions at the U.S. Cybersecurity and Infrastructure Security Agency (CISA) - Absence of federal partners at the National Association of Secretaries of State winter meeting States cannot address this issue independently. Pennsylvania's Republican Secretary of the Commonwealth, Al Schmidt, is on record stating: "It is foolish and inefficient to think that states should each pursue this on their own." Why this matters Foreign meddling in U.S. elections is not hypothetical but a documented fact. The Senate Intelligence Committee's bipartisan report revealed that Russian operatives targeted election systems in all 50 states in 2016. The Department of Justice confirmed similar attempts by Iran, China, and Russia in recent elections. There is no reason to believe they will stop. What can we reasonably expect to happen as a consequence? 1. Heightened vulnerability: State election systems will lack federal backing against sophisticated foreign actors 2. Fragmented defenses: States may adopt inconsistent security measures without unified federal support 3. Loss of expertise: Disruption of years of institutional knowledge and security partnerships 4. Public trust: Visible security measures are crucial to maintaining trust in election integrity The crucial question is: What do we stand to gain by weakening these protections, and at what expense to our democratic processes? CISA has played a vital role in providing essential services to states, including vulnerability assessments, security evaluations, and Election Day crisis readiness. These services have bolstered election infrastructure nationwide, irrespective of political affiliations. #ElectionSecurity #CyberSecurity #VoterProtection #DemocracyMatters #NationalSecurity #CISA #CriticalInfrastructure https://lnkd.in/eqiJRn36
-
This week, I had the privilege of testifying before Congress on how combating financial crime is critical to safeguarding our elections and democracy. Foreign actors are exploiting vulnerabilities in our financial systems—using shell companies, crowdfunding platforms, and cryptocurrencies to fund election interference. These tactics aren’t just undermining democracy; they’re part of a broader epidemic of financial crime that cost Americans $12.5 billion in fraud losses last year alone and launders an estimated $2 trillion globally each year. Here’s what’s needed to close the gaps: ✔️ Expand 314(b) protections: Include crowdfunding sites, FinTech platforms, and crypto exchanges, paired with safe harbor to encourage information sharing. ✔️ Incentivize collaboration: Offer tax breaks or grants for financial institutions and tech companies that actively participate in sharing intelligence. ✔️ Close regulatory gaps: Mandate AML and KYC compliance across emerging payment systems. ✔️ Leverage advanced technology: Use AI to detect fraud while adopting privacy-preserving tools like fully homomorphic encryption. Solving financial crime doesn’t just protect consumers—it disrupts election interference and strengthens our democracy. Together, we can safeguard both our financial systems and electoral integrity. https://lnkd.in/eNxeQk6U 5OH Consulting LLC Christine O'Neill
“American Confidence in Elections: Prohibiting Foreign Interference”
https://www.youtube.com/
-
The UK general election in 2024 - Cybersecurity comes to the fore It’s no secret the United Kingdom has braced itself for a general election at some stage in 2024 with later in the year the suggested timeframe. However, the shock announcement this week of the UK general election taking place so early on July 4th was a surprise to many. This year is highlighted by journalists and political watchers as quite unique from a global electoral perspective with circa 64 countries or potentially 49% of the world’s democratic population (Time Magazine, 2023) embarking on some form of major government or political election this year through 2024. This perfect storm of elections and political communication circulating from the UK and across the globe to influence and effect nations and potential voters creates a perfect storm for cyber attacks, intentional and unintentional misinformation, disinformation and content manipulation. So what, now what .. It is essential that both personal IT users and organisations review and enhance Email security and phishing defences to reduce the volume of inbound content that may look legitimate but instead will be a well-formed cyber-attack masked as electoral news or insight capitalising on the human targets need for information. This may also be the first time organisations must consider the implications GenAI created digital misinformation or deepfakes designed to influence groups of the population and present “seemingly” legitimate information but with the user wholly unaware it is misleading or fake. The IT and cyber industry in tandem with the UK government must amplify discussions / actions to tackle the potential risks throughout this year of digital and cyber-attacks that will be used to target individuals and influence electoral outcomes Publicly increase messaging and education to the broader population to raise awareness of cyber-attacks at this stage and encourage all users of IT or digital systems to be on their guard. Don’t forget to maintain brilliant basics (patching, MFA, EDR, etc) because ransomware may be the final payload of a cyber attack campaign. Consider protection using digital platform and human fact checkers to minimise the flow of compromised or malicious information into an organisation. Commence updates to IT users that should be cascaded to family members with tips of what to look out for when considering potentially deepfake content or manipulated information designed to breach, misinform or misdirect Keep up to date with insight from the NCSC and other cyber threat intel sources to remain close to ongoing developments and potential cyber threats. Elevate the importance of ongoing identity hygiene reviews to highlight anomalies and deliver enhanced visibility and control of both human and nonhuman machine identities as they are dynamically used. More on this one as the stories unfold. Dr. Colin
-
For my latest, I explored how the Justice Department has been curtailing its customary election year coordination to protect state-run voting processes from threats. I interviewed numerous state election officials and former DOJ election crimes attorneys who warned that the cumulative effect of this pullback is heightened risks of the Trump administration interfering in the November midterms or unwittingly exposing precincts to threats. Among the changes: DOJ leaders have eliminated a centralized command post responding to potential election crimes nationwide 24/7 on election weeks, discontinued mandatory election law training for prosecutors, and restricted access to threat briefings for state officials, said people briefed on the situation. Trump-appointed US attorneys will now control the responses to threats in their districts, rather than career DOJ veterans from the public integrity section and civil rights criminal section - offices that have both been gutted. “That is now a significant risk—that the political ideology of US attorneys may stymie election enforcement in a way that department guidelines have tried to avoid,” said Mark Blumberg, who left the department in February after spending the last 20 years leading the centralized election response on civil rights threats. https://lnkd.in/eFqpeEdf
-
A new Global Witness investigation – a headline item on Radio 4’s Today Programme this morning – reveals how social media accounts that appear to be bots are spewing divisive content ahead of the UK’s elections on Thursday. Our Digital Threats to Democracy team uncovered how a small group of these accounts have posted more than 60,000 tweets since the general election was announced. It is estimated that these tweets have been seen a staggering 150 million times. Bots that are particularly dangerous for our democracies hide the fact that they are bots and spread political messages that are frequently and intentionally divisive and hateful. Accounts like these threaten our democracies by drowning out the voices of real voters and subverting the conversation. In our research, we focused on two topics that are central to the current UK election debates: climate change and migration. We gathered all the tweets since the election was announced that used specific hashtags related to those topics. The overwhelming majority of the bot-like accounts we found are overtly party political – they don’t just express political opinions, but clearly align themselves for or against a particular political party. Some of the bot-like accounts spread extreme and violent Islamophobia and homophobia. Some spread anti-Semitism and transphobia. Some state that climate change is a “hoax”, that vaccines have created a “genocide” and that the so-called great replacement theory is a fact. The responsibility lies with the social media corporations to make sure that their platforms are not being manipulated and our democracies are not being put at risk. The EU has recently passed legislation, the Digital Services Act, that requires platforms to mitigate any risks that their platforms pose to electoral processes, with the threat of huge fines if they don’t. The UK’s new Online Safety Act requires platforms to protect against the risk of foreign interference – and bots are one of the ways that a foreign power can attempt to sow division. The major social media platforms already have policies ban harmful bots. As our findings show, however, these standards aren’t being sufficiently enforced. We are calling on X to investigate whether the list of potential bots that we have provided to them violate their policies and to invest more in protecting our democratic debate from manipulation. We wrote to X to give them the opportunity to comment on these findings but they did not respond. https://lnkd.in/eQPZ68VF
-
Excellent (and terrifying) new analysis by Accountable Tech on election preparedness of social media platforms, including a scorecard measuring to what extent their policies meet recommendations made in the Democracy By Design roadmap – actionable, high-impact, and content-agnostic steps to protect the integrity of elections. Download here: https://lnkd.in/eNWHnHRE Thanks Nicole Gill and team for your hard work and insights. TOPLINES: 1. Out of a possible 100% match to the Democracy By Design recommendations, no platform scores above 62%. Nextdoor performs the worst, with a 17% preparedness score 2. There are insufficient guardrails to stop the spread of manipulated content depicting public figures, like deepfakes: Just 20% of platforms – TikTok andSnapchat – have policies on the books that would prohibit deceptively manipulated media of public figures. 3. Platform features enable AI-generated political ads to be micro-targeted to voters: Nearly every social media platform which allows political advertising does not explicitly prohibit AI-generated ads from being micro-targeted to voters. 4. There is a lack of transparency on performance and engagement related to election-related posts: No platform provides transparent access to data related to the highest-performing and highest-engagement election-related posts, advertisements, accounts, URLs, and groups. That means that voters, independent researchers, and election officials are left in the dark about how election-related information spreads across platforms. 5. Insufficient “friction” to stop the spread of misleading election information: A majority of platforms do not have policies in place to put posts that contain misleading or unverified election information behind click-through warning labels that include clear context and fact. Without these labels, election misinformation is able to be spread more quickly and magnify threats. 6. There's a lack of transparency, including an opacity around policy enforcement and safety teams: Some platforms, like Meta, which have previously come under intense scrutiny for their role in amplifying the spread of electoral disinformation narratives, have numerous policies, but it’s impossible to know how they are being enforced. Platforms have wide latitude when it comes to enforcement, and there is reason for skepticism that they meaningfully follow through. This is made more concerning because of industry-wide layoffs and cuts to election integrity safety teams – including the complete dismantling of X’s election integrity team.
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development