Wednesday, May 08, 2024

"The Human Mind Becomes a Battlefield"--5th Cyber Power Symposium on Hybrid Conflict/Warfare (CHP) theme: "The Cyber and Hybrid Aspects of Cognitive Warfare/Superiority" and Sascha Dov Bachmann on "Hamas-Israel: TikTok And The Relevance of The Cognitive Warfare Domain"

 

The Defence Horizon Journal has recently published a series of quite remarkable essays around the theme "aspects of cognitive superiority"--an issue at the heart of modern warfare--as the recent  coherent eruptions on US campuses and elsewhere evidence quite brilliantly: Aspects of Cognitive Superiority: Shaping Beliefs and Behaviours (26 April 2024; free download HERE).   The issue was built around the 5th Cyber Power Symposium on Hybrid Conflict/Warfare (CHP) and its theme: "The Cyber and Hybrid Aspects of Cognitive Warfare/Superiority."  The object is to continue to advance the study of cognitive warfare, and in the process to consider some of its implication for traditional approaches to the protection of human rights, including speech.

All of the essays produced for that event are worth a careful read.  Dr. Teija Tiilikainen, Director of the European Centre of Excellence for Countering Hybrid Treats, set the tone:

The recent focus on the cognitive dimension addresses a specific target of cyber threats, which may be far more difficult to protect than physical systems or structures. When threats are directed against the cognitive dimension, it is the mental structures, or the human mind in general, that become the target. This is nothing new, as conflicts and war have always included a strong ideational dimension. Apart from physical objectives, political conflicts deal with ideas, ideologies, and narratives. The novelty of current threats to the cognitive dimension is linked to modern technologies and the cyber capabilities they provide to influence and manipulate the human mind. In this way, the cyber and cognitive dimensions become a perilous combination that requires the immediate attention of the security policy community. (Teija Tiilikainen, ''The Cyber and Hybrid Aspects of Cognitive Warfare/Superiority,' Aspects of Cognitive Superiority, supra p. 4)

Major General Stefano Cont, Capability, Armament and Planning Director, European Defense Agency, noted in his keynote address:

In my view, cognitive warfare integrates cyber, information, psychological, and social engineering capabilities to achieve its ends. It exploits the internet and social media to target influential individuals, specific groups, and large numbers of citizens selectively and serially in society. Cognitive warfare therefore means that the human mind becomes a battlefield. The aim is not only to change what people think, but also how they think and act. When waged successfully, cognitive warfare shapes and influences individual and group beliefs and behaviours in favour of the tactical or strategic objectives of the attacker. We have to find the right answers to how we can strengthen our resilience against cognitive threats, and who we should educate, train and conduct exercises with to enhance our capacity to resist and respond. (Stefano Cont, ''The Cyber and Hybrid Aspects of Cognitive Warfare/Superiority,' Aspects of Cognitive Superiority, supra p. 6)

All of this, of course, has the potential to upend the carefully crafted and quite vibrant  foundational premises on which liberal democracy operates and that has served as its great strength. Premises crafted and so successfully applied  in a prior historical era may, in the face of technological, moral, and strategic revolutions in the present historical era, may be reshaped, whether the great liberal democratic institutions themselves drive them, or they are driven by events. The discussion, with additional non-military contemporary origins in the COVID mis- dis- and mal- information policy debates, is in its infancy. Its consequences, however, already are being felt.  

Pix credit here

At the same time it is necessary to pause for a moment to recall that the programming of people is an ancient science, one the domination of which has served as a foundation on which all social relations are built.  The process and its instruments were usually quite transparent, or at least easy enough for humans to track and expose. What makes the issue much more interesting now, and therefor far more useful as a critical instrument of warfare, is the way that technology has both enhanced and disguised the forms and projection of that programming. "Rudimentary capabilities previously limited CW-like operations to masses, nations, organizations, and occasionally high-priority leaders. Today, however, disruptive ICT has made identifying thousands—even millions—of specific individuals, analyzing their behaviors and traits, and targeting their cognition possible." (Majors Andrew MacDonald and Ryan Ratcliffe, U.S. Marine Corps, "Cognitive Warfare: Maneuvering in the Human Dimension," US Naval Institute April 2023 (Proceedings,Vol. 149/4/1,442). 


Pix credit here
Bernard Claverie and François du Cluzel noted that "Cognitive warfare is now seen as its own domain in modern warfare. . . Cognitive warfare is the art of using technological tools to alter the cognition of human targets, who are often unaware of any such attempt - as are those entrusted with countering, minimizing, or managing its consequences, whose institutional and bureaucratic reactions are too slow or inadequate." (Bernard Claverie and François du Cluzel, 'The Cognitive Warfare Concept' (December 2023).The programming of discursive contests around social relations have moved form the human to the virtual--and from physical reality to its simulacra. And, given the way in which the regulation of generative artificial intelligence, and big data tech is moving, it is likely to produce fundamental conflict between principles of control and the realities of the weaponization of technology for the control of the minds of physical beings who can then be deployed within the target political community (eg here, and here; more theoretical discussion here).

It is in this context that the essays in this issue are most useful. These include essays by (1) Sascha Dov Bachmann (TikTok And The Relevance Of The Cognitive Warfare Domain, pp. 7-10); (2) Peter B.M.J. Pijpers (On Cognitive Warfare: The Anatomy of Disinformation, pp- 11-17); (3) Matthias Wasinger (The Highest Form of Freedom and the West’s Best Weapon to Counter Cognitive Warfare, pp. 18-25); (4) Maria Papadaki (The Role Of Cyber Security In Cognitive Warfare, pp. 26-31); (5) Josef Schröfl and Sönke Marahrens (The Russia-Ukraine Conflict From a Hybrid Warfare Cognitive Perspective, pp. 32-40); (6) Chris Bronk (New Problems in Hybrid Warfare: Cyber Meets Cognition, pp. 41-47); (7) Gazmend Huskaj (Future Elections and AI-Driven Disinformation, pp. 48-59); (8) Matthew Warren (Hybrid Threats – The Chinese Focus On Australia, pp. 60-64); and (9) Bernard Siman (AI and Microtargeting Disinformation As A Security Threat To The Protection Of International Forces, pp. 65-69). 

Each is worth a careful read; the abstract/summaries (in the form of abstract; problem statement; and 'so what?')  of each follow below. I want to briefly highlight one of the contributions for its connection with recent events in the United States.

Pix Credit here

 

Sascha Dov Bachmann (TikTok And The Relevance Of The Cognitive Warfare Domain, pp. 7-10; see also here) argues that the Hamas-Israel war is being fought both on the battlefield but also in the domain of cognitive warfare.  The cognitive objective is narrative control--to manage the way that foreigners see, understand, and approach the war and its combatants, and in this way to sway the governments that might ally themselves with one (the Israeli) side. One combines narrative premises that are then embedded into interpretive processes around events, making it possible to have a target group believe the unbelievable and to reject as specious facts provided to refute false assertions on the basis of the premises that change cognitive processes. The success of this cognitive war was illustrated with the affair of the bombing of the Al-Ahli Hospital in Gaza. 

Bachmann notes that in the Gaza context, the global network managing cognitive operations use a combination of disinformation, robustly leveraged through supportive traditional news organs and other reliable mouthpieces with some influence among target collectives, and the management of the cognitive processes through which target collectives are trained to receive and process information. To that end cognitive rewiring is made more successful when existing cognitive and interpretive processes are redirected or developed rather than when they are substituted with something else. In the Gaza case that is an easier proposition according to Bachmann. 

Recycling old historical positions and facts regarding colonialism and oppression are part of the new cognitive warfare approach. . . . Targeting Western audiences with anti-Ukraine and anti-Israel content on TikTok is highly sophisticated and shockingly successful. Young Australian and U.S. audiences have become convinced that Israel is a foreign coloniser of indigenous land and is waging a genocidal war against the Palestinians as the land’s indigenous people. . . . TikTok’s targeting of Generation ‘Z’ in the context of the Palestinian– Israel conflict highlights the role this generation is being accredited for in going against the political and diplomatic position their governments would take.

The management of the story of the Al-Ahli Hospital suggests the utility of cognitive techniques. A New York Times analysis  in its own tentative way framed the issue and the consequences:

The hospital explosion is important in its own right: It was the biggest news story in the world for days and sparked protests across the Middle East. The explosion also has a larger significance: It offers clues about how to judge the claims about civilian casualties that are central to Hamas’s war message. * * * This evidence, in turn, suggests that the Gaza Ministry of Health, controlled by Hamas, has deliberately told the world a false story. U.S. officials believe that the health ministry also inflated the toll when it announced 500 deaths; the actual number appears to be closer to 100. This episode doesn’t mean that Gazan officials always mislead or that Israeli officials always tell the truth. * * * But the hospital explosion offers reason to apply particular skepticism to Hamas’s claims about civilian deaths — which are an undeniable problem in this war. Hamas’s record on the war’s most closely watched incident does not look good. (Revisiting the Gaza Hospital Explosion)

Pix credit here
 The role of Tik Tok and its  imitators is especially interesting. It suggests a number of strategic considerations. The first is that the transmission of knowledge  is increasingly detached from its older forms. That, in turn, suggests tat the old premises on which authoritativeness and relevance for knowledge production and transmission--grounded in text and its sources in conventional institutional voices and their bureaucracies (including academia)--is giving way to new markers of authority.   The third is that notions of transparency are increasingly failing as the voices projected virtually may not be who they represent themselves to be, nor, for that matter, are the virtual representations human in the sense of the recording of natural persons. It follows that the transmitters of knowledge and their legitimacy are masked within expectations of legitimacy--grounded in age, and the emerging identitarian categories. But in the virtual anyone can be anything. The fifth is that while the performance of knowledge production and transmission through virtual spaces may be detached from its sources, the forms and objectives of that production and dissemination are easier to obscure. It follows that obscurity can hid both contestations and the guidance of baseline premises and modes of thinking from out of which it is possible to direct "correct" or "intended" interpretation of data. The seventh is that the forms of knowledge transmission, and the tending to the structures of cognition, have shifted in form as well as content. The visualization of knowledge, and the processes of making sense of imagery, connected to the shifting of cognition as something felt rather than thought, substantially shifts as well the way in which a subject population can be made to think in a particular way or approach understanding of stimuli (now visual and aural, supported by text but not driven by it) in a way that predictive modeling can help produce. For examples images of dead babies and blasted civilian apartment complexes may be connected to a bombing or it may be connected to a decision to embed combatants in a space reserved for medical care of civilians. The pathways to cognition as a function of that imagery will depend on the hard work of managing predicate presumptions about justification of belief that guide targeted groups to think, in this example, in terms of "knowing"that the moral unworthiness of striking an opponent and "knowing" that the decision to conduct military operations from under a civilian apartment building is a matter of moral indifference or in this case a positive moral stance.

Two last points are worth mentioning as perhaps consequences of these movements.  The first is that what follows is that the management of cognition permits the management of emotion (currently the politics of rage, though that also has a long pedigree in the pre-virtual world) that reduces the necessity of the rational by substituting pre-packaged analytical pathways. Cognition cultivates feeling rather than understanding. Triggering, fear, elation, anger, rage, and the like, are the cognitive pathways not just to knowledge, but to its interpretation. An anti-rationalism, long in the making from the time of the first efforts at psychoanalysis, permits an alignment of cognition and pre-digested meaning. In one sense this takes one back in time to the pre-modern; but in another is it a quite post-modern project, fusing psychologies of the self with collective meaning making that is driven by (eventually generative) big data tech.  That, in turn, permits the insertion of interpretive conclusions that make sense even against physical data. It is not  merely that facts don't matter; it is the control of the cognition of facticity in virtual realms that make it possible to detach ourselves from the limits of the observable in the physical in space, place and time. In a sense, these are well evidenced by the Al Ahli affair. 

Pix credit here
The second is that, especially when combined with strategic political agitation, the stress on the 18th century construction of notions of speech, speech acts, and engagement within a political collective, may require, or may make inevitable, some substantial redevelopment of the core premises of those structures.  The semiotics of speech and speech acts already point in that direction in the era of the virtual. Speech is an object, it is an embodiment of signification, and its meaning and power is a function of collective rules for its interpretation and consequences. The 18th century construct presumed an (idealized) identity between the three, as well as its circularity, self-referencing character, and the identification and differentiation of the internal and external. What is emerging as cognitive warfare principles and practice merely suggest what semiotics has understood for some time: that this ideal within political systems can be managed to guide, in turn, the stability and solidarity of groups presumed to be engaged in self-referencing dialectical experiences. Tik Tok, in this sense, is a semiotic representation of the 20th century age of the therapeutic and of self-actualization within identitarian categories, now digitized and (re)manufactured within manufactured virtual spaces from which new ways of embracing group feeling can be projected; what appears free, again, is managed. Again, one partial way to begin to try to think about how/why this different from the past is both (1) the utility of the virtual in detaching physical from digitalized spaces; and (2) the diffusion of presumptions of authority from its old institutions (from which pre-modern cognition wars were fought in the form of religious ans cultural wars) downward (to individual speakers), inward (to virtual sources), and outward (toward interlinked and masked generators of strategic hard rewiring of cognitive processes to specific ends) (longer and more theoretical discussion of implications and sources here). Where that takes social structures, at this point, is anybody's guess.

Pix credit here
All of these are quite preliminary thoughts. Prof. Bachmann provides a quite useful way to organize that thinking. What makes for cognitive success is implied in the analysis though unconsciously--the willingness of target collectives to presume the proclivities of one side and the other; the premise that one side must be both capable and morally indifferent enough to undertake that action, and the willingness to believe the power of that amorality to produce substantial death. Even after investigation, there is a reluctance to credit the facts--because they get in the way of the underlying premise, and with it the cognitive certainty that there is a villain in the story is is always the Israelis. The reluctance is always expressed in the form of something like, well they were right this time but we know they are otherwise morally suspect in what they do and say. They cannot be believed at least without substantial proof (eg here). When generalized we come back to Prof. Bachmann's thesis--that segments of the population can be trained to approach reality in a specific way that can be managed in a way that advances the interests of those doing the managing. In this case it is Hamas that has co opted control of the cognitive development of important segments of targeted populations to serve its own interests in ways that may have consequences for cognitive solidarity structures within the state targeted. And in the domains of cognitive warfare the combatants include the governing institutions of the United States the stability and control of which is the subject of the intervention (here, here, and here).

 







Hamas–Israel: TikTok And The Relevance Of The Cognitive Warfare Domain

Author: Sascha Dov Bachmann is Professor in Law and Co-Convener National Security Hub (University of Canberra), University of Canberra, and a Research Fellow with the Security Institute for Governance and Leadership in Africa, Faculty of Military Science, Stellenbosch University. He is also a Fellow with NATO SHAPE – ACO Office of Legal Affairs, where he works on Hybrid Threats and Lawfare. The views contained in this article are the author’s alone.

Abstract: As much as the current Hamas-Israel war occurs on the battlefield, it is being fought in the domain of cognitive warfare. The current conflict highlights the use of cognitive warfare – to influence public support for either side. In cognitive warfare, the human mind becomes the battlefield. The aim is to change what people think and how they think and act. Cognitive warfare as information warfare is what we see again in the current Hamas - Israel conflict: the bombing of the Al-Ahli Arab Hospital in Gaza and the question of attribution and its exploitation have shown the power of both influence operations and dis- information as key elements of cognitive warfare.

Problem statement: How to understand antagonist pow- er’s efforts targeting young audiences in the cognitive domain?

So what?: The West is on a trajectory to lose its youth to such malicious foreign influence in the cognitive domain. This undermining of Western resilience will only benefit the new global order of authoritarian regimes and despotism, with the PRC and Russia being the two main geopolitical players. A comprehensive whole of government plus and society approach involving all stakeholders (both public and private) is needed to raise awareness and work towards both deterrence and resilience.


* * * 

On Cognitive Warfare: The Anatomy of Disinformation

Author: Dr Peter B.M.J. Pijpers is Associate Professor of Cyber Operations at the Netherlands Defence Academy in Breda and a Researcher at the Amsterdam Centre for International Law (University of Amsterdam). A Colonel in the Netherlands Army, he has been deployed four times to mission areas, including Iraq and Afghanistan, and was seconded to the European External Action Service for three years. Dr Pijpers has (co-)authored articles on the legal and cognitive dimension of influence operations in cyberspace and how armed forces can manoeuvre in the information environment. See also Orcid ID 0000-0001-9863-5618. The author can be reached at b.m.j.pijpers@uva.nl. The views expressed in this article are solely those of the author.

Abstract: Cognitive warfare entails narrowing down the execution of warfare to the cognitive dimension. While presented as a new notion, cognitive warfare as a concept articulates the essence of warfare, namely changing an opponent’s attitude and will – and hence their cognition. Although the concept is not new, the resurgence in attention and relevance is due to the inception of cyberspace (and social media), as well as knowledge of cognitive psychology. This renewed focus is particularly evident in the use of disinformation in influence operations.

Problem statement: How is disinformation used to influence the cognition of other geopolitical actors?

So what?: Societies need to be aware of the dangers of cognitive warfare, and become acquainted with its techniques. However, cognitive warfare alone will not win wars; its effectiveness is maximised in combination with and synchronised with other instruments of state power.
* * *


The Highest Form of Freedom and the West’s Best Weapon to Counter Cognitive Warfare

Author: Matthias Wasinger is a Colonel (GS) in the Austrian Armed Forces. He holds a Magister in Military Leadership (Theresan Military Academy), a Master’s degree in Operational Studies (US Army Command and General Staff College), and a PhD in Interdisciplinary Studies (University of Vienna). He has served both internationally and nationally at all levels of command. He is also the founder and editor-in-chief of The Defence Horizon Journal. Since 2020, he has served at the International Staff/NATO Headquarters in Brussels. The views expressed in this paper are the author’s alone.

Abstract: “Our remedies oft in ourselves do lie, which we ascribe to heaven.” Shakespeare’s timeless words echo throughout history and find validity in contemporary struggles. Modern warfare is, waged in the human domain more than ever, with the human mind becoming the battlefield in cognitive warfare. The aim is to change not only what people think – but how they think and act. Waged successfully, cognitive warfare shapes and influences individual and group beliefs and behaviours to favour one’s objectives. Whereas information warfare seeks to control pure information in all its forms, cognitive warfare seeks to control how individuals and populations react to the information presented. Therefore, achieving and preserving cognitive superiority is key. However, this prized end does not justify using all given means. 

Problem statement: How to understand the correlation between cognitive warfare and limitations imposed by Western values?

So what?: State actors can achieve cognitive superior- ity, either inductively through regulations and laws or through educational empowerment. Whereas restric- tions might serve as a temporary solution to mitigate immediate risks, only education provides democracies with a sustainable solution. Any total defence approach has to be built on understanding, common values and the educated willingness to fight for the respective identity.


* * *

The Role Of Cyber Security In Cognitive Warfare

Author: Dr Maria Papadaki is an Associate Professor in Cyber Security at the Data Science Research Centre, University of Derby, UK. She has been an active researcher in the cyber security field for more than 15 years, focusing on incident response, threat intelligence, maritime cybersecurity, and human-centred security. Her research outputs include 70+ international peer-reviewed publications in this area. Dr Papadaki holds a PhD in Network Attack Classification and Automated Response, an MSc in Networks Engineering, a BSc in Software Engineering, and professional certifications in intrusion analysis and penetration testing. The views contained in this article are the author’s alone and do not represent those of the University of Derby.

Abstract: Cognitive warfare has taken advantage of the 21st century’s technological advances to evolve and alter the way humans think, react, and make decisions. Several stages utilise cyber security technological infrastructure, especially in the initial stages of content creation, amplification and dissemination. In fact, evidence points to the use of cognitive threats as a means of inciting wider cyberattacks and vice versa. The war in Ukraine has accounted for 60% of observed cognitive incidents, with Russia being the main actor in this context. The DISARM framework outlines the two nations’ prominent Tactics, Techniques and Procedures (TTPs) as the development of image-based and video-based content, the impersonation of legitimate entities, degrading adversaries, and the use of formal diplomatic channels. Combining the DISARM and ATT&CK frameworks could enhance the analysis and exchange of threat intelligence information.

Problem statement: How to analyse the relationship between cyber security and cognitive warfare and what lessons can we learn from the war in Ukraine?

So what?: Although the DISARM framework is still in the early stages of its development, it provides an invaluable step towards opening up the dialogue on and understanding of FIMI behaviours across the community. Adopting cyber security concepts for the rapid build-up of capacity and resilience in the cognitive domain would be beneficial. In parallel, educating a multi-disciplinary workforce (and society as a whole) against combined scenarios of cognitive warfare and cyberattacks would help to improve their resilience.

* * *

The Russia-Ukraine Conflict From a Hybrid Warfare Cognitive Perspective

Author: Josef Schröfl started his career in the Austrian Armed Forces in 1982 and worked since then in various areas of the military, includ- ing several military operations/UN tours, e.g., to Syria. Since 2006 he served in the Austrian MoD, heading: “Comprehensive Approach”, “Hybrid threats”, and “Cyber Security/Cyber Defence”. He holds a B.A. in Computer Technology, an M.A. in International Relations from the University of Delaware/US, and a PhD in International Politics from the University of Vienna. Several publications/books on Asymmetric/ Cyber/Hybrid threats, crisis, conflict and warfare. Peer Board member/ reviewer of/from several magazines, e.g., “The Defence Horizon Journal”. Current position: Deputy Director for CoI Strategy & Defence at the hybrid CoE in Helsinki/Finland, leading the Cyber-workstrand there. Sönke Marahrens, Colonel i.G. (GER AF), Dipl. Inform (univ), MPA, Director, COI Strategy & Defence. He is a career Air Force officer, previously serving as head of research for Strategy and Armed Forces at the German Institute for Defence and Strategic Studies in Hamburg. As well as a Full Diploma in Computer Science, he holds a master’s degree from the Royal Military College in Kingston, Canada, and another from the University of the Federal Armed Forces in Hamburg. He was deployed with NATO to Bosnia and Kosovo, and in 2020 served as Branch Head for Transition at HQ Resolute Support in Kabul, Afghanistan. The views expressed in this article are solely those of the authors and do not represent the views of Hybrid CoE, or the Austrian and/or the German Federal Armed Forces.

Abstract: Russia’s invasion of Ukraine has had major global consequences, ranging from a humanitarian crisis resulting in millions of refugees, to food crises in the Near East and Africa, followed by a worldwide energy crisis with economic shocks triggering geopolitical realignments, ultimately affecting all military domains, including cyberspace. Specifically, since the Russian invasion on 24 February 2022, Moscow has tried to bring Kyiv to its knees in the cyberspace domain. Accordingly, this paper analyses how hybrid and non-hybrid, cyber and information warfare have worked in Russia’s favour, and where these tools and techniques might have failed. It highlights how the electromagnetic spectrum cannot be fully separated from the cyber and information spaces.

Problem statement: How to analyse the relationship between cyber security and cognitive warfare and what lessons can we learn from the war in Ukraine?

So what?: Whoever has the edge in cyberspace has the ability to shape what people and societies perceive as the truth, as well as control the narrative about what is happening physically on the ground. Lessons from the war in Ukraine call for a coordinated and comprehensive strategy from Western states to strengthen defences against the full range of cyber-destructive acts, espionage, and influence operations.
* * *

New Problems in Hybrid Warfare: Cyber Meets Cognition

Author: Chris Bronk is an associate professor at the University of Houston and director of its cybersecurity graduate program. He has conducted research on the politics and diplomacy of cyberspace; critical infrastructure protection; propaganda and disinformation; counter-terrorism; and cybersecurity. He has served as both a Foreign Service Officer and Senior Advisor at the U.S. Department of State. The views expressed in this article are the author’s alone and do not represent those of the University of Houston nor the State of Texas.

Abstract: Hybrid warfare encompasses the area of adversarial relations between war and peace. In this space, questions have emerged about how cyber action, which involves the subversion of confidentiality, integrity, and availability of data, intersects with information operations (also known as propaganda or influence). While definitions of these phenomena remain imprecise and emergent, terms such as social and cognitive cyber security are gaining currency amongst scholars and practitioners.

Problem statement: How are cyber techniques used to disseminate information designed to influence publics, elites, and leaders?

So what?: The most open societies are likely the most vulnerable to data manipulation and information oper- ations. The community of democratic states, namely those populating the OECD or the NATO alliance and its Pacific analogues, must erect defences against malign information influence delivered through cyberspace.

* * *

Future Elections and AI-Driven Disinformation

Author: Gazmend is the Head of Cyber Security at the Geneva Centre for Security Policy (GCSP) and a doctoral candidate focusing on offensive cyberspace operations at the Department of Computer and Systems Sciences, Stockholm University. Previously, he was a full-time doctoral student at the Swedish Defence University, and before that, he served as the Director of Intelligence for Cyber-related issues in the Swedish Armed Forces. He is a military and UN veteran with over five years of field experience in conflict and post-conflict areas. At the GCSP, his focus areas include Executive Education, Diplomatic Dialogue, and Policy Research & Analysis. The views expressed in this article are the author’s alone and do not represent those of the Geneva Centre for Security Policy (GCSP) nor DSV.

Abstract: This paper conceptualises the impact of Artificial Intelligence (AI) on disinformation campaigns, contrasting AI-driven operations with tra- ditional human-operated methods. Utilising a Human Intelligence Collector Operations (HUMINT) and Offensive Cyberspace Operations (OCO) framework, the research analyses the advancements in AI technology in terms of speed, efficiency, content generation, and adaptability. The findings reveal that AI-driven operations, particularly those with billions of tokens, significantly outperform human-operated disinformation campaigns in speed and efficiency, demonstrating an ability to process vast datasets and complex scenarios almost instantaneously.

Problem statement: How to understand the need to develop AI-driven strategies to protect democratic processes against disinformation campaigns?

So what?: Governments, tech companies, and academic researchers must collaborate on advanced AI counter- measures to combat AI-driven disinformation campaigns.

* * *

Hybrid Threats – The Chinese Focus On Australia

Author: Matthew Warren, Centre of Cyber Security Research and Innovation, RMIT University, Melbourne, Australia. The views contained in this article are the author’s alone.

Abstract: Cognitive operations affect people’s perception of reality and decision-making, guiding groups of people and targeted audiences towards conditions desired by a geopolitical adversary. What we are seeing is the use of social media being used in a disinformation context by authoritarian governments against the West in a direct and indirect way to change societies. But what we are also seeing is those perception approaches being used by authoritarian governments internally and externally. It is difficult to change people’s perceptions once they have been altered. This paper explores the hybrid threat relationship between Australia from the People’s Republic of China (PRC).

Problem statement: How to understand that dealing with hybrid threats are not standalone but should be part of a greater strategy? 

So what?: Understanding that hybrid threats are not standalone but could be part of a greater strategy is vital. This paper will highlight the importance of understanding hybrid threats against Australia from the PRC. Understanding the nature of hybrid threats, how they can be interconnected, and how they have evolved over time.



* * *

Emerging Hybrid Threats: AI And Microtargeting Disinformation As A Security Threat To The Protection Of International Forces

Author: Bernard Siman is a Senior Associate Fellow at Egmont Royal Institute for International Relations in Belgium, where he is responsible for hybrid threats and warfare. He teaches at the Royal Military Academy in Belgium, and the European Security and Defence College. He also heads Cyber Diplomacy at the Brussels Diplomatic Academy of the Vrije Universiteit Brussel (VUB). Geographically, he specialises in the Mediterranean and Black Sea regions, including the Middle East, and in global maritime geopolitics. He has authored various publications on hybrid threats and global geopoli- tics. The views expressed in this article are the author’s alone.

Abstract: Disinformation has mainly been viewed as a communication challenge. For entities like the UN, the EU and NATO, it has evolved into a security threat and a Force Protection (FP) challenge, as well as a threat to the well-being of deployed individuals and their families overseas. Feasibly, this threat will only grow with the combination of AI-enabled “deepfakes” and microtargeting.

Problem statement: What role does strategic commu- nication play in ensuring that peacekeeping and EU missions continue to have enhanced protection of their military forces overseas?

So what?: Strategic, emotive communication must urgently become an integral part of the planning and execution of mission security, which should expand in scope to include civil society organisations in the areas where personnel are deployed.



.













No comments:

Post a Comment