The challenge: How to advise on complex phenomena (creating cybersecurity norms) when the narrative surrounding them is oversimplified.
- In particular, it is difficult to explain to policymakers the intersubjectivity of norms and their sociological basis in a way that is immediately policy relevant.
- The “problem” in question often involves many dimensions not related to norms at all.
- The degree of success of policy depends not only on knowledge about norms but on the application of that knowledge to particular situations.
The background and policy setting:
Martha Finnemore is a noted expert on norms in international relations. Around 2010 she was enlisted to help develop norms around cybersecurity. Recognizing the difficulty of generating “harder” law or treaties in this space, policymakers interested in governing cybersecurity turned to norms. They approached Finnemore to help them understand how to develop norms.
This is something Finnemore has experienced before. Policy makers assume that academics know the theories and policymakers know the issue areas. So, they imagine that if scholars can just explain the theories, they can mix them in to generate the changes they want.
But many in the policy community talk about norms in a way that belies any understanding of what they are. In particular, the narrative surrounding norms often treat them as reified objects that that one can create and use irrespective of others. But scholars of norms recognize norms as “collectively held expectations,” shared beliefs within a community. Though scholars can trace how norms may develop, their creation is a collective endeavor and thus depends not only on the activity of norm entrepreneurs but also on how they are taken up by others. It is difficult to explain to policymakers the intersubjectivity of norms and their sociological basis in a way that is immediately useful.
Adding to this difficulty, intersubjectivity’s insight that developing norms “depends on others” creates logistical problems for policymakers. It makes their job harder. It’s easier to just “do it to it”—drop money on a problem, issue a law or regulation to enforce. When the success of their efforts depends on others, that’s a pain; it creates lots of uncertainty about both schedules for moving forward and about ultimate success. This often isn’t a message policy folk want to hear.
The engagement:
Finnemore first traces her engagement with cybersecurity norms to a thinktank request. Much like she had with other areas, she translated the literature on norms for the cybersecurity arena. Her 2011 working paper charted patterns in the kinds of norms that are likely to succeed and the strategies that are likely to prevail in cyberspace based on this broader literature.[1]
Once you write something, Finnemore reflected, it begins to pop up in searches and suddenly you are an “expert”. Through this working paper and its indication of her “expertise”, she began to interact with a broader interdisciplinary community eager to cultivate cybersecurity norms. There she found policy entrepreneurs with strong governmental connections and eventually, funding from a Minerva Grant through the Department of Defense. Working in this space she both published two important academic articles (with Duncan Hollis) and was pulled further into a series of interactions with various policy makers around cybersecurity.[2]
As mentioned above, the policy community Finnemore interacted with wanted governance around cybersecurity but saw norms as preferable to treaties or law for several reasons. The process of treaty making is slow and, in the United States, requires ratification by Congress. Legislation, too, depends on Congress, not noted for its technical expertise. In the technology space where change is rapid and ubiquitous, norms seemed a more nimble, flexible, and nuanced instrument of coordination.
Finnemore often found it difficult to explain intersubjectivity to policy staff. Ironically, she noted that it was sometimes easier to explain it to elected officials who understand the need for persuasion. But staff, especially those with graduate training in political science or international relations during the 1980s and 1990s, often viewed the world through the realist and rationalist lenses that dominated the IR discourse at that time. These people are exceptionally intelligent, but they relied on terminology and frameworks rooted in rationalism, which can sit uneasily with notions of intersubjectivity and thus not a good platform on which to understand how to create and maintain norms. Her challenge was how to convey knowledge of intersubjectivity in a way that was not patronizing and was useful to people who have social science backgrounds but not ones steeped in a sociological understanding of norms.
Translating norms language into language these policy makers more easily understand is one strategy that is Finnemore found useful. They had contacted her because they had decided their issue is one where norms are relevant. But they thought about cybersecurity as a technical space. Many of the important academic studies on norms are around moral issues like international human rights. Drawing lessons from human rights often did not make sense to people working on cybersecurity, who think of human rights as a very different space. She found it more useful to use language like best practices or principles.
In many situations the people she interacted with understood their problem and that norms could help coordinating people to solve it. They wanted to know how to create norms and persuade others to adopt them. But there is not a tidy 3 step process for norm creation and dissemination. To figure out how she could help, it was important to meet policymakers where they live and understand what they were trying to accomplish. Likening this approach to “therapy,” she often asked a lot of questions. Though she would frequently begin an engagement thinking that she had little to contribute, as policymakers began to answer questions, she had a better sense of how her expertise could be useful for whatever problem her policy interlocuter was interested in. This is a key issue, she thinks. Policymakers are often only interested in general phenomena such as norms if they help solve a problem. But problems often involve many dimensions not related to norms at all. Figuring out more about the problem and the constraints surrounding it is critical to offering good advice.
Sometimes asking these questions led her to step away from the engagement. An example of this occurred in the Global Commission on the Stability of Cyberspace in the late 2010s.[3] This transnational group asked her to give them advice amidst concerns over the Group of Governmental Experts (GGE) process stalling out in the United Nations (UN). Finnemore quickly concluded that she had little in the way of advice for this group. They were diplomatically experienced and already had strong views about norms they wanted to promote and how to produce “yesable” propositions that people would sign onto. Finnemore opined that disciplinary training may have caused her to be less comfortable staking out particular normative positions in this kind of conversation than, for example, international lawyers.
Other times, the answers to her questions led Finnemore to see ways that she could offer advice based analogous instances. Over the course of time, the US government’s experience in trying to generate norms in many different areas has produced more cases, including successes and failures, on which to draw. In general, the success or not of policy depended not only on knowledge about norms but on the application of that knowledge to specific situations. While Finnemore could offer lessons on norms, she could offer less help around the specifics of the situation.
Finnemore also noted that the nature of cyber security is complicated. People know there are things they do not like – such as ransomware, for instance. But ransomware is only one of many problems related to cybersecurity. Each problem has its own policy community and government connections. Even on issues that seem connected, such as freedom of information and privacy, policy communities often operate in their own silo and don’t speak to one another. And these problems are peripheral to the central issue of protecting the core of the internet itself. Though specific problems are clear, the definition of cybersecurity as an overall set of problems is still in process. This is further complicated by the fact that cybersecurity capabilities are often outside the ownership of the US military or even the US government. The range of players and scope of the problem, which touches every dimension of contemporary life, make it hard to fit alongside other security problems.
What are the implications of this engagement?
Finnemore observed that the quality of the norms conversations has shifted in a positive direction. Over time, fewer people ask her for packaged solutions, and more policymakers have been appreciative of the answer “it depends.” She supposed that this could be a product of learning from experience, and policymakers’ orientation toward finding solutions that work. She said she would mostly give the practitioners she interacted with high marks. She also suggested that policymakers are more likely to learn from their mistakes than academics because they are more likely to pay professional costs when policies don’t work. Academics, on the other hand, can continue to get published even if they are wrong. Ultimately, the challenge in generating productive norms lies in how policymakers apply their knowledge of norms in a particular context. Thus far, Finnemore has not been approached by policymakers with normative goals she finds problematic or objectionable. She described herself as a resource that she or the other party could easily disengage from. Not needing to produce policy “deliverables” allowed her to develop healthy boundaries.
[1] Finnemore, Martha. 2011. “Cultivating International Cybernorms,” in America’s Cyberfuture: Security and Prosperity in the Information Age by Kristin M. Lord, Mike McConnell, Peter Schwartz, Richard Fontaine, Travis Sharp and Will Rogers. Center for New American Security. June.
[2] Finnemore, Martha and Duncan Hollis. 2016. Constructing Norms for Global Cybersecurity. The American Journal of International Law 110, No. 3: 425-479; Finnemore, Martha, and Duncan B. Hollis. "Beyond naming and shaming: accusations and international law in cybersecurity." European Journal of International Law 31.3 (2020): 969-1003.
[3] Global Commission on the Stability of Cyberspace, 7 February 2021. https://cyberstability.org/