I am a second year PhD candidate at theSchool of Law and the School of Computer Science at the University ofBirmingham (UK), under Professor Karen Yeung and Dr Hyung Jin Chang. I hold anLLM from the University of Cambridge (Queens’ College), and a BSc in Politics,Psychology, Law, and Economics from the University of Amsterdam. I was previously a junior researcher and tutor at the University of Amsterdam. My research concerns the responsible governance of computer vision technologies. This involves a technologically grounded mapping of computer vision applications; a philosophically and politically grounded taxonomy of the societal and individual harms created by various computer vision technologies; and a critical evaluation of potential regulatory responses to those harms. My interests include AI governance, algorithmic accountability, and theories of law and justice.
I am a final year PhD student under Professor Ross Anderson in the Department of Computer Science and Technology at the University of Cambridge where I spend my time analysing the security of computer systems to make them more robust. Formerly, I was a researcher at ETH Zurich under Professor Srdjan Capkun. I am interested in all aspects—technical and social—of privacy, provenance, and security engineering. I think designing systems that preserve these requires more than good code, it requires an understanding of economics, law, and human behaviour. Therefore, I try to take a broad view of the various forces tugging at computer systems in my research. My current research interests are in the domains of distributed systems (especially blockchain networks), Trusted Execution Environments, and covert communication channels.
Democracy on the Margins of the Market: A Critical Look Into the Privatisation of Cyber Norm Formation
When it comes to responsible behaviour in cyberspace, there are two questions we ought to ask ourselves: Who are the actors who operate in cyberspace? And who can legitimately set standards for”responsible behaviour'” of those actors?
Regarding the first question, we argue that the traditional focus on state behaviour in international law and international relations does not sufficiently capture the complexities around authority and legitimacy in the cyber norm-making process. As private actors exercise immense power in cyberspace and in cyber norm formation, we must take the activities of corporations seriously. The international legal framework which emphasises military use of cyberspace does not reckon with the usual operations of corporations and therefore creates a gap in legal protection.
Regarding the second question, we show that the political legitimacy of corporations in the exercise of defining normative standards for cyberspace cannot be assumed. Wherever it is assumed, it is likely that this is done on the basis of neoliberal ideology. We show throughout this paper that this neoliberal move operates through the economisation of authority and the construction of the moral corporation, illustrating these dynamics using the example of Microsoft’s “norm entrepreneurship.”
The neoliberal push and the ensuing decay of the state are problematic in cyberspace. This is specifically because the economic logic which motivates much of the activity in cyberspace –surveillance capitalism – presents unique threats to the foundations of our democracies. We therefore urge cyber norm scholarship to be aware of these dynamics and approach any corporate involvement in standard-setting with a critical mind. Moreover, we urge states to reclaim their position as legitimate legislators for the public good, and to unapologetically reject single-minded market logic wherever it threatens democracy, human rights, or the rule of law.