Related Bibliography [access v.1.2 8/04]
Related Research: Trusted Systems Project
The current public debate that pits security and privacy as dichotomous rivals to be traded one for another in a zero-sum game is based on a general misunderstanding and apprehension of technology and security on the one hand and a mythology of privacy that conflates secrecy of data and absolute anonymity with autonomy of the individual on the other. Security and privacy are dual obligations of the liberal democratic state and developing policy to provide both requires a better understanding of how current security requirements and technological developments intersect with (and challenge) certain privacy interests, including anonymity. To meet this dual obligation, policy makers need to better understand security strategies and technical potentials and constraints, as well as to re-examine unchallenged conceptions of privacy and anonymity and their relationship to civil liberty.
Real-world procedural mechanisms to maintain or protect anonymity, in particular those premised on inefficiencies in information acquisition, management and use (for example, doctrines of "practical obscurity" and anonymity through data transience) are challenged by automated information processing, particularly emergent data aggregation and data analysis technologies, as well as new identification, authentication and collection technologies. In an information society in which the cost of data retention is less than the cost of selective deletion (and the cost of indiscriminate data collection is less than the cost of selective acquisition), the question is no longer just whether data (including data relating to social interactions or transactions) will be collected but under what circumstances it can be accessed and used for certain purposes, including national security and law enforcement, while still maintaining core privacy values. Preserving socially-desirable anonymity is a particular subset of this problem and can be restated as a data attribution problem -- that is, under what circumstances (and process) will observed or analyzed data (including data relating to social interactions) be associated with a specific individual or identity, or under what circumstances (and process) will a specific individual or identity be associated with available related data (and, in either case, to what consequence).
Resolution of this problem is particularly difficult in a threat environment in which there exists the political need for taking a preemptive intelligence approach (based on probabilities and disruption of acts prior to their commission) rather than a reactive law enforcement approach (based on evidence and conviction after the fact) as in, for example, counter-terrorism, or in which there exists a practical need for instantaneous or real-time defense or identification of the attacker as in, for example, cybersecurity.
Security strategies (whether for national, domestic or systems security) generally involve determining whether access control or accountability is the appropriate and most effective approach. Access control strategies generally employ a default state of "deny except with permission" (low cost of implementation, high cost to functionality or freedom) while accountability strategies employ a default state of "permit with accountability" (lower cost to functionality, potentially higher cost from security consequences). Absolute anonymity in data space -- a condition that does not exist in the real world -- would introduce a third default state, "permit without accountability" (high cost of implementation, potentially devastating cost to security).
It is my premise that for policy purposes, access control strategies equate with totalitarianism, accountability strategies with democratic freedom, and absolute anonymity with anarchy. Thus, the ideological divide in the policy debate is whether one considers "anonymity" a line between freedom and totalitarianism (requiring absolute secrecy of data for its own sake) or between freedom and anarchy (based on accountability under constitutionally recognized due processes in which autonomy is protected through selective revelation of identity subject to defined constraints). If the latter, then functional social anonymity (and thus autonomy) can be preserved by using security and technology strategies (including anonymization of data and pseudonymization of identity) that employ selective revelation (selective data attribution) mechanisms subject to appropriate and familiar due process procedures. Such strategies could meet security needs while protecting socially-beneficial requirements for anonymity.
"Government Should Not Rush to Massive ID Surveillance System," CAS Executive Director said in a statement released at a conference in New York as part of the Global Information Society Project's Program on Law Enforcement and National Security in the Information Age, October 29, 2004. [more]
K. A. Taipale, "Technology, Security and Privacy: The Fear of Frankenstein, the Myth of Privacy and the Lessons of King Ludd," 7 Yale J. L. & Tech. 123; 9 Intl. J. Comm. L. & Pol'y 8 (Dec. 2004)
K. A. Taipale, "Data Mining and Domestic Security: Connecting the Dots to Make Sense of Data," 5 Colum. Sci. & Tech. L. Rev. 2 (Dec. 2003) [executive summary PDF]
K. A. Taipale, "Designing Technical Systems to Support Policy: Enterprise Architecture, Policy Appliances, and Civil Liberties," Chapter 9.4 in "Emergent Information Technologies and Enabling Policies for Counter Terrorism" (Robert Popp and John Yen, eds., IEEE Press, forthcoming 2005). [introduction available online] See also the Policy Appliance Reference Model.
K. A. Taipale, "Identity Resolution and Domestic Security: Who's Who in Whoville," New York: Center for Advanced Studies (Dec. 2003)
[related presentation material (12/03)]
K. A. Taipale, "Free Speech, Semiosis and Cyberspace," New York: Center for Advanced Studies, [download comment draft PDF] (v. 01/03).
American Association for the Advancement of Science [AAAS]
Woodrow Wilson International Center for Scholars [WWICS]