Danny Weitzner's Statement
From CFPWiki
Adding Information Accountability to Transparency: How to get over Scott McNealy with help from David Brin
by
Daniel Weitzner
Co-Director, MIT Decentralized Information Group
http://www.w3.org/People/Weitzner.html
[A longer version of this essay will appear in the Communications of the ACM, June 2008.]
David Brin was right. We must learn to live in a transparent society. What he taught us ten years ago is only more true today. His key insight was to challenge us to figure out how to deploy individual and community effort toward holding those who collect and might misuse information accountable for those harmful actions. Two things have changed since Brin wrote in 1998. First, our capacity for data integration and analysis has exploded. Not only can we collect more and more information. We also have extraordinarily powerful tools to extract even more information from what we have collected. Second, we have failed to do anything at all useful technically or legally toward promoting the kind of collective scrutiny regarding the use and misuse of personal information. In this sense, we are slouching toward the world that Scott McNealy warned us of when he said: 'privacy, get over it.' I propose 4 simple steps, many inspired by Brin, to help us get over Scott McNealy's dystopia:
Step 1 Drop the fig leaf: admit just how broken our legal and technical privacy tools actually are.
Existing legal and technical mechanisms intended to protect our privacy, copyright, and other important values have been overwhelmed by the increasingly open information environment in which we live. These threats follow from the ease of information storage, transportation, aggregation, and analysis. We must therefore rethink our approach to protecting our rights to be sure that the technical laws spelled out by Gordon Moore and Robert Metcalfe don ÃÃt permanently overwhelm the values enshrined in society ÃÃs laws. For too long, our approach to information protection policy has been to seek ways to prevent information from ãescaping ÃÂ¥ beyond appropriate boundaries, then wring our hands when it inevitably does. This hide-it-or-lose-it perspective dominates technical and public-policy approaches to fundamental social questions of online privacy, copyright, and surveillance. Yet it is increasingly inadequate for a connected world where information is easily copied and aggregated and where automated correlations and inferences across multiple databases routinely expose information even when it is not explicitly revealed.
Step 2 Learn the lessons of accountability from other areas of law and society.
As an alternative to secrecy, information accountability must become a primary means by which society addresses privacy and other issues of appropriate use. Information accountability means that information usage should be transparent so it is possible to determine whether a use is appropriate under a given set of rules and that the system enables individuals and institutions to be held accountable for misuse. Transparency and accountability make bad acts visible to all concerned.
Visibility alone does not guarantee compliance. Then again, the vast majority of legal and social rules that form the fabric of our societies are not enforced perfectly or automatically, yet somehow most of us follow most of them most of the time. We do so because social systems built up over thousands of years encourage us, often making compliance easier than violation. For those rare cases where rules are broken, we are all aware that we may be held accountable through a process that looks back through the records of our actions and assesses them against the rules.
Protecting privacy is more challenging than ever due to the proliferation of personal information on the Web. Access control and collection limits over a single instance of personal data are insufficient to guarantee the protection of privacy when either the same information is publicly available elsewhere on the Web or it is possible to infer private details to a high degree of accuracy from other information that itself is public. Worse, many privacy protections (such as lengthy online privacy-policy statements and in the context of health care and financial services) are mere fig leaves over the increasing exposure of our social and commercial interactions. In the case of publicly available personal information, people often intentionally make the data available, not always by accident. They may not intend for it to be used for every conceivable purpose but are willing for it to be public nonetheless. Even technological tools that help individuals make informed choices about data-collection practices they are prepared to permit are no longer sufficient to protect privacy in the age of the Web.
We must replace the outmoded legal and technical mechanisms that seek, in vain, to limit the flow of personal information. Instead, we should rely more on information accountability as a privacy protection mechanism. Information-accountability framework more closely mirrors the relationship between the law and human behavior than do the various efforts to enforce policy compliance through access control over information. As an early illustration of information accountability at work today, consider credit bureaus and their large collections of personal information. When these databases came on the scene in the consumer financial markets of the 1960s, policy makers recognized the public imperative to protect individual privacy and assure data accuracy, all while maintaining enough flexibility to allow analysis of consumer credit data based on the maximum amount of useful information possible. Under the Fair Credit Reporting Act (enacted 1970), privacy is protected not by limiting the collection of data, but by placing strict rules on how it may be used. Analysis for the purpose of developing a credit score is essentially unconstrained, but the resulting information can be used only for credit, insurance or employment purposes. Strict penalties are imposed by the FCRA for the breach of these use limitations. Data quality is protected by giving all consumers the right to see the data held about them (transparency). If a user of the data makes a decision adverse to the consumer (such as denial of a loan or rejection of an employment application) the decision must be justified with reference to the specific data in the credit report on which the decision was based (accountability). If the consumer discovers that the data is inaccurate, he/she may demand that it be corrected. As the FCRA illustrates, we achieve greater information accountability only by making better use of the information that is collected and by retaining the data that is necessary to hold data users responsible for policy compliance.
Step 3 Build Accountable Systems.
What technical architecture might be required to support information accountability? Our goal in promoting accountability systems is to build into our information infrastructures the technology necessary to make acts of information usage more transparent in order to hold the individuals and institutions who misuses data accountable for their acts. Systems supporting information accountability require three basic architectural features:
- Policy-aware transaction logs. In a decentralized system each endpoint will have to assume the responsibility of recording information-use events that may be relevant to current or future assessment of accountability to some set of policies.
- Policy-language framework. Assessing policy compliance over a set of transactions logged at a heterogeneous set of endpoints by diverse human actors requires a common framework for describing policy rules. Drawing on semantic Web techniques, larger and larger overlapping communities on the Web can develop shared policy vocabularies in a bottom-up fashion. Perfect global interoperability of these policies is unlikely but not a fatal flaw. Just as human societies learn to cope with overlapping and sometimes contradictory rules, so too will policy-aware systems be able to develop at least partial interoperability.
- Policy-reasoning tools. Accountable systems must be able to assist users in answering such questions as: Is this piece of data allowed to be used for a given purpose? Can a given string of inferences permissible be used in a given context, depending on the provenance of the data and the applicable rules? One possible approach to designing accountable systems is to place a series of accountable appliances throughout the system that communicate using Web-based protocols.
(see L. Kagal, C. Hanson, D. Weitzner, Integrated Policy Explanations via Dependency Tracking, to appear IEEE Policy 2008, http://dig.csail.mit.edu/2008/Papers/IEEE%20Policy/air-overview.pdf)
Step 4 Find new projects for cryptographers.
The era of secrecy is over. Cryptographers have made tremendous contributions through tools that help protect anonymity in extreme circumstances such as the need to communicate around repressive political regimes. It is now time to create a new set of technical tools that give people confidence that their personal information is actually used as intended. For this, we need legal and technical architectures that hold people and institutions accountable for misuse of information.
For more information see http://dig.csail.mit.edu/TAMI