The Information Ethics Roundtable (IER) conference focused on “Privacy and the Challenge of Technology.” The conference was scheduled for a full day April 27, 2012 and a half a day April 28, 2012. I attended the first day. The full conference program appears here: http://ier2012.wordpress.com/schedule/
The IER is equally comprised of philosophers and librarians. The librarians are more often, it seems, faculty teaching in schools of library and information studies, rather than practicing librarians. Nonetheless, CUNY’s own Tony Doyle (Hunter College) has been instrumental in the establishment of the group, hosted this year’s conference and is both an working librarian and an ABD, teaching philosopher.
The conference format proceeded with multiple sessions of two paired speakers, a response speaker and then a brief time for Q & A following each session. The keynote speaker was Helen Nissenbaum, (Media, Culture, and Communication, NYU), author of Privacy in Context: Technology, Policy, and the Integrity of Social Life (Stanford Law, 2010). Lucinda Zoe, Associate Provost and Assistant Vice President for Academic Affairs, Hunter College, and herself a former chief librarian at Hostos Community College, introduced Nissenbaum. Nissenbaum’s talk focused on “The Value of Privacy in Context.”
Nissenbaum began by noting that as recently as February 23, 2012, the White House, through the Department of Commerce, issued the “Consumer Privacy Bill of Rights” (http://www.commerce.gov/blog/2012/02/23/us-commerce-secretary-john-bryson-delivers-remarks-unveiling-%E2%80%9Cconsumer-privacy-bill-). The third principle of the seven outlined in the Bill denotes a “Respect for Context,” and this is where Nissenbaum focused her talk. She discussed the various forms of technology mediation that act as threats to privacy: GPS, RFID, biometrics, cookies, web 2.0, etc. She suggested that the public sees the solution as transparency (notice) and choice, and that what bothers most people about their information being out there is not so much about their own loss of control over their information but rather the inappropriateness of how the information may be used. She described what would be an appropriate flow of private information and called this “contextual integrity.” Nissenbaum elaborated then on what she sees as context, that is, the structured spheres of social life. She then described what we see as informational norms – e.g. and interviewer cannot ask an interviewee’s religion, citizens have to report to the IRS their incomes, etc.
In discussing information norms, Nissenbaum lays out who are the actors involved, what type of information is involved, and what are the transmission principles (how the information was passed on – consent, buy, warrent, etc.). Contextual integrity is breached when actions or practices veer from information norms. Technology disrupts information flow norms. Why information norms matter – in order to sustain general moral and political values, to prevent harm and risk, and to limit unfair discrimination. Maintaining contextual integrity is not just about protecting the individual but is also about protecting context specific values (e.g. – protecting the privacy of an individual when voting also protects democracy; protecting privacy about health information encourages people to be more forth-coming with health care providers about their own health information and therefore protects public health when others may be at risk).
Nissenbaum is aware of her challengers who say she should examine privacy more as an issue of control, some who criticize her contexts as ill-defined, and others who suggest that digital media breeds novel contexts.
One interesting observation from the audience suggested that it may be the reaction of people that defines whether or not something is a violation of social information norms. Nissenbaum countered that majority opinion may not always be the best-thought-out opinion. Another point made was that culture also defines norms (e.g. – photos are routinely included in job application in China). Additionally, someone noted that economic considerations will often affect people’s idea of how much they want to control their own information. People may not care that google is tracking them, but when they hear that google is making money off of tracking them, then they want to halt the process.
In addition to Nissenbaum speaker, the Friday conference was a forum for four pairs of speakers.
Sarah Shik Lamdan, CUNY Law, who has also worked as a law librarian, spoke about “Protecting the FOIA Requestor: Privacy for Information Seekers.” Lamdan raises the interesting point that information about those who request information through the Freedom of Information Act (FOIA) is logged and these logs become public record. One can, in fact, request to see these logs because of FOIA, and interpretations of actions and intentions might be made using this information. Lamdan contrasted this to the analogous situation of libraries keeping patron requests confidential. She recommends that library privacy ethics be incorporated into FOIA requestor policies. A useful bit of knowledge I learned is that some information from the government can only be obtained through a FOIA request, but government agencies also support electronic reading rooms where a lot of information is readily available. This is one example: http://www.foia.cia.gov/ Additionally, I learned that only American citizens can make a FOIA request, but the online reading rooms can be accessed by anyone.
Paired with Lamdan was Adam Moore, iSchool and Philosophy, University of Washington who spoke about “Privacy and Government Surveillance: WikiLeaks and the New Accountability.”
Moore says that the public is often encouraged to see that in giving up privacy it is achieving security. Additionally, we are told simply to trust the government. Moore notes that there are
many examples of the government’s being corrupt when it is handling information. For example, there was the case of the FBI trying to blackmail MLK Jr. Moore tears down the “If you have nothing to hide why do you care about who looks at your information” argument. This argument creates a chilling effect on behavior. Additionally, the government itself may become a threat to security. Furthermore, do the efforts to increase security that are in place already even help prevent every possible attack? Moore proposes that there should be rules constraining security providers. These would include probable cause, judicial discretion, and accountability. Moore concluded that the idea of what wikileaks has accomplished has been a game changer. He says, “now, just like the rest of us, governments are information targets with little control over private information.”
During the Q & A following Lamdan’s and Moore’s talks, someone raised the point of the need to consider non-government corporations getting access to information. Someone also noted that the public has access to public employee salaries. Is the public’s right to know an invasion of privacy? A good suggestion would be to publish salaries without names. There is an emerging view suggesting that just because information is publicly available does not mean it is not private.
The first of the next pair of speakers included John Buschman, Georgetown University Library, who discussed “Privacy vs. Anonymity: When is Anonymity an Unethical Power Move in the Educative Information Professions?” Buschman noted that there are debates that rage in academic blogs where posters hurl criticism against named opponents, but do so anonymously. Library blogs are not exception. In fact Library Journal supports such a practice with its anonymous column “The Annoyed Librarian.” Buschman warns that anonymity allows shielding of those making serious threats. Those who support anonymity argue that pseudonyms are part of a literary tradition and protect one’s job. Buschman counters that the dangers to speaking out are not what they once were, and that there are mechanisms in place that will protect one in a job if he speaks out. He goes on to say that anonymity in the blogosphere is more about secrecy than privacy. Anonymity distorts healthy discourse. It’s more of a power move and lessens accountability.
Michael Zimmer, School of Information Studies and the Center for Policy Research, University of Wisconsin-Milwaukee spoke next on “Library Privacy in the ‘2.0’ Era: Avoiding a Faustian Bargain.” With heavy references to Rory Litwin, (http://libraryjuicepress.com/blog/?p=68) Zimmer examined the tension between the two values libraries extol, providing access and protecting privacy. Librarians have become quite fond of using Web 2.0 tools, but unfortunately these tools keep user personal information.
Marc Meola, Library, The College of New Jersey provided a reasoned wrap up. He suggested that the idea of professional anonymous blogging is sort of oxymoronic and what is the purpose? It reduces credibility. There’s an inequality – you attack a named person but you do not name yourself. There may be a middle path with library 2.0 – strip out personal data or provide a forum for free-wheeling discussion. Unfortunately, librarians may not be able to educate about anononymity since it’s not really clear what 2.0 services do know about us.
<!–[if !supportLineBreakNewLine]–>The next speaker was James Stacey Taylor, Philosophy, The College of New Jersey who discussed
“Queen Christiana’s Hermaphroditism: Why the Dead Have No Right to Privacy.”
Taylor began by saying that wronging and harming are different things.
If you think that privacy continues after death can you be harmed if information comes out about you after your death? Most philosophers say that the dead have a right to privacy, but there is little philosophical argument that the dead can be wronged. In one argument, Nelson Lande says the dead could be wronged; people deserve their good name and therefore have a right to it. Taylor argues that the problem of speaking of the dead as being harmed is backward causation – something that is happening now is affecting someone who lived in the past. Others have argued that this is not backwards causation. Taylor says, however, we cannot claim inconclusively that a person can be harmed after dead, so therefore we cannot know that the dead have a right to privacy.
Christopher Sula, Information and Library Science, Pratt Institute, was the second speaker in this pairing. Sula spoke about “Adapting to Digital Environments: Evolutionary Ethics and the Challenge of Privacy.” Sula asks: Can evolutionary facts bear on ethical situations? How will theory of our ancient origins inform contemporary times? Nearly 200,000 years of early human development suggest that anything that occurred in this time period is more deeply ingrained in us than what has happened more recently. According to Robin Dunbar, early human socialization shows people can handle being close to about 150 people. Interestingly, on Facebook, most people average about 150 friends. The significant difference, though, between a friendship network of real friends and a friends network on Facebook is that in FB your small network of friends makes it look like you are in a small network, affording personal interactions, but under the hood Facebook is a massive network. Sula recommends that FB needs to communicate how your data is used, provide opt-in options for data, and give users more ways to deal with the information they get from friends. Sula also was quick to say that he was not arguing for return to small groups with the negatives it offers, following a question from the audience.
One question raised following Taylor’s and Sula’s talks was that privacy is perhaps being seen too broadly. It was suggested that public people have no right to expect their information to be kept from the public. Related to Taylor’s talk, another audience member drew the audience’s attention to the web site http://www.deadsoci.al.
The last two presentations on Friday opened with Brian Roux, Computer Science, University of New Orleans & Law, Tulane University Law School speaking about “Extended Cognition and the Privacy of Smart Devices,” a paper he wrote with Michael Falgoust, Philosophy, Tulane University. Roux said we should consider using external devices (mobile electronics) as an extension of the mind. As such, to be deprived of these devices is experienced as akin to a personal violation. For example, during international border searches, devices are observed and seized. This is encouraging alternate use of data (e.g. – clouds). The outcome of the loss of a device puts a person in a position of being unable to communicate. Roux asks if any interest is served by border seizer? Information will get through anyway, privacy was violated and corporate interest may be hurt. Further, Roux argues, it is not a good idea to say everyone should just adopt circumvention to avoid having devices seized. The younger generation is bringing their own devices to work. How can employer regulate these devices and the data on these devices if they do not own the devices, but own the data on the devices? Another issue is social networking at work and employer forced-friending. One final issue considered is that in the normal parent/child relationship, there should be a diminishing curve of parental interest in child’ private life, but with social networks parents stay increasingly aware of their children’s private activities. Arguments can be made either way as to whether this is potentially harmful or helpful for the child.
Alan Rubel, School of Library and Information Studies and the Program in Legal Studies, University of Wisconsin, Madison spoke next about “Privacy, Technology, and Varieties of Freedom.” He asked the question, how is privacy related to intellectual freedom? He considered two examples of information seeking and retrieval in libraries where standard library procedures for protesting privacy cannot be kept in place: personalized research retrieval interfaces (e.g., an ebsco acct) and kindle lending.
I’m hopeful someone who attended Saturday’s presentations will summarize the talks.