Download Session 31

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Ethics of Computing
MONT 113G, Spring 2012
Session 31
Privacy as a value
1
Privacy as an Individual Good
The "right to privacy" is legally complicated.
• Based on 1st and 4th amendments
• Protects citizens from intrusion by governments
• Corporations treated like persons
• How do we protect individual's privacy from corporations?
• Privacy act of 1974 legislates some rights.
2
Privacy as a Value
Is privacy an intrinsic or instrumental value?
Fried (1968): Friendship, intimacy and trust cannot develop in
societies under constant surveillance. (E.g. Orwell's 1984)
Rachels (1975): Privacy is required for diversity of
relationships.
Control of information about ourselves is important for our
autonomy.
Need to release information about ourselves in some contexts,
but should have control over whether it flows to other places.
3
Privacy as Contextual Integrity
Nissenbaum (2004) argued that there are information norms
for every domain of life.
People have certain expectations about these in each domain:
1) What kinds of information is appropriate for this
context?
2) How will information be distributed in this context?
Examples of appropriate information:
Applying for a loan
Visiting a doctor
Examples of distribution norms:
Medical information
Credit reports
4
Development of Norms
Formal norms: Established by legislation or specific policies of an
organization.
Informal norms: Established and enforced by social expectations.
Changing norms:
New technology creates a "policy vacuum" for norms.
New technology allows collection and distribution of new forms of
information.
Organizations may use the technology without informing clients.
IT tools are often invisible to users and adopted without public
announcement.
This makes privacy difficult to protect.
5
Privacy as a Social Good
Privacy as an Individual Good doesn't influence policy.
Utilitarian arguments: Social goods trump individual goods.
Example?
Privacy as a Social Good provides more balance with other social
goods such as security.
How can we frame privacy as a social good?
6
Privacy and Democracy
Privacy is essential to Democracy because:
People under constant surveillance change their behavior.
Examples:
1. Bentham's Panopticon
2. IT Society. How are we watched?
Benefits of surveillance?
Drawbacks of surveillance?
7
Privacy is Essential to
Democracy
In a democracy, citizens:
• must be free to exercise autonomy.
• must be able to think critically.
• must argue about issues and learn from argument.
If it's too risky to argue for something new or challenge authority,
democracy will not work.
Johnson's view:
Privacy, autonomy and democracy are so intertwined that you cannot
have one without the other.
8
Data Mining
Problems with data mining:
• The norms are often not subject to public discussion.
•
The norms may be invisible to those being watched.
•
Some people are singled out by the data mining program and
others are not.
•
Information may be merged and mined for patterns people may
not have realized they were revealing.
•
People are placed in categories based on data mining results.
What is wrong with this?
9
Fair Information Practices
How do we as a society protect privacy?
Adopt, either through legislation or self-regulation, fair information
practices.
1. No secret personal data record-keeping.
2. Individuals able to find out what information is recorded and how
it is used.
3. Must be able to prevent information that from being used without
consent.
4. There must be a way to correct information.
5. Organizations must assure the reliability of the data.
10
Other Safeguards
Transparency:
Information practices should be made clear to users.
(Example of how not to do this--Citizens Bank Privacy
statement).
Opt-in vs. Opt-out
It's better if people are given a choice to opt-in to a policy, rather
than having the policy implemented and forcing people to
opt-out.
Example: Facebook, Beacon
11