* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Download spinellochapter01
Philosophy of history wikipedia , lookup
Morality throughout the Life Span wikipedia , lookup
Lawrence Kohlberg's stages of moral development wikipedia , lookup
Paleoconservatism wikipedia , lookup
Cosmopolitanism wikipedia , lookup
Ethics in religion wikipedia , lookup
Moral development wikipedia , lookup
Alasdair MacIntyre wikipedia , lookup
Ethical intuitionism wikipedia , lookup
Organizational technoethics wikipedia , lookup
Moral responsibility wikipedia , lookup
Moral disengagement wikipedia , lookup
Moral relativism wikipedia , lookup
Critique of Practical Reason wikipedia , lookup
Morality and religion wikipedia , lookup
Neohumanism wikipedia , lookup
Global justice wikipedia , lookup
Consequentialism wikipedia , lookup
Chapter One The Internet and Ethical Values 1 Laws vs. Software Controlling Technology • Attempting to control technology through law and regulation has often been futile. • Correcting technology with other technology has been more effective. • Ex. Laws suppressing pornography have been rough to enforce but software that filters out pornography has been more successful. 2 Larry Lessig’s Framework • Four constraints that regulate our behavior in real space: laws, norms, the market and code / architecture • Laws – rules imposed by the government which are enforced by ex post (after the fact) sanctions – The complicated IRS tax code is a set of laws that dictates how much we owe. If we break these laws we are subject to fines / penalties. 3 Larry Lessig’s Framework • Social Norms – expressions of the community. Most have well defined sense of normalcy in norms, standards and behavior. – Cigar smokers are not welcome at most functions. • The Market – prices set for goods, services or labor. – $3.95 for coffee and local coffee shop • Architecture – physical constraints of our behavior. – A room without windows imposes certain constraints 4 because no one can see outside. Real Life vs. Cyberspace • Subject to the same four constraints – Laws – provide copyright and patent protection – Markets – advertisers gravitate towards more popular web sites – Architectural – software code such as programs and protocols (constrain and control our activities). Ex. Web sites demanding username/passwords and software deployed to filter spam and certain email. – Norms – Internet etiquette and social customs. Flaming is a bad norm. 5 Ethics • Intrinsic human goods and the moral choices that realize those goods. • There are basic human goods that contribute to human well-being or human flourishing. 6 James Moor • Moor’s list of core human goods (considered thin) include: – Life – Happiness – pleasure and absence of pain – Autonomy – goods that we need to complete our projects (ability, security, knowledge, freedom, opportunity, reason) 7 John Finnis • Finnis’ version of human good (considered thick) includes: – Life – Knowledge – Play (and skillful work) – Aesthetic experience – Sociability – Religion – Practical reasonableness (includes autonomy) • Participation in these goods allow us to achieve genuine human flourishing 8 Both Moor and Finnis Believe • Ultimate good, human flourishing of ourselves and others should be our guidepost of value, serving as a basis for crafting laws, developing social institutions and regulating the Internet. • Golden Rule (Matthew 7:12) – “So whatever you wish that others would do to you, do also to them” • Immanual Kant stated “Act so that you treat humanity always as an end and never as a means” 9 Blocking Software • Look at the last paragraph on page 6 • The author believes that those that write programs or create laws should rely on ethics as their guide. • Code writers need to write in such a way that preserves basic moral values such as autonomy and privacy. • Many feel technology is just a tool and it is up to us whether this powerful tool is used for good or ill purposes. 10 Technological Realism • Two extremes: – Up to us what happens – Technology locks us into inescapable cage • Technological Realism – acknowledges that technology has reconfigured our political and social reality and it does influence human behavior in particular ways. 11 Two Broad Ethical Frameworks • Teleological – rightness or wrongness of an action depends on whether the goal or desired end is achieved (look at the consequences – maybe OK to lie). Sometimes called consequentialism • Deontological – is an action right or wrong. Act out of obligation or duty. Prohibition against harming the innocent. 12 Utilitarianism • Teleological • Most popular version of consequentialism • Right course of action is to promote the most general good • The action is good if it produces the greatest net benefits or lowest net cost • See example on the bottom of page 11 - 13 13 Contractarianism • Deontologic • Rights-based • Looks at moral issues from viewpoint of the human rights that may be at stake – Negative right – implies one is free from external interference in one’s affairs (state can’t tap phones) – Positive right – implies a requirement that the holder of this right be provided with whatever one needs to pursue legitimate interests (rights to medical care and education) 14 Pluralism • Deontologic • Duty-based • Actions only have moral worth when they are done for the sake of duty – Ex. If everyone would break promises there would be no such thing as a promise. – Consider this when looking at intellectual property – Ask the question “What if everybody did what you are doing?” – Respect for other human beings 15 7 Moral Duties 1. 2. 3. 4. Keep promises and tell truth (fidelity) Right the wrongs you inflicted (reparation) Distribute goods justly (justice) Improve the lot of others with respect to virtue, intelligence and happiness (beneficence) 5. Improve oneself with respect to virtue, intelligence and happiness (self-improvement) 6. Exhibit gratitude when appropriate (gratitude) 7. Avoid injury to others (noninjury) 16 New Natural Law • Good should be done and evil avoided • This principle is too general. 17 Flaws in Moral Theories • None are without flaws or contradictions • 4 frameworks converge on same solutions but suggest different solutions • One must decide which framework they will follow and “trump” the others 18 Principlism • Popularized by Beauchamp and Childress • “At first glance” one principle should be given more weight than others but • 4 principles are: autonomy, nonmaleficence, beneficence and justice 19 Autonomy • Is a necessary condition of moral responsibility • Individuals shape their destiny according to their notion of the best sort of life worth living • If deprived of their autonomy, someone is not treated with the respect they deserve. 20 Nonmaleficence • Above all else – do no harm 21 Beneficence • This is a positive duty • We should act in such a way that we advance the welfare of other people when we are able to do so 22 Justice • Similar cases should be treated in similar ways • Fair treatment 23