Download Models of moral development + Technology and ethics

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Lawrence Kohlberg wikipedia , lookup

Transcript
Feelings, Values, Ethics, and Skills
73
as control, expansionism, flexibility, power, precision and speed, we reinforce the
very technologies that we may wish to reform. The technologies and skills we value
and the values we build into our technologies and reinforce through our identification with them have historical roots and social implications.
Can we expect students to merely adopt values on the basis of authority, peer pressure,
propaganda or immersion in capitalist economics? When it comes time to choose
from among a range of values in technology, or life in general, how can young
people choose their own course of action? Ought we model or teach certain values
regarding technology in the labs and workshops? Values clarification, explained in
the next chapter, is a technique that deals with the process of valuing and challenges
to students to formulate and test their values against a range of issues. Character
values are addressed in Chapter VII. Dealing with values, whether directly or indirectly, requires that moral choices be made. Teaching with a values consciousness
requires that we understand moral reasoning and the processes of ethics.
Models of Moral Development
Students deal with values and technology at their own level of morality. Young
children are quite capable of making moral decisions based on their values. In the
late 1960s and early 1970s, Lawrence Kohlberg (1975) documented a stage theory
of moral development. After working with groups of children, and teenage and adult
males, Kohlberg argued that people pass through stages of moral judgment. Kohlberg
noted that moral development was a process of growth or progress toward universal
principles of morality. However, Kohlberg was quick to note that moral growth was
not pinned to biological growth. Young people could advance toward high levels
of moral maturity while adults could be stalled in lower stages. Nonetheless, the
stages of moral development provide teachers with a road map for analyzing their
students’ judgments on ethical and moral issues. It provides teachers with an idea of
the judgments their students are capable of making. It also gives us an understanding
of why some students or people have higher moral standards than others.
Children usually develop through the first two stages and settle into stages three and
four. A minority of adults pass into the fifth and sixth stages. The “lower” stages of
morality revolve around oneself, then as the morality gets “higher” it includes others
individuals and “society.” According to Kohlberg (1975), however mistaken, each
individual must go through each stage and cannot skip stages. Students progress
through social interaction and exposure to individuals that exhibit the “higher”
level traits. Moral dilemmas provide people with opportunities to test their beliefs
against those of others and thereby learn which moral judgment system yields a
more acceptable result.
Copyright © 2007, Idea Group Inc. Copying or distributing in print or electronic forms without written permission
of Idea Group Inc. is prohibited.
74 Petrina
One criticism of Kohlberg’s theory is that the progression from lower to higher levels
represents the myth of development and progress in western society. The notion of
universal ethics is more culturally specific than Kohlberg suggested. With regards
to development, adults do not reach a plateau, but rather pick and choose levels
of ethics that depend on situations. Another criticism of the stage theory of moral
development comes from feminist psychologists, such as Carol Gilligan (1982).
Gilligan noted that Kohlberg’s subjects were boys, for the most part. She says that
the stages represent male development with an emphasis on the concepts of justice
and rights. Female development, she says, is more concerned with negotiation,
responsibility and caring. Women must learn to tend to their own interests as well
as to the interests of others. Gilligan suggests that women hesitate to judge because
they see the complexities of relationships. Rather than apply a universal system of
ethics to situations, women tend to look at the specifics of relationships and feelings involved in a moral dilemma. Her three stages of moral development progress
from selfish, to social or conventional morality, and finally to a post conventional
or principled morality of caring. Kohlberg emphasized the cognitive dimensions
of moral judgment and Gilligan brought to surface the emotional dimensions of
moral judgment.
Kohlberg’s and Gilligan’s stage theories are roadmaps and not exact templates of
reality. They provide teachers with powerful tools for helping their students with
ethical and moral development. Teachers can have high expectations for their students and a clear notion of the moral abstractions that they can handle. Technology
teachers, with their responsibilities for design and technology, need to model moral
stances that are based on a range of stages in Kohlberg’s and Gilligan’s theories.
Table 2. Kohlberg’s and Gilligan’s moral development theories
Kohlberg
Gilligan
Punishment and obedience
Selfishness
Personal survival—Me against the world
Instrumental exchange
You scratch my back, I’ll scratch yours
Interpersonal conformity
Good vs. bad
Social or conventional morality
Post-conventional or principled morality
Law & order
Social contract
Universal principles
Copyright © 2007, Idea Group Inc. Copying or distributing in print or electronic forms without written permission of Idea Group Inc. is prohibited.
Feelings, Values, Ethics, and Skills
75
Technology teachers ought to work with their students to demonstrate the range of
moral judgments necessary to use and regulate technology. Teachers need to work
with their students to understand the ethical and moral judgments that accompany
technical skills (Table 2).
As we explained in Chapter I, the affective domain represents a general model of
emotional expression and development. In many ways, the affective domain also
represents a model of moral expression and development. The affective domain
suggests that people express emotions in an increasingly sophisticated way. At low
levels, young children merely attend to stimuli and express low level emotional
responses, such as satisfaction and dissatisfaction. Young adolescents begin to form
and internalize values that they express in their actions. In the upper levels of the
affective domain, young adults are capable of organizing a range of different values and emotional responses into value systems. At the upper level of the affective
domain, adults are capable of characterizing a value system over periods of time.
At the upper levels, individuals are in touch with their own feelings and extremely
sensitive to others. Like Kohlberg’s and Gilligan’s models, the affective domain is a
roadmap. The affective domain is not tied to directly to age. People do not biologically evolve or progress to the upper levels. Rather, many adults merely express
basic emotions without ever organizing their values into a system that characterizes
their behavior. The highest levels nevertheless point toward moral action. If life
were a simple as progressive development toward universal morality, there would
be no problems. There would be no need for ethics.
Technology and Ethics
The International Society for Technology in Education’s (ISTE) third “technology
foundation” standard refers directly to ethics: “Students understand the ethical,
cultural, and societal issues related to technology.” Rather than an ethical standard
that states how students ought to act, the standard casts ethics in terms of understanding. It is one thing to understand ethical issues and quite another to act ethically in dealings with technology. Moral action requires both emotion and reason.
Moral action means that we make reasoned choices on a justifiable basis. Ethics
guides moral action in choices of good and evil, right or wrong, and virtue and
vice. Moral actions are those deemed worthy of praise or blame, and affect others
or yourself. Ethics is a branch of philosophy that attempts to inform moral action
by determining a general basis for making choices and judgments. Ethics guides us
in examining our choices and actions and the basis for making and judging these
choices and actions. Ethically sound actions and choices, or responsibility, require
guidance and education. We have to teach students to act ethically in practical and
political dealings with technology.
Copyright © 2007, Idea Group Inc. Copying or distributing in print or electronic forms without written permission
of Idea Group Inc. is prohibited.
76 Petrina
Many, if not most, of our most serious moral dilemmas today involve technologies
we have chosen to produce and deploy. Total war, terrorism, cloning, drugs, global
warming, ozone depletion and surveillance involve technology in complex ways.
Technology is involved in less popularized yet equally serious moral dilemmas,
such as mass consumption, television, and free market capitalism. Even the most
mundane decisions such as the food we choose to eat, the air we breathe and the
transportation systems we use involve technology and, therefore, require ethical
examination. Since the 1980s, specialized fields in applied ethics, such as bioethics, environmental ethics, and computer ethics, suggest the proliferation of new and
novel moral dilemmas in technology. There are five general areas which implicate
technology in moral dilemmas (Pecorino & Maner, 1985):
1.
Technology may aggravate certain moral problems (e.g., creating new avenues
for infringements on rights).
2.
Technology may transform familiar moral problems into analogous but unfamiliar ones (e.g., copyright problems were transformed by file sharing on the
Web).
3.
Technology may create new problems that are unique to realms of action (e.g.,
robots displacing workers in manufacturing).
4.
Technology may relieve existing moral problems (e.g., built-in breathalyzer
in vehicle ignition relives dilemma of drunk driving).
5.
Technology may consolidate and aggregate a range of moral problems (e.g.,
genetic engineering allows us to prevent certain diseases, control behavior,
identify criminals, etc.).
Our choices to create or use a particular technology are moral choices. Are we free
to choose among alternatives based on ethical analysis? Whether it is a particular
technology that destabilizes ecology and society and undermines traditional moralities, or whether it is the way that humans use these technologies is a moot point in
ethics. Ethics means that we examine possibilities and generate a sound basis for
choices.
Morality means that we make decisions on sensitive issues and align ourselves
with certain causes. We make moral decisions based on five possible approaches
(Edgar, 1997):
•
Base moral decisions on feelings and intuition (emotivism).
•
Make moral decisions by avoidance or procrastination.
Copyright © 2007, Idea Group Inc. Copying or distributing in print or electronic forms without written permission of Idea Group Inc. is prohibited.
Feelings, Values, Ethics, and Skills
77
•
Make moral decisions by passing the buck. Find a scapegoat to blame for the
situation or decision. Go by the book by appealing to authority (e.g., boss,
expertise, courts, law). Or follow the crowd and conform to the norm.
•
Base moral decisions on caring, sympathy, or love.
•
Base moral decisions on a rational criterion.
Of course, there is no magical formula for making moral decisions. Ethics does not
divine the right choice or the answer for safe moral action. In technology, we cannot
opt for the fifth approach (rational decision making) by simply acting in our own
best interest, regardless of other considerations. This is rational egoism. Nor can
we make decisions simply by generating a balance sheet of positive and negative
impacts to guide our decision. This is consequentialism.
Consequentialism means that consequences alone should be the basis for moral
decisions. It means that an action is morally good or right if the consequences of the
action are more favorable than unfavorable. Hence, ethical conduct is determined
solely by a cost-benefit analysis of an action’s, or a technology’s, consequences.
Simply tally up the good and bad consequences of an action or technology and assess whether the good outweigh the bad. This is the simplest form of technology
assessment (Chapter V). Consequentialism holds individuals responsible for the actions whether the consequences were intended or unintended. But it is also an “ends
justify the means” type of ethics and inadequate for technological decision making.
Consequentialism demands that we calculate potential consequences before acting,
yet we can end up to be slaves to utility. Utilitarianism means that we judge an action
or technology based on a calculation of the “greatest good for the greatest number.”
We decide on an action or technology that will provide the greatest happiness or
pleasure for the greatest number. Simple utilitarian or majoritarian ethics, or what
is good for the greatest majority, are ineffective in making technological decisions.
Under majoritarian rule, it becomes difficult to sustain the rights of minorities and
the underprivileged in the world. Although there is nothing ethically wrong with
this, consequentialism tends to emphasize prudential over moral action. We calculate
our decisions and actions to avoid risk.
The other option in ethics is to act on a basis of duty and obligation toward principles and rules, higher spirituality or an intuitive sense of what is good and right.
Deontological ethics emphasizes intentions over consequences. What is right or
wrong is based on our intentions since consequences are beyond our control. We
hold individuals responsible for their intentions, where consequentialism and utilitarianism tend to absolve individuals from responsibilities for consequences. Our
conscience and good will ought to be our guides, says deontology.
An ethics that is based on the principle that we should always maximize the goods
we want or those goods we think are good for all, unless tempered with justice, will
be blind to an equitable distribution of these goods (Ferré, 1988). Privilege and duty
Copyright © 2007, Idea Group Inc. Copying or distributing in print or electronic forms without written permission
of Idea Group Inc. is prohibited.
78 Petrina
go hand in hand. Moral obligation means that we adopt the principles of three golden
rules: (1) Do not do unto others what you would not have done to you (Principle of
Maleficence). (2) Do unto others as you would that others do unto you (Principle
of Beneficence). (3) Weigh actions by what is fair (Principle of Justice). These are
summarized as “do no harm,” “try to create good,” and “be fair.”
Moral decisions cannot solely be made on scientific or technical reasoning. An
automatic default to authority undermines democracy and a basis for technology
studies. Ethics and emotions must play a vital role in technology studies. We have
to help our students understand consequentialism and utilitarianism, as well as feel
the weight of moral obligation. We actually have a moral obligation to our students,
to help them take responsibility for their technological decisions. Philosophers of
technology, such as Carl Mitcham (1996), say that we have three choices in making
technological decisions. We can
1.
Assume that the problems are so complex that they must be left to the experts,
that is, to scientists, engineers, and ethics counselors;
2.
Insist that these problems must be handled by the public, even though the
public often lacks adequate technical knowledge or sufficient reflection on the
ethical issues involved, because this is what our established values require;
and
3.
Strive to create an informed public that works with technical professionals
and ethics counselors to reach informed decisions.
This last option is where technology studies comes in. Informed decision making
in technology requires that ethics be taught and explored with students at all grade
levels. Informed decision making means that we pay attention to our mission in
technology studies to resensitize students to their technological decisions and surroundings. Ethics speaks to the heart with reason, and there is nothing wrong with
that.
The controversial issues and values clarification methods explained in the next
chapter are essential to assist students in their ethical decision making. In general,
an ethical analysis of technological decisions ought to proceed as follows (Edgar,
1997, p. 74-75):
1.
Assess the relevant facts of the technologies of interest. Establish the facts of
the technology as best as you can. (e.g., here are the facts of the automated
telephone dialer- autodialer)
2.
Identify the fundamental ethical principles of the technology and keep them
clearly in mind. Consequentialist ethics will establish a cost-benefit analysis.
Deontological ethics will establish prior principles and obligations (e.g., au-
Copyright © 2007, Idea Group Inc. Copying or distributing in print or electronic forms without written permission of Idea Group Inc. is prohibited.
Feelings, Values, Ethics, and Skills
79
todialers express rights to free speech; invade privacy; generate junk calls,
etc.)
3.
Identify which disputes over the technology are concerned with means to an
end and which are over the end itself (e.g., disputes over the autodialer itself
or over access of private businesses to individuals).
4.
Deliberate as is relevant and encourage students to make a decision and act.
As Kohlberg and Gilligan found, young children have no problem with ethical
reasoning, emotivism, and making moral decisions. Teachers may have to use
techniques that allow students to distance themselves from the issue, where the
process takes precedence. For older students, the processes of ethical reasoning
and emotivism ought to move students from dualities to commitments. Somewhat
like Kohlberg and Gilligan, William Perry (1970) and Jane Loevinger (1976) created two models to help teachers give direction to their students’ ethical reasoning
(Table 3). Kohlberg’s and Gilligan’s models deal with growth in longer frames of
time. Perrys’ and Loevinger’s models deal with positions in the span of an issue.
These are models and goals to give direction to teachers.
Table 3. Perry’s (1970) and Loevinger’s (1976) ethical reasoning stages
Perry
Loevinger
Basic duality
(Issue is either right or wrong)
Impulsive
(Ignores rules and ethics)
Multiplicity
(Recognizes options)
Relativism
(Tolerant of options and choices)
Self-interested
(Calculates immediate advantage)
Conformist
(Obedient of rules and authority)
Conscientious
(Self-critical and responsible)
Commitment
(Acts on commitment, accepts responsibility)
Autonomous
(Tolerant)
Integrated
(Committed to justice)
Copyright © 2007, Idea Group Inc. Copying or distributing in print or electronic forms without written permission
of Idea Group Inc. is prohibited.
80 Petrina
Technological decision making has gotten increasingly complex and contingent on
ethical analysis. The infringements on our privacies and rights that we can tolerate and cannot tolerate in technology are dependent on our ability to make sound
ethical analyses. For example, the fundamental liberty to pursue a livelihood is
threatened by technologies of automation that governments support and regulate.
The tomato picking machine developed at the University of California in the 1990s
was responsible for the elimination of 32,000 tomato picking jobs. The question is
whether the infringements on the rights to a livelihood were justifiable, even if the
technology was profitable and promised a net benefit to society (Consequentialism and utilitarianism). In technology studies, emotions, knowledge, and skills are
empowering. Perhaps there is no greater need than for students to learn to use their
skills in ethical ways.
Skill Acquisition
As we acknowledged, cognition, emotion, and action are inseparable. Ethics are
inseparable from technical skills. Indeed, do not underestimate the role of cognition,
emotion, and ethics in the process of skill acquisition. As we will explain in more
detail in Chapter VI, cognition, emotion, judgment and action are interdependent.
Many researchers and teachers continue to make the false assumption that emotion, ethics, and cognitive reasoning are simply applied to the development and
use of technical skills. They falsely assume that the relationship between emotion,
knowledge or judgment and technical skills is application. They assume a priority
of knowledge over technical skills. Our task here is to dispel this false assumption.
Emotion, knowledge, judgment, and technical skills develop together and are inseparable in experience and practice. In this chapter and the last, we described the
articulation and organization of knowledge, emotion and judgment. In this section,
we describe the acquisition, articulation, and organization of technical skills.
There are four general types of skills: cognitive, emotional, social and sensorimotor
skills. We described a range of cognitive skills in the last chapter. In Chapters V and
VII, we will discuss problem-solving and social skills. Some refer to emotional and
social skills as “soft skills.” Many theorists argue that the acquisition of “hard skills”
or technical skills was the essence of industrial education and educational technology. The balance of cognitive, emotional, social and technical skills is the essence
of technology studies (head, heart, hand, and feet). Today, technology educators
must be prepared to assist students with a wide range of skills.
As we indicated, while technical skills are inseparable from cognition and emotions,
they are typically characterized by fine motor skills. First and foremost, motor skills
are learned, and distinguished from innate capacities and abilities. Individuals may
Copyright © 2007, Idea Group Inc. Copying or distributing in print or electronic forms without written permission of Idea Group Inc. is prohibited.
Advanced Teaching
Methods for the
Technology Classroom
Stephen Petrina
The University of British Columbia, Canada
Information Science Publishing
Hershey • London • Melbourne • Singapore
ii
Acquisition Editor:
Senior Managing Editor: Managing Editor:
Development Editor:
Copy Editor:
Typesetter: Cover Design:
Printed at:
Michelle Potter
Jennifer Neidig
Sara Reed
Kristin Roth
Larissa Vinci
Marko Primorac
Lisa Tosheff
Integrated Book Technology
Published in the United States of America by
Information Science Publishing (an imprint of Idea Group Inc.)
701 E. Chocolate Avenue
Hershey PA 17033
Tel: 717-533-8845
Fax: 717-533-8661
E-mail: [email protected]
Web site: http://www.idea-group.com
and in the United Kingdom by
Information Science Publishing (an imprint of Idea Group Inc.)
3 Henrietta Street
Covent Garden
London WC2E 8LU
Tel: 44 20 7240 0856
Fax: 44 20 7379 3313
Web site: http://www.eurospan.co.uk
Copyright © 2007 by Idea Group Inc. All rights reserved. No part of this book may be reproduced in any form or by any means, electronic or mechanical, including photocopying, without
written permission from the publisher.
Product or company names used in this book are for identification purposes only. Inclusion of
the names of the products or companies does not indicate a claim of ownership by IGI of the
trademark or registered trademark.
Library of Congress Cataloging-in-Publication Data
eISBN
British Cataloguing in Publication Data
A Cataloguing in Publication record for this book is available from the British Library.
All work contributed to this book is new, previously-unpublished material. The views expressed
in this book are those of the authors, but not necessarily of the publisher.