Download Course Packet - Department of Sociology

Document related concepts

Science policy wikipedia , lookup

Development Communication and Policy Sciences wikipedia , lookup

Transcript
We see the ocean navigated and the solid land traversed by steam power, and intelligence
communicated by electricity. Truly this is almost a miraculous era. What is before us no one
can say, what is upon us no one can hardly realize.
Daniel Webster, 1903
Welcome to Sociology 415: The Dynamics of Social Change!
The purposes of Sociology 415 are to provide:
1.
2.
3.
4.
an understanding of linkages among science, technology, and society,
an understanding of public risk perceptions,
skills for communicating about risk to professional audiences and the public, and
skills in gaining adoption of complex, controversial technologies.
I hope you find the course to be intellectually stimulating, useful in your life, and an
enjoyable experience.
Organization of This Website
This website contains five sections: Home Page, Syllabus, Calendar, Reading Assignments,
and Class Assignments.
Reading Assignments
The web pages listed under Reading Assignments provide all the reading materials needed
for this course. No textbooks or other materials are required. Most of the links provided on
the web pages reference non-required reading.
Sociology 415 begins by presenting a sampler of readings that describe social issues related
to four complex and controversial agricultural technologies:
large-scale hog-confinement operations,
food irradiation,
genetic modification of food,
caging of chickens,
nanotechnology, and
stem cell research.
In reading about these technologies we will see they offer great promise for improving the
agricultural economy and the health and well-being of people worldwide. Yet, they raise
ethical issues and concerns about potential negative effects on the environment, the quality
of life in rural areas, and social and economic equity.
Class Assignments
Class assignments include four brief quizzes, four exams, and a computer simulation
exercise. The Calendar shows the due dates for these assignments.
A Note Regarding the Artwork Displayed at This Website
1
In his masterpiece, Modern Times, Charlie Chaplin portrays with endearing humor the
tribulations of the Tramp as he confronts the overly mechanized, dehumanized, and
irrational consequences of technological advancements in modern society. The Tramp copes
as best as possible with rapidly moving assembly lines, automatic feeding devices, and giant
machines that devour people whole! In Sociology 415, we neither condemn nor praise new
technology, but seek to understand public opinions of it and the effects of these opinions on
the shaping of the technology and its adoption by society. As part of this task, we seek
greater understanding of the emotions of us as the Tramp, as one who grapples with the
broader consequences of technological advancement. The images portrayed here pay
tribute to a great film and hopefully will brighten your day.
Thank you to the Roy Export Company Establishment for their permission to use the
photographs displayed on this website.
2
Instructor
Dr. Stephen G. Sapp
Department of Sociology
Iowa State University
320 East Hall
Ames, IA 50011-1070
Ph: (515) 294-1403
Cell: (515) 451-1620
FAX: (515) 294-2303
[email protected]
Course Description
SOC 415 addresses theoretical and applied topics in the sociology of technology and risk
communication. It focuses primarily upon applied issues of technology transfer. It explores
techniques of and social issues related to risk assessment, risk management, risk
communication, public policy formation, and diffusion strategies.
This course is conducted in accordance with the Department of Sociology Code of Ethics.
Any student who needs an accommodation based upon a disability should contact Dr. Sapp
privately to discuss their specific needs. Also, please contact the Disability Resources Office
(Room 1076, Student Services Building, 415-294-6624, [email protected]) to
coordinate disability certification and accommodation.
Readings
Sociology 415 Web Site
All assigned readings are available at the course web site: http://www.soc.iastate.edu
/sapp/soc415.html.
Diffusion of Innovations, 5th Edition Written by Everett Rogers, 2003. NY: Free Press.
Although not required for Sociology 415, this text is fundamental to all programs for
social change. Persons pursuing careers in social change professions should purchase
it and read it carefully.
3
Sociology 415 Course Packet
An Acrobat Reader version of the Course Packet is available at the Sociology 415 web
site: Course Packet
A paper copy of the Course Packet is available at the ISU Bookstore.
Related Books
Adams, John, Risk.
Douglas, Mary and Aaron Wildavsky, Risk and Culture.
FAO/WHO Report #70, The Application of Risk Communication to Food Standards and
Safety Matters.
Krimsky, Sheldon and Dominic Golding, Social Theories of Risk.
Lupton, Deborah, Risk.
Shrader-Frechette, Kristin S., Risk and Rationality.
Slovic, Paul, The Perception of Risk.
Webster, Andrew, Science, Technology, and Society.
Related Articles
Bell, Michael and Diane Mayerfeld, The Rationalization of Risk.
Bradbury, Judith A., The Policy Implications of Differing Concepts of Risk.
Freudenburg, William R., Risk and Recreancy.
Kahan, Dan M., Paul Slovic, Donald Braman, and John Gastil, Fear of Democracy: A
Cultural Evaluation of Sunstein on Risk.
Kahan, Dan M., and Paul Slovic, Cultural Evaluation of Risk: 'Values' or 'Blunders'?
Sapp, Stephen G. and Peter F. Korsching, The Social Fabric and Innovation Diffusion.
Sapp, Stephen G. et al., Consumer Trust in the U.S. Food System: An Examination of
the Recreancy Theorem.
Sapp, Stephen G. et al., Science Communication and the Rationality of Public Opinion
Formation.
Slovic, Paul, Trust, Emotions, Sex, Politics, and Science: Surveying the
Risk-Assessment Battlefield.
Sunstein, Cass R., Misfearing: A Reply.
Course Organization
Following the introduction, the course is organized into four units: 1) Science, Technology,
and Society, 2) Risk Assessment, 3) Risk Communication, and 4) Diffusion of Innovations.
Unit one addresses relationships among science, technology, and society, the philosophy of
science, the philosophy of technology, and social philosophy. The second unit presents
approaches to technology evaluation. The third unit covers risk communication, risk
management, linkages between public perceptions and technology policy, and the role of
the media in risk assessment. The final unit addresses strategies for gaining either the
adoption or rejection of complex and controversial agricultural technologies.
Assignments and Grading
1. The Calendar of Events summarizes the assignments for the course.
2. Evaluations include quizzes, exams, and a computer simulation exercise.
Quizzes evaluate understanding of basic concepts.
Exams evaluate integration and application of course materials.
The computer simulation exercise applies principles of the diffusion of
innovations approach to gaining adoption of an innovation within a hypothetical
village.
4
3. Class participation is an important component of this course. Four points will be
deducted from the total score for each unexcused absence.
4. The Class Assignments page provides detailed descriptions of the expectations for
each type of assignment.
5. The total value of all evaluations equals 250 points. The scoring procedure allows for
80 points from quizzes (4 quizzes at 20 points each), 120 points from exams (4
exams at 30 points each), and 50 points from the computer simulation exercise.
Grading is based upon a standard curve: A = 90% or more, B = 80% - 89%, and so
forth, with some consideration of + and - grades.
6. Grades for all assignments will be posted on ISU Blackboard. All other materials
related to the course are located at this web site.
5
Date
Unit One
August 24
August 26
August 28
August 31
September 2
September 4
September 9
September 11
September 14
September 16
September 18
September 21
September 23
Topic
Welcome to Sociology 415
The Social Problem
Science and the Public
Consumer Skepticism
Sampler Technologies
The Philosophy of Science
The Philosophy of Science
Philosophy of Technology
Social Philosophy
Social Philosophy
Social Philosophy
Review
Exam #1
Unit Two
September 25
September 28
September 30
October 2
October 5
October 7
October 9
October 12
October 14
October 16
Science, Technology, and Society
Technical Risk Assessment
Risk and Rationality
Economic Risk Assessment
Psychological Risk Assessment
Sociological Risk Assessment
Sociological Risk Assessment
The Sociology of Trust
Anthropological Risk Assessment
Exam #2
Unit Three
October 19
October 21
October 23
October 26
October 28
October 30
November 2
November 4
November 6
November 9
Globalization
Globalization
Globalization
Risk and Public Policy
Risk Communication
Risk Communication
Risk Communication
Risk Communication
The Media and Risk Communication
Exam #3
Unit Four
November 11
Diffusion of Innovations: Part 1
Assignments
Quiz #1
Quiz #2
Quiz #3
November 13
November 16
November 18
November 20
November 30
December 2
Diffusion of Innovations: Part 1
Diffusion of Innovations: Part 1
Diffusion of Innovations: Part 1
Diffusion of Innovations: Part 1
Computer Simulation Exercise
Computer Simulation Exercise
December 4
December 7
December 9
December 11
Computer Simulation Exercise
Diffusion of Innovations: Part 2
Diffusion of Innovations: Part 2
Summary
December 15
Exam #4 (9:45 a.m. to 11:45 a.m.)
Quiz #4
Description of the Quizzes
The quizzes will evaluate your mastery of key concepts discussed in Sociology 415. Each
quiz consists of 10 multiple-choice questions worth 2 points each (20 points total). The
course contains four quizzes, for a total value of 80 points.
Quiz #1: Monday, September 14th
Quiz #1 will cover materials related to Consumer Skepticism, the Social Problem of
societal decision making regarding complex and controversial technologies, the
Philosophy of Science, and the Philosophy of Technology.
Readings:
The Social Problem
Consumer Skepticism
Philosophy of Science
Philosophy of Technology
Key terms: problem of deduction, problem of induction, community of scholars,
scientific theory, consumer's dilemma, functional imperatives (i.e., adaptation,
goal-attainment, integration, latency), dynamic equilibrium, falisification, theoretical
knowledge, practical knowledge, productive knowledge.
Quiz #2: Friday, October 9th
Quiz #2 will cover materials related to Technical, Economic, Psychological, and
Sociological Risk Assessment, and Rationality.
Readings:
Technical Risk Assessment
Risk and Rationality
Economic Risk Assessment
Psychological Risk Assessment
Sociological Risk Assessment
Key terms: forms of technical risk assessment (i.e., actuarial,
toxicological/epidemiological, probabilistic), spurious and suppressor effects, formal
and substantive rationality, rational choice theory, reflective modernization, risk
society, outrage factors, recreancy.
Quiz #3: Wednesday, October 28th
Quiz #3 will cover materials related to Globalization and Risk and Public Policy.
Readings:
Globalization (Web page)
Globalization (PowerPoint)
Risk and Public Policy
Key terms: comparative advantage, types of economies (i.e, primary, secondary,
tertiary), economic leakage, race to the bottom, dilemmas of risk policy (i.e.,
fact-value, standardization, contributors, de minimus, consent), strategies for
development (i.e., isolationist, social progress, contervailing benefits, consent,
reasonable-possibility).
Quiz #4: Wednesday, November 18th
Quiz #4 will cover materials related to the Diffusion of Innovations, Part 1.
Readings:
Diffusion of Innovations: Part 1
Key terms: adopter categories (i.e., innovators, early adopters, early majority, late
majority, laggards), innovation-decision process, symbolic adoption, opinion leaders,
diffusion effect, choice shift.
Description of the Exam
Exam #1 will contain two sections of short-answer questions. In the first section students
will be asked to answer two required questions worth 10 points each. In the second section
students will be asked to answer one of four questions worth 10 points each.
Reading Assignments
Required
Class Notes.
The Social Problem
Consumer Skepticism
Philosophy of Science
Philosophy of Technology
Social Philosophy
Recommended
Familiarity with consumer issues regarding the Sampler Technologies.
Discussion Questions
1. Be prepared to describe why skepticism is a rational, legitimate response by
consumers to hearing about a new and controversial technology.
2. Be prepared to describe the limitations of educating the public about new and
controversial technologies.
3. Be prepared to describe in detail the central tenets of the structure-function, critical,
and human agency approaches to understanding relationships among science,
technology, and society. You will be asked to describe the role of the sociologist in
improving the quality of public decision-making regarding complex and controversial
technologies.
4. Be prepared to describe the Classical Greek, Enlightenment, and Critical philosophies
of technology. For each one, be prepared to discuss how it influences public opinion
about complex and controversial technologies.
5. Be prepared to describe and list the advantages and disadvantages of the positivist,
10
hypothetico-deductive, and community of scholar approaches to conducting science.
What is the role of the community of scholars in the enterprise of science?
6. Be prepared to describe the "scientist's dilemma" and the "consumer's dilemma." Why
does negative information about a technology carry disproportionate weight in
influencing consumer opinions about it?
Help Session
The help session for this exam is scheduled for Tuesday, September 22, from 5:00 p.m. to
6:00 p.m. We will meet on the first floor of East Hall and find an open classroom to hold the
help session.
Dr. Sapp's Office Hours are MWF, 9:00 a.m. to 10:00 a.m., or by appointment. Students are
invited to come to the office at any time to discuss the class materials.
11
Description of the Exam
Exam #2 will contain two sections of short-answer questions. In the first section students
will be asked to answer two required questions worth 10 points each. In the second section
students will be asked to answer one of four questions worth 10 points each.
Reading Assignments
Required
Class Notes.
Science, Technology, and Society
Technical Risk Assessment
Economic Risk Assessment
Psychological Risk Assessment
Sociological Risk Assessment
Science and Public Policy
The Sociology of Trust
Recommended
Familiarity with consumer issues regarding the Sampler Technologies.
Discussion Questions
1. Be prepared to describe the key characteristics and strengths and limitations of each
approach to risk assessment.
2. It will be important to recognize the philosophical and applied implications of
differences between the approaches to risk assessment. In comparing the economic
approach with the probabilistic approach, for example, note that we shift our focus
from expected value to expected utility, thereby placing emphasis upon perceptions of
the usefulness of the technology. In a sense, we are shifting our focus from asking,
"How safe is the technology?" to asking "Is the technology safe enough?"
3. Be prepared to describe the limitations to technical risk assessments and Adams'
suggestions for evaluating technical risk assessments.
4. Be prepared to describe what is meant by "outrage factors" in public discourse about
5.
6.
7.
8.
9.
10.
new technologies. Know which outrage factors are most important in influencing
public opinion.
Be prepared to describe Beck's argument that we live in a "risk society."
Be prepared to describe Bell and Mayerfeld's argument that risk assessment is a
battle over the language of risk.
Be prepared to describe the elements of Paul Slovic's argument that the system
destroys trust.
Be prepared to describe the policy debate between Daniel Kahan and Cass Sunstein.
Why does Sunstein believe that public policy should be guided by values, but not by
blunders? Why does Kahan argue for approaches that seek to reconcile diverse value
orientations? As part of your response, be prepared to describe bounded rationality.
Be prepared to describe the recreancy theorem (see: Sociology of Trust) as an
approach to risk communication. Be prepared to describe the sociological importance
of the recreancy theorem in relation to social-psychological approaches to
understanding public responses to complex and controversial innovations. As part of
your response, be prepared to describe formal rationality and substantive rationality
and the importance of these expressions for designing public policy related to complex
and controversial innovations.
Andrew Webster offers five policy recommendations for improving relationships
between science and society. Be prepared to describe these recommendations and
offer your opinion about which one can be the most effective in influencing the "active
citizen" to become more involved in technology development and science policy?
Help Session
The help session for this exam is scheduled for Thursday, October 15, from 5:00 p.m. to
6:00 p.m. We will meet on the first floor of East Hall and find an open classroom to hold the
help session.
Dr. Sapp's Office Hours are MWF, 9:00 a.m. to 10:00 a.m., or by appointment. Students are
invited to come to the office at any time to discuss the class materials.
Description of the Exam
Exam #3 will contain two sections of short-answer questions. In the first section students
will be asked to answer two required questions worth 10 points each. In the second section
students will be asked to answer one of four questions worth 10 points each.
Reading Assignments
Required
Class Notes.
Globalization (Web page) Globalization (PowerPoint).
Risk and Public Policy
Risk Communication
The Media and Risk Communication
Recommended
Familiarity with consumer issues regarding the Sampler Technologies.
Discussion Questions
1. Be prepared to describe the concepts of economic leakage and dependency as they
relate to primary, secondary, and tertiary economiies.
2. Be prepared to describe the dilemmas faced by risk assessors in working with the
public to define technological risk (i.e., fact-value dilemma, etc.)
3. Shrader-Frechette identifies four inappropriate responses to public concerns about
complex, controversial technologies. Describe each one and the counterargument to
these responses offered by Shrader-Frechette.
4. Shrader-Frechette identifies five improper/unethical rationales used to disseminate
usafe technologies to areas with little knowledge about the technology or power to
influence its use. Describe each of these rationales and why it is improper/unethical.
5. Be prepared to describe the history of efforts at effective risk communication.
6. Be prepared to describe the elements of effective risk communication.
7. Be prepared to describe the barriers to effective risk communication.
8. Be prepared to discribe the Natural History Model, the Public Arena Model, and the
Hoopla Effect. Know appropriate ways the media can present risk information to the
public.
9. Be prepared to describe the concept of comparative advantage.
Help Session
The help session for this exam will be held on Sunday, November 8, from 5:00 p.m. to 6:00
p.m. We will meet on the first floor of East Hall and find an open classroom to hold the help
session.
Dr. Sapp's Office Hours are MWF, 9:00 a.m. to 10:00 a.m., or by appointment. Students are
invited to come to the office at any time to discuss the class materials.
Description of the Exam
Exam #4 will contain two sections of short-answer questions. In the first section students
will be asked to answer two required questions worth 10 points each. In the second section
students will be asked to answer one of four questions worth 10 points each.
Reading Assignments
Required
Class Notes.
Diffusion of Innovations: Part 1
Diffusion of Innovations: Part 2
Recommended
Familiarity with consumer issues regarding the Sampler Technologies.
Discussion Questions
1. Be prepared to discuss the meaning of the "diffusion effect." What is the sociological
meaning of this effect? How does it relate to the concept of normative expectations?
Why is this effect important for influencing the adoption of innovations? What are the
two central elements of the diffusion effect? [See: Diffusion of Innovations, Part 1.]
2. Be prepared to discuss the methods of identifying opinion leaders. Know an
advantage and disadvantage of each approach. [See: Diffusion of Innovations, Part
1.]
3. Rogers discusses strategies for reducing inequalities that occur from the adoption of
new technologies and presents three scenarios for reducing gaps between "ups" and
"downs" that sometimes are increased as the result of innovation adoption. Be
prepared to discuss each of these scenarios and Rogers' proposed solutions to them.
[See: Diffusion of Innovations, Part 2.]
4. Be prepared to discuss the key features of each stage of the innovation-decision
process. Know the diffusion strategy that is most appropriate for each of the first
three stages. [See: Diffusion of Innovations, Part 1.]
5. Be prepared to discuss: re-invention , the "strength-of-weak-ties" , and innovation
16
characteristics. [See: Diffusion of Innovations, Part 1.]
6. Be prepared to discuss the contributions and criticisms of the diffusion of innovations
model. [See: Diffusion of Innovations, Part 2.]
Help Session
Monday, December 14th, 5:00 p.m. to 6:00 p.m. We will meet on the first floor of East Hall
and find an open classroom to hold the help session.
Dr. Sapp's Office Hours are MWF, 9:00 a.m. to 10:00 a.m., or by appointment. Students are
invited to come to the office at any time to discuss the class materials.
17
Copyright Information
The Diffusion Game is a refinement of the Change Agent Game, a paper and pencil game
developed by Everett M. Rogers (Copyright 1970 and 1972). It was modified and adapted to
the computer by Charles B. Weinberg with the assistance of Roberto Mendez and David
Rothschild at Stanford University. It was jointly copyrighted (1977 and 1981) by the
President and Fellows of Harvard College and the Board of Trustees of the Leland Stanford
Junior University (Christopher H. Lovelock, Harvard University; Charles B. Weinberg,
University of British Columbia).
This current version of the game (Copyright 2001) was written by Scot Hoffman and revised
by Paul Murphy, with the permission of Charles B. Weinberg. It is intended for use by
students taking the Dynamics of Social Change course at Iowa State University under the
direction of Stephen G. Sapp. Please do not copy the program or distribute it to others.
Introduction
Diffusion is the process by which a tangible or intangible item spreads through a society. An
area of particular interest to communication specialists, marketers, and sociologists is the
diffusion of innovations, where an innovation is defined as a product, process, behavior
pattern, idea, or entity that is new to a person or a society. People may be unwilling to
adopt an innovation for a variety of reasons, not least because it may involve changes in
present habits or beliefs.
Organizations seeking to promote change are sometimes referred to as change agencies
and those who work for them as change agents. The latter are professionals who try to
convince others to adopt innovations. Typically, change agents work by contacting
individuals or groups in person; however, they may also use forms of non-personal
communication. Examples of change agents are teachers, health workers, agricultural
extension agents, Peace Corps volunteers, sales people, and political precinct workers.
This game asks you to assume the role of a change agent and to concentrate on two of a
change agent's functions -- gathering information on the target population and, based on
that information, implementing diffusion strategies to promote an innovation.
How to Play the Game
Scenario
You are a change agent in a rural village. A map of this village, which consists of 100 farm
households, is provided. These households are divided into ten cliques. Each clique has a
different number of followers, headed by one opinion leader. The degree of reputational
influence accorded to each opinion leader varies. In some instances, this influence may
extend to villagers outside the opinion leader's immediate clique.
18
Although you know little about the village, your objective is to secure adoption of the
innovation among a specified percentage of village households within one year. Information
about the villagers' behavior takes time to obtain but should help you develop diffusion
strategies for promoting the innovation. You will therefore find yourself engaging in two
kinds of activities: (1) obtaining information about the villagers and (2) selecting
appropriate diffusion strategies to motivate villagers to adopt the innovation you are
advocating.
Each time you initiate a diffusion strategy, the cost is subtracted from the work days
available for completing your task and you are notified of the number of days that remain.
At the end of your visit to the village, you will be told how many adopters you have gained.
At any point in the game, you may also ask for a report on how many households have
adopted the innovation so far (i.e., "Feedback"). Your visit ends when you have used all
your work days.
The game's scoring system rewards players who choose wisely among the different diffusion
strategies. Additionally, the sequence in which you select diffusion strategies affects your
score.
Information Request
Each time you play the game, it is assumed you are new to the village and must learn some
basic information about its social structure and the people who live there. The Diffusion
Game assumes it takes 45 days to collect this information. Learning about social structure
and population characteristics will greatly enhance your ability to gain adoption of your
innovation. During your first 45 days in the village you have learned the reputational
influence of each opinion leader. Also, you have learned key communication patterns among
the leaders of the 10 cliques. These communication patterns are summarized as "Links with
Other Opinion Leaders" in the information window. By observing the reputational influence
of each leader and contact patterns among leaders you should be able to identify the best
strategy to gain maximum adoption within the remaining 320 days of your visit.
Diffusion Strategies
There are several ways in which you can inform villagers about the innovation. Although
you can reasonably assume that each of the diffusion strategies is a feasible alternative (for
example, you and the villagers speak the same language; there is a local newspaper and a
radio station in the vicinity), some strategies may be more effective than others.
It is important to note that you cannot implement most diffusion strategies until you have
first obtained relevant information. For example, you cannot select a diffusion strategy of
talking about the innovation with a specific opinion leader in the village unless you have
already identified that leader.
Each diffusion strategy convinces a specific number of village households to adopt the
innovation. By periodically seeking feedback about your performance, you should be able to
assess the relative effectiveness of different strategies.
With the exception of conducting demonstrations at opinion leader's farms, there is no limit
to the number of times you may select a specific strategy. However, a strategy may be
more or less effective (in terms of new adopters) the more it is used. The only way to
discover the effectiveness of a strategy after repeated use is by obtaining regular feedback.
Suggestions for Playing the Game
Develop an overall diffusion strategy each time you play the game.
19
Remember to use what you know about diffusion, especially the relative importance
of different channels of communication at different stages in the innovation adoption
process.
Do not forget the value of feedback, even if it costs time. Feedback helps you to learn
the effectiveness of your strategy.
Instructions for Playing the Game
1. Click on the link provided below to open the program for Village 1.
2. At the prompt, click on "Open" to play the game.
Diffusion Game: Village One (Not active until Monday, December 1st).
A Real World Example
The Diffusion Game is fun to play and provides an excellent summary of the materials in
this section of Sociology 415. The game, however, is noticeably artificial. This link presents
an example of an Application of the Diffusion Game in Afghanistan by a former student who
took Sociology 515, a version of this course developed for the Masters in Professional
Agriculture program at ISU.
20
The policy of being too cautious is the greatest risk of all.
Jawaharlal Nehru
Sociology and Technology
In a subsequent web page to this course you will read about some sampler technologies
selected to guide our discussions. Each technology offers promise for greatly improving food
safety, the economy, animal welfare, and the health and well-being of people worldwide.
Each one also raises concerns about potential negative effects on the environment and on
our health and well-being.
Given inevitable dilemmas that must be faced in evaluating relationships among science,
technology, and society, the public is faced with fundamental questions they expect will be
addressed by sociologists in their efforts to shape societal institutions in ways that promote
productivity, efficiency, and equitable distribution of resources. Answering these questions
defines the research, teaching, and outreach agendas for sociologists.
Some key questions posed to sociologists are:
What are the costs and benefits of adopting or rejecting a particular technology?
In what ways is technology development affected by power relationships in society?
What are the best types of technologies for the economy, for the environment, for
families?
What are the correct ethical guidelines to take in evaluating new technologies?
What types of decision-making processes can we expect the public to engage in as
they evaluate new technologies?
What can be done and what should be done to help citizens address the complex
issues involved in evaluating new technologies?
Can contentious public discourse erode public confidence in science and technology? If
so, what actions might facilitate thoughtful and respectful decision-making about new
technologies?
Which technologies likely will be accepted by the public and which ones likely will be
rejected?
What strategies can be used to influence the public to either adopt or reject a new
technology?
Possible Approaches to the Sociology of Technology
The examples listed above reflect the types of questions posed to sociologists. What, then,
should be the structure of this course in exploring them? Consider three not mutually
exclusive approaches that can be taken to organize this course.
Societal Structure and Functioning
One approach would be to focus our attention upon understanding how technologies can
significantly affect the structure and functioning of societal institutions.
What have been the effects of birth control procedures on the size of families, on the
strength of family ties, and on the meaning of family in American society?
How have petroleum-based production systems affected environmental quality?
Have environmentally-friendly production systems improved environmental quality
21
and what negative effects, if any, have they had on productivity and efficiency?
What have been the societal consequences of the development of cell phones,
computers, antibiotics, new construction materials, and on and on....
That is, are we better off today than we were yesterday? And can we improve the living
conditions for future generations? These types of issues motivate sociologists to investigate
relationships among science, technology, and society.
By the way, sociologists view technology more broadly than the average person. To a
sociologist, "feminism," and "global thinking," to name two examples of nonmaterial
innovations, along with material innovations, are considered to be technology.
Environmentalism, for example, is a nonmaterial technology important to understanding the
structure and functioning of America.
Distribution of Costs and Benefits
All new technologies have some negative consequences for everyone and bring about less
access to societal resources for some. That is, it is inevitable that technology is flawed and,
given that some persons always will have a vested interest in maintaining the status quo,
technology adoption always will create some "losers."
Can we anticipate negative consequences of new technologies?
Are negative consequences distributed, intentionally or unintentionally, in an
inequitable manner?
Do powerful segments of society manipulate technology development and
dissemination in such a manner as to exploit resources from the less powerful and
thereby unevenly distribute negative consequences to them?
What types of societal-level policies might be instituted so as to mitigate inequitable
distribution of negative consequences?
Thus, this approach to investigating linkages among science, technology, and society
focuses on how power relationships influence technology development and risk
management.
Human Agency
Sociological interest in human agency focuses on citizen involvement in technology policy.
The central questions regard the role of the active citizen in shaping technology policy and
obligations of societal institutions to solicit and respond to citizen input.
How do citizens (oftentimes, we will use the word consumers) influence the adoption
or rejection of new technologies?
How do consumers react to hearing information about new agricultural technologies?
Do citizens behave rationally in evaluating complex technologies?
What types of communication messages are most (and least) effective in conveying
complex information to consumers?
What are effective strategies for gaining adoption of complex and controversial
agricultural technologies?
What is the role of the social scientist in facilitating well-reasoned public decision
making about complex technologies?
Can controversial decision-making take place in a manner that respects the opinions
of others?
In summary, a focus on human agency involves understanding public responses and
facilitating well-reasoned and respectful discourse regarding technology. This type of inquiry
provides the scientist not only with an understanding of these issues but theoretical and
22
applied knowledge for acting as a change agent; that is, as someone who helps influence
the adoption or rejection of complex and controversial technologies.
23
Technology: No Place for Wimps!
Scott Adams: Dilbert
Sociology 415
In the previous section, we learned that the sociology of technology addresses issues of
technology development and dissemination within three areas: societal well-being, equitable
distribution of risks, and human agency. Because explorations into societal well-being and
equitable distribution of risks typically fall within the domain of professional sociologists, and
because this course is designed for students majoring in many disciplines, Sociology 415
focuses its attention primarily upon human agency. Sociology 415 covers risk evaluations,
consumer perceptions, technology communication, social issues of public policy formation,
and strategies for gaining either the adoption or rejection of agricultural technologies. That
is, the course addresses issues likely to have the most pragmatic applications to commercial
and public sector endeavors outside the domain of professional sociology.
Unit One: Science, Technology, and Society
A. Public Responses to Risk
Why are people sometimes skeptical of new technologies?
Is skepticism justified?
What is the "consumer's dilemma?
We learn about the rational and emotional elements of risk perceptions. We learn
about the justifications for skepticism and the abuse of skepticism by those who
choose to fearmonger. We learn why change agents expect public skepticism and
their stratigies for addressing skepticism.
B. Philosophies of Science, Technology, and Society
Philosophy of Science
What is science?
What are the relationships among science and society?
If we are going to learn how to significantly influence public decisions regarding new
technologies, we must have a good sense of the strengths and limitations of science.
We will need to become wiser about what science can accomplish and the many ways
in which it fails to achieve objective analysis of technology. That is, if we are going to
learn to gain adoption or rejection of technology, we need to know our product; we
need to know how scientific research and technology development take place.
The conclusion we inevitably will arrive at is that it is impossible for scientific
research, and therefore for technology development, to be unbiased and objective.
Knowing this inevitability will give us a sound philosophical perspective by which to
view technology and public opinions about technology.
Philosophy of Technology
What are the ways in which people think about technology?
24
In what ways do differing philosophies of technology affect technology development
and policy?
What are current paradigms (i.e., broad, philosophical perspectives) about
technology? Is more technology always a good thing or always a bad thing? Are
current paradigms leading Americans to make decisions about technology that later
will create significant problems for our society? A quotation I like to use to justify the
content of Sociology 415 is, "The choice of technology, whether for a rich or poor
country, is probably the most important decision to be made" (George McRobie,
Conservation Letter, October, 1976). Even if it is not the most important decision to
be made, certainly technology choice is critical to societal well-being. The paradigm
citizens use to evaluate technology--to decide what is good and bad technology--can
have a significant effect on the well-being of future generations.
Social Philosophy
What are the fundamental principles of sociology?
How can these principles be used to understand linkages among science, technology,
and society?
Sociology is guided by three paradigms: social structure, critical, and human agency.
These paradigms are represented above in the description of three approaches that
might be taken in organizing this course. The social structure paradigm focuses upon
societal structure and functioning, the critical paradigm attends to power relationships
and inequalities, and the human agency paradigm emphasizes the role of citizens in
shaping their society. This course focuses upon human agency. But it includes
sections that discuss social structure and critical evaluations and it incorporates the
perspectives of social structure and critical thinking into its presentation of risk
communication and the diffusion of innovations.
C. Relationships Among Science, Technology, and Society
How do science and technology affect the well-being of social systems?
How do social institutions and public policies affect science and technology?
We learn the importance of public attention to relationships among science,
technology, and society. We learn how economic, cultural, and political features of
society affect and are affected by science and technology. How does science and
technology development work in practice? How does science affect and how is it
affected by public perceptions and policies? What are some important take-home
messages for all citizens in a democracy regarding the enterprises of science and
technology development?
Unit Two: Risk Assessment
What are the approaches used to assess risk?
What are the strengths and weaknesses of each approach?
The materials presented thus far provide the philosophical basis to begin the applied
unit of Sociology 415. We start by learning approaches to technology risk assessment.
This information is critical because we need to know the source of risk assessments
and the strengths and weaknesses of each source. Knowing the strengths and
weaknesses of different approaches to risk assessment helps us understand public
responses and how to tailor risk communication messages to fit different types of risk
assessment. Importantly, as responsible change agents, we need to know the
limitations of different types of risk assessment. The key point of this section is that
technology should be evaluated simultaneously from multiple approaches wherein
25
each approach might yield different findings about the wisdom of adopting a
technology.
Unit Three: Risk Communication
A. Globalization
What is "Globalization"?
How does globalization affect the sociology of technology?
We define globalization as the rules for international trade in goods and services.
These rules provide the context for technology development and dissemination, which
can enable some societies to improve their well-being and exploit resources from
other societies.
B. Risk Communication: Theory and Applications
What is the best approach for communicating about technology and risk to the public?
What are effective procedures for reducing/creating public outrage about a new
technology?
This section describes guidelines for risk communication. Risk is conceptualized as
hazard + outrage, where hazard is assessment of technical risk and outrage is public
responses to hazard that reflect trust, perceived fairness, and other nontechnical
issues. Suggestions are offered about how to present technical risk to the public, how
to reduce outrage, and how to manage risk communication about complex and
controversial technology.
C. The Media and Risk Communication
How does the media affect public decisions about new technologies?
What should be the role of the media regarding public discourse about new
technologies?
Dr. Eric Abbott, from the Greenlee School of Journalism and Communications,
conducts research on the risk communication cycle, public views of technology, and
communication strategies for presenting high risk technology to the public. Dr. Abbott
uses the example of food safety to describe how the mass media views public concern
about technology and how the media and scientists can best present controversial
topics to the public.
D. Risk and Public Policy
What are some pragmatic and ethical approaches for a public to take in evaluating
risk and setting risk policy?
K.S. Shrader-Frechette, in Risk and Rationality, asks, "How does a society evaluate
and regulate risks associated with technology?" In answering this question, she
explores conflict between science and populist movements, the contrasting
philosophies of cultural relativists and naive positivists, false dichotomies between
"actual" and "perceived" risk, and problems with quantitative risk assessment.
Shrader-Frechette concludes her book by presenting workable risk management
principles.
Unit Four: Diffusion of Innovations
A. The Diffusion of Innovations
How can the change agent influence the adoption of new technologies?
26
What are the ethical obligations of the change agent?
If public opinions about technology cannot be swayed by risk communication alone,
then what are approaches to gaining adoption/rejection of technology? In this section,
we learn the processes that take place in public decision-making about technology
and risk. We learn the time sequence of events that occur leading to
adoption/rejection decisions. As part of this education, we learn how to influence the
adoption/rejection of technology.
The principle textbook, The Diffusion of Innovations, Fifth Edition, written by Everett
Rogers, describes procedures for gaining adoption of technology. The same
procedures can be used to gain rejection of technology. In some cases, the sociologist
might believe that technology adoption is desirable for the well-being of a society
(e.g., adoption of condom use as protection from HIV infection) and in other cases
might strive for technology rejection (e.g., rejection of dangerous and illegal drug
use). Other times, the sociologist might not have sufficient evidence to claim that
either adoption or rejection necessarily will make society better (e.g., it would be
difficult to claim, from a scientific perspective, that American society would be better
or worse off if laws allowing abortions were banned). We will learn strategies that can
be used to influence public opinion regarding technology decisions. The choice of
whether to seek adoption or rejection is up to you. What we will focus upon, in
addition to learning diffusion strategies, is learning the ethics of using diffusion
strategies.
B. The Diffusion Game
We will use the principles of diffusion to gain adoption of a hypothetical innovation
within a computer simulation exercise. The exercise will test your skills as a change
agent.
Course Summary
In Sociology 415 we will learn about relationships among science, technology, and society.
We will learn how philosophical paradigms affect societal choices about technology. We will
learn what types of messages are effective at what stages of the diffusion process in
influencing public opinion. We will learn how to gain adoption/rejection of complex and
controversial technologies.
Related Courses
These ISU courses provide instruction on topics covered in Sociology 415.
ECON 460: Agricultural, Food, and Trade Policy. Description and analysis of economic
problems of U.S. agriculture. Explanation and economic analysis of government policies and
programs to develop agriculture, conserve agricultural resources, address consumer food
concerns, stabilize farm prices, and raise farm incomes. The influence of macropolicy, world
economy, and international trade on U.S. agriculture.
JLMC 347: Science Communication. Science Communication. Reporting and writing about
science and technology topics for general audiences. Outlets for stories include print,
broadcast and online media. Story topics include reporting about basic, applied sciences and
social sciences, as well as ethical, political and policy issues related to science and
technology.
PHIL 343: Philosophy of Technology. Conditions under which technological innovations
contribute to human emancipation, relationship of technology and democracy, utility and
27
limits of technical rationality, and problems of ensuring that benefits of technological
advance are communally shared.
PHIL 480: Controversies in Science. Philosophical treatment of a branch of science that has
(or has had) significant social, political, religious and/or moral implications.
28
Ours is a world of nuclear giants and ethical infants. If we continue to develop our
technology without wisdom or prudence, our servant may prove to be our executioner.
Omar Bradley, General of the Army, 1950.
Introduction
The social problem we address in this course is, "How does society bring about the adoption
of beneficial innovations (or the rejection of harmful ones) as quickly as possible within an
arena of public discourse that respects the opinions of others?"
If the innovation is mainly a beneficial one, then society wants to adopt it as soon as
possible. If the innovation is a harmful one, then society wants to reject it as soon as
possible. For simplicity, we will orient this course to the adoption of mainly beneficial
innovations. All the same principles we will learn to bring about the adoption of a favorable
innovation (e.g., treating water to prevent water-borne illnesses) can be used as well to
bring about the rejection of a harmful innovation (e.g., smoking tobacco).
An innovation is an idea, practice, or object that is perceived as new. What might seem
familiar to some is new to others. Innovations can be material or nonmaterial. In practice,
these two types of innovations become intertwined because the adoption of material
innovations brings about changes in social relations. That is, culture responds to changes in
material conditions. Understanding relationships among culture, values, existing practices,
and political/social/economic relations is a necessary condition to understanding and
facilitating technology transfer.
Innovations need not be "high tech" in nature. In a developing country an innovation might
be boiling water to prevent disease. Or, an innovation might be the adoption of condom use
to help reduce the incidence of sexually transmitted diseases. An innovation might be a new
approach to teach calculus to high school students. It might be a new business plan for a
corporation. An innovation can be any type of material or nonmaterial idea, practice, or
object that is seen as new by potential adopters.
We might classify innovations as either low involvement or high involvement. By low
involvement innovations we mean ones that elicit little public controversy. A new type of
shaving cream that promises "less skin irritation," for example, is unlikely to create much
concern among opposition groups or raise much public outcry. A high involvement
innovation is one that does create public concern. It causes concern because it challenges
strongly held beliefs (e.g., stem cell research), sounds scary (e.g., food irradiation),
threatens ones environment (e.g., large-scale hog confinement operations), raises the
specter of unknown negative consequences (e.g., genetic modification of organisms), or
creates other concerns. Typically, for high involvement innovations we mean ones where an
organized group is actively opposing the change. Hence, the adoption of the innovation
must come about within the context of organized opposition.
To simplify our discussion we will assume that the innovation under consideration is safe,
wholesome, etc., and that, as change agents we are seeking adoption of this innovation.
Thus, here is our challenge: "How do we facilitate the adoption of a [favorable] innovation
as quickly as possible while encouraging public discourse that is respectful of the opinions of
others?"
29
The Consumer's Dilemma
To answer our question we need first to understand our audience: Consumers. Oftentimes,
the word "consumer," when used in the context of discussions about the adoption of new
technologies, particularly when the discussions are held by persons working in the life and
physical sciences, becomes synonymous with words like, "irrational," "uninformed," and
"unreasonable in their lack of trust in government institutions." Indeed, public responses to
new technologies can differ from those of a trained scientist. But to fully understand new
technologies as viewed by the public, and to facilitate rapid adoption of these technologies
(that for simplicity we will assume to be mainly beneficial), we need to gain a more accurate
and complete profile of the consumer.
To do so, please consider these illustrative points:
1. Do you own a cellular telephone? With no further instructions or plans (and presuming
it were legal to do so), could you build a working cellular telephone by purchasing the
needed parts and assembling them correctly? If not, then you are IGNORANT!
Being ignorant, or uninformed is unavoidable. All of us are ignorant and
uniformed. That does not necessarily mean we are irrational and unreasonable,
just ignorant.
2. Do you have 100% trust in everything that your government tells you 100% of the
time? If not, then you DO NOT TRUST YOUR GOVERNMENT!
Not fully trusting your government does not mean you are unreasonable. A
social scientist would assert that not only would you be a fool to trust your
government completely, but you would be a irresponsible citizen to do so. A
democracy simply will not work if its citizens do not ask questions, challenge,
probe, and offer alternative proposals for action.
3. Suppose you are walking down a well-known path through the woods. You walk this
path often. On this day as you walk, close to your feet your hear a rustle in the
leaves? Do you take notice, move to the side, check it out? If so, you are SKEPTICAL!
Being skeptical does not mean you are irrational, it means you are doing what
comes naturally: checking out potential dangers.
You can see the point I am making by presenting these silly examples. Being ignorant is
unavoidable. Being untrusting is one's responsibility. Being skeptical is a survival skill. Being
ignorant, untrusting, and skeptical are neither character flaws nor indications of an irrational
person. Thus, when first hearing about a complex, controversial technology, the reasonable,
rational person will be skeptical about adopting it.
One might argue that once the individual receives the scientific facts about a
high-involvement innovation they would be irrational to continue to be untrusting and
skeptical. After all, now they are no longer ignorant, but informed about the scientific facts.
Here's the rub. First, science is never perfect. It cannot be. So, hearing the scientific facts
will not necessarily reduce skepticism because people know that scientists sometimes make
mistakes. We will talk about this phenomena more in later sections of this course. Second,
and key to understanding the difference between low- and high-involvement innovations,
for high-involvement innovations the public is also receiving information from scientists who
are concerned about potential negative consequences of the innovation. The public is being
"educated" from both sides! Without having the knowledge base of the scientist an
otherwise highly educated person, a reasonable person, a rational person will wonder, "Who
is right?" That is, knowing that sometimes scientists make mistakes and that sometimes
governmental regulatory agencies make mistakes, the consumer's dilemma is, "Whom do I
trust this time?"
Education of the Public
30
How do we overcome reasonable, rational, uninformed fear of controversial new
technologies? (I will no longer remind us that when we state we are seeking to "gain
adoption" we are assuming the technology is mainly beneficial. We will, however, spend a
lot of time in the course on understanding the decisions that determine whether we consider
a technology to be mainly beneficial).
The reasonable answer to this question is to educate the public about the new technology.
Tell them the scientific facts. Certainly, distributing scientific facts is an essential first step
to gaining adoption. But guess what? In the initial stages of gaining adoption, when
scientists are telling the public about the favorable qualities of the new technology, public
acceptance will drop dramatically!
Why? Because opponents of the technology also are distributing information and negative
information carries disproportionate weight in the initial stages of the diffusion of
innovations. Why so much weight to negative information? Why does the public listen more
to non-scientists? The public does not listen more to non-scientists, but during the initial
phases of the diffusion of innovations they listen and pay heed. To return to our silly
example, most likely the rustle in the leaves is being caused by something no more harmful
than a chipmunk. But it could be rabid raccoon. Science sometimes makes mistakes!
One might respond to my scenario thus far by saying that those who distribute negative
information about new technologies are pseudo-scientists who twist scientific findings for
the purpose of fearmongering. Sometimes they are; there is money to be made in
fearmongering. Importantly, however, science messes up often enough that sometimes
persons who raise concerns have valid points. For example, no matter how much one might
dismiss the proclamations of Public Citizen (the organization founded by consumer activist
Ralph Nader), the fact remains that this and similar organizations have made valuable
contributions to improving the safety of all citizens.
So, if education will not work, then what does? That is what we will learn in this course. We
will need to cover a lot of material before we are ready to answer this question. I hope you
find the material informative and enjoyable.
31
Everybody wants to be second.
Joseph Borsa MDS Nordion [in referring to adoption of irradiated food]
Introduction
Food irradiation is the post-harvest application of ionizing radiation to preserve food,
prevent migration of invasive insects, increase shelf life through delayed sprouting or
ripening, or sterilize meats from bacteria that can cause food borne illness. Fresh fruits,
vegitables, and meats can be exposed to sufficient levels of gamma radiation to kill harmful
bacteria in the food or insects that live on the food. Sources of radiation include radioactive
substances, such as Cobalt 59, or electricity generated with electron beam accelerators.
Approximately 60 countries allow foods to be irradiated, with an estimated 500,000 metric
tons of foods irradiated annually worldwide.
Given the complexity of process, the seeming oximoron of exposing food we eat to high
doses of radiation, and other concerns consumers have expressed emotions ranging from
skepticism to outrage over this technology. More information about food irradiation is
provided on the Sociology web page regarding Sampler Technologies. At the outset of
Sociology 415 we will view a 15 minute video reproduction of a presentation by an expert
panel of four persons regarding food irradiation. Two persons on the panel (Ms. Ellen Haas
and Dr. Walter Bernstein) speak out against food irradiation and the remaining two
members of the panel (Dr. George Giddings and Dr. Edward Remmer) speak in favor of the
technology. The audience includes persons attending an episode of the popular television
production, The Donahue Show, starring Mr. Phil Donahue. This presentation occurred in
1985, when the topic of food irradiation was just being introduced to the American public.
In the video, one can see the reactions of consumers as they hear about food irradiation for
the first time. And one can see their reactions to a panel of experts strongly disagreeing
with one another about the science behind food irradiation, its safety, and its feasibility for
use in the U.S. food system. We will watch this video as a way of introducing ourselves to
the contentious, confusing, and complex arena of science risk communication. In this way,
we will be introduced to the central question of this course: How does one gain adoption of
complex and controversial innovations?
32
What, Me Worry?
Alfred E. Newman, Mad Magazine
Introduction
One of the central purposes of this course is to learn how to gain adoption of technologies
considered to be mainly beneficial that the public initially rejects. These technologies might
be advanced (e.g., biotechnology) or not (e.g., boiling water in undeveloped areas to
prevent disease). They might be material (e.g., confined animal feeding operations) or
nonmaterial (e.g., a new school curriculum, or perhaps a new business plan). The central
feature they share is public skepticism about them that delays their adoption and thereby
hinders scientists' ability to improve the well-being of society, under the assumption that
the technology is mainly beneficial.
This web page addresses the roots of consumer skepticism about new technologies. It
discusses the legitimacy of skepticism as a positive feature of an informed public. At the
same time, it describes how unjustified skepticism engendered by the fearmongering of
some organizations and individuals can create negative consequences for society. It
explains the role of the change agent in understanding skepticism, respecting its legitimacy
from the perspective of the public, and gaining adoption of new technologies among
skeptical consumers.
Why are Consumers Skeptical?
To gain adoption of a mainly beneficial technology, one must realize that skepticism about it
likely will occur, even after the public is presented with the scientific facts about this
technology. In this presentation we will learn the root causes of skepticism. Before doing so,
we need to recognize that skepticism can hinder the adoption of mainly beneficial
technologies. Scientists who labor to improve our quality of life understandably become
frustrated when the public rejects new technologies based upon what scientists consider to
be unreasonable fears about them. Certainly, if it inhibits the adoption of mainly beneficial
technologies, unreasonable skepticism can hinder scientists' ability to improve the well
being of society.
So, "Why are Consumers Skeptical?"
Typically, life and physical scientists answer this question by saying that the general public
is ignorant about the technology being considered and about science in general. Many
scientists believe that if the public only knew the scientific facts about a technology, or only
knew more about science, they would not harbor unreasonable skepticism about the
technology.
It is true that society most likely would be better off if the public knew more about science.
But these assumptions that greater education about science in general and about the
technology under consideration will necessarily increase acceptance of it are incorrect for
five reasons:
1. The presumption that knowing more about science will improve acceptance of new
technologies is incorrect because when people learn more about the actual practice of
science they also learn more about its limitations. Science can never be totally
objective, value-free, and unbiased. Scientific studies always have limitations. And the
33
2.
3.
4.
5.
practice of science always is guided by the questions being asked; wherein funding
agencies strongly influence which questions are asked. Therefore, science education is
not a good predictor of technology adoption. In fact, some of the strongest critics of a
particular technology typically are people with the most education about science and
the technology.
The presumption that knowing more about the technology under consideration will
improve acceptance of it is incorrect because when people learn more they also learn
more about the limitations of the technology and its potential negative consequences.
That is, organizations opposed to controversial technologies also are educating the
public about the technology. The public is hearing two sides of an issue. Because
consumers who might otherwise be highly educated often do not have the advanced
education within a particular scientific discipline to fully understand the arguments
made by proponents and opponents, they are uncertain about whom to trust.
Thinking that gaining more science and technology knowledge will improve
acceptance of a technology ignores the fact that persons might understand science
and the technology, but be opposed to it based upon moral or ethical reasons. One
might, for example, be highly educated in general and highly educated about genetic
modification and be opposed to biotechnologies because they feel that these
technologies create too many negative consequences for farmers.
Certainly, learning the scientific facts about a technology is a necessary element of
gaining adoption of it. But many years of research and practice show that learning the
facts is not the key element of gaining adoption. Adoption is a much more complex
issue than simply learning scientific facts. In later sections of this course we will learn
about the complexity of adoption decisions.
Thinking that learning the scientific facts about a technology will increase adoption of
it assumes that the public trusts the scientists who are proponents of the technology.
The public trusts scientists in general, but might not trust them immediately when
they learn about a new technology. They initially might be skeptical.
So, if ignorance cannot explain skepticism, then: "Why are consumers skeptical?"
Skepticism as a Rational Response
The key to understanding public skepticism is to recognize that sometimes it is well
founded.
For example:
Sometimes, scientists make mistakes...
Vioxx: A good medicine for relieving pain, but with more severe negative side effects
than originally realized.
Thalidomide: Developed to treat pregnant women with nausea, it causes severe birth
defects.
Hydroxycut: Developed as a weight loss product, it causes liver damage.
Sometimes, government management of technology is flawed...
Space Shuttle Challenger: Scientists' voices not sufficiently heeded.
Food Safety: The FDA and lobbyists.
Food Safety: The FDA and food inspections.
Sometimes, industry management of technology is flawed...
Ford Pinto: Organizational greed and technology.
Chicken Production: Arsenic and banned antibiotics in chicken production.
Sometimes, government and industry management of technology is flawed...
34
Love Canal: David vs. Goliath.
Sometimes, industry lies...
Tobacco: Decades of Deception.
OxyContin: A Recent Deception.
Sometimes, scientists lie...
Stem Cell Research: Faking results.
Stem Cell Research: Issuing misleading statements.
Medical Research: Fraud or Mistake?
Science Fraud: Cooking the books.
Science Fraud: Fraud at Iowa State University.
Science Fraud: More science, more fraud.
Discussion
When we recognize that consumer skepticism of new technologies sometimes is well
founded, we acknowledge that it is a rational, reasonable response by citizens. In fact,
skepticism is a survival trait. It has a firm foundation in fact. Homo Sapiens would not have
survived on this planet without exhibiting skepticism about possible dangers. And
skepticism can have a long memory. For example, farmers in less developed countries know
the many disadvantages and unintended negative consequences of adopting agricultural
practices associated with the green revolution. They therefore are hesitant to take at face
value the promises of technologies that sound similar, such as those associated with the
biotechnology revolution.
With these facts in mind, we can conclude that skepticism is not irrational, it is warranted.
Therefore, when change agents attempt to convince the public to adopt new technologies,
they should expect to observe public skepticism.
Because skepticism is warranted, the effective change agent will reframe the question from,
"Why are consumers skeptical?" to "How can we overcome skepticism about this
technology?" This reframing of the question gives legitimacy to consumer skepticism. It
switches the burden of adoption from the consumer to the change agent. That is, we no
longer ask, "Why are consumers so unreasonable?" We instead ask, "How can we overcome
legitimate skepticism about this new technology?" One objective of this course will be to
learn how to overcome public skepticism about new technologies, whether these
technologies be material or nonmaterial, advanced or simple.
Skepticism of Skepticism
As noted in the discussion above, public skepticism of new technologies is warranted
because science, industry, and regulation cannot always be trusted. At the same time,
people and organizations that raise concerns about technology sometimes cannot be
trusted.
Fearmongering and Technology...
Sometimes, organizations make claims about technologies that are not well supported by
scientific facts:
Top Ten Travesties: American Council on Science and Health.
Cyclamates: Skepticism about a good technology.
Alar: Much ado about nothing?
Alar: Fearmongering or not?
35
Fearmongering for Profit...
Sometimes, to make a profit, organizations will mislead the public about technology:
Erin Brockovich: Science, the Public, and the Courts.
Disseminating Misleading Information...
Sometimes, the opinions offered by non-scientific groups seem to be those offered by
professional scientific organizations:
The website for the American College of Pediatricians, for example, might look like it offers
the opinions of the American Academy of Pediatrics, which is the professional association of
pediatricians. It does not. Rather, it presents opinions representative of the socially
conservative, religiously fundamentalist position.
Discussion
Fearmongering can create problems when organizations are successful at convincing the
public to reject mainly beneficial technologies. When citizens are persuaded by junk science
or by fearmongering, then they unnecessarily punish industry or reject technologies that
can improve societal well being. Therefore, just as citizens should be active in learning
about the limitations of new technologies, they should be active in learning about its
benefits.
Because individuals and organizations sometimes engage in unreasonable fearmongering,
the change agent might be inclined to dismiss consumer skepticism as an irresponsible
perception.
The effective change agent will not follow this path of blaming consumers for nonadoption
for two reasons.
1. As noted above, skepticism is warranted because sometimes it is well founded.
2. Dismissing a perception that the public considers as legitimate is not the most
effective approach to gaining consumer confidence in and subsequent adoption of a
new technology.
The Bottom Line
At the end of the day, consumer skepticism does not have as strong a negative effect on
technology adoption as is sometimes believed. Skepticism can be a significant barrier to
adoption. And it sometimes prevents or significantly delays the adoption of presumably
good innovations or brings about unnecessary punishment of an industry. But for most
technologies skepticism can be eased and adoption gained.
Too often, adoption of mainly beneficial technologies is unnecessarily delayed because
scientists pursue inadvisable approaches to risk communication. Techniques for gaining
adoption of presumably beneficial innovations can and have overcome skepticism for many
different types of technologies in many settings worldwide.
The effective change agent will expect skepticism, respect its legitimacy, and learn how to
alleviate it and gain adoption of the mainly beneficial technology.
Ethical Issues
One of the main objectives of this course is to learn techniques to overcome public
skepticism for the purpose of gaining adoption of presumably beneficial technologies. As
part of their work in gaining adoption, change agents should recognize that all technologies
36
are flawed in some respects. Also, the adoption of new technologies always brings about
negative consequences for some segments of the population. The change agent, therefore,
needs to understand as best as possible the potential negative consequences of technology
adoption and seek ways to mitigate them. We will learn about these ethical issues at various
points throughout this course.
37
Humanity is acquiring all the right technology for all the wrong reasons.
R. Buckminster Fuller
It is a characteristic of our times that we must keep ourselves informed about relationships
among science, technology, and society. I encourage you to provide the class with
information about technology-related issues of importance to you so we can discuss them
within the context of the course materials. I hope you will watch for media reports on issues
that might be of interest to the class. We can talk about a wide range of topics regarding
the sociology of technology and risk communication.
This sampler provides information about six technologies of importance to Iowans:
large-scale hog confinement operations, food irradiation, genetically modified foods,
Proposition 2, Nanotechnology, and stem cells. Food irradition, genetically modified foods,
and nanotechnology are examples of advanced technologies designed for food engineering.
Large-scale hog confinement operations represent an example of a technology cluster
designed for food production. Proposition 2 represents a nonmaterial technology, one
designed primarily for societal engineering regarding animal welfare. the use of stem cells
might improve treatments for a wide variety of health problems and physical disabilities.
Let's not limit our discussions to just these technologies, but let's begin with them. Most
likely, some members of the class will be very knowledgeable about one or more of these
technologies. Perhaps they will allow us to "pick their brains" about them in our class
discussions.
Large-Scale Hog Confinement Operations
The building of large-scale hog confinement operations in Iowa occurs in response to
needed economies of scale to successfully compete in an industry that is experiencing rapid
vertical integration and increases in the size of production units. Concerns arise, however,
about potential negative effects on human health, reduced quality of life in rural areas,
decreased land values, animal welfare, and short- and long-term environmental damage to
water and air quality. These concerns have motivated public resistance to the operations as
well as calls for additional technologies to reduce undesirable odors and other
environmental problems.
Go to: Large-Scale Hog Confinement Operations to read about this issue.
Food Irradiation
Food irradiation--the exposure of food to high energy gamma rays for the purpose of
post-harvest insect control, extension of shelf-life, and the killing of harmful bacteria in
meats and seafood--has been the focus of heated debate for over twenty-five years! This
controversial technology holds the promise of safer foods that remain fresh longer, but
raises health and environmental concerns.
Go to: Food Irradiation to read about this issue.
38
Genetically Modified Foods
Genetically modified foods are created by transferring genetic material from one organism
to another. Proponents say they will reduce dependence upon pesticides, improve the
environment, and reduce world hunger. Opponents raise concerns about safety,
environmental degradation, and furthering of income inequalities.
Go to: Genetically Modified Foods to read about this issue.
Proposition 2
In 2008, California citizens voted in favor of the Prevention of Farm Animal Cruelty Act
(Proposition 2). Proposition 2 prohibits the confinement of certain farm animals in a manner
that does not allow them to turn around freely, lie down, stand up, and fully extend their
limbs. Proponents argue that this innovation contributes to the humane treatment of
animals. Opponents say that Proposition 2 was a misguided attempt at animal welfare and
burdens farmers with unnecessary expenses.
Go to: Proposition 2 to read about this issue.
Nanotechnology
Nanotechnology refers to the scientific study of and engineering with particles at the
molecular and atomic scale, wherin "nano" refers to particles that equal 1 billionth of a
meter in size. Nanotechnology involves three related areas of inquiry: 1) the study of how
the properties of elements change at very small scale, 2) the development of technologies
to improve health, the environment, and production efficiency for a wide array of
applications, and 3) the development of nanomachines ("microbots") that build and
reproduce nanotechnologies. Nanotechnology offers great promise for improving human
well-being. In the wrong hands, however, it might be the mechanism by which a small
rouge group could destroy all humanity.
Go to: Nanotechnology to read about this issue.
Stem Cells
The use of stem cells to stimulate tissue renewal might yield important treatments for a
wide variety of health problems and physical disabilities. This area of science, however,
raises issues in ethics, economics, and culture.
Go to: Stem Cells to read about this issue.
39
The scientist has no other method than doing his damnedest.
Percy W. Bridgman.
Introduction
If we in this course are to understand relationships among science, technology, and society
and learn to act effectively as change agents to gain adoption/rejection of agricultural
technologies, we need to know the strengths and limitations of science.
We need to know our product.
Scientific inquiry operates under certain rules. We will begin our understanding of science
by learning the rules of scientific investigations. Then, we will learn from philosophers of
science how the actual practice of science really works.
This page reviews fundamental principles of the philosophy of science. It describes science
in relation to other epistemologies, briefly reviews the history of science philosophy, and
shows that some rather "non-scientific" notions are an integral part of the actual practice of
science.
Compass
Key Questions
What is science?
What are the relationships among science and society?
Examples
Can we expect that the sampler technologies will be shown to be flawed?
What is the role of the active citizen in evaluating risks associated with new
technologies?
What is the role of the scientist in developing new technologies and presenting
information about them to the public?
Epistemologies
To understand sociology or any other science we need to understand the key principles of
scientific inquiry. Before I describe these principles I will define science and compare it with
four other epistemologies (ways of knowing about reality) using the typology (classification
scheme) presented by Walter Wallace in The Logic of Science in Sociology.
40
Religion requires one to have faith in the existence of certain absolute truths to know
reality.
The Mystical epistemology relies upon the opinions of gifted persons who have divine
insight into reality (e.g., prophets, clairvoyants).
The Authoritarian epistemology relies upon the opinions of persons in authority or
well-respected persons or entities to know reality.
The Logico-Deductive epistemology relies upon established procedures for collecting
observations that reflect reality, as much as possible, without bias or intervention by
the person(s) making the observations.
Science, like the Logico-Deductive epistemology, relies upon observations collected in
a manner that is as unbiased as possible. Science differs from the Logico-Deductive
epistemology in that it requires also the testing or development of theory, or an
explanation of why an event occurs that can be falsified by observation.
A scientific theory is a set of empirically falsifiable, abstract statements about reality. Simply
put, it is a story about how reality works that can be falsified by observation.
Science requires theory for three reasons:
1. Theory provides an explanation of why an event occurs. In contrast, empirical
generalizations merely summarize a specific set of observations. Fishbein and Ajzen's
theory of reasoned action, for example, is a set of abstract statements that can and
have been applied successfully to understand and predict a very wide range of
behaviors.
2. Scientists use theory to help others in the community of scholars (persons trained and
certified as members of a scientific discipline) with their investigations. Limitations to
the theory of rational expectations discovered in a study of one behavior, for
example, might prove helpful in understanding or predicting another behavior.
3. By gaining support for theory (based upon analysis of quantitative data, qualitative
data, or some combination of these), scientists feel confident about applying theory to
improve the well-being of human, animal, and plant populations by building bridges,
growing food, raising healthy families, and so on.
Some Notes About Science
Falsification
The ability of theories to be falsified by observation is the critical component of science that
sets it apart from other forms of knowing. Actually, theories can be falsified only in principle
because if one can never verify the truth then one cannot falsify it either (see discussion
below on Evolution vs. Creationism). Still, it is the idea that theories can in principle be
falsified by observations that sets science apart from other forms of knowing.
Deduction and Induction
To say that science necessarily entails the use of theory is not to say that science must be
deductive (research designs that begin with an established theory). Quite to the contrary,
an essential element of scientific inquiry is inductive creation/reformulation of theory. Still,
it is the focus on theory, whether its testing through deductive procedures or its
development through inductive procedures, that defines science as a unique epistemology.
Evolution or Creationism?
To emphasize the importance of theory in science we can compare scientific and
41
religious explanations of the origin of the species. Judeo-Christian stories about
creation (i.e., Creationism, Intelligent Design), for example, which state that the
universe and everything in it were created by a supernatural--and therefore
unobservable--being, might in fact accurately depict creation, including the origin of
the human species. These stories, however, cannot be falsified because one cannot
disprove the existence of an absolute god or intelligent designer. Creationism and
intelligent design, therefore, are not and can never be scientific theories. The theory
of evolution, on the other hand, can be falsified by observation and thereby qualifies
as a scientific theory.
Of course, if one can never know Absolute Truth, one can never fully know that a
theory has been falsified! That is, true falsification is never achievable (see this
related article by Kate Becker)! Explanations of "what is science," therefore become
complicated and compromised. As do attempts to distinguish science from
"non-science," including attempts to dismiss creationism and intelligent design from
the realm of science.
Still, it is possible to draw a line in the sand between science and
creationism/intelligent design because, in principle, one could not devise an
experiment to test the existence of God or an Intelligent Designer but one could
bring observations to bear on falsifying the theory of evolution. If you think the
qualifier "in principle" is too big a concession to make for the purpose of defining
science, then please recognize that if one attempts to make a logical argument for
the existence of God then one must make big concessions also (see this related
paper written by Wade A. Tisthammer).
In summary, epistemologically, one cannot argue that creationism or evolution is a
"better" or the "more correct" story. They simply are different types of stories.
Science, however, MUST be based upon stories that in principle can be falsified by
observation.
The advantage of science over other approaches to knowing is that observations can
be replicated by others using the same procedures that produced the original
observations. Replication gives one a sense of confidence that an observation (e.g.,
the tensile strength of steel under certain conditions of temperature and pressure)
did not occur by chance or miracle (e.g., leading one to have a certain amount of
confidence that the bridge will not collapse).
This related presentation provides a more detailed comparision of Science and
Intelligent Design.
What is Good Science?
When is a set of statements about reality considered to be a theory?
When is someone's work considered to be science?
What is good science?
To answer these questions, we must understand the rules of science. To understand the
rules of science, we will trace the path of how the rules were developed by examining the
philosophy of positivism and its various critiques by the philosophy of phenomenology.
Rules of Positivism
Positivism attempts to establish a set of rules for science that can verify the truthfulness of
42
statements about 'reality' in an objective, value-free, unbiased manner. The positivist
philosophy can be presented in various ways; the presentation below reduces positivism to
four rules:
1. Rule of operationalism: Record only that which is actually manifested in experience.
Rely only upon sense data. Rule out the metaphysical or theological bases for
verification. That is, only data directly observable by the senses are proper for
scientific inquiry.
2. Rule of nominalism: No generalized constructs or terms that cannot be reconstructed
by reference to sense data.
3. Rule of value-free knowledge: Scientific inquiry must be value-free and unbiased.
4. Unity of scientific method The scientific method is universal and equally applicable to
all areas of inquiry. All sciences must obey Rules 1-3 above.
The Phenomenological Critique of Positivism
Phenomenology argues that the rules of positivism, although noble in intent, are impossible
to follow in practice. Blind adherence to the rules of positivism, argue phenomenologists,
ignores the true nature and obscures the real value of science.
1. Critique of the rule of operationalism: What constitutes sense data? How does one
obtain pure sense data that is not filtered through the personality, experience, and
preconceived ideas of the scientist? Given that humans are thinking beings, the rule
of operationalism becomes not only restrictive to the social sciences, which seek to
understand the thinking of individuals and collectivities, but in itself a contradiction in
that scientists are thinking beings who make observations about reality. Nothing is
observed directly by the senses; all observations are filtered through the experiences
and biases of humans who interpret the raw sense data gathered by their eyes, ears,
etc.
2. Critique of the rule of nominalism: Logical atomism, or the reduction of all
observations to their basic components of sense data results in an attempt to reduce
all statements ad infinitum to some fundamental building block of reality. An
understanding of reality, however, always reflects abstractions drawn from sense
data. The sparrow, for example, might be reduced in description to the nature of its
atomic structure. But the "sparrow" is an abstraction of these building blocks. All the
description possible, from now until eternity, of the basic building blocks of the
sparrow never will equal "sparrow" until the observer calls this collection of building
blocks a sparrow, thereby creating the abstract concept: sparrow.
3. Critique of the omission of values: Once the rules of operationalism and nominalism
are shown to be logically impossible to follow, then it becomes evident that all
observations are influenced by the values and biases of the observer. That is,
observation is a human endeavor, one affected by values and bias.
4. Critique of the principle of one science: If one cannot establish a set of rules for
verification, then science becomes a human enterprise, subject to the dynamics of
other human enterprises. Because no science can adhere to the rules of positivism,
then none are required to do so. But this rule--that all sciences must adhere to the
same set of guidelines--does hold true. All sciences--life, physical, and social--must
adhere to the same rules. It is just that these rules cannot be the rules of positivism
because the rules of positivism cannot be followed by any science.
The Hypothetico-Deductive Model
43
Another approach to verifying the truthfulness of statements about reality is to assess them
as logical conclusions of laws established a priori through the human experience. The
Hypothetico-Deductive (HD) model, in effect, admits that the rules of positivism are
impossible to follow--that objective, value-free, unbiased observations are impossible to
obtain. The HD approach is to establish a set of rules whereby objective, value-free,
unbiased conclusions can be drawn from admittedly biased observations.
In the HD model, the explanandum (event to be explained) is a conclusion drawn from
premises (explanans) that cover one or more universal laws.
The HD model takes the following form:
Law: Always, if A then B.
Observe: A.
Then: B.
For example:
Law: All men are mortal.
Observe: Socrates is a man.
Then: Socrates is mortal.
The Phenomenological Critique of the HD Model
The HD model allows for symmetry of explanation and prediction, but suffers from two
fundamental problems:
The Problem of Deduction
Deduction is the derivation of hypotheses from a given law (or axiom). The problem
of deduction is that the law used as the initial explanans might be an accidental law,
one that appears to be true, but is not. That is, prediction does not mean explanation.
For example, the geocentric solar system described by Ptolemy can predict with good
accuracy the movement of the planets. But the heliocentric solar system described by
Copernicus is the correct (as far as we know!) law of the movement of the planets.
Thus, if one begins with the wrong law, then one's conclusions will be incorrect.
I am not going to challenge the law of physical mortality used in the example given
above regarding Socrates. But, other laws that seemed to be inviolable have been
falsified. For example, when an apple departs from the apple tree and travels toward
the Earth, perhaps taking a detour upon the head of Sir Isaac Newton, it certainly
seems like the lighter mass object is being pulled by the heavier mass object (i.e.,
gravity). But, so I am told by physicists, what actually is happening, at least from the
perspective of the special theory of relativity, our latest invention to explain falling
phenomena, is that the Earth and the apple both are in a free-fall through a curved
space-time continuum. In other words, what we think is true today might not be true
tomorrow. [Note: Actually, the theory of gravity works just fine most of the time.
Physicists still rely strongly upon the theory of gravity. It is only when matter speeds
up considerably that they must turn to the special theory of relativity for a more
accurate depiction of events.]
The Problem of Induction
Induction refers to the building of theory by summarizing a set of empirical
generalizations within the context of a philosophy about how "reality" works (I placed
reality in quotation marks to point out that I am not asserting that one absolute
reality necessarily exists). The problem with induction is that a set of observations
that support a law do not verify the law because there might be some other equally
44
logical explanation for the same set of observations.
To very briefly summarize an excellent volume of work by Karl Popper regarding the
problem of induction, scientific inquiry can falsify a theory, but never can verify it.
The Community of Scholars Approach
The phenomenological critique of positivism refutes the principle of verification. The
community of scholars approach, therefore, is to relax the verification principle, but still rule
out metaphysical justification in favor of empirical falsification of statements about reality.
This approach entails a big concession-that truth cannot be verified-and therefore requires
establishing a criterion for deciding what constitutes sound science.
The community of scholars approach to evaluating science relies upon the consensus (or
intersubjective) opinion of the community of scholars (i.e., basically, all those persons who
hold a PhD degree in a particular scientific discipline) regarding the acceptability of
statements about reality.
What is good science?
According to the community of scholars approach, the answer to this question rests with the
opinions of the community of scholars. It is this community that decides when work meets
the criteria of good science. And it is this community that decides not the truthfulness of
statements, but their acceptance as the best statements possible until something better
comes along.
In practice, technical reports of scientific investigation are submitted by the author(s) for
review by the community of scholars (See for example, the review of MS#08-084).
Typically, the procedure is to submit a manuscript to a professional journal. The Editor of
the journal distributes copies of the manuscript to 2-4 reviewers, who are not told the
identity of the author(s). The reviewers evaluate the quality of the scientific investigation as
it is reported in the manuscript. If they think the manuscript is clearly written and reflects
acceptable scientific procedures, then they recommend it be published as an article in the
journal. Upon publication, the study is considered to be "acceptable science."
The Phenomenological Critique of the Community of Scholars Approach
This approach of peer reviewing manuscripts for publication sounds straightforward, right?
Not so fast, argues Thomas Kuhn, in The Structure of Scientific Revolutions. Kuhn points out
that the peer review process cannot be an objective one. It includes elements of other
epistemologies, such as religious beliefs, authoritarianism, and mysticism. If the findings of
an investigation challenge long-held beliefs, for example, they will be scrutinized more
vigorously. If they challenge positions held by leading persons in the community of scholars
or threaten strong economic benefits promised by a new technology, then they are looked
upon with greater skepticism. If the findings do not sit well with the religious, political, or
philosophical positions of the reviewers or the Editor of a professional journal, it will be
more difficult for these persons to find the manuscript acceptable.
Thus, the community of scholars, like any other human collectivity, is influenced by power
structures, economics, religion, politics, culture, and so on.
Summary
This page describes different types of epistemologies--ways of knowing. Science differs from
other epistemologies in that its stories about reality must in principle be capable of being
45
falsified by observation. Thus, science posits theories (stories about reality) that use
abstract concepts (so they can be applied to many situations or events) to describe reality.
These theories can be cast aside with sufficient evidence contradicting them.
Science is not necessarily a better epistemology than others, it is simply a different form of
knowing. The advantage of science is that, with sufficient training, anyone can conduct
scientific investigations. In principle, scientific findings are immune from special
characteristics of the observer. Because scientific findings can be replicated by anyone with
similar training and access to observations (e.g., equipment, funds, contacts with human or
animal subjects), people gain a sense of confidence in scientific findings. Also, the peer
review process helps to ensure that science is conducted with expertise and integrity.
But science is conducted by scientists, who exhibit individual traits and respond to the
expectations of the collective. That is, science is influenced by politics, economics, religion,
culture, and social relations. So, the enterprise of science includes other epistemologies,
such as religion, authoritarianism, and mysticism. Scientists are committed to doing the
best they can to behave in an objective, unbiased, and value-free manner. But they know
that these goals cannot be reached. The philosopher P.W. Bridgman said it well, "The
scientist has no other method than doing his damnedest."
Suggested Readings
These books and articles provide excellent summaries of the philosophy of science. They will
reference philosophers who have made important contributions to understanding scientific
inquiry.
1. Benton, Ted (1977) Philosophical Foundations of the Three Sociologies. London:
Routledge and Kegan Paul.
2. Carmines, Edward G. and Richard A. Zeller (1979) Reliability and Validity Assessment.
Beverly Hills, CA: Sage.
3. Fales, Evan (1982) "Must Sociology be Qualitative." Qualitative Sociology 5(2):
89-105.
4. Feyerabend, Paul (1975) Against Method. London, NLB.
5. Giddens, Anthony (1974) Positivism and Sociology. London: Heinemann.
6. Hamilton, Peter (1974) Knowledge and Social Structure. London: Routledge and
Kegan Paul.
7. Kuhn, Thomas S. (1962) The Structure of Scientific Revolutions. Chicago: University
of Chicago Press.
8. Smart, Barry (1976) Sociology, Phenomenology and Marxian Analysis. London:
Routledge and Kegan Paul.
9. Wallace, Walter L. (1971) The Logic of Science in Sociology. New York: Aldine.
Links Related to the Philosophy of Science
Lyle Zynda's lectures on the philosophy of science delve perhaps too deeply into some
topics than is necessary for this course, but provide a good background on the key issues
affecting scientific inquiry.
Wade A. Tisthammer's paper on The Nature and Philosophy of Science addresses many of
the same topics covered on this page.
Patrick O'Driscoll provides a comprehensive explanations of logical fallacies at Fallacy Files.
46
Now I am become Death, the destroyer of worlds.
J. Robert Oppenheimer -(citing his translation of the Hindu, Sanskrit text, 'Bhagavad-gita'
after witnessing the first atomic bomb detonation at Trinity Site, just west of Socorro, New
Mexico on July 16, 1945).
Introduction
Dr. Robert Hollinger, of ISU's Department of Philosophy, reviews key features of western
philosophies of technology from Aristotle to Habermas. Dr. Hollinger presents us with more
than just a history of philosophical perspectives on technology; he describes a way of
understanding different perspectives used to evaluate it today.
Compass
Key Questions
What are the ways in which people think about technology?
Examples
Is technology inherently good or bad or neither?
Do persons opposed to new agricultural technologies adequately understand
the technology and its potential benefits? Are persons opposed to new
agricultural technologies irrational in their thinking about the technology?
Are proponents of new agricultural technologies insensitive to the needs of the
public? Are proponents of new agricultural technologies driven primarily by
greed in their support of the new technology?
Do our regulatory agencies do a good job of ensuring the safety of new
agricultural technologies?
Philosophy of Technology: Classical, Enlightenment, Critical
Classical Greek Philosophy
Greek Typology of the Three Forms of Knowledge:
Theoretical Knowledge: Knowledge about the immutable laws of the cosmos (e.g.,
physics, math, astronomy). Knowledge about things that can be directly observed.
Practical Knowledge: Knowledge related to social life (e.g., politics, ethics, social
interaction). Wisdom gained from experience with living.
Productive Knowledge: Knowledge about how to do things (e.g., technology).
47
Knowledge and skills required to achieve goals.
Plato considered productive knowledge (i.e., technology) to be the direct (i.e., unbiased,
value-free) outcome of theoretical knowledge. Thus, technology was neither good nor bad,
but the natural outcome of deductive reasoning from the immutable laws of nature. This
viewpoint implies that persons without theoretical knowledge are unqualified to question
technology and its consequences.
Aristotle also viewed these three forms of knowledge as relatively distinct. Aristotle's
viewpoint, however, recognized that theoretical knowledge sometimes can be rather brutal
in practice.
Application in Context
How does the classical Greek philosophy affect perspectives on technology today?
Michael Fumento voices strong concerns about the qualifications and integrity of
consumer groups who question the efficacy of agricultural technologies. His
viewpoints on Bogus Biotech offer a contemporary example of the classical
philosophy of technology.
A technology related to the genetic modification of food is the "terminator" seed--a
seed that produces a plant with sterile seeds. Thus, a farmer would not be able to
save seed from year to year, but instead would have to buy new seed each year.
Many objections have been raised about the brutal consequences of this technology
for subsistence farmers in developing countries. The Union of Concerned Scientists
describes these objections and asks the U.S. Department of Agriculture to drop its
patents on such seeds.
Enlightenment Philosophy
Enlightenment refers to the rise of science as a respected form of knowledge acquisition
that can be used to solve practical problems. From the Enlightenment perspective,
knowledge is power and progress is good. The technological imperative (or technological fix)
is the view that all practical problems can be viewed as technical problems and all technical
problems can be informed by scientific theory.
Note that the development of sociology during the Enlightenment period represented a
dramatic break from the classical Greek philosophy that theoretical knowledge (science)
could not be applied to practical (social) problems.
Application in Context
How does the Enlightenment philosophy affect perspectives on technology today?
Is the human race better off today than it was 1,000 years ago? This question is a
matter of opinion; but it is certain that the scientific approach has produced a vast
knowledge base that has been used to make dramatic changes in our environment
and behavior. The presentation by the Council for Biotechnology Information
regarding the many potential benefits of the genetic modification of foods provides a
48
contemporary example of Enlightenment philosophy.
Critical philosophy
From the Enlightenment perspective, science provides a means to dominate nature through
an ongoing process of improving technology and solving social problems. From an
Enlightenment perspective, all problems, including social problems, are seen as
technological in nature (e.g., the solution to crime is more prisons and longer sentences for
convicted felons).
From the Critical perspective, Enlightenment philosophy contains an inherent flaw in
defining all problems from the point of view of the technological imperative. It states that
Enlightenment philosophy, if taken to the extreme, can result in politics, religion, and social
life being viewed as technically governed and therefore subject to technically-defined
solutions, which can effectively eliminate much of the power of people to govern
themselves. Critical philosophy views Enlightenment philosophy as not necessarily
malevolent by nature, but flawed because it leads to unworkable, unethical solutions.
Critical philosophy "looks behind" the development of technology to view the motivations
involved in producing the technology, the assumptions made about its safety and proper
use, and the ethics implied by noting who will be most benefited by the technology. Critical
philosophy pays particular interest in how the power elite of a society influences technology
development and dissemination. Critical philosophy does not necessarily posit malevolent
intentions of the power elite in their influence over technology development and
dissemination; people sometimes are not fully aware of the consequences of their actions
and sometimes seemingly benign actions bring about negative consequences. Whether
intentionally malevolent or not, critical philosophy notes that power and resources, including
risks, are shared disproportionately. The less powerful will bear more than an equal share of
technological risks because technology is developed by and for the benefit of the powerful
elite. Marxian social philosophy, as one form of critical philosophy, would anticipate a
malevolent purpose of the power elite to control society and distribute risks inequitably.
From a Marxian point of view, the power elite always attempts to exploit resources from the
less powerful.
Application in Context
How does the critical philosophy affect perspectives on technology today?
The Center for Food Safety offers a good example of the use of critical philosophy to
examine the societal implications of genetically modified foods as well as other
controversial technologies.
Resolution of Enlightenment and Critical Philosophies
Is there some way to resolve differences between Enlightenment and Critical philosophies of
technology?
49
Philosophers such as Jurgen Habermas and others note that society must learn to:
1. keep abreast of technological advances, and,
2. establish institutions to control the direction and use of technology in an ethical
manner.
This strategy seeks separability between the good and bad consequences of technology.
Critical to this strategy is that:
1. citizens to be active in learning about technology and influencing public policy, and
2. societal institutions must be responsive to citizen input.
50
The sociological imagination enables us to grasp history and biography and the relations
between the two within society.
C. Wright Mills
Introduction
We will use a sociological perspective to review the course materials. Although human
agency is the focus of the course, we will use the social structure and critical paradigms also
to understand the context of human agency and inform our discussion of technology
transfer.
Compass
Key Questions
What are the fundamental principles of sociology?
How can these principles be used to understand linkages among science,
technology, and society?
Examples
What are the effects of technology adoption on the structure and functioning
of society?
Are the negative consequences of new technologies distributed fairly among
powerful and less powerful segments of society?
How does human interaction affect public responses to new technologies?
The Sociological Perspective
The sociological perspective is that:
People behave differently in groups than they do as individuals.
Human interaction influences individual and collective decision making.
Normative expectations (i.e., societal-level "rules") affect behavior.
Normative expectations can be changed by negotiation of the rules through human
interaction.
This perspective is used by sociologists to frame their approaches to improving society.
Sociologists are charged with the tasks of:
Monitoring and suggesting changes to societal structure to improve its functioning.
Improving society by noting the presence of inequalities in the distribution of valued
resources and suggesting ways to reduce inequalities.
Facilitating social cohesion (i.e., sense of belonging) among the members of society.
51
To accomplish these tasks, sociologists rely upon three paradigms (i.e., broad philosophical
viewpoints; worldviews) to guide their research and outreach activities:
Social Structure (Structure-Functionalism)
Structure-functionalism relies upon an "organic" analogy of human society as being "like an
organism," a system of interdependent parts that function for the benefit of the whole.
Thus, just as a human body consists of parts that function as an interdependent system for
the survival of the organism, society consists of a system of interdependent institutions and
organizations that function for the survival of the society.
Relying upon the successes of biologists in understanding the human body, functionalists
took a similar approach to understanding human social systems. Social systems were
dissected into their "parts," or institutions (family, education, economy, polity, and religion),
and these parts were examined to find out how they worked and their importance for the
larger social system. The rationale was that if scientists could understand how institutions
worked, then their performance could be optimized to create an efficient and productive
society. This approach as proved to be very successful and is the predominant philosophy
guiding macro-level sociology today.
Structure-functionalism arose in part as a reaction to the limitations of utilitarian
philosophy, where people were viewed as strictly rational, calculating entrepreneurs in a
free, open, unregulated, and competitive marketplace. The tenet of functionalism, and the
fundamental building block of all sociology, is that people behave differently in groups than
they do as individuals. Groups have "lives of their own," so to speak. Or, as you might hear
from a sociologist, "the whole is greater than the sum of its parts." Just as the "invisible
hand of order" can guide economic relations, "social forces" can guide social relations, and
thus yield for society very positive outcomes (volunteerism, democracy, laws, moral and
ethical standards for behavior, family and educational systems, communities) and very
negative outcomes (discrimination, organized crime, moral decay, warfare, poverty).
The idea of the functionalists was to create a science of society that could examine the parts
of human social systems and make them work for the betterment of all. And it is the task of
sociologists to use scientific principles to help create the best form of society possible.
Listed below are the central tenets of the functionalist approach to understanding human
social systems. We will use these tenets throughout this course to gain a functionalist
perspective on technology issues facing America today.
1. Society as a system of interrelated parts functioning for the good of the whole.
Keep in mind that functionalism is always oriented toward what is good for the whole.
As we examine different philosophical foundations of sociology, we will note the
advantages and disadvantages of this perspective.
2. All social systems have four key functions: Adaptation, Goal-Attainment, Integration,
Latency.
These functional imperatives roughly correspond to the five institutions of human
societies (economics, politics, family/education, and religion). By understanding which
functional imperative is most closely related to current issues of America, we can
understand the importance of the issue and its likely impact on the well-being of
America.
3. Social action takes place within a social system of cultural norms and institutional
structures.
52
Implications of structure-function theory for the sociology of technology:
1. The Structure-Function paradigm focuses upon the functions and dysfunctions of
technology for the society as a whole.
2. There is an emphasis on equilibrium and stability of the social system.
3. Social action takes place within a social system of cultural norms and institutional
structures. That is, technology must be compatible with existing ideas and practices.
4. There is an emphasis on integrating technology within a complex system of
institutions and norms.
5. There is an emphasis on alleviating, as much as possible, the negative consequences
of new technologies within the context of advancing technological progress for
adaptation.
Application in Context
Hog lots in Iowa?
One might ask, "For the benefit of Iowa, should local communities be given control
over the siting of large-scale hog confinement operations?" If no significant harm to
Iowa can be documented by limiting local control and Iowa is seen to benefit from
the revenues of hog lots, then the structure function paradigm suggests limiting
local control for the benefit of Iowa.
Critical Sociology (Marxian Analysis)
From the critical perspective, society is a system of competing parts in conflict for scarce
resources. All social systems are considered to have a small minority of power elites who
control most of the functions of society. All social action, including the development and
dissemination of technology, takes place within an arena of conflict and exploitation of
secondary segments of society by dominant segments of society. Thus, from the social
structure paradigm, new technologies arise in response to demand for improved efficiency,
productivity, and societal well-being. From the critical perspective, however, new
technologies are supplied by the power elite to further their class interests. That is,
technology is developed for and by the power elite. An essential element of this paradigm is
that exploitation of power is considered to be inherent in society and therefore inevitable in
the development and dissemination of technology.
The critical perspective relies heavily upon ideas set forth by Karl Marx in his critiques of
capitalist society. Marx relied upon the philosophical perspective of dialectical materialism to
guide his critique of capitalism. The dialectic, as used by Marx and Georg Hegel, has a
three-part structure: the thesis (i.e., status quo or central argument), the antithesis (i.e.,
an alternative to the status quo or the counter-condition of the central argument), and the
synthesis (i.e., the resolution of the conflict of the thesis and antithesis, usually considered
to be an "advancement" over the thesis, a "move forward" to something better). Although
Karl Marx's idea of a communist utopian society failed due to an inadequate understanding
of human motivation and organization, his identification of potential problems with human
social systems still is a crucial element of all the social sciences. His hypotheses that human
societies can experience sufficient organized and intentional exploitation by powerful elite's
to lead to their collapse have received enough support that citizens should be aware of
these potential problems and maintain a constant vigil against their becoming too severe.
53
Listed below are the central tenets of the Marxian approach to understanding human social
systems. We will use these tenets throughout this course to gain a Marxian perspective on
technology issues facing America today.
1. Society as a system of competing parts in conflict for scarce resources.
From the perspective of Marxism, the fundamental processes of society are
competition and conflict, rather than cooperation for the good of the whole, which we
noted (with qualifications) was the emphasis in structure-functionalism.
2. All social systems have a small minority of powerful elite's
For Marx, these persons/organizations were those most closely linked with the means
of production: the owners of large industries.
3. Social action takes place within an arena of conflict and exploitation between
dominant and secondary segments of society.
With the Marxian approach, it is instructive to identify the dominant and secondary
segments that affect and will be affected by the outcome of social action regarding
current issues. Using Marxism, we anticipate that dominant segments will use their
power to exploit resources from secondary segments of society.
Marx's Dialectical Materialism
To understand Marxian social philosophy, it is instructive to review its underlying principle,
which is dialectical materialism. The dialectic consists of three parts: the thesis (the status
quo, or our current understanding of "reality"), the antithesis (a contradiction to the status
quo, or a recognized flaw in our current understanding of "reality"), and the synthesis (a
suggested alternative to the status quo, or an improved understanding of "reality"). In one
sense, the dialectic refers to inherent, inevitable conflict. Thus, citizens must inevitably
wrestle with society as it is, the recognized flaws in society, and suggested alternatives for
an improved society. In another sense, the dialectic is a method for achieving progress.
Thus, citizens can use the dialectical way of thinking to improve society by recognizing and
attempting to overcome its flaws.
Marx focused on material conditions (e.g., food, clothing, housing, access to health care and
education). For Marx, the dialectic represented inherent conflict between the means and
relations of production. Owners were forced to exploit labor to achieve the competitive edge
over their rivals in the capitalist economy, but in the process, destroyed the very source of
their profit: labor.
Thus, Marx used dialectical materialism to understand capitalist society and its flaws for the
purpose of suggesting an alternative that would create a better society.
Thesis: Means of production. The status quo was capitalist society, which required the
lowest possible labor costs.
Antithesis: Relations of production. Marx witnessed firsthand the horrific conditions of
manual labor in industrialized England in the mid-19th century.
Synthesis: Communism. To eliminate poverty and the misuse of power in capitalist
society, Marx proposed a society that would end the holding of private property-people would work for the common good and share in the fruits of their labor.
This solution is seriously flawed in several respects. First, it errs in focusing too strongly
upon the economic conditions of society. Certainly, economic conditions are important, but
they are not the only ones to affect divisions among people and subsequently the well-being
54
of society. Differences in religion, race, and gender, for example, also are sources of
inequalities and exploitation. Contemporary theories of conflict therefore have expanded
Marx's insights to incorporate a broader range of potential divisions among populations. One
might respond that these extensions of Marxism reinforce rather than contradict the theory.
And to some extent they do. The flaw in Marxism is that it ignores the fact that other
divisions among people sometimes are more important to their cooperating with one
another than are economic ones. Therefore, the potential revolution predicted, and
advocated, by Marx based upon economic divisions is diffused to some extent by other
societal divisions. Second, Marx failed to recognize the power of democratic political
systems. No one is pretending that all persons in democracies have equal influence on
decisions. Democracies do, however, offer a path to change that does not require
revolutions against unmoving sources of power. Third, Marx did not and realistically in his
time could not anticipate the rise of the mass consumption, mass production society. That
is, Marxian theory does not account for the rise of economic power among workers as a
means to consume the goods and services they produce. Fourth, and most importantly,
Marx failed to recognize a basic human need for meritocracy: to be rewarded for extra effort
and productivity. A communist society society fails to satisfy people's desires to advance
themselves through their efforts.
Incorrect Assumptions of Marx's dialectical materialism:
1.
2.
3.
4.
5.
too much emphasis on economic relations.
social conflict is rarely bipolarized.
political interests are not strictly class (economic) based.
power rests on more than economic relations.
conflict does not always cause social change.
Correct Assumptions of the critical perspective:
1. inherent conflict between "haves" and "have nots" and focus on intentional
manipulation by the power elite to maintain unequal distribution of resources.
2. role of power in the distribution of resources.
3. conflict as a major source of change in social systems.
Marx's understanding of societies, the people that live in them, and capitalist economy is
sufficiently flawed that his suggested solution to capitalism is itself inherently flawed.
Marxian social philosophy is valuable today, however, because it reminds us of the potential
exploitation of the less powerful by the more powerful and of the need for the less powerful
to be mindful of this potential. Here is an example of how we can apply this philosophy to
contemporary society. Mrs. LaVon Griffieon, in her essay, Food for Thought, notes that "We
are so trusting in our ignorance." I think this statement epitomizes contemporary
applications of the Marxian critique of society. Mrs. Griffieon has learned firsthand, as a
"farm wife living in Iowa," that the forces of multinational agribusiness organizations might
create a structure of agriculture that will be detrimental, rather than beneficial, to the
well-being of society. One in which ordinary farmers are exploited by too powerful
multinational agribusiness firms.
To effectively apply Marxian theory to today's conditions, therefore, we should recognize
that:
1. the more powerful are in a position to exploit resources from the less powerful,
2. the less powerful as a result of their lack of access to decision making can become
alienated (i.e., separated) from society and therefore more likely to engage in less
productive or even deviant activities,
3. citizens of democracies, who have the opportunity to institute change, need to be
ever mindful of potential exploitation and take actions to protect equal opportunities
for all.
55
Thus,
1. Individual interests are distinct from, and opposed to, the general interest
represented by the State. Citizens do not consider themselves as participants in public
affairs, but view the state as an external necessity of which they have to take
account.
2. The state is the rule of reason in society, the incarnation of freedom. The citizen, as a
separate individual, has civil and economic, but not political interests.
3. Reconciliation of this conflict is based on the fact that people are creatures of reason.
If freedom is located in the selfish desires of the individuals, then social life
would appear possible only by setting up an external organization to limit this
freedom; government then appears as a necessary evil.
But if citizens realize that their true freedom consists in the acceptance of
principles, of laws which are their own, a synthesis of universal and particular
interests becomes possible.
This synthesis can be actualized only in and through political institutions,
whereby the State is distinguished from civil society.
Civil life then remains as an element of the State, but only as a subordinate
moment in it. Political interests transcend but do not replace individual
economic interests.
People have a universal side and so can accept universal laws without becoming
unfree.
Marx could not accept this abdication of power, and citizen responsibly, to the State. He saw
the need for citizens to become more politically active, especially given the terrible
conditions of the working class and the inevitable (in his opinion) collapse of capitalism.
Implications of conflict theory for the sociology of technology:
1.
2.
3.
4.
Focus on biased estimations of risk.
Focus on unequal distribution of risk.
Focus on ethical need for a technology.
Focus on potential for changing social relationships from the introduction of a
technology.
5. Emphasis on preventing the negative consequences of new technologies.
Application in Context
Hog lots in Iowa?
From the critical perspective, the answer to the question posed above is that people
should "revolt," not in a violent way, but through their voting power, to establish
laws to enable local people to control their well-being in the face of powerful
corporations who care only for profit and not about the welfare of local citizens.
Human Agency (Symbolic Interactionism)
This paradigm focuses not upon societal institutions or power relationships within society,
but upon interactions among the members of the society. It addresses issues of how people
make the rules that determine which technologies will be adopted and which ones will be
rejected. In a democratic society, ultimately, it is the people who decide whether to adopt
new technology, assuming they have full knowledge and access to power through their
56
votes and other means of influence. The central question addressed from the perspective of
human agency is, "How do people evaluate technology?"
Where did society come from, anyway? Well, from us! From the perspective of symbolic
interactionism, society is in a constant state of re-creation through interaction and
negotiation of meanings. We created the rules we live by, and, importantly, we re-create
these rules everyday through our interactions with one another. Mostly, societies are
conservative with respect to social change. But, our redefining of: 1) the symbolic meanings
we attach to things and events, 2) our sense of morality and ethics, and 3) what we choose
to value have important implications for the rules we create and the ways we choose to live
with one another.
Listed below is a very abbreviated outline of the central tenets of the symbolic interactionist
approach to understanding human social systems. We will use these tenets throughout this
course to gain a symbolic perspective on technology issues facing America today.
1. Reality is socially constructed through our interactions with one another. Morality,
ethics, and values, are not given; we create them through our interactions with one
another.
2. Social action is influenced by person's beliefs, attitudes, perceptions, and negotiations
of meanings.
3. The rules are open for grabs. If you do not like your society: work hard to change it!
Key concepts: definition of the situation, perception, social construction of reality, morality.
A critical element of human agency is the notion of socially constructed reality, or to be
more directed toward the content of this course, socially constructed risk assessments. The
essential features of socially constructed risk assessments are:
Persuasive arguments.
Social comparison.
Choice shift.
New technologies bring about uncertainty within an arena of ignorance. That is, most
persons do not have the educational background to understand, for example, the science of
biotechnology. We are neither uneducated nor stupid, but simply ignorant about much of
the world around us. Thus, we face the consumer's dilemma: We must make a decision
about whom to trust in the face of our ignorance. Active citizens begin by hearing out
persuasive arguments in favor of and in opposition to the new technology. The arguments
themselves, however, although necessary to gaining acceptance of the technology, are not
sufficient to do so. Why? Because experience tells the public that even very highly trusted
research and development organizations sometimes make mistakes and that, sometimes,
new technologies are developed just for the economic benefit of the powerful elite. So,
people turn to others for guidance. They socially compare their opinions with those of
others. In a sense, people seek safety in numbers. If consumers sense a consensus of
thought in favor a new technology--an indication that it is social acceptable--then their
choice shift moves toward adoption. Without a sense that they are making a wise decision,
however, their choice shift moves toward rejection of the technology.
Implications of symbolic interactionism for the sociology of technology:
1. Focus on socially constructed nature of risk.
2. Focus on cultural influences on risk construction.
3. Focus on changing definitions of appropriate technology. Consideration of ethics and
morals.
57
4. Emphasis on understanding the meaning of a technology for members of the society.
Application in Context
Hog lots in Iowa?
Should the siting of large-scale hog confinement operations be subject to local
control? From the perspective of human agency, the sociologist will focus on
understanding the decision made by investigating social comparison processes. If
sociologists choose to do so, they also can focus on influencing this decision by
applying principles of innovation diffusion. If sociologists, as change agents, choose
to influence, then they must utilize the social structure and critical paradigms to
decide what is best for society and how best to mitigate the inevitable negative
consequences of new technology development.
Summary
Sociology 415 addresses issues of social structure and exploitation of power in technology
development and dissemination, but its primary focus is upon the effects of human agency
on technology adoption and rejection. Thus, the process of socially constructed risk
assessments is a critical element of the strategies we will learn about later in discussing
techniques of technology transfer.
58
We live in a society exquisitely dependent upon science and technology, in which hardly
anyone knows anything about science and technology.
Carl Sagan
Introduction
Our first objective in Unit One is to explore relationships among science, technology, and
society. To provide some structure to meeting this objective, we will review the book,
Science, Technology, and Society, written by Andrew Webster. You are not required to read
Webster's book. This section and the one that follows it will outline the principles we need to
learn.
Compass
Key Questions
How do science and technology affect the well-being of social systems?
Examples
Based upon the materials shown in the example web pages for the sampler
technologies (or, if you wish, other web-based materials on these
technologies), do you think the sampler technologies are being "oversold" to
the public? Will the public loose confidence in these technologies when their
flaws are revealed?
Are proponents and opponents of the sampler technologies being fully honest
in their presentations? Should proponents and opponents be fully honest? Do
they need to be?
How do social institutions and public policies affect science and technology?
Is the American public sufficiently informed about science and technology to
make a valuable contribution to technology policy?
Should the public be concerned that institutions such as Iowa State University
have sold themselves out to commercial interests? Can you site instances that
lead you to believe that research conducted at ISU is biased in favor of
commercial interests?
What can scientists do to give the public more confidence in the integrity of
their research?
Does American society have adequate control over the sampler technologies?
How can science and technology development be controlled? Should a society
even attempt to control advances in pure science?
59
Overview of Webster's Science, Technology, and Society
Andrew Webster examines how the economic, cultural, and political features of society
affect and are affected by science and technology. He points out differences in popular
images of science and the actual practice of science as it is conducted at research
institutions and in the private sector. Webster highlights the ways in which scientific facts
reflect "invention" as much as they do "discovery." He points out ways in which science and
technology can be exploited for societal goals, keeping in mind that the setting of societal
goals relies upon political and economic relationships among citizens. Webster ends his book
by offering some suggestions for controlling science and technology to maximize benefits to
the most persons possible.
Webster is writing from the perspective of a citizen living in a democratic society. This
course assumes the same. That is, it assumes that citizens have legal protections sufficient
to enable them to critique new technologies and provide input regarding technology policy.
Science in the Real World
Webster introduces us to the Discovery Dome, an exhibit he visited that emphases a
hands-on approach to understanding how technology works. Webster points out that this
emphasis might increase awareness and appreciation of technology and perhaps reduce
fears of it. But real understanding of technology, which can lead to a better understanding
of technological risks, requires also an understanding of how science works.
Typically, scientific method is presented as asocial, apolitical, non-economic, expert,
progressive, and so forth. Such an approach furthers the image of science as being
objective, pure, beyond the realm of people and their failings, and devoted only to making
all of our lives better. The problem with presenting such an image, however, is that while it
seeks to increase confidence in science, particularly in comparison with other methods of
knowledge acquisition, it sets unattainable expectations that lead to diminished public
confidence when science and technology inevitably are revealed to be flawed.
No scientific research is perfect and all technology is flawed in some respects. The paradox
of science, therefore, is that attempts to present it as infallible inevitably erode confidence
in it. Understanding this paradox provides us with insights for developing strategies for
gaining adoption of complex and controversial technology, topics to be explored later in this
course. That is, change agents, persons seeking to gain adoption of an innovation, are faced
with the dilemma of presenting a technology as safe and beneficial without overselling it,
knowing that, inevitably, the technology is flawed and will bring undesirable consequences
to some segments of the population.
Active Citizens
Webster argues that science is socially constructed. By this he means that science is not an
objective, value-free pursuit of knowledge guided solely by theoretical propositions. Instead,
the enterprise of science--which questions get asked, which research gets funded, how
research is conducted, how findings are interpreted--is dependent upon negotiation and
debate among scientists and between scientists and the public. Scientists tend to pursue
questions of more immediate interest to the public, with greater potential for lucrative
patents, or that are more popular among funding agencies. In short, science is an
enterprise as much influenced by social, political, and economic vested interests as any
other human enterprise within a democratic society.
60
If Webster is correct in his assertion that science is negotiated, then citizens must be aware
of their influence on science and their responsibility to help guide science to produce the
kind of technology best suited to their society's well-being. If, indeed, science and
technology are socially constructed and reflect socioeconomic and political interests, then
science policy--the decision-making regarding what types of science and technology will be
funded by the public--becomes central to a society seeking to use the very powerful tools of
science to produce technology for the common good.
Application in Context
Have the potential benefits of genetic engineering been oversold to the public?
1. From your reading of the Sampler materials, and other information you know
about genetic engineering, do you think the public has been adequately
informed about both the benefits and potential problems associated with this
technology?
2. Do you think proponents and opponents of genetic engineering have been fully
honest with the public?
3. Do proponents and opponents have an obligation to be fully honest with the
public?
4. What actions should/can active citizens take to learn the facts about genetic
modification of food?
Science and Science Policy
By their very nature, the uncertainties of innovative technology make it difficult for such
decisions to be made. Thus, because most efforts of science policy are directed toward
technology transfer, questions about how innovations are encouraged, measured, and
evaluated are a crucial element of science policy.
Science policy typically assumes that:
1.
2.
3.
4.
technology is independent of social (meaning cultural, economic, political) context,
scientists (experts) also are authorities on correct science policy,
technology can be objectively evaluated in any social context, and
science must be held accountable to the public.
Each of these assumptions has its shortcomings.
1. Because technology is embedded within a social context, it is influenced by social,
political, and economic interests and its transfer from one social system to another
can be problematic.
2. Expert opinion regarding the production of technology does not necessarily imply
expert opinion regarding the use and transfer of technology.
3. Evaluation of technology is exceedingly difficult, and depends upon a wide range of
indicators, including ones outside the domain of science (e.g., is legalized abortion
moral?).
Problems related to technology transfer and evaluation, therefore, make it difficult to
determine how and to what extent science and technology have met public needs.
Society and Science Policy
61
Webster takes note of an emerging emphasis on the commercialization of public sector
research and development. Public universities are being encouraged to enter into
cooperative agreements with the private sector to develop and transfer technology with
national and international commercial potential. The role of the state and commercial
interests in setting science policy has always been a concern of scientists, even though they
often benefit from national policy objectives and technology transfer to the commercial
sector. Webster discusses three concerns that have been voiced about this trend by
scientists conducting research in public universities:
1. To what extent will commercial interests manipulate the direction and focus of
scientific research?
2. Will the conditions of work and the relationships among scholars change with
increased emphasis upon meeting the needs of the commercial sector?
3. What impact will commercialization have on the free access to and exchange of
information, data, materials, and findings among scientists?
The Public and Science Policy
Controlling science is an exceptionally difficult task; who or what is to be controlled for what
purposes? To what extent should the public be involved in setting the directions and scope
of science? The public's involvement in setting national research priorities requires public
knowledge of not just the content of science, but the institution of science as well. Thus,
knowledge of content is a necessary, but not sufficient condition for deciding wise science
policy (and it is very difficult to educate the public about complex technologies).
The media plays a very important role in shaping public opinion. Traditionally, the media
has portrayed the institution of science as authoritative, objective, unbiased, and so forth.
But with increasing public concern over the risks associated with advanced technology, and
increased attention directed toward the shortcomings of technology, the media has taken a
more critical look at both the content and practice of science. In turn, the public has
become more skeptical of science and technology.
Pressure groups tend to focus on a single technology or scientific theory for the purpose of
challenging the value of science in building a good society. This type of challenge to science
typically takes the form of public debate and confrontations between the citizen groups and
representatives of the scientific community or business leaders with vested interests in a
specific technology.
Sometimes public pressure can have significant effects on the direction and outputs of
science. The movement toward the development and dissemination of appropriate
technology in developing nations, for example, has dramatically affected research and
outreach worldwide.
The alternative science movement attempts to institutionalize alternative approaches to
science to maintain an emphasis on critical evaluation of established research and
development organizations. Feminist and religious organizations, to name two examples,
attempt to redirect approaches taken by scientific institutions in recruiting scholars, setting
research priorities, and developing technology.
Webster concludes that the public, through various forms of advocacy groups, can exert
significant influences on scientific institutions and the content of science, including the
formulation of what is considered to be scientific facts.
Application in Context
62
Has Iowa "Sold Itself Out" to Corporate Farming?
1. The state of Iowa is making large investments into biotechnology. And the
Republican and Democratic candidates for Governor support increasing
investments in biotechnology. At the same time, the legislature cut funding
from the Leopold Center for Sustainable Agriculture by 86 percent. Has Iowa
"sold itself out" corporate farming?
2. Has Iowa State University "sold itself out" to large, corporate interests?
Policy Recommendations
Webster reminds us of the inherent connections among science, technology, and society. He
points out that science is a human enterprise and thus is influenced by social, political, and
economic interests. He encourages citizens living in a country ruled by democratic processes
to be active--to become aware of and involved in science and technology policy formation.
He urges us to recognize some general principles of science and technology development.
Science can be neither objective nor infallible. It is necessary, therefore, for active citizens
in a democratic society to take the responsibility for the ownership of science and
technology development.
In addition to these recommendations for citizens, Webster suggests five directions for the
social science research on science. Webster suggests that social scientists should pay
greater attention to:
1.
2.
3.
4.
5.
the 'political economy' of the scientific laboratory,
the organization and culture of private sector research and development,
the impact of public interest groups on science and technology,
integrating other social sciences into the sociology of science, and
building linkages between the sociology of science and public policy makers who
influence the direction of science and technology.
63
So oft in theologic wars, the disputants, I ween, rail on in utter ignorance of what each
other mean and prate about an elephant not one of them has seen!
John Godfrey Saxe, The Blind Men and the Elephant.
Introduction
The materials presented thus far provide the philosophical basis to begin the applied unit of
Sociology 415. We start by learning seven approaches to risk assessment of technologies.
Knowing different approaches to risk assessment and the strengths and weaknesses of each
helps us understand public responses to technologies and tailor risk communication
messages to fit different types of technologies. Importantly, as responsible change agents,
we need to know for ourselves the limitations of different types of risk assessment. The key
point of this section is that technology should be evaluated simultaneously from multiple
approaches wherein each approach might yield different findings about the wisdom of
adopting a technology.
Social Theories of Risk, edited by Sheldon Krimsky and Dominic Golding, addresses how
individuals and institutions evaluate and communicate to the public about technology risks.
We review the chapter written by Ortwin Renn entitled, "Concepts of Risk: A Classification,"
to learn about approaches to evaluating technology.
Renn classifies different approaches to risk assessment by their answers to three essential
questions:
1. How can we specify or measure uncertainties?
2. What are the undesirable outcomes?
3. What is the underlying concept of reality?
The approaches to risk assessment derived from this classification scheme are organized
into four sections:
Technical Risk Assessment,
Economic Risk Assessment,
Psychological Risk Assessment,
Sociological Risk Assessment.
Three Approaches to Technical Risk Assessment
The Actuarial Approach
Characteristics
Risk is measured as expected value (i.e., arithmetic average) based upon previous
occurrences of undesirable events. Undesirable events are defined as physical harm to
humans or other ecosystems, wherein these events can be observed with sense data (e.g.,
excluding events such as subjective perceptions). The underlying concept of reality is
positivist: that undesirable events are easily recognized, agree upon by all, and limited to
what is observed.
Assumptions
This approach assumes that sufficient data exists to make meaningful predictions about
future events and that the causal mechanism that underlies the occurrence of previous
64
undesirable events will remain stable over the prediction period.
Strengths and Limitations
The actuarial approach provides quantification of undesirable events and an indication of
what frequency to expect for future occurrences of these events. It assigns risk without
prejudice because it does not provide an explanation of why undesirable events occur.
Because it does not attempt to identify causal mechanism, however, it gives little guidance
on how to prevent or predict future occurrences of undesirable events. Also, because the
actuarial approach attempts to quantify hazard, it is subject to the types of observation and
measurement errors described in the next sections on critiques of risk assessment.
Examples of Use
The assignment of automobile insurance rates for different segments of the population
provides a good example of the actuarial approach to risk assessment. Younger drivers pay
higher insurance rates than do older drivers because historical evidence shows that younger
drivers have more accidents than do older drivers. This assignment of a higher insurance
rate to John, Jr. is not an expression of prejudice against him; it is simply an assertion that
persons in his age group are more likely to be involved in an automobile accident than
persons in the age group of John, Sr. John, Jr. might be an excellent driver, perhaps even a
better one than is John, Sr. But Jr. pays a higher rate for automobile insurance due to the
expected value for his age group.
The Toxicological/Epidemiological Approach
Characteristics
The toxicological/epidemiological approach attempts to identify causal mechanisms in
occurrences of undesirable events. This focus on explaining why negative events occur
requires the application of scientific theory to analysis of previous events. Hence, this
approach represents a considerable advancement over the actuarial approach in its attempt
to explain the occurrence of undesirable events. As with the actuarial approach, undesirable
events are considered to be observable and reality is thought of as positivist in nature.
Assumptions
This approach assumes that the correct theoretical explanation has been applied to the data
on previous undesirable events. It assumes that events can be explained and that future
events will, under conditions specified by a theory, occur in accordance with theoretical
predictions.
Strengths and Limitations
The toxicological/epidemiological approach provides quantification of undesirable events, an
explanation of these events, and therefore a rationale for predicting the occurrence of
undesirable events in the future if theoretical conditions exist. But it depends upon correct
specification of theory. If the theory is free from misspecified spurious or suppressor
relationships, then the risk assessment will be relatively correct subject to the errors of
observation and measurement as described below and in the critique of risk assessment.
Examples of Use
Modeling events is a common practice in science and modeling undesirable events is one of
the keystones of risk assessment. In quantitative risk assessment, technical experts
attempt to derive expected frequencies of undesirable events based upon experience with
65
previous events and theoretical expectations of conditions occurring that would lead to an
undesirable event occurring in the future. Thus, a technical expert might conclude
theoretically that the use of Pesticide A will result in one additional person in ten million
contracting cancer than would be the case if Pesticide A were not to be used.
The Probabilistic Approach
Characteristics
Probabilistic risk assessment is the use of modeling applied to technology systems rather
than to a single event. This approach relies upon the application of logic systems such as
fault-tree or event-tree analyses to arrive at a quantitative assessment of overall system
failure as some function of the probability of the failure for each of the components of the
system. Undesirable events are considered as observable and positivist in nature.
Assumptions
The approach assumes that theories for each individual risk assessment are correct and that
the procedure for combining the probabilities of individual failures to arrive at an overall
assessment of failure is correct.
Strengths and Limitations
The probabilistic approach is useful for quantifying the probability of system failure for
complex technologies. It is difficult to model, however, the probability of common mode
failure, (the simultaneous breakdown of more than one system component) and humanmachine interactions. The approach is limited by all the possible errors of observation and
measurement outlined in the critiques of risk assessment. But note that such failures in
precision can occur for each component of the overall system and that each probability for
error is multiplied by the probability for error in subsequent components. Thus, the overall
assessment of failure can be highly inaccurate.
Examples of Use
Many technologies consist of a system of individual technologies. In fact, the technologies
people tend to fear the most--nuclear power plants, petrochemical refineries, and food
safety nets--consist of many individual components that might fail, which would result in
total system failure. The risk management approach taken in such cases is to institute
backup systems in the event that one system fails and early warning systems to detect an
impending component or system failure.
Application in Context
What is the technical risk assessment of food irradiation?
Quantitative Risk Assessments
Food irradiation is perhaps the most studied food processing technology, with
over 50 years of research on its effects.
In theory, the dosages of gamma rays applied in food processing should allow
for virtually no chance of survival for microorganisms living in or on the food.
Studies that seem to show adverse health effects from eating irradiated food
have been discounted because of serious flaws in methodology.
The U.S. Food and Drug Administration has determined that irradiated food is
safe to eat.
66
Critical Thinking
Qualified epidemiologists have critiqued the studies used by the FDA in
approving food irradiation as having serious flaws in methodology. These
persons nevertheless support the technology.
Therefore, note that technical assessments are critiqued as being flawed by
both proponents and opponents of food irradiation.
Is it possible to conduct scientific inquiry that is not flawed? If not, then how
should the consumer interpret conflicting accounts of scientific credibility?
Critiques of Technical Risk Assessment
Ortwin Renn
Understanding that risk is multifaceted explains how societies can be in conflict over
technology adoption and why technology adoption sometimes can take a long time to
achieve. Renn makes these observations about approaches to risk assessment:
1. All approaches have benefits and drawbacks.
2. All approaches are necessary for a complete understanding of risk.
Renn offers these critiques of technical risk assessment:
People have different values and preferences that affect their perceptions of risk.
Human interaction with technological systems is difficult to model quantitatively.
Outside the domain of all forms of technical assessment, for example, is the
probability of technology failure due to human mismanagement or irresponsible
behavior.
The institutional structure designed to manage risk itself might be inadequately
designed or managed to do so.
Technical approaches imply risk management practices in proportion to quantitative
risk assessment. People, however, also desire risk management policies that include
objectives such as fairness, equity, and sense of morality/ethics.
John Adams
Adams, (Risk, 1995) points out that risk is not easily measured, agreed upon by diverse
audiences, or managed. In asking, "Can we assess risk better?," he is not so much posing a
problem that has a one best solution as challenging us to become more involved in
understanding and evaluating relationships among science, technology, and society.
Adams notes that sometimes the public and scientific experts differ in their evaluations of
technology risk. This disagreement occurs, in part, because the public uses a wide variety of
criteria, including some nonscientific criteria, in its evaluations of risk. Adams distinguishes
between formal and informal approaches to risk evaluation, wherein formal approaches
emphasize technical assessments of health and safety hazards and informal approaches
address social, political, economic, and ethical issues. He observes that the typical response
of technical risk evaluators to nontechnical evaluations is a patronizing effort to further
educate the public about the real risks associated with a technology.
Adams rejects as a false dichotomy the notion that technical experts know actual risk and
the public harbors uninformed, misinformed, and even irrational perceptions of risk. He
asserts that individual and group risk-taking involve instead a balancing act between social,
political, economic, and ethical costs and benefits. He argues that adherence to the false
dichotomy has led to many misguided attempts to educate the public into thinking correctly
67
about a new technology. Given that such educational efforts are necessary but not sufficient
motivators of attitudinal and behavioral change (even when scientists do have a good
knowledge of actual hazards), scientific experts experience inevitable failures and
subsequent frustration in their attempts at risk communication.
Adams points out the actual versus perceived dichotomy is false in two respects:
1. Technical risk assessments are neither entirely objective nor necessarily very precise.
Sometimes no data exists upon which to make a risk assessment. For new
technologies, this problem is common. For complex technological systems, the
problem increases geometrically. That is, there might be no data for a particular
component of the system and there might be no data for the combination of
various components with one another.
Inadequate data, improper recording of data, and data that are difficult to
disaggregate also can create problems in technical risk assessments.
When technical risk assessments are demanded, and the data are inadequate
for such assessments, guesswork must be made, which elicits problems with
values and opinions entering into presumably objective indicators of risk.
Technical risk assessment is further hampered by accident migration (the
tendency for ignored areas of accident occurrence to experience increased
accidents) and regression towards the mean (the natural ebb and flow of
accidents associated with a certain range of events).
Cultural filtering determines which types of risk will be assessed and the
outcome of the risk assessment. Noise (measurement error in collecting data),
"near misses" (ambiguous data), and bias (misrepresentation of data) also
affect quantitative risk assessment.
Deriving cost/benefit analysis for a technology not yet in use can be especially
difficult. First, to assess expected utility, the user of the technology must be
fully informed of the risks associated with it. The educational requirements for a
complex technology, however, can be extensive. Second, users must be able to
incorporate subjective evaluations into their expectations of utility. As noted,
these evaluations depend upon the social construction of risk, which not only
include many subjectively defined shared values, but require some lag time to
fully develop.
2. Technical risk assessments exclude considerations of political, social, and ethical
goals.
Even to the extent that technical risk assessments accurately reflect hazards, because
they ostensibly exclude consideration of political, social, and ethical goals, they
provide only a limited appraisal of the value of a technology. A technology might
contain few hazards but engender much outrage. The abortion of a human fetus, for
example, is a fairly safe technology (for the mother) but raises strong emotional
feelings in American society. On a different note, some argue that risk assessments
intentionally include political and economic considerations. See: Risky Business: How
Scientific Are Science-Based Risk Assessments?
Michael Bell and Diane Mayerfeld
Bell and Mayerfeld (The Rationalization of Risk, 1999: full text article) , like Renn and
Adams, note important limitations to technical approaches to risk assessment:
Quantitative risk estimates are precise, but often are not accurate because they rely
upon a whole series of assumptions, guesses, and extrapolations that limit their
accuracy.
Estimated risks often do not account for multiple hazards that occur in conjunction
with one another in complex technological systems. For example, we might estimate
68
the risk of pesticides A and B, but often we do not estimate the risk of pesticide A in
combination with pesticide B.
Numbers often carry disproportionate effect in technological assessments of risk.
Technical risk assessments often falsely homogenize populations. That is, the risk for
a child might be different than the risk for an adult.
Judith Bradbury
Bradbury (Science, Technology, and Human Values 14: 380-389, 1989) notes that all forms
of risk assessment have limitations. Therefore, risk communication strategies that rely too
closely upon a single risk assessment framework will not be as effective as they could or
should be.
Technical approaches define risk as the product of the probability and consequences of an
adverse event. Assessments of probability and consequences are made by technical experts
as part of quantitative risk assessments. From the perspective of the technical approach,
risk can be evaluated independently of political, economic, or social conditions. Thus, risk
resides primarily in the technology and its relationship to foreseeable consequences.
The technical approach implies communication strategies that educate the public about
technical risk assessments. Technical assessments are considered to represent actual risks.
When these risks are deemed by regulatory agencies to be minimal, then, if possible, they
are not conveyed to the public to avoid unnecessary concern. When risk assessments
become public and consumer perceptions do not coincide with actual risk, then, from the
technical perspective, acceptance of a new technology can be unnecessarily delayed or
implementation can become more expensive than necessary. Thus, public rejection of the
logic of technical risk assessments is considered to be irrational. Risk communication
strategies thereby focuses upon educating an ignorant and sometimes irrational public
about actual risk. Strategies seek to reduce outrage based upon inaccurate perceptions so
as to retain a focus on actual risk.
Bradbury asserts that this approach ignores the economic, political, and cultural dimensions
of risk assessment and management. The public is not necessarily ignorant of technical risk
assessments nor are they being irrational in expressing skepticism about a technology when
technical assessments show it to be a minimal risk. Rather, the public evaluates technology
on a broader set of criteria than is considered in technical risk assessments. Bradbury asks,
for example, "Who bears the burden of responsibility for defined risk?" and " Who decides
how risk will be evaluated?" She argues that because technical assessments ignore
economic, political, and cultural issues, risk communication strategies that focus upon
education about technical facts are necessary but insufficient to sway public opinion.
Risk Perceptions and Risk Management
Managing risk, then, is a key motivator of much professional practice. But the objective of
managing risk must be as free from misplaced concreteness as possible to avoid polemics.
Myths about nature, or cultural outlooks, affect all risk evaluations. These myths--also
called paradigms, ideologies, belief systems--are the set of assumptions about reality
formed through shared experience, supported by interactions with others, and routinely go
unquestioned. They are culturally constructed and maintained.
What happens when observations about reality do not correspond with our assumptions
about it? Certainly, we should avoid being too hasty to revise paradigms; they must have
enjoyed a great deal of support at some point in time to have gained their standing. Yet, to
cling too long to paradigms with many anomalies is to engage in the fallacy of misplaced
concreteness: to believe in the paradigm in spite of overwhelming evidence that refutes it.
But paradigms carry much emotional baggage. Revising them or exchanging them for
69
radically different ones requires not only much scientific debate, but much soul-searching as
well. Thus, evaluations of risk, if they place pressure on paradigms, which they sometimes
do, will instigate debates about paradigms that reflect cultural outlooks. Hence, debates
about high risk technology often entail emotionally charged debate that reflects cultural
outlook.
Can we manage risk better? Adams suggests keeping in mind the following observations on
the evaluation of risk by technical experts:
1.
2.
3.
4.
Remember, everyone else is seeking to manage risk, too.
They are all guessing; if they knew for certain, they would not be dealing with risk.
Their guesses are strongly influenced by their beliefs.
Their behavior is strongly influenced by their guesses and tends to reinforce their
beliefs.
5. It is the behavior of others, and the behavior of nature, that constitute the actual risk
environment.
6. It will never be possible to capture "objective risk."
70
In everything one thing is impossible: rationality.
Friedrich Nietzsche
Introduction
It seems like such a simple thing, does it not? Rationality. We all have it, all the time, in all
settings (well, mostly anyway). Or do we? Can we be irrational or perhaps nonrational? Or
perhaps can we be rational and irrational and nonrational all at the same time. Here, we
explore the concept of rationality in its various manifestations. Our purpose in doing so is to
better understand why people judge technologies as good or bad, safe or unsafe, and to
understand why two seemingly "rational" persons can view the same technology so
differently.
Given its importance for understanding human behavior it is not surprising to learn that
much has been written about rationality. We limit our reading mostly to a single source, the
writings of the sociologist Max Weber. We will supplement Weber's writings with a brief
review of the literature regarding rational choice theories as they are conceived in the social
sciences (particularly in the disciplines of sociology and economics). The majority of the text
here is borrowed from a paper I and co-authors published in the journal Science
Communication.
Rationality
Max Weber's (1968[1921]) treatises on rationality distinguish between zweckrational
motivations (i.e., ones aimed at attaining "rationally pursued and calculated ends" p. 24-26)
and wertrational motivations (i.e., ones guided by a "conscious belief in [a] value
[orientation] for its own sake" p. 24). To Weber, wertrational expressions have importance
to the actor "independently of [their] prospects of success" (Weber, 1968[1921]: 24).
Wertrational expressions thereby are neither irrational nor nonrational but instead are
"induced by immanent or transcendental (as opposed to instrumental) values and thus by
intrinsic rather than extrinsic motivation" (Zafirovski, 2005). In distinguishing between
zweckrational (i.e., formal) and wertrational (i.e., substantive) motivations Weber posed a
complementary ontological position to that offered in other social sciences, particularly to
that offered in economics as the foundation of rational choice theory (e.g., Boudon, 1981;
Coleman, 1990; Elster, 1989; Hechter and Kanazawa, 1997; Sen, 1977; Simon, 1982).
Rational choice theory posits that individuals in their decision making seek to maximize their
benefits and minimize their costs related to some form of goal-attainment (e.g., Sen,
1977). "Thin" forms of rational choice assume that economic benefits define the first-order
ends and motives for goal-attainment whereas "thick" forms of rational choice broaden the
definition of potential ends to include non-economic goals, such as prestige, power,
influence, and the like, including expressions of value orientations (Elster, 1989; Hechter
and Kanazawa, 1997). In either its thin or thick versions the essential element of rational
choice is that behavior in some manner is oriented toward achieving an extrinsically
identifiable goal, whether this goal be defined by the maximization of economic or
non-economic utility (Boudon, 1981; Sen, 1977).
As might be expected the teleological determination inherent within thick rational choice
theory, wherein it is assumed that the actor is pursuing "some goal," in addition to its
assumptions about full knowledge, intentionality, and transitivity have engendered debate
within the social sciences regarding its falsifiability and eventual pragmatic usefulness (e.g.,
Ackerman, 1997; Sen, 1977; Smelser, 1992). We pass on this debate to explore instead the
distinction drawn by Weber between rational behavior oriented toward extrinsically defined
71
goals, whether these goals be economic or non-economic ones (i.e., formal rationality), and
rational behavior reflecting intrinsic self-expressions of values that present merits to the
actor in and of themselves, ones that have importance to the actor independently of their
prospects of success (i.e., substantive rationality). Weber's typology has proved valuable to
social scientists in understanding human motivation and in informing approaches to
incorporating such motivations within public policy formation (Ritzer, 2010). In considering
sound public policy formation, for example, policy makers might wonder about the extent to
which public opinions offered only for the sake of expressing value-based orientations (i.e.,
an expression of substantive rationality) should be incorporated into policy deliberations.
Hence, from an applied perspective empirically validating this distinction might highlight the
significance of such expressions as a mechanism by which people support experts'
recommendations (e.g., Kahan, et al., 2006; Kahan and Slovic, 2006; Sunstein, 2005,
2006). And conceptually, such an evaluation would support Weber's typological distinction
between zweckrational and wertrational motivations.
Consider a situation wherein a person believes that they share the same values as another
person and then rates this person on their competency (i.e., their skills and ability to
accomplish a task) and their fiduciary responsibility (i.e., their willingness to do the right
thing in accomplishing a task). The hypothesized relationship between perceived shared
values and evaluations of fiduciary responsibility seems to conform with the description of
formal rationality. That is, it seems like a logical means-ends relationship to assume that a
perception of shared values with an agent would influence an actor to believe that the agent
will "do the right thing." On the other hand, the hypothesized relationship between
perceived shared values and evaluations of competence does not seem to meet the
requirements of formal rationality because a sense of shared values seems to provide
neither a necessary nor sufficient condition to infer competence. For example, Stephanie
might believe that Natalie shares her same values but has not the competency to
successfully carry out her task-related responsibilities. Conversely, Stephanie might believe
that Natalie does not share her same values but recognizes that Natalie nevertheless is
highly competent at carrying out her task-related responsibilities. The specified relationship
between shared values and competency, therefore, might reflect some type of rationality,
but inasmuch as it does not represent a logical connection between means and ends cannot
in itself represent formal rationality. Nor is it likely that such an expression be deemed as
irrational, at least within the scope of contemporary social science definitions of rationality
(e.g., Zafirovski, 2005).
The question becomes, then, does this seemingly nonrational expression of values represent
an extension of rational choice in that it is directed toward goal-attainment (i.e., thick
rational choice) or does it reflect an expression of values in themselves (i.e., substantive
rationality)? Thick forms of rational choice theory might consider this relationship as an
expression of formal rationality in that it reflects attributed competence for the prospects of
attaining an extrinsically identifiable goal. For example, although Stephanie might believe
that Natalie does not have the competence necessary to successfully complete a task she
nevertheless might attribute competence to her as a means of encouraging her to attain an
extrinsically identifiable goal (i.e., "I desire for Natalie to be successful"). Alternatively, this
expression might reflect substantive rationality in that it represents an assertion of values in
themselves without regard for the prospects of success. For example, although Stephanie
might believe that Natalie does not have the competence necessary to successfully
complete a task she nevertheless might attribute competence to her as a means of
expressing her value-orientation regardless of whether the task is successfully completed
(i.e., "I wish to express my support for Natalie's values").
The results of empirical inquiry regarding consumers' perceptions of the U.S. food system
found evidence that consumers' evaluations reflected elements of both substantive and
formal rationality. The significance of this finding can be illustrated by considering it within
the context of contemporary civic discourse regarding the extent to which public values
should be incorporated within social policy formation. We explore this line of inquiry
72
specifically with regard to consumer perspectives of U.S. agricultural production goals.
Much of contemporary debate about the extent to which the public should be involved in
technology policy formation centers upon issues in agricultural production (e.g., Sunstein,
2005, 2006), wherein U.S. consumers express their opinions about agricultural practices
through their adoption or rejection of new technologies (e.g., Israel and Hoban, 1992; Sapp
and Korsching, 2004) and their advocacy for legislative changes in agricultural policies
(e.g., Auld, 1990; Humane Society of the United States, 2011; Lovvorn and Perry, 2009;
Lulka, 2011; Sierra Club 2011). Within the past decade especially consumers have
successfully pursued legislative solutions, particularly with regard to addressing
environmental and animal welfare issues, typically lobbying for what experts consider to be
less rational technologies (e.g., Croney, 2010). As expected, however, citizens' pursuit of
legislative solutions raises questions about the extent to which social policy should be
determined by people whose opinions are relatively uniformed and sometimes reflect
ineffective use of decision-making heuristics (see especially reviews by Dietz and Stern
(2008) and Renn (2008)). Sunstein (2005), for example, in assessing public responses to
genetically modified food products, proposes that because people's opinions always will be
influenced by a lack of full knowledge and the use of heuristics that often mislead them
regarding actual risks (i.e., as determined by expert risk analysis), a deliberative society is
best served by having expert panels listen to citizens' value expressions but develop social
policies that conform with scientific findings. He thereby argues that expert panels should
be the principal agents in forming social policy, policy that is guided by values but not by
blunders.
Of course, ample evidence exists to note that the enterprise of science itself is guided by
politics, economics, religious beliefs, culture, and the like (e.g., Freudenburg, 1988;
Shrader-Frechette, 1991). And sound scientific evidence might be superseded by the
implementation of politically expedient social policies. For example, whereas civic leaders in
the European Union might recognize that restrictions placed upon imports of genetically
modified food arising from consumer-voiced safety concerns are unsupported by scientific
evidence they might nevertheless impose such restrictions to appease concerned consumers
(e.g., Carter and Gruere, 2003). As a further potential limitation of the expert panel
approach one wonders about the extent to which a skeptical public will accept the idea of
social policy being influenced for the most part by panels comprised of persons whom they
mistrust (Croney, 2010). Fischhoff (1995), for example, notes that approaches that ignore
public opinion, present just the facts, interpret the facts for the public, or rely upon other
strategies that represent a one-way communication from risk experts to the public have in
the past failed both to adequately inform public opinions and achieve public support.
Therefore, within a democratic society it might be unrealistic to presume that citizens will
delegate social policy decision-making to expert panels. Nevertheless, Sunstein (2005)
echoes the sentiments of others (e.g., Goklanv, 2001; Graham, 2004; Hanekamp, 2006;
Powell, 2010) in advocating for a greater influence of science-based expert opinion when
designing social policies.
In contrast to this position, Kahan and colleagues (e.g., Culture Cognition Project, 2011)
contend that greater attention should be paid to the value-based expressions of citizens.
They assert that the key determinants of trust in and support for social policies are the
value-orientations that citizens use to inform their policy preferences rather than the
heuristics they use to interpret risks. Relying upon the grid-group typology advanced by
Douglas and Wildavsky (1982), Kahan and colleagues propose that effective and acceptable
social policies can be achieved when they elicit a sense of shared values among
persons/groups with differing value-orientations. It should be noted that value-orientations
do not necessarily predict social policy preferences. For example, a person with an
egalitarian value-orientation and a global vision might support the agricultural production
goal to "grow enough food to help feed the world." A person with an egalitarian valueorientation and a vision limited to "America first," on the other hand, might oppose this
goal. Thus, as Sunstein (2006) notes, the culture cognition approach might simply
73
represent one form of bounded rationality. Nevertheless, the findings of Kahan and
associates support the claim that policies that give people a sense of shared values can be
effective in gaining consensus support. Recent legislative action regarding egg production,
for example, represents in part a reconciliation among institutional representatives who
express differing value-orientations (e.g., Humane Society of the United States, 2011).
It should be recognized that some persons even if they hold strong opinions do not
necessarily wish an audience for them among policy makers. And to the extent that one
believes that experts share their values then one might feel comfortable with policy being
guided by these experts. Yet it is not uncommon, especially within the arena of agricultural
production policy formation for expert opinions to differ substantially from those of a public
that has only their value-orientations to offer as justification for policy alternatives. For
example, whereas experts in a recent policy debate strongly agreed that the current size of
hen cages were scientifically justified, U.S. consumers successfully voiced their valueoriented opinions that cages needed to be larger (Humane Society of the United States,
2011). In this sense, although public opinions might be relatively uniformed, they
nevertheless are important to policy formation and not necessarily irrational in content. To
the extent that citizens wish to be heard, therefore, the question for a democratic society
becomes to what extent should policy makers heed the advice of relatively uniformed
citizens in lieu of the advice offered by experts?
Our findings can be interpreted to support either of the two perspectives described above.
To the extent that the findings here demonstrate an intrinsic importance to value expression
they support social policy formation that facilitates reconciliations among groups/people
with diverse value-orientations. This approach, however, might be frustrated if citizens are
expressing their values with no desire to attain an externally identifiable goal, as would be
the case when these expressions represent instances of substantive rationality. In this case,
pursuing such approaches will be ineffective to the extent that values are entrenched.
Alternatively, in indicating that citizens to some extent express their values with no
extrinsically identifiable rationale in mind, the findings here support social policy formation
that accentuates decisions made by expert panels. After all, if citizens express their values
with no more rationale than their desire to do so then it seems prudent for a nation to
provide expert guidance to social policy formation. However, along these same lines the
findings indicate that expert panel approaches, ones that listen to but overrule citizens'
value-orientations, might be difficult to implement because people strongly desire valueexpression. Also, implementing such approaches might foment distrust in societal
institutions to the extent that they are perceived as being insensitive to citizens' values.
In conclusion, finding that consumers seemingly use both substantive and formal rationality
to evaluate the U.S. food system presents challenges to gauging the extent to which
consumer opinions should be incorporated into social policy formation. If indeed it is
unrealistic to implement expert panel approaches to social policy formation then the most
successful approach to achieving acceptable U.S. agricultural production policies, even if in
some cases this approach might be either inefficient or ineffective, will involve actions taken
by agricultural producers, agribusiness firms, and agriculture-related government
institutions to develop goals and pursue agricultural production practices that are based
upon mutual understandings and shared values with consumers. With these considerations
in mind, we believe our results indicate that developing both well-reasoned and publically
acceptable technology-related policy will require greater rather than less emphasis upon
furthering effective science communication theory and practice.
References
Ackerman, F. 1997. "Consumed in Theory: Alternative Perspectives on the Economics
of Consumption." Journal of Economic Issues 31: 651-664.
Auld, M.E. 1990. "Food Risk Communication: Lessons from the Alar Controversy."
Health Education Research 5: 535-543.
74
Boudon, R. 1981. The Logic of Social Action. London: Routledge & Kegan Paul.
Carter, C.A. and G.P. Gruere. 2003. "International Approaches to the Labeling of
Genetically Modified Food." Choices 18: 1-4.
Coleman, J.S. 1990. Foundations of Social Theory. Cambridge: Harvard University
Press.
Croney, C.C. 2010. Should Animal Welfare Policy Be Influenced By Consumers'
Perceptions? White Paper, Animal Behavior and Bioethics, The Ohio State University.
Cultural Cognition Project. 2011. http://www.culturalcognition.net/ (accessed,
December, 2011).
Dietz, T. and P. C. Stern. 2008. Public Participation in Environmental Assessment and
Decision Making. Washington, DC: National Research Council.
Douglas, M.T. and A.B. Wildavsky. 1982. Risk and Culture: An Essay on the Selection
of Technical and Environmental Dangers. Berkeley, CA: University of California Press.
Elster, J. 1989. Nuts and Bolts for the Social Sciences. Cambridge, UK: Cambridge
University Press.
Fischhoff, B. 1995. "Risk Perception and Communication Unplugged: Twenty Years of
Progress." Risk Analysis: 15:137-145.
Freudenburg, W.R. 1988. "Perceived Risk, Real Risk: Social Science and the Art of
Probabilistic Risk Assessment." Science 242:44-49.
Gollier, C. 2004. The Economics of Risk and Time. Boston: The MIT Press.
Goklanv, I.M. 2001. The Precautionary Principle: A Critical Appraisal of Environmental
Risk Assessment. Washington, DC: Cato Institute.
Graham, J.D. 2004. The Perils of the Precautionary Principle: Lessons from the
American and European Experience. Washington, DC: Heritage Foundation Report
#118.
Hanekamp, J.C. 2006. "The Precautionary Principle: A Critique in the Context of the
EU Food Supplements Directive." Environmental Liability 2: 43-51.
Hechter, M. and S. Kanazawa. 1997. "Sociological Rational Choice Theory." Annual
Review of Sociology 23: 191-214.
Humane Society of the United States. 2011. Farm Animal Protection: Web Page:
http://www.humanesociety.org/issues/ (accessed, December, 2011).
Israel, G.D. and T.J. Hoban. 1992. "Concern About Eating Genetically Engineered
Food." Southern Rural Sociology 9: 23-43.
Kahan, D.M., P. Slovic, D. Braman, and J. Gastil. 2006. "Fear of Democracy: A
Cultural Evaluation of Sunstein on Risk." Harvard Law Review 119: 1071-1109.
Kahan, D.M. and P. Slovic. 2006. "Cultural Evaluations of Risk: "Values or Blunders?"
Harvard Law Review 119: 166-172.
Lovvorn, J.R. and N.V. Perry. 2009. "California Proposition 2: A Watershed Moment
for Animal Law." Journal of Animal Law 15: 149-169.
Luhmann, N. 1979. Trust and Power. New York: Wiley.
Lulka, D. 2011. "California's Proposition 2: Science, Ethics, and the Boundaries of
Authority in Agriculture." Agriculture, Food, and the Environment 33: 29-44.
Powell, R. 2010. "What's the Harm: An Evolutionary Theoretical Critique of the
Precautionary Principle." Kennedy Institute of Ethics Journal 20: 181-206.
Renn, O. 2008. Risk Governance: Coping with Uncertainty in a Complex World.
London, England: Earthscan.
Ritzer, G. 2010. Sociological Theory: Eighth Edition. NY,NY: McGraw Hill.
Sapp, S.G. and P.F. Korsching. 2004. "The Social Fabric and Innovation Diffusion:
Symbolic Adoption of Food Irradiation." Rural Sociology 69:347-369.
Sen, A. 1977. "Rational Fools: A Critique of the Behaviorial Foundations of Economic
Theory." Philosophy and Public Affairs 6: 317-344.
Shrader-Frechette, K.S. 1991. Risk and Rationality: Philosophical Foundations for
Populist Reforms. Berkeley, CA: University of California Press.
Sierra Club. 2011. Sierra Club News Releases: Web Page: http://action.sierraclub.org
/site/PageNavigator/E-Newsletters/Pressroom (accessed, December, 2011).
Simon, H. 1982. Models of Bounded Rationality. Cambridge, MA: MIT Press.
Slovic, P. 2000. The Perception of Risk. London , England: Earthscan.
75
Smelser. N. 1992. "The Rational Choice Perspective." Rationality and Society 4:
381-410.
Starr, C. 1969. "Social Benefit Versus Technological Risk." Science 165: 1232.
Sunstein, C.R. 2005. Laws of Fear: Beyond the Precautionary Principle. New York:
Cambridge University Press.
Sunstein, C.R. 2006. "Misfearing: A Reply." Harvard Law Review 119: 1110-1125.
Weber, M. 1968[1921]. Economy and Society, Volume 1. Edited by G. Roth and C.
Wittich. Berkeley, CA: University of California Press.
Zafirovski, M. 2005. "Is Sociology the Science of the Irrational? Conceptions of
Rationality in Sociological Theory." The American Sociologist 36: 85-110.
76
Be wary of the man who urges an action in which he himself incurs no risk.
Joaquin Setanti
Introduction
Technical approaches assume that risk is tied to the technology, that it is associated with
the probability of failure of the technology itself. In this sense, technical approaches ask,
"How safe is the technology?" Social science approaches view risk as perceptions held about
the technology. Risk assessments note that the human enterprises using technology are
intrinsically flawed. Knowing that failure will somehow occur, the question thus becomes, "Is
the technology safe enough?"
The first social science approach we discuss is the economic one. The economic approach
seeks to assign monetary quantification of the costs and benefits of a technology.
The Economic Approach
Characteristics
Risk is measured as expected utility and undesirable events are defined as instances where
the costs of a technology outweigh its benefits. Expected utility is the estimated value of the
technology in consideration of its use and the costs of its use. From the economic
perspective, if estimated benefits outweigh the estimated costs, then the technology has a
favorable overall risk assessment. This approach marks a significant departure from
technical approaches in that the maximization of satisfaction rather than reduction of
physical harm is the desired outcome. In the move from technical to social science risk
assessment the central question shifts from, "Is the technology safe?" to "Is the technology
safe enough?"
This excerpt from Science for All Americans (J. James Rutherford and Andres Ahlgren, 1990.
American Association for the Advancement of Science) describes the key elements of
cost-benefit analysis:
Rarely are technology-related issues simple and one-sided. Relevant technical facts
alone, even when known and available (which often they are not), usually do not
settle matters entirely in favor of one side or the other. The chances of reaching good
personal or collective decisions about technology depend on having information that
neither enthusiasts nor skeptics are always ready to volunteer. The long-term
interests of society are best served, therefore, by having processes for ensuring that
key questions concerning proposals to curtail or introduce technology are raised and
that as much relevant knowledge as possible is brought to bear on them. Considering
these questions does not ensure that the best decision will always be made, but the
failure to raise key questions will almost certainly result in poor decisions. The key
questions concerning any proposed new technology should include the following:
1. What are alternative ways to accomplish the same ends?
2. What advantages and disadvantages are there to the alternatives?
3. What trade-offs would be necessary between positive and negative side effects
of each?
4. Who are the main beneficiaries?
5. Who will receive few or no benefits?
6. Who will suffer as a result of the proposed new technology?
77
7. How long will the benefits last?
8. Will the technology have other applications? Whom will they benefit?
9. What will the proposed new technology cost to build and operate? How does
that compare to the cost of alternatives?
10. Will people other than the beneficiaries have to hear the costs?
11. Who should underwrite the development costs of a proposed new technology?
12. How will the costs change over time?
13. What will the social costs be?
14. What risks are associated with the proposed new technology?
15. What risks are associated with not using it?
16. Who will be in greatest danger?
17. What risk will the technology present to other species of life and to the
environment?
18. In the worst possible case, what trouble could it cause? Who would be held
responsible? How could the trouble be undone or limited?
19. What people, materials, tools, knowledge, and know-how will be needed to
build, install, and operate the proposed new technology? Are they available? if
not, how will they be obtained, and from where?
20. What energy sources will be needed for construction or manufacture, and also
for operation?
21. What resources will be needed to maintain, update, and repair the new
technology?
22. What will be done to dispose safely of the new technology's waste materials?
23. As it becomes obsolete or worn out, how will the technology be replaced?
24. What will become of the material of which it was made and the people whose
jobs depended on it?
Individual citizens may seldom be in a position to ask or demand answers for these
questions on a public level, but their knowledge of the relevance and importance of
answers increases the attention given to the questions by private enterprise, interest
groups, and public officials. Furthermore, individuals may ask the same questions with
regard to their own use of technology (e.g., their own use of efficient household
appliances, of substances that contribute to pollution, of foods and fabrics). The
cumulative effect of individual decisions can have as great an impact on the large
scale use of technology as pressure on public decisions can.
Not all such questions can be answered readily. Most technological decisions have to
be made on the basis of incomplete in-formation, and political factors are likely to
have as much influence as technical ones, and sometimes more. But scientists,
mathematicians, and engineers have a special role in looking as far ahead and as far
afield as is practical to estimate benefits, side effects, and risks. They can also assist
by designing adequate detection devices and monitoring techniques, and by setting
up procedures for the collection and statistical analysis of relevant data.
Assumptions
The economic approach assumes that costs and benefits can be accurately estimated and
agreed upon by all. It assumes that costs and benefits can be measured in economic terms
using a common denominator (i.e., money). Also, it assumes that all potential costs and
benefits have been anticipated and integrated within the cost/benefit analysis. The
economic approach assumes a rational actor; that is, someone who acts entirely upon
estimated utilitarian costs and benefits and who has full knowledge of these costs and
benefits.
Strengths and Limitations
Evaluating risk as a function of costs and benefits provides a universally understood basis
78
for technology assessment. But realizing the potential usefulness of the economic approach
entails overcoming some important limitations:
1. Cost benefit analysis relies upon estimated costs and benefits, which are subject to
the types of observation and measurement errors of technical risk assessments.
2. Costs and benefits do not necessarily accrue at the same time. Benefits might lag
behind costs or costs might lag behind benefits.
3. It is difficult to place issues of social welfare and perceptions of equity and fairness in
terms amenable to cost/benefit analysis.
4. The assumption of a rational actor rarely holds true in practice. People take into
account other issues besides utilitarian ones in making decisions about complex and
controversial technologies.
5. The reliance upon utilitarian contractual exchange poses potential ethical problems for
a society. Utilitarian contractual exchange would bring about a situation where the
poorest communities would bargain for the most risk (i.e., exchanging risk for cash),
meaning that the poorest members of society would bear the most cost for potentially
hazardous technologies.
6. It can be difficult to include in cost-benefit analysis all favorable and unfavorable
externalities (i.e., costs and benefits not immediately recognized in the initial
evaluation of a technology).
Examples of Use
Cost-benefit analysis provides a valuable tool for understanding what is exchanged for
technology risk. By weighing potential benefits against estimated risks, the public is
provided with the information needed for rational decision making regarding the level of risk
they are willing to tolerate in exchange for potential payoffs from a technology.
This brief paper published by Health and Safety Executive provides a template for
cost-benefit analysis of safety measures: Cost Benefit Analysis Checklist.
Application in Context
What is the cost-benefit analysis of food irradiation?
This is the most recent information available on the web regarding retail sales of
irradiated food:
1. On May 16, 2000, Huisken Meats (now, Branding Iron), based in Minneapolis,
MN, began the first commercial market testing of irradiated food (i.e., frozen
beef patties). This initial rollout to 84 groceries quickly expanded to hundreds
of stores in Minnesota and other states.
2. On July 31, 2000, Hawaii Pride began shipping irradiated fruit products to the
mainland United States. In May, 1999,
3. The SureBeam Corporation, a division of Titan Corporation, opened its first
irradiation facility. One year later, in cooperation with Huisken Meats, it was
shipping irradiated ground beef patties to thousands of groceries in 32 states.
By January of 2004, however, SureBeam was out of business due to poor
demand for irradiated food (see related article).
79
All technology should be assumed guilty until proven innocent.
David Brower
Introduction
Technical and economic approaches assume that risk assessments represent rational
responses to the objective facts of the technology. As noted earlier, these assumptions
cannot be met in practice. The approaches nevertheless assume that assessments are as
rational as can be achieved. They rely upon either expected values or utility. The
psychological approach notes that subjective, or emotional, elements affect risk
assessments. That is, some aspects of technology seem more threatening than others,
depending upon human resonses to them.
The Psychological Approach
Characteristics
Risk is measured as subjective utility. Subjective utility takes into account not only
assessments of technical hazard (i.e., estimated physical harm), but also outrage, (i.e.,
emotional reactions to estimates of technical hazard). The approach focuses on personal
preferences rather than "objectively defined" probabilities and attempts not to define hazard
but explain why people are willing to accept some risks and not others. More detail on
outrage factors is outlined in a later section of Soc 415.
Assumptions
The psychological approach assumes that subjective utilities are adequately recognized by
the actor and that these assessments are rationally applied to intentions and behavior.
Hence, it assumes that behavior follows from perceptions, even if perceptions are not
necessarily based upon technical assessments of hazard but reflect instead emotional
reactions to estimated hazard.
Strengths and Limitations
This approach recognizes that emotions guide risk assessments as much as do rational
decisions about probability of harm and the balance of utilitarian costs and benefits. It
brings people and their emotions into the risk assessment process. It is difficult, however,
to translate emotional reactions into public risk policy. Should public policy be altered
because people fear a technology that technical experts deem to be low risk?
The theoretical model depicted in this diagram provide an example of the psychological
approach to understanding public responses to technology.
Examples of Use
Knowing public outrage related to a technology can help in the design of risk communication
strategies and message content. Psychological research has identified twelve key attributes
of technologies that affect emotional responses to them:
1. Voluntary/Coerced. Risks we take upon ourselves create less outrage than those
forced upon us.
2. Natural/Industrial. Natural risks are viewed with less emotional response than risks
80
created by human actions.
3. Familiar/Unfamiliar. Things familiar are considered less risky than the unfamiliar.
4. Memorable/Not Memorable. Linking technologies to highly memorable tragedies
makes them seem more risky.
5. Not Dreaded/Dreaded. Linking technologies to dreaded events (i.e., cancer) makes
them seem more risky.
6. Chronic/Catastrophic. Risks we face everyday create less outrage than the
catastrophic event.
7. Knowable/Unknowable. People tend to fear the unknown. Opponents of a new
technology can always use this outrage factor to their advantage because, de facto,
using new technologies involves uncertainties.
8. Control/Not in Control. We feel safer when we have the ability to regulate the use of a
technology.
9. Fair/Unfair. People will become more outraged about a technology if they think they
must bear more costs or fewer benefits than do others.
10. Morally Irrelevant/Relevant. Linking the use of a technology with immoral motives
creates outrage. Linking it with moral standards lessens outrage.
11. Trustworthy/Untrustworthy. Trust in the experts who develop or endorse a new
technology might be the most important factor influencing outrage.
12. Responsive/Unresponsive. Outrage is reduced when persons/organizations responsible
for the development or regulation of a new technology seem responsive to public
concerns.
Knowing the key determinants of public outrage related to a technology can help change
agents design risk communication strategies and message content. In the section on Risk
Communication we will learn many important guidelines for communicating to an outraged
public about complex and controversial technology.
Application in Context
What are psychological perspectives on food irradiation?
Opinion polls generally show that most consumers are concerned about food safety
and consider irradiation to be a safe process. Few opinion polls, however, inform
respondents of the concerns raised by its opponents. From the psychological
perspective, one would assume that hearing negative information would yield
unfavorable opinions of food irradiation. And the results of polls that include
statements from opponents show this shift toward negative evaluations of food
irradiation.
Example of the Psychological Approach
Trust, Emotion, Sex, Politics, and Science: Surveying the Risk-Assessment
Battlefield Paul Slovic: Risk Analysis, vol. 19(4) 1999. Full text article.
Slovic asserts that risk management has become increasingly politicized and contentious.
He expresses concerns that controversy and conflict might have become too pervasive. It
might be that the quality of society erodes with too contentious public discourse about
technology policy.
The irony, he states, is that at the same time our nation has expended considerable
resources to make life safer, many persons have become more, not less, concerned about
risk. To understand this phenomenon, Slovic describes the nature of risk assessment and its
81
relationship to public perceptions. He distinguishes between hazard--technical assessment
of potential physical harm--and risk--socially constructed perceptions of risk. [Earlier, we
said that Risk = Hazard (technical assessment) + Outrage (emotional assessment). Here,
Slovic is saying that Risk is the socially constructed sum of hazard and public perceptions.
Thus the two perspectives are very similar.] Slovic states that assessments of danger, both
by technicians and the public, are influenced by political, economic, cultural, and other
social factors. Importantly, it is definitions of risk that affect risk policy--defining risk is an
exercise in power.
Thus, risk controversies are not about science versus misguided public perceptions of
science, wherein the unwashed public needs to be educated about "real" risks. Rather, risk
controversies are struggles over whom will define risk. The public cannot be labeled as
irrational because their judgments about risk are influenced by emotion. The viewpoints of
scientists also are influenced by emotion, politics, economics, and so forth. Technology
policy discourse, therefore, is not about whom is correct about assessment of danger, but
whose assumptions about political, social, and economic conditions win out in the risk
assessment battlefield. Thus, danger is real, but risk is socially constructed. Scientific
literacy and public education are important, therefore, but they are not central to risk
controversies.
Slovic raises concerns about how disparities between "real" and "perceived" risk might
engender public discourse that, itself, is a risk to the social fabric of society. Trust is a
critical factor in risk assessment and management. Social relationships of all kinds, notes
Slovic, are strongly influenced by trust. Unfortunately, trust is fragile. Slovic states that the
limitations of risk science, the importance and difficulty of maintaining trust, and the
complex, sociopolitical nature of risk point to the need for a new approach to risk
assessment--one that focuses upon introducing more public participation into both risk
assessments and risk decision-making to make the decision process more democratic,
improve the relevance and quality of technical analysis, and increase the legitimacy and
public acceptance of the resulting decisions.
Slovic argues that the system destroys trust. The pervasiveness of media attention to
technology and risk assessments destroys trust because most of what the media reports is
trust-destroying news. Also, powerful special interest groups find access to the media.
Slovic states that the young science of risk assessment cannot prevail against the level and
intensity of assaults against it.
Slovic thereby argues that whoever controls the definition of risk controls technology policy.
In contrast to others who note also the seemingly disproportionate effect of negative citizen
opinion upon risk assessment, Slovic states that more, not less, citizen involvement is
needed to adequately manage risk. It seems like Slovic is not comfortable with technology
policy formed through contentious debate between scientific experts and special interest
groups and therefore urges more widespread involvement in risk management by the
public.
82
That great, growling engine of change -- technology.
Alvin Toffler, Future Shock, 1970
The Sociological Approach
Technical, economic, and psychological approaches emphasize individual-level decision
making, whether in the quantification of the potential for technology failure, the rational
assessment of costs and benefits, or the emotional response to a technology. The
sociological approach emphasizes the socially constructed nature of risk.
Characteristics
It is recognized that risk perceptions reflect negotiated meanings through interaction with
others. Renn classifies sociological approaches to risk using two dimensions:
1. Individual versus structural, and
2. Objective versus constructivist.
Structural assessments emphasize the importance of societal definitions of risk rather than
the processes by which risk evaluations are formulated. Individual approaches focus on how
socially constructed risk is achieved through human interaction. Objective approaches are
positivist in considering risks as observable and real whereas constructivist approaches
think of risk as social artifacts fabricated from social interaction.
Dr. Sapp prefers the taxonomy presented in the Course Description, wherein sociological
studies are classified according to their emphasis on:
1. Social structure and functioning,
2. Critical theory, or
3. Human agency.
Whatever the classification system used, the critical element of the sociological perspective
is that humans, through their interactions with one another, create expectations that
influence public decision making regarding complex and controversial technology.
Assumptions
The assumptions of the sociological approach are that humans behave differently in groups
than they would as individuals, that normative expectations are formed through human
interaction, and that these expectations influence risk evaluations.
Strengths and Limitations
The sociological approach can be used to understand and influence the social construction of
risk. By understanding fundamental properties of human collectivities (e.g., collectivities
have prestige hierarchies and normative expectations for behavior) one can gain an
understanding of the process of public decision making and exert some influence upon
public decisions. Much more description of the sociological approach and its strengths and
limitations is provided in the sections on the Diffusion of Innovations.
Examples of Use
Because sociological studies on risk are undertaken from three different paradigms,
83
examples of their use vary widely. In Soc 415, we will emphasize principles of human
agency to focus on understanding and influencing the behavior of rational actors in their risk
decisions. Social mobilization theory, as one example of the structure-function paradigm,
examines the circumstances under which individuals are motivated to actively promote or
oppose certain technologies. Also as examples of the structure-function paradigm,
organization theory investigates organizational change that occurs in response to the
adoption of new technologies and systems theory examines how institutions affect and are
affected by technological adoption. The critical paradigm motivates studies on the
distribution of risk and the control of technology development and dissemination by the
powerful elite. The human agency paradigm investigates how interaction with others
influences consumers' opinions about complex and controversial technologies.
Application in Context
How do social factors affect opinions of food irradiation?
At the same time that Huisken Meats began their market testing of irradiated beef
patties in Minneapolis, MN, sociologists at Iowa State University began tracking
consumer opinions in a study of how human agency affects adoption of food
irradiation. Although much research has been conducted on human agency over the
past 35 years, few opportunities have arisen where researchers were able to track
opinions over time beginning at the introduction of a controversial technology.
As anticipated from theories of human agency, initial public skepticism toward
irradiated food shifted toward acceptance over time. This shift was influenced most
strongly by endorsements of respected people/organizations. Unit Three discusses
this "diffusion effect" in more detail.
Examples of the Sociological Approach
Contemporary philosophy focuses as much on the social construction of risk assessment,
management, and communication as classifying technology as good, bad, or indifferent. The
central issues addressed relate to citizen involvement--or lack of involvement--in technology
policy making. Contemporary viewpoints acknowledge improvements in living conditions
brought about by advances in technology while noting that the manner in which risk is
defined and by whom strongly affects technology policy. This section reviews viewpoints
offered by Ulrich Beck, Michael Bell and Diane Mayerfeld, and William Freudenburg on
relationships among risk, power, and democracy.
The Risk Society
Ulrich Beck, in Risk Society: Towards a New Modernity, expands upon the solution offered
by Habermas to the critical philosophy of technology. Beck challenges our understandings of
modernity, science, and technology and, in so doing, helps us recognize the need for new
conceptions of these endeavors and our place in a society characterized not by relations of
production, but by relations of risk. That is, Beck thinks the focal point of science and
technology policies should be the effects of technology on the welfare of all citizens, not on
the benefits enjoyed by a few citizens.
The Introduction to Risk Society, written by Scott Lash and Brian Wynne, provides a good
review of Beck's viewpoints. This Introduction is summarized here.
84
Philosophers and social scientists long have sought to develop approaches for maximizing
the use of beneficial technology while avoiding its negative consequences. Beck asserts that
the dominant perspectives reflect scientism--the culture of science--which excludes
non-rational forms of discourse and argument.
Thus, arguments not endorsed by officially sponsored scientific or governmental agencies,
or those put forth by external agencies, such as consumer advocacy groups, are considered
non-rational if they challenge assumptions of the status quo. Public skepticism is treated as
non-rational and thus is not considered to be of sufficient importance to be taken seriously
except as a barrier to scientific and technological progress. In the politics of technology
evaluation even social scientific explanations of risk can be relegated to reflect merely the
inaccurate perceptions of a misinformed public. As stated by Lash and Wynne, "technical
experts are given pole position to define agendas and impose bounding premises a priori on
risk discourses."
Beck argues for a new paradigm of risk evaluation, one that recognizes the benefits of
technology development, but at the same time recognizes the many different and equally
legitimate ways that technology can be rationally evaluated. This reflexive modernization, in
contrast with traditional modernization, seeks to understand technology in practice--the
unintended, unavoidable, and undesirable consequences of technology adoption--and the
necessary and beneficial aspects of socially constructed risk assessments on technology
development and use.
General Principles of Reflexive Modernization
1. Physical risks always are created and effected in social systems, for example by
organizations and institutions that are supposed to manage and control the risky
activity.
2. The magnitude of the physical risks is therefore a direct function of the quality of
social relations and processes.
3. The primary risk, even for the most technically intensive activities, is therefore that of
the public's social dependency upon institutions and actors who might not have their
best interests in mind.
The Rationalization of Risk
Michael Bell and Diane Mayerfeld (full text article) express concerns about how the language
employed by experts to convey risk to the public can be used to manipulate rather than
inform. They argue that what is different about the worries of the present day is neither the
number of hazards we face nor the degree of uncertainty we feel about our lives, but rather
it is the language we use to think and talk about them. They note that the language of risk
can be used to explain uncertainty; but it also can be used to explain it away. Bell and
Mayerfeld suggest that the language of risk as it is being used today has some strikingly
undemocratic implications and strongly urge greater caution in its use by social scientists
and policy makers.
Bell and Mayerfeld disagree that our times are more risky than the times of our ancestors.
Their observations are that:
We live in a time of much risk; but so have others before us.
People always have sought for some sense of control over uncertainties.
What is changed is not the amount of risk, but control over the language of risk.
Historically, risk definition has fallen primarily to technicians with the expertise to
understand the technical aspects of material innovations. But it should be recognized that
evaluations of risk are subjective, not objective. Therefore control over the manner in which
risk is defined and assessed is critical to risk management and communication.
85
Quantitative risk estimates are precise, but often are not accurate because they rely
upon a whole series of assumptions, guesses, and extrapolations that limit their
accuracy.
Estimated risks often do not account for multiple hazards that occur in conjunction
with one another in complex technological systems. For example, we might estimate
the risk of pesticides A and B, but often we do not estimate the risk of pesticide A in
combination with pesticide B.
Numbers often carry disproportionate effect in technological assessments of risk.
Risk assessments often falsely homogenize populations. That is, the risk for a child
might be different than the risk for an adult.
Given the limitations of risk assessment, control over risk definitions and strategic
communication with the public become central to risk management:
Because people are aware of the limitations of quantitative risk assessment, they tend
to respond with skepticism to these assessments, even though they tend to trust
science and science-based organizations.
The field of risk communication arose in response to this form of "illogical" reasoning
by the public.
Most risk communication efforts begin with the premise that scientific experts know
actual risk and the skeptical public, out of ignorance or irrational fear or both,
misperceives actual risk.
The goal of risk communication, therefore, is to educate the public for the purpose of
removing their irrational fears. A central assumption of this approach is that experts
favor the technology being discussed and non-experts (i.e., the public) opposes the
technology.
"In its most extreme form, manipulative risk communication results in legal
maneuvering to withhold information from the public altogether."
"In short, risk communication is infected with a contempt for the public, which
perpetuates its undemocratic bias and also ensures the continued failure of risk
communication efforts."
If control over language strongly affects risk management, then advanced procedures must
be developed for interacting with the public about technology and risk:
"Risk is a far from neutral language. Rather than representing interest-free rationality,
nameless knowledge that applies to everyone, risk represents the deeply interested
knowledge of those who are able to command it."
People are becoming more aware of how power relationships influence risk.
"The reaction against risk represents democracy, not the hysteria of the ill-informed."
Risk assessments often falsely divide the population into those affected and those
unaffected. Humanist viewpoints consider all to be affected when some are affected.
If control over language is critical to risk assessment and management, then citizens need
to be become aware of risk assessment procedures and risk communication techniques used
to convey information about technologies to them. The tenets of the critical philosophy alert
us to the need to become aware also of how power relationships can affect risk assessment
and communication.
Beck argues that technology advancement occurs so rapidly that our institutions
cannot keep up, leading to a "risk society." To Beck, new hazards have led to new
critiques of technology.
Bell and Mayerfeld believe that we have no more hazards or worry about hazards than
we had before. Instead, we have a growth of new language for debating about
hazards and greater public interest in discussing potential hazards related to
technology development.
"The real uncertainty at stake in the language of risk is the relationship between
power and democracy."
86
Recreancy and Societal Institutions
William Freudenburg (full text article) points out that the earliest discussions of risk were
framed almost exclusively in terms chosen by engineers. Within the technical community,
two explanations typically are given for public reactions to what the technicians deem to be
objectively defined risk. The first is that the public is ignorant and/or irrational. From this
definition of the situation, policy focuses on education of the ignorant and easily
manipulated public. The second explanation, associated more with economists, is that public
reactions represent economically rational, although understandably selfish, responses to
risk. From this view, policy focuses on providing adequate compensation for risks endured.
The problem with the first view is that technical definitions of objective risk are not always
precise (see Technical Risk Assessment) and always are influenced by social and cultural
factors. The problem with the economic approach is defining adequate compensation.
Events vary in the amount of outrage they create and it is difficult to assign monetary value
to risk and negative health outcomes.
Increasingly, the findings from empirical studies are providing paradoxical with respect to
the individualistic theoretical perspectives that have predominated in the past. Research
shows that differences in risk perceptions cannot be attributed to differences in information
levels, but are attributed more to differences in cultural outlooks and personal values.
Freudenburg notes that with increasing complexity of technological innovations and societal
division of labor people find themselves in a position of not knowing much about highly
complex and potentially dangerous technologies. They therefore must rely upon their
judgments about whom to trust. Like Slovic, Freudenburg is aware that:
the public is not irrational in their skepticism about complex technologies, but rather
cautious in deciding whom to trust in their understandable state of ignorance about
these technologies,
the public and scientists rely upon social as well as technical criteria to evaluate risk,
claims that the public is irrational in part are responsible for increasingly contentious
debate about complex technologies,
some special interest groups profit from fear mongering within this atmosphere of
ignorance and fragile trust,
the media have a difficult job of presenting varying viewpoints on technical issues.
Freudenburg wants to look at societal not individualist explanations for this pervasive
problem because contentious public debate can:
delay implementation of valuable technologies,
hasten implementation of undesirable technologies, and
create public discourse that in itself might be harmful to the social fabric of society.
Freudenburg uses the term recreancy, which means institutional failure resulting either from
lack of competence and/or fiduciary responsibility, to refer to societal-level inadequacies in
risk assessment, management, and communication. Recreancy does not necessarily result
from "villainy," but instead comes about from inadequate societal-level definitions of risk,
procedures for evaluating risk, risk management practices, and poor risk communication
techniques. In other words, it is societal structure and functioning that is inadequate, not
familiarity with technology or irrational thinking, in bringing about wise technology
development and policy.
Freudenburg offers suggestions for improving societal-level capacity in risk assessment,
management, and communication:
1. Assess the level of recreancy in American society,
2. Become more aware of societal-level influences on risk assessment, management,
87
and communication,
3. Build institutional capacity to facilitate wise technology policymaking.
Summary
Beck thinks that because society has become more risky, citizens need to become more
involved in the process of risk assessment and management. Bell and Mayerfeld, on the
other hand, think that the world is no more risky than it has been before, but that control
over the language of risk, which strongly affects technology assessment and management,
has become more advanced and therefore in need of more careful scrutiny by the public.
Freudenburg focuses on organic social solidarity--the trust citizens place in societal
institutions to behave with competence (i.e., skills, expertise, experience) and fiduciary
responsibility (i.e., honesty, integrity). Each perspective highlights characteristics of society
that must be addressed in understanding the sociology of technology. Given the emphasis
of this course on human agency, we will direct most of our attention to techniques of
gaining adoption of technologies considered to be mainly beneficial. The approach we will
rely upon is "diffusion of innovations." This approach will be described in detail in the final
section of the course.
Application in Context
How does the language of risk affect your perceptions of technology?
The first four links presented on the Sampler web site regarding genetic engineering
present this technology in a favorable manner, while the last four links present
concerns about and objections to it. Skim through these materials again looking for
key terms, use of language, or the context in which arguments are presented to
investigate how the language of risk is used to sway opinion.
What are some key terms or phrases advocates use to make genetic modification of
food seem like a good idea?
What key terms or phrases do opponents use to make this technology seem like a
bad idea?
88
Nobody believes the official spokesman... but everybody trusts an unidentified source.
Ron Nesen
Introduction
Democratically governed nations require that citizens fulfill their twofold responsibility to
challenge institutions so that they might adapt to ever changing social and environmental
changes and support institutions so they might serve the needs of the people. Thus, to
some extent societal institutions at the same time must be trusted and not trusted. This
presentation focuses upon the determinants of trust in societal institutions.
Compass
Key Questions
What are the key determinants that affect public trust in societal institutions?
Examples
What are the key characteristics of societal institutions that make them
trustworthy?
To what extent do citizens use formal and substantive rationality when
evaluating societal institutions?
Are expressions of substantive rationality conducive to deliberative social
policy formation?
Recreancy, Rationality, Trust, and Public Policy
The attached paper describes the roles of citizens, the responsibilities of societal
institutions, and the key factors affecting public trust in societal institutions. It describes
formal and substantive rationality and the effects of these two forms of rational decision
making on the quality of social policy formation.
Consumer Trust in the U.S. Food System: Expressions of Formal and Substantive Rationality
Within the Recrancy Theorem.
89
It has been said that democracy is the worst form of government except all the others that
have been tried.
Sir Winston Churchill
Introduction
The anthropological approach to risk assessment focuses upon the extent to which
underlying value-orientations affect individuals' perceptions of risk. This approach, most
closely associated with the work of Mary Douglas and Aaron Wildavsky, posits that cultural
ways of life can be classified along two dimensions, termed "group" and "grid." A "high
group" dimension emphasizes the importance of collective sentiments. A "high grid"
dimension emphasizes the importance of a highly structured society. When considered as
complements to one another, these dimensions imply four value-orientations. A high-group,
high-grid orientation emphasizes strong collective control over society, with a focus on
hierarchy. A high-group, low-grid orientation emphasizes collective control through
individualized actions, with a focus on egalitarianism. A low-group, high-grid orientation
emphasizes the individual within a hierarchical structure, with a focus on fatalism. And a
low-group, low-grid orientation emphasizes the individual with little need for hierarchy, with
a focus on individualism.
This chart depicts the culture cognition approach to defining value orientations.
The anthropological approach to risk assessment is to understand how individuals' value
orientations affect their perceptions of risk. An egalitarian, for example, might perceive little
risk in society recognizing same-sex marriages, whereas a persons with a value-orientation
of hierarchy might be very concerned about society adopting such a policy.
The anthropological approach to social change is for change agents to attempt to bridge
value-orientations so as to capture consent for change from the majority of persons. For
example, one approach to reconciling differences of opinions among those who want greater
collective control over environmental pollutants and those who prefer an individualized
approach is to create a system of "carbon credits (Sarah Ploss: Introduction to Carbon
Credits) that can be bought and sold as commodities, thereby satisfying the value
orientation of the individualists, but collectively act to reduce carbon emissions.
No person can become an expert on all topics, but all citizens within a democracy can
influence social policy. And the public typically desires to exert much influence upon public
policy even when they have little understanding of complex and controversial innovations.
The question becomes, To what extent should ordinary citizens influence social policy
related to the use and regulation of advanced technologies about which they have little
90
understanding?
Compass
Key Questions
To what extent and in what manner should the opinions of the public be
incorporated within the formation of social policies designed to regulate
complex technologies?
Examples
Why should a technologically advanced society give much credence to the
opinions of a largely ignorant public?
Should?
How do social institutions and public policies affect science and technology?
Is the American public sufficiently informed about science and technology to
make a valuable contribution to technology policy?
To what extent should people's values be taken into consideration in forming
social policy related to advanced technologies?
Debate Regarding the Role of Citizen Input Into Technology Policy
Cass R. Sunstein, Director of the Office of Information and Regulatory Affairs under
President Obama, argues that advanced societies are too strongly influenced by "laws of
fears," that is, regulations that ignore sound scientific evidence in favor of misinformed and
misguided public opinions. Mr. Sunstein believes that public policy should be guided by
values, but not by the blunders of ordinary people. In contrast, Dan Kahan and colleagues
argue that social policy formation should pay greater attention to the value-orientations of
the public.
Please read the papers shown below and be prepared in class to discuss the extent to which
the public's value-orientations should be included when forming regulatory policies
regarding advanced technologies.
Kahan, Dan M., Paul Slovic, Donald Braman, and John Gastil, Fear of Democracy: A Cultural
Evaluation of Sunstein on Risk.
Sunstein, Cass R., Misfearing: A Reply.
Kahan, Dan M., and Paul Slovic, Cultural Evaluation of Risk: 'Values' or 'Blunders'?
91
For a list of all the ways technology has failed to improve the quality of life, please press
three.
Alice Kahn
Introduction
Link to PowerPoint presentation regarding Globalization.
Several of the readings in this section refer to globalization. In simple terms, globalization is
the development of rules by nations to govern international trade for the purpose of
increased efficiency in the production and distribution of goods and services. Rules for trade
are developed and enforced by organizations such as the World Trade Organization (WTO),
a governing body representing a consortium of 153 nations.
The proposed benefits of globalization are: 1) increases in economic productivity achieved
through efficient resource allocation, and 2) greater political stability achieved through
economic interdependence. Hence, it is assumed that if nations become economically
interdependent, they will be less likely to wage war upon one another or tolerate political
instability at home. According to some sources, it was the desire to achieve political stability
in post-WWII Europe that most influenced the development of the European Union, the first
large-scale cooperative economic arrangement among Western, industrialized nations.
Principle of Comparative Advantage
To understand the economic motivation for globalization, it is important to understand the
principle of comparative advantage. This principle, attributed to David Ricardo (1772 1823), posits that nations can be most productive through specialization in areas where
they have a ratio advantage, relative to other nations, in the production of a good or
service.
Consider this often used example. Two nations--England and Portugal--produce two
commodities--wheat and wine. The cost per unit in labor hours to produce wheat is 15
hours in England and 10 hours in Portugal. The cost per unit in labor hours to produce wine
is 30 hours in England and 15 hours in Portugal. Thus, Portugal has an absolute advantage
in labor hours to produce both wheat and wine.
But, Portugal has a relatively better ratio at producing wine and England has a relatively
better ratio at producing wheat. That is, the ratio of producing wheat to wine in Portuagal is
2/3 whereas the ratio of producing wheat to wine in England is only 1/2. So, even though
Portugal has an absolute advantage at producing both wheat and wine, Portugal has a
comparative advantage in the production of wine and England has a comparative advantage
in the production of wheat.
How can these comparative advantages be used to improve the total production of wheat
and wine?
Suppose England has 270 total labor hours at its disposal and Portugal has 180 hours of
labor at its disposal (i.e., "labor hours" encompasses size of labor force and technical
capacity). Suppose, before trade, that England produces and consumes 8 units of wheat (at
15 hours/unit = 120 hours) and 5 units of wine (at 30 hours/unit = 150 hours) and Portugal
produces and consumes 9 units of wheat (at 10 hours/unit = 90 hours) and 6 units of wine
(at 15 hours/unit = 90 hours). The total production of wheat equals 17 units and the total
production of wine equals 11 units. Now, suppose England specializes in wheat production
92
and Portugal specializes in wine production. England can produce 18 units of wheat (at 15
hours/unit = 270 hours) and Portugal can produce 12 units of wine (at 15 hours/unit = 180
hours). Note that total production of both wheat and wine have increased by one unit!
Through specialization and trade, England and Portugal, taking advantage of their
comparative advantage, can increase total production of both wheat and wine.
This principle assumes:
1.
2.
3.
4.
5.
6.
no transport costs,
constant costs and no economies of scale,
just two countries producing these goods,
traded goods are interchangeable,
factors of production are perfectly mobile,
perfect knowledge, so that all buyers and sellers know where the cheapest goods can
be found internationally, and
7. no tariffs or other trade barriers (i.e., perceived restrictions on trade).
Obviously, world trade is considerably more complicated than the exchange of two goods
between two countries. But taken to a global scale, in theory the principle of comparative
advantage should increase productivity for all. The restrictions implied by the first six
assumptions listed above presumably can be overcome with trade volume and relatively
standard practices of production, transportation, and monetary exchange. If transportation
costs are small relative to the volume of trade, for example, then they will not hinder the
benefits of globalization.
The key factor that makes this system work is assumption number 7: no tariffs or other
trade barriers. The objective of governing bodies such as the WTO, therefore, is to reduce
trade barriers as much as possible to optimize the free flow of goods and services
worldwide. Thus, nations, working in groups or as single units, usually informed and
pressured by multinational corporations, actively pursue restriction of trade barriers within
the WTO.
Concerns About Globalization
Much has been written about the possible negative consequences of globalization. Some of
the key concerns are summarized below.
Economic Leakage
Economic leakage refers to the movement of profit margins from primary, to
secondary, to tertiary markets.
Primary markets are oriented mainly toward the production of raw commodities (e.g.,
food commodities, such as corn, wheat, soybeans; mined goods, such as raw ore and
minerals). Secondary markets focus mainly upon the further processing of raw
commodities (e.g., corn syrup, bread, soy-based oil products, steel, cut minerals).
Tertiary markets specialize in facilitating production and trade by providing financing,
access to markets, and access to information about markets (e.g., the Chicago
Mercantile Exchange, the NYSE, Citibank).
Typically, profit margin increases as goods move from primary to secondary to
tertiary markets. Thus, a nation whose economy focuses almost exclusively upon
primary commodity production will experience "economic leakage" of potential profits
to nations involved in secondary and tertiary markets because it is not involved in
these more lucrative ventures.
Perpetual Status
A concern expressed about the WTO and other organizations that govern international
93
trade is that nations involved in primary commodity production will find it very
difficult to develop secondary or tertiary markets. Suppose, for example, that Nation
A, which is involved mainly in primary commodity production, wants to build an
industry that can further develop its raw commodities. Such industrial development
requires much investment of capital. Thus, Nation A might want to provide over the
short run government subsidies to help the new industry bear the burden of start-up
costs and operating losses until it can become efficient enough to compete in world
markets. Such subsidies would be considered illegal under current and proposed WTO
rules. Nations not yet developed enough to enjoy the increased profit margins of
secondary and tertiary markets might never be able to do so under WTO regulation.
Environmental Degradation
Another concern expressed about globalization is that nations wishing to establish
laws to protect their environmental quality might not be able to do so under WTO
regulations. Consider a case in 1996 when Venezuela brought a claim against the
United States alleging that the U.S. Clean Air Act unfairly discriminated against
Venezuela gas exports to the US because the act required foreign gasoline sold in the
US to be no more contaminated than the average level of contaminants in US gas.
The WTO ruled in favor of Venezuela. So, the US rewrote the Clean Air Act to allow
both domestic and foreign gas producers to produce more contaminated gas. In
another WTO ruling, Japan was forced to lift its import ban on certain fruits that might
bear dangerous insects, even though to get rid of those insects Japan needed to use
harmful pesticides.
Thus, in principle, any action taken by any country to protect its environment that
might be perceived as restricting free trade can be overturned by the WTO.
Research on the Effects of Globalization
The articles linked below summarize key research findings regarding relationships between
globalization and wages, income inequality, and social mobility of nation states. Research
indicates that Foreign Direct Investment (FDI) tends to increase wage levels and reduce
poverty in both developing and developed nations. For developing countries, FDI tends to
increase income inequalities in the short run but decrease income inequalities with greater
investment over time. For developed countries, FDI tends to decrease income inequalities.
Income inqualities among the richest and poorest nations seems to be decreasing.
Matthew Slaughter and Phillip Swagel: Does Globalization Lower Wages and Export
Jobs?
Gary Clyde Hufbauer: Globalization Facts and Consequences.
Almas Heshmati: The Relationship Between Income Inequality and Globalization.
Glenn Firebaugh and Brian Goesling: Accounting for the Recent Decline in Global
Income Inequality.
Robert Hunter Wade: Income Inequality and Globalization: Comment on Firebaugh
and Goesling.
It can be difficult to determine the extent to which changes in wages, income inquality, and
social mobility reflect globalization or technological advancements, which themselves result
in part from globalization. The research indicates that, overall, globalization seems to be
improving the economies of both the developed and developing nations.
94
Technology is a blessing wrapped in a curse; it can carry you far, but it can drop you hard,
and in a New York minute.
Paul Quinnett, Darwin's Bass, 1996.
Introduction
How does a society evaluate and regulate risks associated with technology? K.S. ShraderFrechette, in her book, Risk and Rationality: Philosophical Foundations for Populist Reforms,
provides some workable principles for societies to use in their efforts at risk management.
Compass
Key Questions
What are some pragmatic and ethical approaches for the public to take in
evaluating risk and setting technology policy?
Examples
What procedures should be used to mediate dilemmas of risk assessment
regarding the sampler technologies?
Are opponents of the sampler technologies reasonable in their objections? How
much should public policy reflect objections of opposition groups?
Is the public adequately protected against unethical practices in the
development and dissemination of the sampler technologies?
What should be the role of the government in protecting consumers from the
sampler technologies?
Risk Perceptions and Public Policy
Shrader-Frechette seeks workable solutions for wise public policy formation. She describes
the dilemmas faced in risk assessment, controversies that arise in public debate about risk,
and unethical technology policies and then suggests guidelines for good technology policy
formation.
Five Dilemmas of Risk Assessment
Public policy is formulated within the context of risk perceptions, which, in turn, reflect the
public's opinions of the quality of risk assessment. Risk assessors must estimate as closely
as possible the potential hazards of new technology. Their assessments are evaluated by a
public that desires that social and ethical as well as technical criteria be used in making risk
95
assessments. Risk assessors and the public then face dilemmas in their attempts to balance
technical evaluations with the public's desires for nontechnical input into risk assessment.
1. The Fact-Value Dilemma: Risk evaluations cannot be both wholly factual/scientific and
wholly sanctioned via democratic processes because lay persons want ethical/moral
considerations to be taken into account.
2. The Standardization Dilemma Risk assessors seek standardization of evaluation
criteria to avoid the appearance of arbitrariness. But local groups want them to take
into account their special conditions.
3. The Contributors Dilemma Risk assessment can be costly to undertake. Thus, many
potential hazards are not examined. In addition, aggregate effects of subthreashold
levels of risk often go untested. Pesticides A and B might be considered safe, for
example, but are these pesticides safe when they both are applied to the same crop?
4. The De Minimis Dilemma How safe is safe enough? Declaring a threshold level at
which to define negligible risk is a difficult task when citizens hold different
expectations of safety.
5. The Consent Dilemma Persons most affected by risk are sometimes those persons
least able to give consent. Persons who lack the economic means and political access
to challenge public policy might also be the ones who bear most of the burden of
potential hazards.
Of course, no perfect solution exists to resolve these dilemmas. Shrader-Frechette's interest
is in bringing these contradictions to light so the public can openly discuss competing
agendas as part of the policy making process.
Attacks on the Public's Aversion to Risk
When the public does not accept the risk definitions of technical experts, experts typically
blame the public for their reluctance to adopt the innovation. Shrader-Frechette reviews
some of the accusations directed at a skeptical public delivered by technical experts and a
counterpoint to each accusation.
1. The public is anti-technology: The public engages in irrational witch hunts against new
technologies. The public has inconsistent fears of technology. Opponents are
motivated by sectarian, antitechnology sentiment, choose to be panic struck about
imagined dangers from technology rather than real threats to the economy or
education. They serve their own moral purpose by attacking new technologies.
Counterargument: It is not irrational to question the efficacy of new technologies that
have come under criticism. Imagine yourself walking a well-trusted path through the
woods, a path you have taken many times before. You hear a rustle in the leaves at
your feet. Do you investigate the source of the sound? Of course you do. Our species
would not have survived on Earth this long if we were not genetically hard wired to be
skeptical. Thus, challenging the efficacy of new technologies is not irrational; it is very
rational. Furthermore, challenging new technology is the responsibility of the active
citizen; democracies require questioning to work well.
2. Opponents of new technology are remote from power and influence: They are
ineffectuals, distrustful of those in the center of power. They are anti-industry and
therefore opposed to new technologies developed by industry.
Counterargument: Risk choices are determined largely by philosophy and psychology,
not place in the social structure. This argument is not consistent with the facts of
powerful persons also being opposed to various technologies. The moniker, "big
business" does instill a sense of mistrust in the American public. But, in fact, most of
the time the public embraces new technologies.
3. Laypersons ignore the fact that society is getting safer: Critics of technology fail to
96
recognize that life is getting longer, not shorter; health is better, not worse.
Counterargument: Distribution of risk is important also. And so are moral/ethical
considerations in regard to how advances are made to improving longevity and
safety.
4. The public will never be satisfied with anything but 100% safety: The public is
unrealistic in their expectations of safety.
Counterargument: The public has the right, in fact, a responsibility, to challenge new
technology, to ask questions and seek understandable answers.
Shrader-Frechette notes that attributions of motives to laypersons oftentimes are
inappropriate and unfounded. She suggests that we must develop a new theory of
rationality that respects the responsibilities and viewpoints of active citizens.
Public Policy in Developing Countries
Shrader-Frechette points out the sometimes developed countries explain away unethical
dissemination of known hazardous technology to developing nations with rationalizations
about their efforts at promoting progress. She describes five such rationalizations:
1. The Isolationist Strategy: Abide by current laws only in imposing risk. That is, if a
known hazardous technology (e.g., DDT in pesticides) is not illegal in the developing
country then it is ok to sell this technology in that country. The problem with this
strategy is that the developed nation knows that the laws of the underdeveloped
nation are inadequate to reduce risk.
2. The Social Progress Strategy: Hazards are a necessary evil for social progress. But,
ask Shrader-Frechette, progress for whom and at what risks for the local population?
3. The Countervailing-Benefits Strategy: Recipients of banned products are better off
than they would be without them (i.e., the benefits outweigh the costs). But, are
some costs preventable evils that never should be allowed?
4. The Consent Strategy: If persons in the host country agree to accept the risk, then it
is ethical to impose it upon them. But, can the people of host countries give informed
consent?
5. The Reasonable-Possibility Strategy: It is impossible to prevent use of banned
products. But, we regulate industry all the time and can regulate ourselves in host
countries.
Shrader-Frechette implores citizens in developed countries to extend the same ethics to
developing countries they impose upon themselves in setting technology policy.
Technology and Public Policy
What should be the role of government and industry in protecting the public from exposure
to unnecessary risks? This question lies at the heart of technology public policy formation.
Shrader-Frechette suggests guidelines that can be used to help formulate wise technology
public policy.
1. Minimizing harm is more important than providing good.
2. Because they have fewer financial resources, the public needs more protection than
does industry.
3. Consumers need self-determination--the chance to reject new technologies.
4. Societies need to stress the importance of values and long-term economic gain vs.
short-term economic benefits.
Summary
97
Shrader-Frechette points out that technology policy making is not an unambiguous process.
Risk assessors and policymakers face dilemmas in their attempts to be objective and fair
and also take into account special needs and interests. Research and development
organizations sometimes become frustrated with a skeptical public. And it can sometimes be
too easy to rationalize unethical dissemination of hazardous technologies for the sake of
profits. Shrader-Frechette offers guidelines to help citizens form wise technology policy.
98
The moment a man talks to his fellows he begins to lie.
Hilaire Belloc
Introduction
Sociology 415 focuses upon human agency in the sociology of technology. We are interested
in understanding public responses to complex and controversial technologies, with the
intent of learning how to gain adoption of these technologies by an understandably and
justifiably skeptical public. We pursue this line of inquiry with the realization that all
technologies are flawed and their adoption will bring about negative consequences for some
segments of the population. We nevertheless seek adoption under the premise that the
technology is mainly beneficial for the public. We might recognize, for example, the flaws in
and negative consequences of adopting condom use for the prevention of sexually
transmitted diseases, but nevertheless advocate for such adoption in the belief that condom
use will mainly benefit the society.
With these considerations in mind, we turn our attention to learning techniques of risk
communication. Risk communication can involve two related messages, "Watch out!" or
"Don't worry." Risk communication, whether designed to mobilize citizens to take protective
measures (e.g., "better strengthen the levy") or stop worrying (e.g., "irradiated food is safe
to eat"), involves similar elements. These are: 1) the dissemination of information, 2)
persuasion to take some action, and 3) the provision of assistance in taking the desired
action. Experience in seeking adoption of many different technologies, in settings
worldwide, for many years teaches us that these tasks are much more difficult to achieve
than one might initially believe.
We begin this topic by reviewing a history of risk communication efforts followed by an
explanation of the limitation of each one. Next, we learn techniques of risk communication
developed through experience and the application of social science theories derived from
the disciplines of psychology, economics, and sociology. We end this section by pointing out
the limitations of risk communication as only communication. In the final section of the
course, we learn that convincing the public to either "watch out" or "stop worrying" requires
more than communication itself, it requires approaches for manipulating the social
construction of risk perceptions.
Risk Communication: History.
Baruch Fischhoff's 1995 review of twenty years of process in risk communication research
and practice (full text article.) revealed some effective and ineffective techniques for telling
the public about technology. His review is organized within eight developmental stages that
span the 20 year period from 1975 to 1995. He describes each stage and its strengths and
limitations. He concludes his review by offering suggestions for improvements that need to
be made in future efforts at risk communication.
First Developmental Stage: "All We Have to Do is Get the Numbers Right"
Fischhoff notes that communication often begins before a word is said; that is, the
viewpoint that nothing needs to be said is a form of communication in itself. This form
of noncommunication often represents the initial reaction of technical experts
regarding public input to risk assessments. One can understand, for example, the
perspective of risk experts who painstakingly master the assessments of technologies
99
when they believe that little communication is needed or should be expected from
them with a public who is for the most part ignorant of the risk issues associated with
a technology. And certainly, the "paradox of democracy" precludes public discussion
ad infinitum regarding the adoption of new or maintenance of existing technologies.
Yet, because within democratic societies the public will have input to decision making,
it becomes requisite de facto for risk experts to convey their findings to the public.
Second Developmental Stage: "All We Have to Do is Tell Them the Numbers"
When requested to do so, risk managers present their findings to the public, often
with little interpretation or explanation of them. Although this approach to information
delivery seems forthright in its intent at objectivity, it can be viewed by the public as
an indication of distance or even arrogance by the risk managers. And subsequent
attempts by the public to provide interpretation can be hindered by lack of
information or expertise or politicized by subjective evaluations of the meaning and
usefulness of the raw data. This approach is further hampered by its premise that the
numbers are correct. As has been often noted (see: Technical Risk Assessment), risk
assessments can be limited in their applicability or outright flawed for many reasons,
including occasional acts of dishonesty by scientists or technology managers.
Therefore, just presenting facts, in addition to being limited by its appearance of
condescending distance, is flawed in its premise that the numbers provide a complete
and accurate assessment of risk.
Third Developmental Stage: "All We Have to Do is Explain What We Mean by the Numbers"
Once one begins to explain numbers, inevitably one begins to introduce subjective
evaluations of these numbers. And the public recognizes the subjective nature of
these explanations. This recognition sometimes leads to contentious public discourse
about interpretations. Typically, proponents and opponents of a technology will offer
their conflicting interpretations of the numbers, and in this exercise, the viewpoints of
opponents will influence public opinion more so than those of proponents because
negative information, initially, carries more weight (see: Consumer Skepticism and
The Social Problem). The ensuing dilemma for scientists (see: Science, Technology,
and Society) is deciding how much explanation about a technology to provide to the
public. These dilemmas and the consequent unfavorable public reactions to any
confessed limitations of the technology lead some risk communication experts to
declare that what is at stake is control over the language of risk (see the viewpoints
offered by Paul Slovic, below). This approach has its limitations as well. At this
developmental stage, it should be recognized that explanation of the numbers can
engender controversies that proponents likely will lose because negative information,
initially, carries disproportionate weight.
Fourth Developmental Stage: "All We Have to Do is Show Them That They've Accepted
Similar Risks in the Past"
It might seem intuitively appealing to compare a technology under consideration with
a technology previously considered as risky but now accepted as being of little risk.
This approach in effect says to the public, "See how silly you are now to doubt now
when your doubts in the past have proven to be groundless." This approach to risk
communication, first, assumes that the risk assessments are correct, which
sometimes is not the case. Secondly, the condescending attitude it conveys is unlikely
to sway public opinion favorably. Third, technology comparisons are difficult to make,
even when the public is willing to accept some risk, when often they prefer to bear no
risk.
Fifth Developmental Stage: "All We Have to Do is Show Them That It's a Good Deal for
Them"
Explaining both costs and benefits can be a highly effective approach to helping the
100
public reach decisions about a technology. The public is in effect asked to join with
risk experts in evaluating the merits and limitations of a technology. This approach
suffers from similar pitfalls as the third developmental stage, wherein explanations of
costs and benefits themselves can become problematic within contentious public
discourse. Because no technology can claim 100% safety, the public must eventually
weigh benefits and risks, presuming these can be agreed upon.
Sixth Developmental Stage: "All We Have to Do is Treat Them Nicely"
Aretha Franklin had it right, it's about R.E.S.P.E.C.T. In addition to honest, balanced
messages, people want to be treated with respect, and from respect and a sense that
experts have sufficient expertise, comes trust. The public needs to feel as if their
opinions, and even their emotions, are respected as legitimate.
Seventh Developmental Stage: "All We Have to Do is Make Them Partners"
Treating the public with respect is an essential element of the decency they deserve.
Yet this approach can seem patronizing if it is not accompanied by a true partnership
of ideas. In fact, even a less educated public can make valuable suggestions to
technology improvement. The "indigenous knowledge" of the public can enhance the
effectiveness of a technology. Simply asking for input from the public can significantly
improve the relationship with it. "Partnerships are essential to creating the human
relations needed to damp the social amplification of minor risks--as well as to
generate concern where it is warranted."
Summary
The quantity of social science research on risk communication has increased dramatically
over the past thirty years in response to a growing awareness among risk assessors, risk
managers, and consumers that the public should be better informed and more active in
technology development and policy making. Led primarily by psychologists, research has
explored key determinants of consumer understandings, misunderstandings, and outrage
concerning risks associated with a wide variety of new technologies.
Early investigations into consumer risk perceptions revealed that the public often has much
different viewpoints about risk than do experts. Studies showed that expert and public
opinions often were inverted, where the public was least concerned about hazards that most
concerned scientific experts and most concerned about risks of least concern to scientists.
Early efforts at risk communication, therefore, focused on developing procedures to convey
"actual" risk to consumers who held uninformed and sometimes irrational "perceptions" of
risk.
After further investigations, scholars became more aware of the many limitations of
technical risk assessments and risk management practices. Scholars become more aware
also that value orientations can be legitimate criteria for establishing technology policy
(e.g., chemical warfare is rejected by civilized nations not because it is scientifically flawed
or inefficient but because people consider it to be ethically abhorrent).
Such awareness greatly changed the nature of risk communication research and application.
The recognition that technical assessments are biased and flawed and that valueorientations are equally important to risk assessment as are technical assessments altered
the risk communication paradigm from a focus upon "educating an irrational public" to one
of "exchanging information and opinions" among the many stakeholders in technology
policy making.
Guiding Principles of Risk Communication
101
This presentation reviews the Joint United Nations Food and Agriculture Organization/World
Health Organization (FAO/WHO) Expert Commission report on The Application of Risk
Communication to Food Standards and Safety Matters (1998). Two other books provide
similar information on risk communication as is found in the FAO/WHO report: Risk
Communication: A Handbook for Communicating Environmental, Safety, and Health Risks,
Second Edition, by Regina E. Lundgren and Andrea H. McMakin (Battelle Press, 1998), and
Responding to Community Outrage: Strategies for Effective Risk Communication, by Peter
M. Sandman (American Industrial Hygiene Association, 1993).
Peter M. Sandman[1], a leading consultant on risk communication, says that, "watch out!"
and "stop worrying" almost certainly were among the first phrases uttered in the early
development of language. These phases embody the two essential goals of risk
communication:
1. to warn others of potential harm, and
2. to inform others that there is no need to be concerned about harm.
Sandman notes that risk communication, defined in this manner, essentially represents
one-way communication of knowledge to others. As such, this form of risk communication
reflects three assumptions:
1. that the source of the warning/reassurance knows more about the risk than the
audience,
2. that the source is primarily concerned about the best interests of the audience, and
3. that the warnings/reassurances are based upon actual information rather than just
values or preferences.
For many warnings (e.g., yelling out about a falling tree limb) and reassurances (e.g.,
telling others that a gas leak has been repaired), "watch out!" and "stop worrying" are
pragmatic forms of risk communication. Disseminating information about complex and
controversial technologies, however, presents challenges for which "watch out!" and "stop
worrying" often are not adequate to create/reduce a sufficient amount of perceived risk/lack
of risk to be effective approaches to risk communication. The reason that "watch out!" and
"stop worrying" are inadequate forms of risk communication is that people realize that
sources of warnings/reassurances sometimes must rely upon flawed technical assessments
and the political, economic, and cultural context of new technology diffusion influences
sources to embed value-judgments into warnings/reassurances.
Because the public realizes that risk assessments of complex and controversial technologies
are flawed and influenced by political, economic, and cultural context, Sandman
recommends that risk communication regarding them should be multi-directional; it should
stimulate debate in addition to transferring knowledge. To Sandman, the criteria for
evaluating "effective risk communication" should be the openness of the decision-making
process and the extent to which value claims are distinguished from (admittedly flawed)
scientific claims.
Everett Rogers [2] author of The Diffusion of Innovations, makes a similar argument. He
states that diffusion of information about complex and controversial technologies must
avoid the pitfalls of the hypodermic-needle model, wherein the paradigm of risk
communication is "injection of knowledge about actual risks" into an uninformed public.
Instead, diffusion should consist of two-way communication between the public and
developers of new technologies.
The FAO/WHO report, The Application of Risk Communication to Food Standards and Safety
Matters, incorporates these suggestions into its definition of risk communication:
Risk communication is the exchange of information and opinions concerning
risk and risk-related factors among risk assessors, risk managers,
102
consumers, and other interested parties.
The goals of risk communication, according to the FAO/WHO report, are to:
1. Improve the effectiveness and efficiency of the risk analysis process,
2. Promote consistency and transparency in arriving at and implementing risk
management decisions,
3. Promote awareness and understanding of the specific issues of the risk analysis
process,
4. Strengthen the working relationships and mutual respect among risk assessment and
management participants,
5. Exchange information among interested parties to risk analysis and management, and
6. Foster public trust and confidence in risk analysis and management.
The FAO/WHO report, therefore, considers risk communication as an integral part of
technology development and analysis rather than as a one-way transfer of knowledge from
scientists to consumers. Multi-directional communication aimed at an inclusive process of
informed decision-making that respects the value-orientations of others has been adopted
as the most effective approach because experience demonstrates that one-way
communication must rely upon flawed analysis and inevitably reflects biased judgments and
value orientations of risk assessors in addition to errors resulting from lack of sufficient data
on failure probabilities.
Susan G. Hadden, in A Citizen's Right to Know: Risk Communication and Public Policy,
argues that citizens have a right to know the risks to which they have been exposed and
what policies are in place to regulate these risks and a right to participate in risk
assessment and management decisions. The essential element of risk communication,
therefore, as stated in the FAO/WHO report, is facilitation of the identification of risks and
informed weighing of decision alternatives by risk managers and the public. That is, proper
risk communication is interactive risk communication.
Elements of Effective Risk Communication
The principles described above note that transmission of scientific knowledge alone is
insufficient for proper risk communication. Scientific knowledge should not be considered as
flawless, value-free, and unbiased. Nor should scientific knowledge be considered as the
only criteria for technology adoption. Technology policy, however, should be science based.
Transmission of scientific knowledge, therefore, is a necessary component of risk
communication.
Essential aspects of proper risk communication, as described in the FAO/WHO report,
include:
Knowing the audience. The audience should be analyzed to understand their
knowledge and opinions regarding the new technology. Listening to all interested
parties is a critical element of this task.
Involving scientific experts. Technology policy decisions should be science based.
Hence, scientific experts should be called upon to relate current knowledge about the
technology in a clear and concise manner.
Establishing expertise in communication. The FAO/WHO report states that successful
risk communication requires expertise in conveying information in a manner that can
be clearly understood by most citizens. This suggestion has created some controversy
among scientists who wonder why all the burden for information dissemination should
fall upon them. They wonder why citizens do not make more effort at understanding
science.
Being a credible source of information. Factors that influence source credibility include
perceived competence, trustworthiness, and sincere interest in the well-being of the
103
public. Consistent messages help establish credibility. It is the nature of science,
however, that new knowledge alters existing risk estimates. Thus, scientists, who are
obligated to report their findings, oftentimes face the dilemma of reporting new
knowledge that will erode public confidence in science.
Sharing responsibility. Scientists, regulatory agencies, and industry must share
responsibility for developing and managing effective and safe technologies.
Increasingly, these parties are pointing out that consumers also must bear
responsibility for becoming more informed and active in technology development and
policy making.
Differentiating between science and value judgment. Certainly, a fundamental goal of
science is to conduct value-free, unbiased research. Therefore, risk communication
should focus upon facts, not values. Unfortunately, this approach to risk
communication is impossible because it is impossible for any science to be free of bias
and value judgments. Scientists should, as much as possible, omit their value
judgments from risk communication and point out where judgment is most likely to
affect risk assessments.
Assuring transparency. As much as possible within legitimate requirements to assure
confidentially, scientists should help the public understand the technology
development and risk assessment process.
Placing the risk in perspective. Risks and benefits and the probabilities of each should
be compared with one another. One must use caution, however, in comparing risks
because the choice of risks compared might reflect bias. Risk comparisons should not
be made unless the estimates are equally sound, directly comparable, and relevant to
the audience.
The FAO/WHO report lists the following elements that might be included as part of a risk
communication program:
The nature of risk
The characteristics of the hazard,
The estimated magnitude of the hazard,
The urgency of addressing the hazard,
Whether the hazard is becoming smaller or larger over time/space,
The probability of exposure to the hazard, and
Who is at greatest risk from the hazard.
The nature of the benefits
The estimated benefits associated with the technology, and
Who is most likely to be benefited by adoption of the technology.
The uncertainties in risk assessment
The
The
The
The
methods used to assess the risk,
weaknesses or inadequacies of the risk assessment,
assumptions used in the risk assessment, and
sensitivity of the risk/benefits estimates to changes in the assumptions.
Risk management options
The
The
The
The
The
policy(ies) suggested to control the risk,
action(s) individuals might take to control their exposure to the risk,
estimated effectiveness of different management options,
costs and benefits of different management options, and
justification for selecting a particular risk management option.
Barriers to Effective Risk Communication
104
The shift in paradigms from "informing an irrational public" to "facilitating informed and
respectful discussion among interested stakeholders" greatly improved the quality of risk
communication research and practice. Which is not to say that unavoidable barriers remain
to effective risk communication.
The FAO/WHO report describes barriers to effective risk communication that occur due to
limitations of the risk assessment process and social processes of human interaction and
decision making:
Barriers within the risk analysis process
Lack of information. Lack of information always poses problems for risk
communication because, by nature, little information exists about how well new
technologies will perform in practice. New findings and new applications of
technologies can reveal flaws not previously known or anticipated.
Access to information. Lack of access to proprietary information held by private firms
limits the abilities of risk assessors to adequately evaluate new technologies.
Incomplete participation in the process. Lack of participation by appropriate experts
or stakeholders limits the abilities of risk assessors to evaluate hazards and of risk
communicators to effectively convey important information. Sometime,
non-participation occurs because of the nature of technology development,
assessment, and dissemination processes themselves. Lack of knowledge about the
development of a new technology, lack of resources to learn about them, and lack of
access to relevant information can influence some who should become involved to not
participate.
Barriers associated with human agency
Differences in perceptions. People from different segments of society or who hold
different value orientations view the same scientific facts differently. Concerns about
hazards and viewpoints about how best to manage risks vary by individual and
sub-populations. People differ in the extent to which they are exposed and attend to
hazard analyses. The effectiveness of risk communication is enhanced when people
become aware of differences in perceptions and the reasons for these differences.
Differences in receptivity. Given similar perceptions of risk, people differ in their
concerns about it. Some persons might consider, for example, a 1/100 chance of
technology failure to be acceptable while others think of this estimate as too risky.
Lack of understanding of the scientific process.Most persons do not have a thorough
understanding of the scientific process, resulting not necessarily out of a lack of
formal education or awareness of important societal issues, but from ignorance of
science. Even the most educated among us are ignorant in many ways. Risk
communication should attempt to use non-technical terms to overcome barriers
related to ignorance. Risk communication should focus as well on educating the public
about the process of science, wherein it is not uncommon that new findings alter
existing risk estimates and controversy among scientists is common rather than an
indication of poor science.
Source credibility. Trust in the sources of information about new technologies is
perhaps the most important factor influencing public opinions. Trust is associated with
perceptions of expertise, accuracy, and concern for the public welfare. Distrust arises
with suspicions of bias or conflicts of interest. Once lost, trust is difficult to regain.
Media effects. Most persons receive their information about new technologies from the
media. Because relatively few reporters have extensive backgrounds in the sciences,
they rely heavily upon scientists to present their information in a clear and concise
manner using non-technical language. Reporters are ethically bound to present
differing viewpoints rather than what a scientist might consider to be the "truth."
Scientists therefore oftentimes blame the media for public controversy they think
never would have occurred if the media had not presented the viewpoints of
105
opposition groups. Risk communicators need training in media skills and reporters
need more training in science.
Societal characteristics. Language barriers, cultural differences, illiteracy, geographic
barriers, discrimination, exploitation of power, and other characteristics of society
influence perceptions of risk, receptivity to risk messages, source credibility, and
opinions about risk. As much as possible, societal differences that might affect risk
perceptions and risk communication effectiveness need to be identified. The section
on Diffusion of Innovations, Part II in Sociology 415 describes procedures that can be
used to improve risk communication to disadvantaged audiences.
Strategies for Effective Risk Communication
The complexity of risk communication requires that communication programs be tailored to
each setting. It is possible, however, to describe general strategies that research and
experience have shown to be effective across a wide variety of settings.
The outline presented here summarizes the large body of risk communication research and
program experience to date. Strategies for implementing the suggestions offered in this
section are addressed in more detail in Diffusion of Innovations, Parts I and II. Techniques
for risk communication and public relations campaigns are covered in JLMC 424: Public
Relations Campaigns, offered by Iowa State University's Greenlee School of Journalism and
Communications.
General Considerations
According to the FAO/WHO report, systematic approach to risk communication recognizes
the importance of gathering background information, thorough preparation, effective
dissemination of information, and program evaluation.
Background information
Understand the scientific knowledge about the technology,
Understand public perceptions by gathering information through surveys and other
social science methods,
Find out what information people need and want, and
Be sensitive to differences in perceptions, access to information, receptivity to
information, and social context.
Preparation
Avoid overly simplistic comparisons between familiar risks and new risks because they
might appear to be flippant and insincere.
Recognize and respond to the emotional aspects of risk perceptions. Sandman states
that Risk = Hazard + Outrage, wherein hazard is the technical assessment of risk and
outrage is the emotional response to hazard analysis. Hazard and outrage are equally
important determinants of public risk assessments.
Express the risk in several ways without avoiding the central issues of the new
technology.
Maintain an openness to and recognition of public responsibilities.
Build public awareness of the benefits of the new technology.
Communication
Accept and involve the public as a legitimate partner to technology policy making.
Share the public's concern rather than dismiss it as not being legitimate.
Be honest, frank, and open at all times.
106
Explain the overall risk assessment before presenting the more detailed statistics.
Coordinate and collaborate with other credible sources.
Meet the needs of the media.
Review and Evaluation
Evaluate the effectiveness of the risk communication program.
Emphasize ongoing actions to monitor, manage, and reduce risk exposure.
Risk Communication and Outrage
We noted that outrage--emotional responses to risk information--is equally as important as
hazard--technical evaluations of the probability of technology failure--in public risk
assessments. "Outrage factors," as they are described by Peter M. Sandman, are key
factors affecting emotional reactions to new technologies.
Outrage and Risk Perceptions
The FAO/WHO report outlines outrage factors that affect risk perceptions:
Unknown, unfamiliar, or rare events are more likely to create outrage.
Outrage increases when events are seen to be outside one's control.
Risks perceived to result from industry action(s) create more outrage than those
viewed as natural occurrences.
Risks that raise moral or ethical questions are more likely to create outrage.
An unresponsive decision-making process will create outrage.
The FAO/WHO report suggests these approaches to reducing outrage:
Make risks voluntary by giving the public input into the decision making process and
control over the regulation of risks.
Show that expert disagreement about risk simply represents a range of uncertainty,
not uncertainty about the quality of science used to estimate risk.
Acknowledge that uncertainty exists.
Treat all stakeholders with respect.
Always consider public concerns and complaints seriously.
Risk Communication Strategy and Use of Outrage Factors
Sometimes, persons/organizations attempt to create outrage as a means of warning about
risk. Various agencies of the U.S. government and some private organizations, for example,
try to create outrage about cigarette smoking to reduce its use. And sometimes
persons/organizations want to reduce outrage when they think the public is overly
concerned about a low-risk technology.
Earlier in Sociology 415, we learned from Bell and Mayerfeld that the "language of risk" can
be an effective tool in swaying public opinion. Thus, persons/organizations interested in
swaying public opinion learn to use language in a manner that creates/reduces outrage.
Proponents and opponents of the sampler technologies, for example, are familiar with
outrage factors and attempt to create/reduce public outrage regarding these technologies.
Research and experience have identified twelve key factors that tend to create/reduce
outrage regarding new technologies:
1. Voluntary/Coerced. Risks we take upon ourselves create less outrage than those
forced upon us.
2. Natural/Industrial. Natural risks are viewed with less emotional response than risks
created by human actions.
107
3. Familiar/Unfamiliar. Things familiar are considered less risky than the unfamiliar.
4. Memorable/Not Memorable. Linking technologies to highly memorable tragedies
makes them seem more risky.
5. Not Dreaded/Dreaded. Linking technologies to dreaded events (i.e., cancer) makes
them seem more risky.
6. Chronic/Catastrophic. Risks we face everyday create less outrage than the
catastrophic event.
7. Knowable/Unknowable. People tend to fear the unknown. Opponents of a new
technology can always use this outrage factor to their advantage because, de facto,
using new technologies involves uncertainties.
8. Control/Not in Control. We feel safer when we have the ability to regulate the use of a
technology.
9. Fair/Unfair. People will become more outraged about a technology if they think they
must bear more costs or fewer benefits than do others.
10. Morally Irrelevant/Relevant. Linking the use of a technology with immoral motives
creates outrage. Linking it with moral standards lessens outrage.
11. Trustworthy/Untrustworthy. Trust in the experts who develop or endorse a new
technology might be the most important factor influencing outrage.
12. Responsive/Unresponsive. Outrage is reduced when persons/organizations responsible
for the development or regulation of a new technology seem responsive to public
concerns.
Thus, proponents of a technology attempt to convey to the public that the technology is well
known, under control, familiar, trustworthy, and so forth. Opponents want the technology to
appear uncertain, unresponsive, unfair, not trustworthy and so on.
Risk Communication: Non-Crisis Situations
Risk communication is not limited to crisis situations. Rather, risk communication is an
ongoing process of informing, listening to, and responding to the public. Responsive risk
communication programs help prevent crises and establish source credibility when crises
emerge.
The FAO/WHO report describes important steps to take in developing responsive risk
communication programs:
Background information
Anticipate potential hazards before they become significant.
Keep abreast of the target audience--their perceptions, knowledge, and motivations
to become involved in technology policy making.
Determine which communication channels are most effective for different types of risk
information.
Preparation
Provide ongoing information about the technology, including updates on risk
assessments.
Identify shared values and concerns among the target audience.
Make messages interesting and relevant by focusing upon people rather than
statistics.
Maintain good working relationships with the media.
Communication
Keep messages in the mass media and in public forums.
Sustain regular communication to enable citizens to become involved in ongoing
decision making.
108
Make certain that risk communication is multi-directional: listen to the public and
facilitate their involvement in decision making.
Review and Evaluation
Continue to evaluate the effectiveness of the risk communication program.
Test the clarity and understanding of messages.
Educate risk assessors and managers on the principles of risk communication.
Engender cooperation among the public, management, and regulatory agencies.
Risk Communication: Crisis Situations
By definition, a crisis is short-lived. The public's memory of how a crisis is handled,
however, can affect risk perceptions and outrage for a long time.
The suggestions offered by the FAO/WHO report can help mitigate the negative
consequences of a crisis situation.
Describe in an open and honest manner the extent of the crisis and measures being
taken to control it.
Inform the public about how to reduce their risk exposure.
Help the public identify the hazard and how to avoid it.
Describe how to prevent further exposure to the risk.
Provide complete, up-to-date, and accurate information about the crisis.
Keep messages simple. Too many facts can be overwhelming. Do not, however, omit
key facts in the hope that the public will not hear about them.
Choose and rely upon a media spokesperson. The public should know who is the
spokesperson. Make this person available to the media at all times. Hold regular
briefings with representatives of the public and regulatory agencies.
Peter M. Sandman makes some additional suggestions for handling a risk crisis:
Acknowledge prior misbehavior. The prerogative of deciding when you can put your
mistakes behind you belongs to your stakeholders, not to you. The more often and
apologetically you acknowledge the sins of the past, the more quickly others decide
it's time to move on.
Acknowledge current problems. Omissions, distortions, and "spin control" can damage
credibility nearly as much as outright lies. The only way to build credibility is to
acknowledge problems.
Share control and be accountable. The higher the outrage, the less willing people are
to leave the control in your hands. Look for ways to put the control elsewhere (or to
show that it is already elsewhere). Let others--regulators, neighbors, activists--keep
you honest and certify your good performance.
109
I disapprove of what you say, but I will defend to the death your right to say it.
Voltaire
Introduction
Dr. Eric Abbott, from ISU's Greenlee School of Journalism and Communications, conducts
research on the risk communication cycle, public views of technology, and communication
strategies for presenting controversial technologies to the public. Dr. Abbott uses the
example of food safety to describe how the mass media views public concern about
technology and how the media and scientists can best present controversial topics to the
public.
Compass
Key Questions
How does the media affect public decisions about new technologies?
What should be the role of the media regarding public discourse about new
technologies?
Examples
Have the media presented the sampler technologies in a fairminded manner?
In what ways might media presentations about the sampler technologies be
improved?
Have either proponents or opponents of the sampler technologies received
more attention in the media than they deserve to receive?
The Media and Risk Communication
Dr. Abbott poses three questions he considers central to understanding the role of the
media in public evaluations of technology perceived as being high risk:
1. What determines how the media covers high risk technology?
2. What effect does media coverage have on public opinion?
3. What kind of communication strategies are most effective in increasing public
understanding of high risk technology?
How Does the Media Cover High Risk Technology?
The Natural History Model
Dr. Abbott points out that the mass media is a key source of information about risk issues.
Risk information presented to the public, however, does not occur randomly. The Natural
110
History Explanation presented by Anthony Downs posits that risk communication occurs in
four stages:
1. In the pre-problem stage the technology is available for public use, but the public is
largely unaware of it because of little media coverage.
2. The second stage is characterized by alarmed discovery as the media present the
problem to the public. Typically in this stage, experts (relying upon Enlightenment
philosophical approaches to technology and risk) argue that the problems can be
solved through more and better science.
3. In the third stage the public becomes aware of the costs of the technological fix
offered by experts.
4. As the topic becomes more complex, media coverage declines until, in the fourth
stage, public interest declines in mass media coverage of the technology and its risks.
The Public Arena Model
This model posits that risk issues must compete with other newsworthy items for mass
media exposure. Risks associated with complex and controversial technologies might be
covered in the media, but their length of coverage and degree of exposure (e.g., placement
in a newspaper), depends upon other topics of the day. Coverage of highly controversial
technology might get pushed to the back page during a period when news with greater
mass appeal occurs at the same time. Or, technology that might otherwise not raise a great
amount of controversy might receive much media coverage during a slow news day.
The Hoopla Effect
Dr. Abbott has developed a perspective on risk communication that emphasizes how media
coverage can lead to a heightened perception of risk. When a risk issue comes to the
attention of the media and is presented to the public, individuals and organizations that
have a vested interest in the issue take the opportunity to provide the media with more
information, which then is presented to the public. This cycle of activity can create a
heightened sense of awareness about the risk issue that does not necessarily coincide with
reality. News about crime, for example, is easy and inexpensive for the media to present;
so, the media are inclined to present much information about crime. This over-coverage of
crime can give the public a sense that crime is a bigger risk than it actually is. The result is
the hoopla effect.
This graph shows the hoopla effect for newspaper articles on genetically modified foods
from 1997-2000:
111
[D]
It should be kept in mind that the hoopla effect is a natural occurrence of reporting about
controversial technology because of the polarization of opinion that occurs through social
interaction. That is, when people begin talking about a controversial topic, their feelings
about it become intensified.
Reporting about controversial technology places the journalist in a difficult position because
reporting of the controversy, itself, can stimulate a hoopla effect. Yet the journalist is
entitled, even obligated, to report about controversial technology. Journalists sometimes are
blamed unnecessarily for increasing controversy, when, in fact, they are doing the job we
expect of them: reporting controversy surrounding complex technology.
To the extent that the public is responsive to the hoopla effect, their evaluation of
technology can reflect the coverage of the risk as it is presented in the media. Presentation
by the media, and access to the media by vested interest groups, therefore, can affect
public evaluations of technology.
Research shows that negative media information carries disproportionate weight in
influencing initial public opinions of technology. As Dr. Abbott points out, negative
information is fast and effective while positive information is slow, difficult, and expensive to
convey to the public. Traditionally, it has been thought that mass media messages are most
effective at the awareness stage of a diffusion program and interpersonal channels are most
effective at the decision stage.
What are some appropriate ways that the media can present risk information to the public?
1. One approach is to define a risk vs. no-risk situation. One might define the risk of
smoking cigarettes, for example, with respect to the lowered risk of not smoking
cigarettes.
2. A second approach is to present a risk in comparison with a related risk. For example,
presenting the risks associated with chemical food preservatives as compared with the
risk of contracting botulism (a potentially fatal illness resulting from eating spoiled
food).
3. A third approach is to compare a risk with some benefit. One might compare the risks
112
associated with agricultural pesticide use, for example, with the benefits of lower food
prices.
What happens when public perceptions of risk differ from the viewpoint of scientists?
Previously, we pointed out the pitfalls of approaching risk decisions from the perspective
that the scientific view is "actual" risk and the public view is "perceived" (and therefore,
incorrect) risk. It will be important for us to keep this false dichotomy in mind as we review
the literature on risk communication. For now, we consider the strengths and limitations of
three approaches to media communications with the public about complex and controversial
technologies:
1. One strategy for communicating to the public about risk would be to cater to public
fears about technology. Dr. Abbott states that this approach is used often, but is not
conducive to good public evaluation of technology because it tends to make scientific
evaluations irrelevant to the process.
2. A second strategy is to define away the problem (e.g., "America has the safest food
supply in the world"). As pointed out by Andrew Webster, taking this approach most
likely will result in lowering public confidence in science because technologies
inevitably will fail.
3. A third strategy is to facilitate dialogue between scientists and the public. This
strategy emphasizes the importance of explaining difficult terms and processes in
everyday language. It also stresses the importance of legitimizing fears and
presenting the technology as an alternative to other risks. Unfortunately, sometimes
when proponents of a technology claim they want to facilitate dialogue what they
really mean is they want to facilitate dialogue in the form of "educating the public."
Application in Context
How Effective are Different Approaches to Communicating with the Public?
Later in this section of Sociology 415 we will review a videotape of a television
program that features a debate between proponents and opponents of food
irradiation. We will learn more about how proponents and opponents attempt to
reduce or create outrage about this technology. As you view the videotape, look for
instances where proponents attempt to define away the problem or dismiss fears
and where opponents try to cater to fears.
What can the mass media do to better present technology to the public?
Because journalists (like the rest of us) typically are ignorant of how advanced technology
actually works and lack adequate space to educated the public about the technology, they
must be careful about how they frame a story. Also, the mass media needs to pay careful
attention to emphasizing key points in a story to help the public follow a story about a
controversial technology.
113
Good ideas do not sell themselves.
Everett Rogers, The Diffusion of Innovations, Fifth Edition
Introduction
Everett Rogers' Diffusion of Innovations is the definitive source for learning strategies aimed
at gaining adoption of complex and controversial technologies. The diffusion of innovations
approach relies upon well-established theories in sociology, psychology, and mass
communications to develop a concise and easily understood approach to consumer
acceptance of new technologies. Rogers reminds us that the dissemination of technology,
given its inevitable unanticipated, unintended, and undesirable consequences for some, and
sometimes for all, entails a strong commitment to ethical standards of professional practice.
The mistake made most often in attempts at technology transfer is to assume that
transmission of the scientific facts about the technology will be sufficient to gain adoption of
it. Because science is known to fail, because factors other than technical risk assessments
affect decisions to adopt, because for complex and controversial technologies the public
demands attention to their values and assurances of competence to give their trust to the
developers and managers of these technologies, technology transfer strategies must find
ways to address value-based concerns, instill trust in technical risk assessments, and ease
the transition to using the new technology.
As we learned in the two preceding sections, critical elements of technology transfer include
implementing good risk communication skills and working with the media to facilitate
reasonable presentation of arguments in favor of and opposition to the new technology. To
effectively gain adoption (or rejection; this is the last time I will say both), however, one
must also:
Influence the social comparison process that provides the required connection
between persuasive arguments and choice shift (i.e., understand the social system),
Understand the innovation-decision process (i.e., the time sequence of adoption
decisions),
Assist in easing the transition to the new technology, which includes changing
attitudes, behaviors, and infrastructure support for the new technology (i.e., reduce
transaction costs), and
Mitigate negative consequences associated with new technology adoption.
The case illustration of Los Molinos (pages 1-5), where the change agent (Nelida) was able
to achieve only 5% adoption of water boiling in her Peruvian village demonstrates the
importance of customs, interpersonal networks, opinion leaders, and change-agent
characteristics on adoption decisions. In Diffusion of Innovations, Rogers teaches us that
knowledge acquisition, risk evaluation, value acceptance, social/economic/political
constraints, adaptation to specific situations, time, money, and the expertise of change
agents all influence the adoption of an innovation.
Suggestions for reading the Rogers textbook for Sociology 415:
The presentation of materials on the next two web pages follows a different order from that
presented in the Rogers textbook. The suggested ordering for reading the textbook is:
Diffusion of Innovations, Part I on WebCT:
114
1.
2.
3.
4.
5.
6.
Chapter
Chapter
Chapter
Chapter
Chapter
Chapter
1:
6:
8:
9:
5:
7:
Elements of Diffusion.
Attributes of Innovations.
Diffusion Networks.
The Change Agent.
The Innovation-Decision Process.
Innovativeness and Adopter Categories.
Diffusion of Innovations, Part II on WebCT:
1. Chapter 11: Consequences of Innovations.
2. Chapter 3: Contributions and Criticisms of Diffusion Research.
Diffusion of Innovations: Part I
This section describes the diffusion of innovations model, explains its importance for
understanding public responses to complex and controversial technologies, and provides an
approach to gaining adoption if one chooses to do so.
Compass
Key Questions
How is technology adoption influenced by social factors?
How can the change agent influence the adoption of new technologies?
Examples
How do characteristics of the sampler technologies affect their rate of
adoption?
Who are the opinion leaders for each of the sampler technologies?
What strategies might be effective at gaining adoption/rejection of the sampler
technologies?
Who likely are the opinion leaders for each of the sampler technologies?
Elements of Diffusion (Chapter 1)
Diffusion is a process whereby an (1) innovation is (2) communicated through certain
channels (3) over time (4) within social systems. The approaches to risk communication
reviewed thus far in Sociology 415 emphasize the importance of understanding
characteristics of complex innovations and developing effective risk communication
techniques for instilling trust and reducing outrage. The diffusion of innovations approach
posits further that risk communication strategies differ over time and within different social
systems.
An innovation is an idea, practice, or object that is perceived as new. What might seem
familiar to some is new to others. Innovations can be material or nonmaterial. The adoption
of material innovations brings about changes in social relations, which means that
nonmaterial issues arise in the adoption of material innovations. That is, culture changes
with changes in material conditions. Understanding relationships among culture, values,
115
existing practices, and political/social/economic relations is a necessary element of
technology transfer.
Chapter 1 provides an overview of the diffusion of innovations approach. Please read it
thoroughly before proceeding through this WebCT presentation.
Characteristics of Innovations (Chapter 6)
Innovations vary in the extent to which they offer easily observed costs and benefits
compared with existing ideas or practices. The key characteristics of an innovation are its:
1. Relative advantage: the degree to which the innovation is perceived as better than
the idea it supersedes. Relative advantage refers to the extent to which the
innovation is more productive, efficient, costs less, or improves in some other manner
upon existing practices.
It might seem like relative advantage alone should be enough to persuade persons to
adopt an innovation. Certainly relative advantage is a key indicator of adoption. But
sometimes relative advantage is a matter of debate (e.g., legalized abortion), not
immediately evident (e.g., sustainable agricultural practices), complex to understand
(e.g., food irradiation), circumvented by economic/business/political circumstances
(e.g., the popularity of the VHS over the Beta format for home use video tapes),
considered as morally abhorrent (e.g., chemical warfare), or moderated by difficulties
involved in the transition from the old to the new (e.g., switching from traditional
television to HDTV).
Don't better ideas eventually win out? Not always (ask users of Macintosh
computers). And sometimes good ideas like genetically modified food (accept, for the
sake of argument, the value judgment here) undergo delays and considerable costs to
developers due to initial public resistance that might have been avoided if change
agents had focused upon factors other than just relative advantage (e.g.,
biotechnology companies have had to spend much money on repairing public relations
by not anticipating public resistance in Europe to genetically modified foods).
Thus, good ideas do not sell themselves because "good" can be relative, not
immediately evident, complex to understand, circumvented by the market, considered
to be morally abhorrent, or difficult to implement.
2. Compatibility: the degree to which the innovation is perceived as being consistent
with existing values, past experiences, and needs of potential adopters.
Compatibility is the trump card for all innovations, even those with high relative
advantage. An innovation must be considered socially acceptable to be implemented.
And some innovations require much time and discussion before they become socially
acceptable.
If the idea seems morally irreconcilable, then the innovation will not be adopted
(e.g., euthanasia for the terminally ill is having a hard time catching on with the
American public; human cloning might never be accepted).
If the innovation is very or sometimes even just a little bit different than current
practices, then the innovation will not be adopted (e.g., news reports state that
the U.S. Treasury might have to give up on Sacagawea dollars because people
do not like to use them).
3. Complexity: the degree to which the innovation is perceived as difficult to understand
and use.
116
An innovation need not be particularly complex from the viewpoint of its developers.
Feminists, for example, often complain that the public simply doesn't "get it." It is the
perception of the end user that means the most for achieving public adoption of a new
technology.
Food irradiation is difficult to understand, which is part of the reason it has been
slow to be adopted by Americans.
Personal computers were difficult to learn about when they first were
introduced, which slowed their adoption despite their clear relative advantages.
No-till farming was complex to understand and also difficult at first to
implement because one had to make required adjustments to existing
machinery oneself before manufacturers saw sufficient demand to mass
produce no-till equipment.
4. Trialability: the degree to which the innovation may be experimented with on a
limited basis.
Innovations are easier to adopt if they can be tried out in part, on a temporary basis,
or easily dispensed with after trial.
Nuclear waste storage facilities have to be located and built correctly the first
time.
There is no going back from affirmative action, civil rights legislation, legalized
marriage for gay/lesbian couples, and so forth.
5. Observability: the degree to which the results of the innovation are visible to others.
The chances of adoption are greater if folks can easily observe relative advantages of
the new technology. In fact, after some adopt, observability can improve the diffusion
effect, a critical component of technology transfer we will learn about later in Part I.
The advantages of genetically modified foods are not easily observable, at least
not at present, for consumers. Therefore, challenges to gm foods carry greater
weight than if gm foods had highly visible benefits.
A no-tilled farm field had negative observability at first because "good" farmers
did not leave plant residue on their fields; they instead left the ground clean of
plant residue with deep furrows.
Diffusion Networks (Chapter 8)
Communication and the Diffusion Effect
Mass media presentations create awareness, disseminate hardware (information about the
innovation), software (information about how the innovation works), and innovationevaluation (information about how well the innovation works) messages, and provide
feedback to potential adopters about those who have adopted. Because they create
awareness, mass communications place some pressure upon opinion leaders to make
decisions about the new technology, the importance of which will be explained later in Part
I.
Interpersonal communications between experts and the public, opinion leaders and the
public, and among friends and family are equally as essential as mass communications in
bringing about new technology adoption. Knowing the viewpoints of close referent others
(e.g., family and friends) and opinion leaders is a critical element of the social comparison
process leading to choice shift.
117
Diffusion takes place within the context of structures of social relationships based upon
power, norms, and public acceptability. Recognizing the influence of social comparison
processes on technology transfer is the first essential contribution of the diffusion of
innovations model beyond the risk communication techniques addressed in previous
sections of Sociology 415. To understand the role of social comparison processes, we begin
by defining the diffusion effect as the cumulative increasing degree of influence upon an
individual to adopt or reject an innovation, resulting from the activation of peer networks
about an innovation in a social system.
Technology adoption, as a form of human agency, depends strongly upon social comparison
processes that lead to choice shift. Social comparison processes gather inertia as more
persons shift their choice in the prevailing direction of others. Consider the introduction of a
complex technology. This innovation creates uncertainties about safety, environmental
quality, and so forth. So, people listen to persuasive arguments in favor of and in opposition
to the new technology. The public, being ignorant (not irrational) about the science of the
technology, then faces the consumer's dilemma of choosing whom to trust. The social
comparison process then becomes critical because people seek information beyond that
provided by proponents and opponents; that is, they seek some indication of whom to trust.
The important aspect of social systems to recognize is that social collectivities have prestige
hierarchies; the opinions of some persons/organizations carry more weight than those of
others during the social comparison process. Rogers refers to these more prestigious
persons/organizations as opinion leaders. Opinion leaders, as highly prestigious social
comparison others, have the ability to sway choice shift towards adoption or rejection. Thus,
it is the opinions of opinion leaders that strongly influences adoption or rejection.
Keep in mind that technology adoption always brings about culture change. Thus, an
adoption decision is, in the sociological sense, a change in normative expectations (i.e.,
rules for behavior). Adoption, therefore, is not always a simple process, wherein the new
technology is incorporated within the society with very little change to structure and culture.
Sometimes, structure and culture must change considerably to adopt and the public
requires assurances from opinion leaders to make such a change.
Recognizing the importance of the viewpoints of opinion leaders in influencing adoption
decisions provides the change agent with insight into how to bring about desired change,
which is to focus upon gaining adoption by opinion leaders with the knowledge that it will be
opinion leaders who will persuade others to adopt. We will return to the role of the change
agent later in this section.
Models of Mass Communication Flows:
As noted regarding relative advantage, transmission of scientific facts about a new
technology sometimes is insufficient to gain adoption. Rogers refers to the hypodermic
needle model as the attempt to gain adoption of a complex and controversial technology by
transmission of facts alone. He states that this model has had limited success. The two-step
flow model, on the other hand, which posits that interpretations of facts are mediated by
interactions with others, particularly in learning the viewpoints of opinion leaders, has been
shown to provide better explanation of adoption of complex technologies. The "two steps"
refer to mass media presentations of the viewpoints of proponents and opponents followed
by interactions with others and opinion leaders.
Change agent communication with others is aided by homophily--similarity in socioeconomic
characteristics--and hindered by heterophily--dissimilarity in socioeconomic characteristics.
The negative effects on interpersonal persuasion resulting from change agent heterophily
with potential adopters can be mitigated by understanding and operating within
communication networks (i.e., interconnected individuals linked by patterned flows of
communication). The structure of a communication network might be such that change
118
agents can gain access to heterophilous opinion leaders by relying upon the strengthof-weak-ties provided by interstitial persons. Imagine a communication structure consisting
of two cliques of relatively heterophilous persons, wherein each clique is strongly influenced
by one opinion leader. Imagine further that one person (typically, not a strong opinion
leader) from each clique has a "weak" tie (i.e., occasional meetings, conversations; perhaps
a common interest) with one another. The "strength" of this weak tie between these two
interstitial (i.e., bridging) persons is that the change agent can ask the interstitial person
with whom he/she is homophilous to provide an introduction to the heterophilous interstitial
person and thereby gain access to the heterophilous opinion leader.
Characteristics of Opinion Leaders:
A key aspect of understanding how the social system affects diffusion is that social systems
have prestige hierarchies: some persons/organizations are more influential than others. The
social comparison process is affected most by opinion leaders. To effectively gain adoption
of a new technology, the change agent should know how to identify opinion leaders in the
social system. Sometimes, this task is fairly straightforward in that highly influential
persons/organizations can be named by members of the social system in a social survey. To
learn opinion leaders regarding food safety, for example, one might conduct a nationwide
social survey of adults to ask them whom they most trust regarding food safety information.
In other cases, for example within a community, opinion leadership can be more difficult to
identify. This segment describes opinion leaders and a procedure for identifying them within
a community.
The defining characteristic of opinion leaders is they are well respected in their social
system. Respect can be associated with higher socioeconomic status (i.e., education,
occupation, income), but does not require it. Opinion leaders, for whatever reason, sway
adoption decisions through their influence (i.e., informal persuasion), not power (i.e., affect
on behavior arising from the use or threat of using force).
Monomorphic opinion leaders affect decisions within a relatively narrow range of issues
(e.g., the American Medical Association is influential regarding health-related technology
choice); polymorphic opinion leaders influence decisions across several issue areas (e.g.,
the opinion of the magazine Consumer Reports is respected on many topics).
Five Approaches to Identifying Opinion Leaders
The five approaches listed below vary in their expense of implementation and
accuracy in locating opinion leaders. To illustrate these approaches, they are
presented within the context of locating opinion leaders in a community, say for the
purpose of gaining adoption of a municipal bond levy to fund additions and
improvements to the school system.
1. Positional: In this approach, persons in elected or appointed positions in the
community are assumed to be opinion leaders. Thus, the school superintendent,
city council persons, and the mayor would be assumed to be opinion leaders on
school-related issues. This approach is inexpensive--one could learn with a
telephone call to the local courthouse who occupies elected and appointed
positions. But the approach can be highly inaccurate because it assumes
opinion leadership based upon position, rather than respect.
2. Self-Designating: Here, the change agent asks selected individuals to identify
themselves as being influential on school-related issues. The approach has the
advantage of getting input on influence from community members, and
therefore is more accurate than the positional approach. It requires a bit more
expense in that the change agent typically will travel to the community to
interview persons for the needed information. A potential pitfall of the
119
self-designating approach is that persons might over- or under-estimate their
influence on others.
3. Reputational: The reputational approach relies upon the nominations of selected
individuals on, for example, "the ten most influential persons in this community
regarding school-related issues." Using the reputational approach generally
improves the accuracy of identifying opinion leaders because one is getting
information from more than one source about the influence of others in the
community. Typically, persons using the reputational approach will "sno-ball"
their nominations from key informants. Key informants are persons who have a
thorough knowledge of the community and how it works: newspaper editors,
bankers, real estate agents, school superintendents, and city council members
make good key informants (the newspaper editor likely will only provide names
to talk with, rather than more information, due to issues of confidentiality).
Nominations from these key informants are contacted and asked to name their
"10 most influential persons...," and so on, until the list of nominations is
"sno-balled" into a comprehensive list of persons. Using informal "eyeballing" of
the nominations, or sometimes very sophisticated network analysis software,
the change agent selects from all nominations the "most often nominated"
persons as "reputational" opinion leaders. Remember to ask about opinion
leadership with respect to some specific area of skill (e.g., "school-related
issues") because opinion leaders in one issue area might not be opinion leaders
in another area.
4. Sociometric: As noted by Rogers, opinion leaders typically are located at the
center of communication networks. Sociometry is the mapping, usually using
sophisticated network analysis software, of contacts among a potential list of
opinion leaders (usually those identified by the reputational approach). This
mapping of contacts helps the change agent locate persons who are at the
center of communications about the issue area. A question asked of
reputational leaders to map contacts might be, "How often do you contact
[person X] about school-related issues in this community?"
One interesting use of sociometric analysis is the identification of cliques of
leaders. Personal histories or acquired characteristics such as skin color or
gender can underlie the formation of leadership cliques in a community.
Sociometric maps can help identify "natural" boundaries among cliques of
opinion leaders. Sociometric maps also can help identify interstitial persons,
who link leadership cliques. Interstitial persons might be somewhat marginal to
their respective cliques, but because they are connected with other cliques,
they can provide the change agent with access to cliques that might otherwise
be difficult for the change agent to gain rapport. Interstitial persons might have
a "weak" tie to one another (i.e., they might not contact one another very
often). But the strength" of this weak tie is it gives the change agent access to
different cliques of opinion leaders.
5. Observation: There is no substitute for observing social action within the
community. Some opinion leaders are not located at the center of a
communication network, but prefer by their personality to be located a bit
outside the everyday communication pattern. Also, reputation can be
misleading. If the sociometric analysis is conducted using reputational leaders,
an important leader might have been left off of the map altogether.
Observation, because of costs related to lodging, food, and travel, is the most
expensive of the techniques described here, but it is also the most accurate.
The Change Agent (Chapter 9)
120
The change agent influences clients' innovation decisions in a direction deemed desirable by
a change agency. Change agents act as linkers between the change agency and clients.
The change agent:
1.
2.
3.
4.
5.
6.
7.
develops a perceived need for change,
establishes an information exchange relationship (credibility),
diagnoses problems,
creates intent to change in the client,
translates intent into action,
stabilizes adoption and prevents discontinuance, and
achieves a terminal relationship.
Change agent success depends upon:
1.
2.
3.
4.
5.
6.
7.
8.
change agent effort,
change agency vs. client orientation,
change agent empathy,
homophily and change agent contact,
change agent contact with lower status clients,
effective use of paraprofessional aides,
working with opinion leaders, and the
client's evaluative ability to judge the innovation for themselves (the change agent
should educate as well as diffuse).
Good listening skills are essential to change agent success in working with opinion leaders.
These links provide information on how to learn good listening skills:
Reflective Listening: Listen and send it back.
Active Listening: Communication in Organizations.
Centralized and Decentralized Diffusion Systems
The classical diffusion approach assumes a centralized research and development
organization that makes most decisions about the innovation and its diffusion. The
advantages of the centralized approach to technology development and dissemination are:
1. a collectivity of technical experts devoted to improving the quality of the technology,
2. coordinated efforts at technology transfer, and
3. a limited ability to gain adoption of innovations not popular but important for societal
well-being (e.g., seat belt requirements, anti-smoking campaigns, environmental
protection laws, civil rights legislation).
The decentralized diffusion approach entails technology development and dissemination
from small firms, local entrepreneurs, and grass-roots organizations. The advantages of
decentralized innovation development and diffusion are:
1. advancement of needed changes in the social system (i.e., social movements
regarding civil rights, feminism, environmentalism),
2. encouragement of local initiative in small firms,
3. local control of technology development, and
4. motivation for self-reliance.
The Innovation Decision Process (Chapter 5)
The presentation thus far has focused primarily upon relationships between the social
system and innovation adoption. This segment describes the time sequence of events
leading to adoption. The innovation-decision process is a theoretical model of the stages of
decision making resulting in confirmed adoption of a new technology. The process is one
121
example of the axiom underlying social-psychological approaches to explaining attitude and
behavior change called the hierarchy-of-effects principle. This principle states that:
knowledge causes
an evaluation (or attitude) that leads to
a commitment to take action that results in
behavior change.
The theoretical development of this principle in the mid-1960's coincided with the
formulation of the innovation-decision process and other conceptual approaches to
explaining behavior change related to attitude change (in contrast to stimulus-response
approaches to behavior change).
Stages of the Innovation-Decision Process:
1. Knowledge. Most often, potential adopters become aware of the innovation through
mass media messages distributed by news outlets, trade journals, internet web sites,
and scientific publications. Because consumers' engage in selective exposure to
preferred sources of information and selective perception of certain types of
information, change agents must carefully plan their presentations of hardware,
software, and innovation-evaluation information. Knowledge acquisition about
low-involvement innovations--new products with few perceived risks (i.e., consumer
goods)--raises uncertainties for the consumer. Is this a high quality product? Is it
being sold at a good price? Will it be a popular choice for others? Learning about high
involvement innovations--complex, controversial technologies--raises these same
uncertainties and many more. Am I being told all the truth about this technology? Is it
safe for me and others? Will its adoption lead to inequities in the sharing of risk?
Knowledge diffusion can be a difficult period for proponents of a new technology.
Much more information must be disseminated than for low involvement innovations.
The information is more technical and, by nature, less certain because the technology
is new. Most importantly, perhaps, active opposition groups disseminate unfavorable
messages about the technology. It is critical for proponents to recognize that,
because negative information carries disproportionate weight, they usually are at a
disadvantage during the knowledge stage of diffusion. It is equally critical for
proponents to recognize that the consumer is not being irrational by not immediately
accepting the scientific viewpoint of a new technology, but instead is being justifiably
skeptical of a new technology that is being opposed by consumer advocacy
organizations.
2. Persuasion. For low involvement innovations much of the diffusion process rests upon
marketing principles of product, pricing, place, and promotion. Gaining adoption of
high involvement innovations also requires attention to these four p's, but demands
further that the social comparison process be influenced by opinion leaders supportive
of the technology because, unlike for low involvement innovations, consumers are
being exposed to messages that oppose high involvement innovations. Thus, gaining
adoption of a complex, controversial technology requires a good product, price, and
so forth, but it requires also that respected opinion leaders who support it to counter
the opposition arguments. One has to sell a low involvement technology to a passive
audience; one has to sell a high involvement innovation to an audience who is
exposed to active opposition to it.
Whereas opponents typically have the advantage at the knowledge stage, proponents
usually gain the advantage at the persuasion stage. This shift occurs because
research and development organizations usually are university based or are otherwise
respected technology development firms. Thus, they enjoy the reputation of being
relatively correct in their risk assessments and trustworthy in their pursuit of
improving society. Respected opinion leaders, therefore, because they have close
122
contacts with centralized research and development organizations and because they
know that most often the technologies produced by these organizations will be based
upon sound scientific principles, support the new technology. Given that support from
opinion leaders is critical to gaining adoption of high involvement innovations in the
face of arguments by well-organized opposition groups, proponents typically regain
their lost initiative at the persuasion stage.
We might at this time begin a healthy debate about the characteristics of an ideal
society. We might discuss and debate about the concept of progress. We might argue
that scientists usually receive support from opinion leaders because they usually are
correct. We might also consider the interlocking nature of relationships among
powerful research and development organizations and opinion leaders and whether
these relationships further the common good. The good change agent, as we will
discuss in Part II of this section will ask many questions about the inevitable negative
consequences of new technology adoption. For now, it is important to realize that
centralized research and development organizations and opinion leaders often are of
the same mind and therefore proponents usually have the advantage over opponents
at the persuasion stage.
3. Decision. The decision that the innovation is worthy of being adopted represents a
major advance for proponents of a high involvement technology. Proponents, with
support from opinion leaders, have overcome opposition arguments to convince
consumers to accept the technology. This act of symbolic adoption, however
important it is, does not assure behavioral adoption. Symbolic adoption by more and
more consumers does add inertia to the diffusion effect. As more persons adopt, there
is increasing pressure for non-adopters to adopt. This pressure to adopt comes about
because adoption of a new technology:
Oftentimes brings about changes in related technologies. Changes in computer
hardware and software capabilities, for example, often go hand-in-hand,
making it difficult to hold on to a personal computer and still be able to utilize
software that others have adopted.
Can be accompanied by changes in infrastructure support for older
technologies.
Can sometimes bring about changes in laws that favor the newer technology.
Can shift economic advantage to use of the newer technology.
Is accompanied by cultural changes that favor the newer technology.
That is, not adopting sometimes can bring about social, economic, and political
disadvantages as others adopt.
4. Implementation. Implementation refers to the initial trial period for the new
technology. The move from symbolic adoption to implementation is not necessarily an
easy one. Obstacles to implementation include:
Transaction costs: It might be expensive to make the move to the new
technology, even though it has long-term economic advantages.
Infrastructure support: Because the technology is new, technical support,
servicing, retail chains, and other aspects of market development might not be
sufficient to encourage implementation.
Personal decisions: The end-use might recognize the relative advantages of the
new technology, but find themselves in cash-flow problem, in the middle of
another transition, or at the end of their career and not willing to invest in
change that reaps only long-term benefits.
Implementation often entails re-invention, an alteration of the innovation by the
adopter. Adopters alter the new technology to fit their specific needs. Sometimes,
123
alterations are trivial in nature, reflecting more a narcissism of small differences
rather than a substantive change in the makeup or functioning of the innovation.
Such modifications might be nevertheless important for confirmation in that people
usually like to feel some sense of ownership over new technologies. The advantages
of re-invention include:
increased flexibility in applications of the innovation,
increased relative advantage for local use, and
increased sense of ownership over the new technology.
Re-invention can create problems for the adopter, however, and is not always
encouraged by research and development organizations. Disadvantages of
re-invention include:
improper application leading to less effectiveness of the innovation,
inability of the research and development organization to maintain quality
control over the technology in use,
legal problems if the change infringes upon the protection of a closely related
technology.
5. Confirmation. Confirmation involves seeking of reinforcement for the adoption
decision and integration of the new technology within the framework of existing
practices.
Because social comparison is critical to adopting high-involvement innovations,
reinforcement of the social acceptability of the innovation after implementation is an
important aspect of the innovation-decision process. Social psychologists working in
the 1950's recognized the importance of dissonance reduction on behavior change.
Once a difficult decision has been made the adopter finds it psychologically satisfying
to accentuate the good reasons for making the decision to adopt and decentuate the
good reasons for not adopting. Note for yourself how your thoughts about the good
qualities of that other automobile (an expensive item for most persons to purchase)
diminish after taking ownership of the automobile you selected to purchase. This
game we play to sooth our anxieties about difficult decisions becomes more important
the greater the stakes involved in the adoption decision. Adopters of complex,
controversial technologies, therefore, look for signals that their decision was the
correct one. Good change agents, therefore, will reinforce the decision and seek ways
to facilitate the transition to using the new technology (most likely, your automobile
dealer contacted you shortly after your purchase to confirm your decision and seek
your feedback on the product).
Discontinuance, or rejection of a technology, can occur anytime including during
confirmation. Replacement discontinuance occurs when a better innovation is
introduced and adopted. Disenchantment discontinuance results when problems arise
with the design or usefulness of the innovation that were not anticipated. Highly
complex innovations can be discontinued when persons think they can master them
but find they cannot. Changes in policy or in economic, social, or environmental
conditions can lessen the effectiveness of the innovation. Nothing is certain but
change, right?
Innovativeness and Adopter Categories (Chapter 7)
Experience has taught diffusion scholars that adopters can be classified within five
categories: innovators, early adopters, early majority, late majority, and laggards. The
specific percentage of adopters in each category is not critical information; neither are the
differences in characteristics that separate any two of the categories. The importance of the
classification scheme is to highlight that the characteristics and needs of potential adopters
124
differ during the diffusion process. Of special importance is recognizing the roles played by
innovators and early adopters.
Innovators with respect to one new technology but be laggards with respect to another.
People do, however, tend to exhibit socioeconomic and psychological qualities that place
them within certain adopter categories:
1. Innovators (first 5 percent of adopters) tend to be venturesome, cosmopolite,
networked with other innovators, have available financial resources, understand
complex technical knowledge, and be able to cope with uncertainty. Change agents
should recognize that, for high-involvement innovations, innovators do not
significantly affect adoption decisions. Innovators, by definition, are too socially
marginal to gain the respect needed to be an opinion leader. Thus, while adoption by
innovators might encourage the change agent (as it did Nelida in Los Molinas), it
cannot be expected that innovators will generate much diffusion effect.
2. Early Adopters (next 10 percent of adopters) are respected and more local than
innovators. It is from this category that the change agent should expect to locate
opinion leaders. These persons are venturesome, but sufficiently skeptical to
recognize good innovations from poor ones. Because opinion leaders have more
influence on the diffusion effect than persons in any other adopter category, it is
persons in this category that the change agent attempts to persuade to adopt.
3. Early Majority (next 35 percent) tend to interact frequently with peers, seldom hold
positions of opinion leadership but have strong interconnectedness within the
system's interpersonal networks, and tend to have a long period of deliberation before
making an adoption decision.
4. Late Majority (next 35 percent) tend to adopt from economic/social necessity due to
the diffusion effect. They usually are skeptical and cautious and have few extra
resources to risk on high-involvement innovations.
5. Laggards (final 15 percent) are the most localite, suspicious of change agents and
innovations, and have few resources to risk. It might sound as if the laggards are a
doltish lot. In fact, persons within this category might be highly innovative in their
symbolic adoption but slow to implement because they have few financial resources to
offset transition costs or little access to innovation-evaluation information. By
coincidence or design, laggards are the "smartest" ones when seemingly beneficial
innovations become unexpectedly costly or ineffective.
The inability of some to adopt when they would like to do so underscores the fact that
new technology adoption can further existing inequalities. That is, if the new
technology creates economic advantages, but requires resources to offset transaction
costs, then income inequalities can widen as a result of new technology adoption. The
innovativeness-needs paradox refers to the social problem wherein the individuals
who most need the benefits of an innovation generally are the last to adopt it.
Empirical Example
In The Social Fabric and Innovation Diffusion: The Case of Food Irradiation Stephen Sapp
and Peter Korsching describe how the diffusion approach can be used to understand
consumer opinions of food irradiation. They found that, while information about food
irradiation from its opponents can create negative opinions about it, over time these
opinions become more positive due mainly to the consumer's compliance with the
viewpoints of opinion leaders. Thus, their findings support the diffusion of innovations
approach in showing that opinion leader influence can have a significant effect on consumer
opinions.
125
Changing people's customs is an even more delicate responsibility than surgery.
Edward H. Spicer, Human Problems in Technological Change.
These things must be done delicately, or you hurt the spell.
The Wicked Witch of the West, The Wizard of Oz.
Introduction
Research and experience have shown that the diffusion of innovations approach is highly
effective in gaining adoption of many types of innovations across a wide variety of settings.
The question that remains to be addressed in Sociology 415 is, "Should sociologists become
involved as change agents?" Recall that we are thinking of the sociologist as employed by
the public with funding from their tax dollars. Recall also that all innovations inevitably bring
about negative consequences for some members of the population. So, should a publicly
funded employee seek change that will benefit some and create negative consequences for
others? Should the sociologist, as a publicly employed scientist, advocate for/against
adoption of genetically modified food, food irradiation, affirmative action, feminism,
environmentalism, use of contraceptives ... ?
This question is as old as the field of sociology and cannot be answered by policy. The
answer is left to the judgment of the sociologist. What Rogers asks us to keep in mind as we
consider our options is that sometimes seemingly benign actions aimed at societal
improvement undertaken with the best of intentions can create very negative consequences
(e.g., read "Steel Axes for Stone-Age Aborigines," pages 421-422 in Rogers, or for
examples closer to home, read about the failed Pruitt-Igoe Housing Project or the
controversial social program of forced school busing). In Chapters 3 and 11, Rogers
describes how social change can result in unintended, unanticipated, and undesirable
consequences and recommends approaches sociologists can take to avoid negative
consequences when they become involved in the delicate process of acting as a change
agents.
Compass
Key Questions
What are the ethical obligations of the change agent?
How can negative consequences associated with technology adoption be
mitigated?
Examples
What negative consequences might be anticipated as a result of societal
adoption or rejection of the sampler technologies?
What policies might be adopted to mitigate negative consequences of societal
adoption or rejection of the sampler technologies?
126
Contributions and Criticisms of the Diffusion Approach
This final section of Sociology 415 reviews contributions and criticisms of the diffusion of
innovations approach and suggestions offered by Everett Rogers for mitigating negative
consequences associated with new technology adoption. The central point of Chapters 3 and
11 is that change agents, given their accountability to all citizens, have a responsibility to
address negative consequences.
Contributions of the Diffusion of Innovations Approach
Rogers first reviews contributions of the diffusion approach in helping sociologists and other
change agents gain adoption of new technologies.
1. The diffusion model is relevant to many disciplines and topics. The model enjoys
much popularity among a wide variety of academic disciplines, public agencies, and
private firms in providing insight into adoption decisions and strategies for gaining
adoption. The approach works, it works in many settings worldwide, and it has done
so for many years. The diffusion of innovations model serves as the foundation for
every social change program in the world.
2. The model has a strong applied focus. Although supported by much sound social
science theory, one need not be an professional social scientist to develop practical
programs for social change. The methodology of diffusion is clear-cut and relatively
easy to implement.
Criticisms of the Diffusion of Innovations Approach
Because the sociologist is supposed to engineer society in a favorable manner, the
unintended, unanticipated, and undesirable consequences of technology adoption need to
be foreseen and mitigated as much as possible. No small task!
This section describes criticisms of the diffusion approach. Not all the criticisms reviewed by
Rogers are presented here. Some of them are relevant only to professional sociologists
conducting research on the model itself. Please read all the criticisms of the model
presented in Chapter 3, but just the ones likely to be of interested to most persons are
listed below.
Overadoption
Overadoption is adoption when experts suggest rejection, or less adoption. This criticism is
a variation on the theme that one can have too much of a good thing. Too much housing
development in certain locations, for example, can be detrimental to environmental quality.
Too much use of antibiotics in medications, animal feed, and cleaning products frightens
microbiologists who express concerns about bacteria developing a strong resistance to
antibiotics, thereby becoming "supergerms" that will be difficult to defeat. Sometimes, good
innovations should not be adopted by persons who cannot afford them or cannot use them
wisely because of insufficient knowledge of how they work.
Thus, the delicate task of social change is fraught with many hidden dangers. One value
choice leads to another. Questions that come to mind are:
Which innovations should be diffused?
Who should have them?
Who should not have them?
Should limits be placed upon technology adoption?
Pro-Innovation Bias
127
The pro-innovation bias is the implication that the innovation should be adopted by all
members of the social system. New technologies offer wonderful promises for a better,
brighter tomorrow. Many technologies have lengthened the life span, eased burdens, and
provided much entertainment and pleasure. Technological failures, however, sometimes
bring about heartbreaking catastrophes. Given their responsibilities to all citizens to improve
society, sociologists are obligated to investigate potential negative consequences rather
than blindly accept the promises of new technologies.
The sociologist must be critical in evaluating new technologies, recognizing that some
technologies are produced for and by the power elite. Reduction of inequalities sometimes
requires the sociologist to note that adoption will increase inequalities or cause the less
powerful to bear a disproportionate share of the risk.
The Individual-Blame Bias
The individual-blame bias is a tendency to blame individuals for their non-adoption. Of
course, some persons are laggards simply because they do not like change, are slow to
understand new technologies, and so forth. The responsible change agent, however, must
look beyond such individualistic explanations to fully understand non-adoption because not
all laggards are ignorant, resistant to change, or otherwise personally predisposed to reject
new technologies.
One course of action for the change agent to pursue in understanding non-adoption is to
investigate how the characteristics of the innovation might influence some persons to be
laggards.
It might be, for example, that laggards fully understand the features of the innovation
but do not find it compatible with their values (e.g., Amish and Mennonites reject
many technologies based upon their religious beliefs).
Laggards might want to adopt an innovation but do not have the financial ability to do
so (e.g., safer automobiles tend to be more expensive to purchase).
It might be that laggards do not have a good opportunity to adopt because the
innovation is not easily available to them (e.g., a person might be anxiously awaiting
the opportunity to have a cable connection to the internet when it becomes available
in their geographic area).
The explanations provided above for non-adoption focus upon legitimate reasons why some
persons are laggards. The system-blame perspective, as a second explanation of
non-adoption, seeks to understand why many persons rather than just laggards do not
adopt. Why, for example, were many persons reluctant to wear seatbelts when driving their
automobiles? Blaming many persons for being ignorant, lazy, etc. when many persons are
made aware of a new technology is not an adequate explanation for widespread
non-adoption. To understand why many persons do not adopt an innovation that seems
beneficial, sociologists must investigate system-level constraints to adoption. What might
seem like individual reluctance to adopt might be a symptom of cultural or structural
conditions in society that impede adoption.
It might be that issues of compatibility are widely felt within a society (e.g., many
persons reject abortion because they consider it to be immoral, or said another way,
they see abortion as harmful rather than as beneficial).
It might be that societal infrastructure does not encourage adoption (e.g., it has
taken time for policies to be developed to support individual use of ethanol as a fuel
additive).
It might be that societal infrastructure erodes the effects of adoption (e.g., Ralph
Nader, in Unsafe at Any Speed, pointed out that, even if drivers wore seatbelts,
automobile and highway construction technologies and policies significantly
contributed to motor vehicle injuries and deaths).
It might be that societal-level practices constrain adoption (e.g., a colleague noted
128
that practices in some countries encourage unsafe of dangerous pesticides because
powerful interest groups are capable of influencing national policies).
No-till farming, for example, was difficult to diffuse because the prevailing culture defined
a "good farmer" as one who removed all crop residue from the field and cut deep furrows
with mole board plows after harvest. Redefining the "good farmer" was a critical part of
gaining adoption of no-till conservation practices. A second system-level explanation for
slow adoption is that new innovations might threaten the status of the power elite. If so,
then the power elite might create barriers to the development and dissemination of the
new technology.
Issues Related to Furthering Inequalities
The critical perspective of sociology asserts that the powerful elite will intentionally
encourage the development of technologies that maintain or further their class standing.
Such activity might or might not occur; it is difficult to prove technological conspiracies to
further inequalities. Proving intent by the powerful elite, however, is unnecessary when
investigating negative consequences of technology adoption. Whether by intent of the
powerful elite or not, new technology adoption can further inequalities between upper and
lower classes.
New technology adoption can further inequalities for several reasons:
1. People in upper classes typically have greater input to research and development
planning and decision making. Through their contacts and memberships on
committees and advisory councils they can suggest needs for technologies that seem
appropriate to them and consequently tend to benefit them more than they benefit
persons in lower class positions.
2. Upper class people are more likely to hear about new technologies earlier than are
persons in lower class positions. Earlier knowledge of emerging technologies gives
upper class persons an edge in planning for change. Upper class persons are more
likely to hear innovation-evaluation information earlier than their lower class
counterparts and therefore know in advance how well a new technology works.
3. By definition, upper class persons have greater economic resources that allow them to
take risks on new technologies that would be too great for persons with less money.
Thus, because upper class persons have more input to technology research and
development, learn earlier than other persons about new technologies, and are in a better
position to take economic risks, they are able to take advantage of beneficial technologies
earlier than persons in lower class positions.
These privileges can further inequalities because oftentimes the marginal benefits of a new
technology are highest at its early stages of implementation. Profit margins are higher early
on and decrease as competing firms produce similar technologies or as patents expire.
Competitive advantages of increased efficiency or productivity are higher early on and
decrease as others adopt similar practices. Persons in position to adopt early, therefore,
reap more benefits. Rogers refers to advantages accrued through early adoption as windfall
profits. If the elite adopt a beneficial technology early, then inequalities increase, whether
through intentional conspiracies or not.
Exploitation of Weaker Social Systems
The Enlightenment perspective is that "the more advanced the technology the better"
because everyone benefits from increased productivity and efficiency. The critical
129
perspective, on the other hand, asserts that technology adoption might decrease rather
than increase quality of life. The critical perspective outlines four ways in which the more
powerful can exploit resources from the less powerful through diffusion of new technologies:
1. Economic Leakage refers to loss of potential income to another social system.
Economic leakage occurs, for example, when people living in one town conduct most
of their shopping in a neighboring town; money "leaks" out of their home town.
To better understand the process of economic leakage and how it can be a
mechanism of exploitation it is instructive to place the phenomenon within the context
of different levels of economic development. A primary economy is one that focuses
mainly upon raw commodity production (e.g., growing food crops, mining raw ore). A
secondary economy is capable of further manufacturing of raw commodities; it adds
value to raw commodities (e.g., processing food products, finishing metals and
minerals). A tertiary economy finances raw commodity production and manufacturing
and coordinates trade in raw and value-added goods. Typically, profit margins
increase in more advanced economies.
Social systems with secondary economies are in a position to further economic
leakage from social systems with primary economies by providing them with
technologies that increase their capacity to produce raw commodities. The transfer of
production technology from the secondary to primary economy serves as an
investment that increases profits to the secondary economy because the primary
economy can produce more raw materials at a lower unit cost. Similarly, tertiary
economies are in a position to increase their profits by transferring technologies to
primary and secondary economies that increase their capacities to produce and
process raw commodities.
From the structure-function perspective, such activity increases the productivity and
efficiency of the global capitalist system and is therefore beneficial to all. From the
critical perspective, technology transfer for the purpose of increasing raw commodity
production and processing by a social system with a tertiary economy amounts to
exploitation of weaker social systems. From either perspective, the long-term
well-being of social systems with primary economies can be decreased if tertiary
economies transfer only raw commodity production technologies to them because less
developed social systems can find it difficult to increase their capabilities to develop
secondary or tertiary economies. They advance technologically in producing raw
commodities, but they remain less developed because they do not advance
technologically in adding value to raw commodities.
Americans sometimes wonder why the United States has liberal trade policies for
transferring production technologies to less developed countries. Understood from
the perspective of economic leakage, one realizes that, as a social system with a
tertiary economy, the U.S. benefits by providing technologies that increase the rate
of production of raw commodities because it will enjoy the higher profit margins of
financing greater commodity production and processing.
2. Dependency refers simply to relying upon other social systems to support production
in the host social system. If, for example, a less developed country adopts
technologies more advanced than their current capabilities, they become dependent
upon the core country that provided the technology to train persons to use the
technologies, repair them, and integrate them within production systems that rely
upon other technologies provided by core countries.
130
Another advantage to the United States of transferring production technologies to
less developed (i.e., host) countries is that they become dependent upon the U.S.
for training users of the technologies and providing parts and other support services
for technology maintenance.
3. Political influence of leaders of host countries facilitates transfer of technologies that
can further the economic position of core countries. Because technology transfer is an
important element of trade, trade policies can transcend differences in political and
ideological outlooks.
The United States has supported dictatorships and traded with communist nations
and nations with many human rights violations whose trade policies favor U.S.
economic development and technology transfer.
4. Too rapid social change sometimes can result from technology transfer. Indicators of
too rapid social change are feelings of anomie (loss of common values, sense of
mission, or feelings of belonging) and alienation (feelings of being unimportant,
disconnected from others) that can result in higher rates of crime, family disruption,
and loss of productivity. Rapid social change can create vulnerabilities in social
structure and functioning that allow core countries to exploit local resources or
influence technology and trade policies.
The web pages listed below discuss issues of rapid social change and negative
consequences that can result from too rapid adoption of core-country technologies.
The Consequences of Culture Change.
Globalization and Rapid Social Change.
Social Change After the September 11 Terrorist Attack on the United States.
Rogers summarizes this classical approach to development in a chart that shows the
dominant paradigm of development and suggested alternatives to it.
Ethical Responsibilities of Change Agents
Everett Rogers suggests ways the change agent can act responsibly in diffusing information
about new technologies. The guiding principle is that, because the diffusion approach is
effective in gaining adoption of innovations, the change agent has a responsibility to explore
potential negative as well as positive consequences associated with technology adoption.
Principles of Ethical Conduct
Ethical conduct for a change agent involves asking questions that explore motives for
technology development and dissemination and potential negative consequences of
technology adoption.
Typical Diffusion Questions:
131
Rogers notes that too often change agents focus their attention too strongly upon gaining
adoption. In so doing they ask questions that focus their attention only on adoption:
1. How are technological innovations diffused in a social system?
2. What are the characteristics of innovators, etc.?
3. What is the role of opinion leaders in the interpersonal networks through which a new
idea diffuses in a social system?
More Appropriate Questions:
Rogers states that more appropriate questions explore the social context of technology
development and dissemination:
1.
2.
3.
4.
What criteria guide the choice of which innovations will be diffused?
Who decides which innovations should be developed?
Who controls the communication channels?
What is the nature of the society's social structure and what will be the impact of the
innovation on this structure?
Strategies for Reducing Inequalities
Rogers suggests strategies the change agent can use to reduce gap-widening consequences
of new technology adoption. He organizes these suggestions into the three major reasons
why socioeconomic gaps widen as a result of innovation adoption:
Strategies When Ups Have Greater Access to Information:
1. Provide redundant information to allow Downs the opportunity to hear and understand
information about innovations.
2. Tailor messages to Downs. Messages, for example, might rely more upon drawings
and pictures than text.
3. Tailor media to Downs. Rogers points out that in the United States, for example,
persons of lower socioeconomic class depend upon television more than print media
to learn about innovations.
4. Organize small group presentations to Downs to help them understand innovations.
5. Encourage more change agent contact among Downs. Rogers notes that, because
change agents typically are more homophilous (i.e., similar in social characteristics)
with Ups and heterophilous (dissimilar in social characteristics) with Downs, they tend
to focus too much of their attention on gaining adoption among Ups rather than
Downs. Rogers urges change agents to more actively seek contact with Downs.
Strategies When Ups Have Greater Access to Innovation-Evaluation Information:
1. Typically, change agents identify opinion leaders that influence Ups. Rogers suggests
that change agents identify opinion leaders among downs to provide them with
innovation-evaluation information.
2. Use change agent aides among downs. Rogers notes that experience has shown that
using paraprofessional aides recruited from among Downs helps Downs better
understand innovation-evaluation information and gain trust in change agents.
3. Organize groups among the downs. Rogers reinforces the need for change agents to
have more contact with Downs, including organizing interest groups who might serve
as information providers for a wide variety of innovations.
Strategies When Ups Possess Greater Slack Resources:
1. Encourage core social systems to develop and disseminate technology appropriate to
the socioeconomic conditions of the host social system. Oftentimes, the technology of
most use to host social systems is much less advanced than the technology most
132
2.
3.
4.
5.
advocated by the core social system. Appropriate technology can be less expensive to
implement and lessen dependency upon core social systems.
Urge change agents to help Downs form cooperatives to provide them with
purchasing and selling power in relation to core social systems.
Encourage research and development organizations to include Downs in the planning
and dissemination of new technologies.
Advocate for core social systems to fund special agencies to work only with the
Downs.
Encourage change agents to place more emphasis on diffusion of decentralized
innovations. Decentralized innovations, in contrast with centralized innovations,
encourage greater sharing of power and information. They take more of a problemcentered approach to technology development, are highly adaptable to local
conditions, come from experimentation by nonexperts in local settings, and give
power to local people to make adoption decisions.
Summary
Because all technologies bring about negative consequences for some, the delicate process
of technology transfer is, itself, a complex and controversial task. Change agents, knowing
the power of the diffusion approach in gaining adoption, must be aware of the
responsibilities that accompany the application of diffusion strategies. Change agents should
strive to reduce the gap-widening effects of new technology adoption.
133
The choice of technology, whether for a rich or poor country, is probably the most important
decision to be made.
George McRobie, Conservation Foundation Letter, October, 1976.
Science, Technology, and Society
1. Because technology is embedded within a social context, it is influenced by social,
political, and economic interests and its transfer from one social system to another
can be problematic.
2. Expert opinion regarding the production of technology does not necessarily imply
expert opinion regarding the use and transfer of technology.
3. Evaluation of technology is exceedingly difficult, and depends upon a wide range of
indicators, including ones outside the domain of science (e.g., is legalized abortion
moral?).
Social scientists should pay greater attention to:
1.
2.
3.
4.
5.
the 'political economy' of the scientific laboratory,
the organization and culture of private sector research and development,
the impact of public interest groups on science and technology,
integrating other social sciences into the sociology of science, and
building linkages between the sociology of science and public policy makers who
influence the direction of science and technology.
The Philosophy of Technology
1. From the Classical perspective, technology is neither good nor bad, but a simple
derivation from the immutable laws of the universe.
2. From the Enlightenment perspective, science provides a means to dominate nature
through an ongoing process of improving technology and solving social problems.
3. From the Critical perspective, technology is created by and for the benefit of the
power elite who use technology to exploit resources from the less powerful.
4. Risk is not necessarily a neutral language. It might represent the deeply interested
knowledge of those who are able to command it.
5. "The real uncertainty at stake in the language of risk is the relationship between
power and democracy."
The Philosophy of Science
1. It is impossible for science to be objective, value-free, and unbiased.
2. The best approach to achieving as much objectivity as possible is to rely upon the
intersubjective opinion of the community of scholars.
3. The community of scholars, however, harbors inherent biases characteristic of all
human collectivities.
4. If science is inherently biased, then so is technology in its development, assessment,
dissemination, and management.
Social Philosophy
1. The social structure paradigm views society as if it were a living organism. Society is
conceptualized as having parts, or institutions (i.e., economy, religion, education,
family, politics), wherein each part performs an essential function and works in
134
harmony with the other institutions for the benefit of the whole society. This emphasis
on society as a whole means that technologies are evaluated for how they benefit the
overall efficiency and productivity of the society in meeting its needs for survival.
2. From the critical perspective, society is a system of competing parts in conflict for
scarce resources. All social systems are considered to have a small minority of power
elites who control most of the functions of society. All social action, including the
development and dissemination of technology, takes place within an arena of conflict
and exploitation of secondary segments of society by dominant segments of society.
3. The human agency paradigm focuses not upon societal institutions or power
relationships within society, but upon interactions among the members of the society.
It addresses issues of how people make the rules that determine which technologies
will be adopted and which ones will be rejected.
Risk Assessment
1. Technical approaches to risk assessment attempt to identify hazard, the probability of
technology failure.
2. Cost-benefit approaches estimate potential costs, including the costs of potential
technology failure, in relation to potential benefits of a technology.
3. Psychological approaches focus on how knowledge acquisition and emotions affect
public perceptions of technology risk.
4. Sociological approaches note that risks are socially constructed through people's
interactions with others. Perceived social acceptability plays an important role in
public risk assessments.
5. Cultural approaches point out that moral and ethical issues affect public perceptions
of a technology.
6. All approaches to risk assessment have benefits and drawbacks.
7. All approaches are necessary for a complete understanding of risk.
Critiques of Risk Assessment
Part I
Can we assess risk better? John Adams notes that sometimes the public and scientific
experts differ in their evaluations of technology risk. This disagreement occurs, in part,
because the public uses a wide variety of criteria, including some nonscientific criteria, in its
evaluations of risk. Adams suggests keeping in mind the following observations on the
evaluation of risk by technical experts:
1.
2.
3.
4.
Remember, everyone else is seeking to manage risk, too.
They are all guessing; if they knew for certain, they would not be dealing with risk.
Their guesses are strongly influenced by their beliefs.
Their behavior is strongly influenced by their guesses and tends to reinforce their
beliefs.
5. It is the behavior of others, and the behavior of nature, that constitute the actual risk
environment.
6. It will never be possible to capture "objective risk."
Part II
The approach taken to risk assessment influences what is assessed and the outcome of the
assessment. In this section, we discussed how approaches to risk assessment differ.
1. The technical approach implies communication strategies that educate the public
about technical risk assessments. When risk assessments become public and
consumer perceptions do not coincide with actual risk, then, from the technical
perspective, acceptance of a new technology can be unnecessarily delayed or
implementation can become more expensive than necessary. Thus, public rejection of
135
the logic of technical risk assessments is considered to be irrational. Risk
communication strategies focuses upon educating an ignorant and sometimes
irrational public about actual risk. Strategies seek to reduce outrage based upon
inaccurate perceptions so as to retain a focus on actual risk.
2. The psychometric approach seeks to identify the cognitive, emotional, and socialdemographic determinants of public perceptions of risk. Why do we respond to risks
in the way we do? Why do public perceptions of risk differ so greatly from those of
technical experts? The psychometric approach has discovered how outrage factors
affect public responses to risk and developed strategies for overcoming public
resistance to new technologies. Risk communication strategy seeks to reduce outrage
by appealing to the public's sense of voluntariness, control, fairness, and moral
responsibility in technology development and dissemination.
3. The social process approach begins with the premise that risk and technology are
social processes rather than physical entities that exist independently of the humans
who assess and experience them. Risk communication is viewed as interactive among
technicians, the public, and organizations that have a vested interest in gaining either
adoption or rejection of the technology. This approach thereby emphasizes the free
exchange of ideas and mediation of sometimes competing agendas regarding risk
assessment and management. The focus is more upon the quality of discourse rather
than the substance of the arguments themselves.
Risk and Public Policy
1. Public policy is formulated within the context of risk perceptions, which, in turn,
reflect the public's opinions of the quality of risk assessment.
2. Risk assessors and the public face dilemmas (i.e., fact-value, standardization,
contributor's, de minimis, and consent) in their attempts to balance technical
evaluations with the public's desires for nontechnical input into risk assessment.
3. Accusations directed at a skeptical public delivered by technical experts assert that
the public is anti-technology, are remote from power and influence, will never be
satisfied with anything but 100% safety.
4. Attributions of motives to laypersons oftentimes are inappropriate and unfounded.
5. Sometimes developed countries explain away unethical dissemination of known
hazardous technology to developing nations with rationalizations (i.e., isolationist,
social-progress, countervailing benefits, and consent) about their efforts at promoting
progress.
6. The government and industry should recognize that minimizing harm is more
important than providing good, protect the public, assist consumers in their
self-determination, and stress the importance of values and long-term economic gain
vs. short-term economic benefits.
Risk and Public Discourse
1. Risk management has become increasingly politicized and contentious. Controversy
and conflict might have become too pervasive. It might be that the quality of society
erodes with too contentious public discourse about technology policy.
2. Risk controversies are not about science versus misguided public perceptions of
science, wherein the unwashed public needs to be educated about "real" risks.
Rather, risk controversies are struggles over whom will define risk.
3. Disparities between "real" and "perceived" risk might engender public discourse that,
itself, is a risk to the social fabric of society.
4. The pervasiveness of media attention to technology and risk assessments destroys
trust because most of what the media reports is trust-destroying news.
5. The increasing complexity of technological innovations and societal division of labor
leaves citizens in a position of not knowing much about highly complex and potentially
dangerous technologies. They must rely upon their judgments about whom to trust.
6. The public is not irrational in their skepticism about complex technologies, but rather
136
7.
8.
9.
10.
11.
12.
cautious in deciding whom to trust in their understandable state of ignorance about
these technologies.
The public and scientists rely upon social as well as technical criteria to evaluate risk.
Claims that the public is irrational in part are responsible for increasingly contentious
debate about complex technologies.
Some special interest groups profit from fear mongering within this atmosphere of
ignorance and fragile trust.
The media have a difficult job of presenting varying viewpoints on technical issues.
The concept recreancy refers to institutional failure resulting either from lack of
competence and/or fiduciary responsibility, to refer to societal-level inadequacies in
risk assessment, management, and communication.
Improving societal-level capacity in risk assessment, management, and
communication requires social scientists to assess the level of recreancy in American
society, become more aware of societal-level influences on risk assessment,
management, and communication, and build institutional capacity to facilitate wise
technology policymaking.
The Media and Risk Management
1. The Natural History Explanation posits that risk communication occurs in four stages:
pre-problem, alarmed discovery, awareness of technological fixes, and loss of interest
in the topic.
2. The Public Arena Model posits that risk issues must compete with other newsworthy
items for mass media exposure. Risks associated with complex and controversial
technologies might be covered in the media, but their length of coverage and degree
of exposure depends upon other topics of the day.
3. The Hoopla Effect refers to heightened awareness of controversy due to media reports
of controversy.
4. Research shows that negative media information carries disproportionate weight in
influencing initial public opinions of technology.
Risk Communication
Risk communication is the exchange of information and opinions concerning risk and
risk-related factors among risk assessors, risk managers, consumers, and other interested
parties. The goals of risk communication are to:
1. Improve the effectiveness and efficiency of the risk analysis process,
2. Promote consistency and transparency in arriving at and implementing risk
management decisions,
3. Promote awareness and understanding of the specific issues of the risk analysis
process,
4. Strengthen the working relationships and mutual respect among risk assessment and
management participants,
5. Exchange information among interested parties to risk analysis and management, and
6. Foster public trust and confidence in risk analysis and management.
Essential aspects of proper risk communication include knowing the audience, involving
scientific experts, establishing expertise in communication, being a credible source of
information, sharing responsibility, differentiating between science and value judgment,
assuring transparency, and placing risk in perspective.
The following elements should be included as part of a risk communication program:
1.
2.
3.
4.
knowing the nature of risk,
knowing the nature of the benefits,
knowing the uncertainties in risk assessment,
pursuing risk management options,
137
5. recognizing barriers to risk assessment, and
6. recognizing barriers associated with human agency,
Strategies for effective risk communication include:
1.
2.
3.
4.
5.
gathering background information,
preparation of technical facts,
recognition of outrage conditions,
two-way communication, and
review and evaluation of previous communication.
Strategies for mitigating negative consequences of a risk crisis include:
1. Describe in an open and honest manner the extent of the crisis and measures being
taken to control it.
2. Inform the public about how to reduce their risk exposure.
3. Help the public identify the hazard and how to avoid it.
4. Describe how to prevent further exposure to the risk.
5. Provide complete, up-to-date, and accurate information about the crisis.
6. Keep messages simple.
7. Choose and rely upon a media spokesperson.
8. Acknowledge prior misbehavior.
9. Acknowledge current problems.
10. Share control and be accountable.
Diffusion of Innovations
1. The mistake made most often in attempts at technology transfer is to assume that
transmission of the scientific facts about the technology will be sufficient to gain
adoption of it.
2. To effectively gain adoption, one must influence the social comparison process,
understand the innovation-decision process, assist in easing the transition to the new
technology, and mitigate negative consequences associated with new technology
adoption.
3. Diffusion is a process whereby an innovation is communicated through certain
channels over time within social systems.
4. An innovation is an idea, practice, or object that is perceived as new. Innovations can
be material or nonmaterial.
5. Innovations vary in relative advantage, compatibility, complexity, trialability, and
observability.
6. Mass media presentations create awareness, disseminate hardware, software, and
innovation-evaluation messages, and provide feedback to potential adopters about
those who have adopted. Because they create awareness, mass communications
place some pressure upon opinion leaders to make decisions about a new technology.
7. Interpersonal communications between experts and the public, opinion leaders and
the public, and among friends and family are equally as essential as mass
communications in bringing about new technology adoption.
8. Diffusion takes place within the context of structures of social relationships based
upon power, norms, and public acceptability.
9. Technology adoption, as a form of human agency, depends strongly upon social
comparison processes that lead to choice shift.
10. Technology adoption always brings about changes in normative expectations.
11. The two-step flow model has been shown to provide good explanation of adoption of
complex technologies.
12. Change agent communication with others is aided by homophily and hindered by
heterophily.
13. The defining characteristic of opinion leaders is they are well respected in their social
system.
138
14. Techniques for identifying opinion leaders include the positional, self-designating,
reputational, sociometric, and observational.
15. Cliques of heterophilous opinion leaders are bridged by interstitial persons.
16. Change agent success depends upon effort, client orientation, empathy, contact with
opinion leaders, contact with lower status clients, and effective use of
paraprofessional aides.
17. The classical diffusion approach assumes a centralized research and development
organization that makes most decisions about the innovation and its diffusion. The
decentralized diffusion approach entails technology development and dissemination
from small firms, local entrepreneurs, and grass-roots organizations.
18. The innovation-decision process involves knowledge, persuasion, symbolic adoption,
implementation, and confirmation.
19. Re-invention can lead to increased flexibility in applications of the innovation,
increased relative advantage for local use, and increased sense of ownership over the
new technology. It can also bring about improper application leading to less
effectiveness of the innovation, inability of the research and development organization
to maintain quality control over the technology in use, and legal problems if the
change infringes upon the protection of a closely related technology.
20. Innovators are quick to adopt, but have little influence on others to adopt. Opinion
leaders are early adopters. The early and late majority follow the lead of opinion
leaders and are thus influenced by the diffusion effect. Laggards are slow to adopt or
never adopt.
21. Laggards are not necessarily slow to symbolically adopt an innovation.
22. All adoption leads to unintended, unanticipated, and undesirable consequences for
some.
23. Because the sociologist is supposed to engineer society in a favorable manner, the
unintended, unanticipated, and undesirable consequences of technology adoption
need to be foreseen and mitigated as much as possible.
24. Overadoption is adoption when experts suggest rejection, or less adoption. This
criticism is a variation on the theme that one can have too much of a good thing.
25. The pro-innovation bias is the implication that the innovation should be adopted by all
members of the social system.
26. The individual-blame bias is a tendency to blame individuals for their non-adoption.
27. Whether by intent of the powerful elite or not, new technology adoption can further
inequalities between upper and lower classes due to economic leakage, political
influence, too rapid social change, and dependency upon the developers of new
technologies.
28. The change agent is ethically responsible for attempting to mitigate negative
consequences associated with new technology adoption.
139