Download Summary: Decisions under Risk and Uncertainty Uncertainty: the

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Birthday problem wikipedia , lookup

Indeterminism wikipedia , lookup

Inductive probability wikipedia , lookup

Ars Conjectandi wikipedia , lookup

Probability interpretations wikipedia , lookup

Expected utility hypothesis wikipedia , lookup

Risk aversion (psychology) wikipedia , lookup

Transcript
Summary: Decisions under Risk and Uncertainty
Course Learning Goals:
 Understand why trust, security, risk, and uncertainty are important concepts for engineers and the ‘success’ of technologies;
 Know how the humanities and social sciences use these concepts for the analysis of
technologies;
 Understand what typical risks and uncertainties are created by technologies;
 Understand when and how technologies can be used to solve problems of risk, trust and
uncertainty, and when not;
 Can anticipate (identify and describe) technology-related issues of trust, risk, etc.;
 Can distinguish between social scientific, normative (juridical, political and ethical), and
mathematical perspectives on these issues;
 Are motivated to examine technologies from a humanities and social science perspective.
Uncertainty: the absence of certainty; refers to a situation in which we know what might go wrong,
but lack the knowledge to express a hazard within a risk.
Certainty: a state having to do with our knowledge and justification for believing something.
The Method of Doubt (Descartes): is used to clear away those contents of our minds that are uncertain, so that we are left with only certain knowledge.
 If there is any way a belief can be disproved, then its grounds are insufficient:
 The possibility of Illusion – senses (senses deceive us at a distance)
 The possibility of Dreaming (Dream Argument) – imagination
 The possibility of being deceived by an Evil Demon (Evil Genius) – reason
 Cogito ergo sum (I think, therefore I am)
 We can be certain of:
 Our own existence
 The contents of our own thoughts
 Mathematical truths
 Scientific claims, insofar as we base them on what we clearly and distinctly perceive.
Other sceptical possibilities:
 David Hume: the possibility that the future will not resemble the past (scientific regularities
will suddenly change).
 Nick Bostrom: the possibility we are living in a computer simulation.
Uncertain Danger: a bad future event that we cannot rule out with certainty.
Three cognitive responses to uncertain danger:
 Accept it (do not try to reduce uncertainty).
 Zen Approach
 Reduce uncertainty by relying on others (trust).
 Two typical problems that trust may solve: cheating and production of public goods
in communities/groups (free riding).

 Reduce uncertainty by studying the situation (risk assessment).
Social
Amplification: a
systematic
over
inflation
of a danger
as
more and
more
people circulated estimates of serious danger.
Risk: the product of probability and outcome (expected utility) and is thus quantifiable (known unknown since the likelihood and disvalue is known).
“Uncertainty must be taken in a sense radically distinct from the familiar notion of Risk, from which
it has never been properly separated. The term "risk," as loosely used in everyday speech and in economic discussion, really covers two things which, functionally at least, in their causal relations to the
phenomena of economic organization, are categorically different. ... The essential fact is that "risk"
means in some cases a quantity susceptible of measurement, while at other times it is something
distinctly not of this character; and there are far-reaching and crucial differences in the bearings of
the phenomenon depending on which of the two is really present and operating. ... It will appear that
a measurable uncertainty, or "risk" proper, as we shall use the term, is so far different from an unmeasurable one that it is not in effect an uncertainty at all. We ... accordingly restrict the term "uncertainty" to cases of the non-quantitative type.”
Risk Assessment: the purpose of risk assessment is the generation of knowledge linking specific risk
agents with uncertain but possible consequences – the final product of risk assessment is an estimation of the risk in terms of a probability distribution of the modelled consequences.
 Most useful when there is an agreement that an outcome is bad (e.g. death).
 Disvalues can be quantified in various ways.
Risk Scenario: a causal story linking an event with an unwanted outcome.
 Hazard: a situation that poses a level of threat to life, health, property, or environment.
 Exposure: a quantified potential for loss that might occur as a result of some activity.
 Vulnerability: a weakness in an asset (property) or group of assets.
Plausibility of Risk Scenario is relative to best scientific theory available.
Risk Proper: known-unknown (estimable probability)
Uncertainty (uncertain danger): unknown-unknown
Logic: concerns the relation of consequence that holds between the premises and the conclusion of
a sound argument.
 Argument: a set of sentences in which some of the sentences are intended to support others.
 Validity: the property of an argument such that if the premise(s) are true, then the
conclusion must be true.
 Soundness: the property of an argument such that it is valid and its premises are
true.
 Premise: a sentence intended to support another sentence.
 Conclusion: a sentence intended to be supported by other sentences.
 Sub-Conclusion: a sentence that is both a conclusion and a premise in a given argument.
Prescriptive bridge-premise: statement of what should be done.
Confusion about risks (three definitions):
 An unwanted event.
 The probability/frequency of an unwanted event.
 A measure for the damage done by unwanted events.
Threat: a potential event and when it turns into an actual event, it may cause an unwanted incident
(may harm an organization or system).
Quantifying Risk: risk = Threat Event Frequency (TEF) x Vulnerability (V) x Probable Loss Magnitude
(PLM).
Counter Measures:
 Reduce Threat Event Frequency
 Reduce Vulnerability
 Reduce Probable Loss Magnitude
Malicious Threat: a potential cause of an unwanted incident which may result in jeopardizing a system or an organization; it connotes an initiating event that can cause harm to a system or induce it
to fail.
 Attackers adapt to the system architecture (weakest links).
 Attacks may consist of multiple steps.
 Countermeasures influence both system vulnerability and attacker strategy.
Attack Tree: represents a set of attacks, are composed of a combination of leaf nodes.
 Limitations: attacker model implicit in analysis; no relation between annotations (e.g. cost
and probability).
 Only provide the probability of success of an attack given that it is attempted.
Fault Tree: immediately provide the probability of failure of a system in a given time frame.
Predict: complex attack scenarios spanning digital, physical and social engineering steps.
Prioritize: these scenarios via a planning tool that tells defenders where to expect the most serious
issues.
Prevent: attacks by calculating and comparing cost-effectiveness of countermeasures.
Confusion about vulnerability (two definitions):
 A particular weakness in a system.
 The likelihood that a threat event will inflict damage.
Item Response Theory: a paradigm (zienswijze) for the design, analysis, and scoring of tests, questionnaires, and similar instruments measuring abilities, attitudes, or other variables. Provides theory
for estimating vulnerability.
Attacker Model: assuming that an attacker is interested in maximising damage, has full knowledge
of the system vulnerabilities and impacts, attacker resources are increasing over time, threat capability is proportional to attacker resources.
 Attacker will try to maximise attacker-induced risk (expected damage done per unit of time).
 Attacker will attack at the time when, given his resources, the expected damage per unit of
time is maximal.
 Attacker will choose the attack with the highest expected damage.
Precautionary Principle: where there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation.
Precautionary Principle: when an activity raises threats of harm to the environment or human
health, precautionary measures should be taken even if some cause and effect relationships are not
fully established scientifically.
Expected Utility Principle: a risk is acceptable just in case it comes along with the course of action
that maximizes expected utility overall, and reduction of the risk is not cost-effective.
 Criticisms:
 It gives no intrinsic weight to the voluntariness or involuntariness of risks.
 It does not tell us what to do in conditions of uncertainty.
 It implies that highly unequal distributions of risk are acceptable.
 It equates 1 death with 0.1 chance of 10 deaths, and with 1/1.000.000 chance of a
million.
Equipossibility Principle (Principle of Insufficient Reason): in cases of epistemic uncertainty, we
should use the best estimates of experts to calculate expected utility, with a margin of error that
weights all real possibilities equipossibly. In case no accurate estimation is possible, all real possibilities are to be judged equipossible to calculate expected utility.
Maximin Rule: the utility of a mixture of potential outcomes is equal to the lowest utility that is associated with any of these outcomes.
 Criticisms:
 The ethical question is simply transferred to a different stage in the decision-making
process: the stage at which we decide which possible scenarios to take seriously.
 The actual weight of some possibilities is intuitively too great, when using maximin.
Shrader-Frechette’s Argument:
1. We must distinguish between risk and uncertainty, and between principles of personal and
societal choice.
2. The equiprobability principle fails to match our moral intuitions about societal choice under
uncertainty.
3. Maximin better matches those intuitions.
4. Therefore, maximin is a better principle for societal choice under uncertainty.
5. Societal choice under uncertainty is the normal condition of risk policy.
6. Therefore, maximin is a better principle for risk policy.
Hansson’s alternative treatment of risk inequality: exposure of a person to a risk is acceptable if and
only if this exposure is part of an equitable social system of risk-taking that works to her advantage.
Threat of Technocracy (Rule by the Skilled): decisions are made by technological experts since they
have the best scientific information available.
 Candidate technological activities are researched and developed by industry and/or government.
 Scientific expertise is gathered to estimate risks associated with those activities.
 Risks are prioritized by their comparative features (likelihood, magnitude, importance of activity, cost to address).
 Actions, policy are set by management/ government using a rational framework.
 The process and results are communicated to the public.
Democratic Principle of Risk Acceptability: a risk is acceptable just in case the majority of people
decide that it is acceptable in a democratic process.
Trust: the willingness to be vulnerable to another party (interpersonal trust).
 Web of Trust: I believe that User A owns key B because User C says so and I trust C.
 Hierarchy of Trust: I believe that User A owns key B because her CA says so and I believe
that her CA and I believe her CA performs a strict admission policy to be sure of her identity.
 Trust is influenced by:
 Disposition towards trust (stable personality characteristic)
 Perceived trustworthiness (ability, benevolence and integrity of the interaction partner)
 Competence- and morality-based violations of trust.
Hackers: hobbyist, collector, data traveler, idealist, destroyer and spy.
Types of uncertainty:
Surprises:
are unavoidable
(positive
and negative) and
are
together with ignorance unknown-unknowns.
Traditional notion of experiment: an experiment produces observations in controlled circumstances
to test a hypothesis.
 Degree of Control:
 Laboratory Experiment: almost completely controlled environment.
 Field Experiment: field conditions partly controlled and often randomized.
 Natural Experiment: interesting conditions generated by ‘nature’.
 For instance social experiments
 Experimental Methodology:
 Comparison: between different circumstances or different options.
 Reproducibility: similar cases produce similar results.
 Explanation: not just regression analysis.
Ethical principles for research involving humans:
 Respect for persons: informed consent
 Beneficence: no harm, maximize benefits, minimize risk
 Justice: just distribution of benefits and harms
It is hardly possible to reliably predict risks before the technology is actually employed in society.
The term risk can be seen as a specification of hazard (product of the probability of an undesirable
event and the impact of that event).
Ignorance: refers to the situation in which we do not even know what could go wrong, resulting in
unknown hazards.
Indeterminate situation: in this situation potential hazards cannot be expressed in risks because
their occurrence depends on the behavior of, for example, users and operators.
Risk estimates: are still surrounded by large uncertainties.
 We cannot test certain disaster scenarios in realistic circumstances.
 Large disasters are often a rarity.
Limited predictability of risks in the laboratory:
 Long-term cumulative and interaction effects.
 Recursive non-linear systems dynamics.
 Laboratory and field tests are often not representative of real-life circumstances
 Some hazards may be entirely overlooked due to ignorance.
Social experiments: are different from standard experiments since they take place outside the lab
and involve more and other human subjects than standard experiments, in particular users en bystanders they are not always explicitly carried out or recognized as experiments, so that data gathering or monitoring is sometimes absent; they are less controllable, which makes it more difficult to
control experimental conditions and to contain hazards.
Informed consent: for (social) experiments it is only acceptable to use humans as experimental subjects if they have voluntarily agreed to engage in the experiment after being fully informed about all
potential hazards (and expected benefits).
Reasonable moral conditions for societal experiments:
 Absence of alternatives: to require that first all other methods for gaining knowledge about
the actual functioning of a new nanotechnology and its ethical consequences and hazards
have been tried out before a societal experiment is carried out.
 Controllability: responsible societal experiments also require that measures are taken to
contain the hazards of the experiment.
 Informed consent: an experiment using human subjects is only moral acceptable if the subject has voluntarily agreed to take part and the decision is based on a sufficient knowledge
of the expected benefits and potential risks.
 Proportionality of hazard and benefits (continuous (re)evaluation).
Uncertainties are caused by lack of knowledge, ignorance and complexity of a situation.
Mixture appraisal problem: given the moral appraisals (schatting/waarde) that a moral theory
makes of value-carriers with well-determined properties, what moral appraisals does (a generalized
version of) this theory make of mixtures of such value-carriers?
Moral appraisal: covers a wide range of assignments of moral status, such as declarations that
something is forbidden, permitted, morally required etc.
Value-carriers: refer to all entities that can be assigned (moral) value, including in particular human
actions and the outcomes of human actions.
Mixture: a set of value-carriers such that it is not well-determined which of them will materialize.
Decision-making under risk: each action is associated with a set of possible outcomes, and to each
of these is assigned a probability, such that these probabilities add up to unity.
Decision-making under uncertainty: probabilities are either not known at all or only known with less
than full precision.
Utilitarianism: a theory in normative ethics holding that the proper course of action is the one that
maximizes utility, usually defined as maximizing happiness and reducing suffering.
 All moral appraisals are reducible to assignments of utility; utility of human actions is assumed to depend exclusively on their consequences.
Mixture appraisal problem for utilitarianism: given the utilities that a utilitarian theory assigns to
(potential) outcomes with well-determined properties, what utilities does (a generalization of) this
theory assign to mixtures of such outcomes?
 Actualism: the utility of a mixture of potential outcomes is equal to the utility of the outcome that actually materializes.
 Expected utility maximization: the rule that requires that one chooses an action with the
highest probability-weighted average of the values of the possible outcomes.
 Expected utility maximization does not allow for risk-adverse or cautious decisionmaking.
 Disallows the influence of person-related moral reasons.
 Maximin rule: the utility of a mixture of potential outcomes is equal to the lowest utility that
is associated with any of these outcomes.
Deontology (duty ethics): is the normative ethical position that judges the morality of an action
based on the action's adherence to a rule or rules.
Mixture appraisal problem for deontological/rights-based moral theories: given the duties (rights)
that a deontological (right-based) moral theory assigns with respect to actions with well-determined
properties, what duties/rights does (a generalized version of) this theory assign with respect to mixtures of such actions?
 Probabilistic absolutism for deontological theories: if it is morally prohibited to perform a
certain action, then this prohibition extends to all mixtures in which this action has nonzero
probability.
 Probabilistic absolutism for rights-based theories: if someone has a moral right that a certain action not be performed, then this right extends to all mixtures in which this action has
non-zero probability.
Probability limit for deontological theories: each prohibition of an action is associated with a probability limit. The prohibition extends to a mixture that contains the action if and only if the action
has, in that mixture, a probability that is above the probability limit.
Probability limit for rights-based theories: each moral right that a certain action not be performed,
is associated with a probability limit. The right extends to a mixture that contains the action if and
only if the action has, in that mixture, a probability that is above the probability limit.
Contract theory (Socrates): is the view that persons’ moral and/or political obligations are dependent upon a contract or agreement among them to form the society in which they live.


Actual consent: any mixture with non-zero probability of giving rise to negative value is allowed if and only if it is accepted by all those affected by this negative value.
Hypothetical consent: any mixture with non-zero probability of giving rise to negative value
is allowed if and only if it would be accepted in an ideal decision situation by all those affected by this negative value.
The exemption problem: it is a prima facie moral right not to be exposed to risk of negative impact,
such as damage to one’s health or one’s property, through the actions of others. What are the conditions under which this right is overridden, so that someone is allowed to expose other persons to
risk?
 Exposure of a person to a risk is acceptable if and only if this exposure is part of an equitable
social system of risk-taking that works to her advantage.
Annualized Loss Expectation (ALE): a common measure of the cost of risk.
Risk Management: liability transfer (by disclaimer and agreement), indemnification (pooling and
hedging), mitigation and retention.
Nowadays, information security technology focuses primarily on risk mitigation than on reducing
the consequences.
 Risk analysis must be more in quantitative form.
Decision theory: theory about decisions, it focuses on how we use our freedom, is concerned with
goal-directed behavior in the presence of options.
Normative decision theory: theory about how decisions should be made in order to be rational.
 A decision theory is weakly falsified as a normative theory if a decision problem can be
found in which an agent can perform in contradiction with the theory without being irrational.
 A decision theory is strictly falsified as a normative theory if a decision problem can be
found in which an agent who performs in accordance with the theory cannot be a rational
agent.
Descriptive decision theory: theory about how decisions are actually made.
 A decision theory is falsified as a descriptive theory if a decision problem can be found in
which most human subjects perform in contradiction to the theory.
Decision processes:
Condorcet: first stage – first discussion about principles that will sever as the basis for decision in a
general issue (personal); second stage – the question is clarified, opinions approach and combine
with each other to a small number of more general options; third stage – actual choice between
these alternatives.
Dewey: a felt difficulty – the definition of the character of that difficulty – suggestion of possible
solutions – evaluation of the suggestion – further observation and experimentation leading to acceptance or rejection of the suggestion.
Simon: intelligence – finding occasions for making a decision; design – finding possible courses of
action; choice – choosing among courses of action.
Brim et al.: identification of the problem; obtaining necessary information; production of possible
solutions; evaluation of such problems; selection of a strategy for performance; implementation of
the decision.
Mintzberg et al.: non-sequential. Identification – two routes: decision recognition and diagnosis;
development – two routes: search (ready-made solutions) and design (new solutions); selection –
three routines: screen (for search only), evaluation-choice and authorization.
> is said to represent
preference or strong preference, ≥ weak preference, and ≡ indifference.
Completeness: the relation ≥ is complete if and only if for any elements A and B of its domain, either
A≥B or B≥A. A particular preference must have a relation with e.g. musical pieces as its domain.
Transitivity: a (strict) preference relation > is transitive if and only if it holds for all elements A, B and
C of its domain (completeness) that if A>B and B>C, then A>C.
 Intransitive preferences are often inadequate to guide actions.
 In decision theory, it is commonly supposed that not only strict preference (>) but also weak
preference (≥) and indifference (≡) are transitive.
 A weak preference relation ≥ is transitive if and only if it holds for all elements A, B,
and C of its domain that if A≥B and B≥C, then A≥C.
 An indifference relation ≡ is transitive if and only if it holds for all elements A, B, and
C of its domain that if A≡Β and B≡C, then A≡C.
An alternative is (uniquely) best if and only if it is better than all other alternatives. If there is a
uniquely best alternative, choose it.
An alternative is (among the) best if and only if it is at least as good as all other alternatives. If there
are alternatives that are best, pick one of them.
According to some moral theorists, all values can be reduced to one single entity, utility.
Open set of alternatives: new alternatives can be invented or discovered by the decision-maker.
Closed set of alternatives: no new alternatives can be added.
 Voluntary (decision-maker decided herself to close the alternatives) and involuntary closure.
Mutually exclusive: two alternatives cannot be both realized (alternatives are close).
States of nature: summary of the various unknown extraneous factors.
Outcomes: combined effect of a chosen alternative and the state of nature that obtains.
Decision matrix: alternatives are represented by the rows of the matrix, and the states of nature by
the columns.
Knight: reserve the term uncertainty for cases of the non-quantifiable type, and the term risk for the
quantifiable cases.
Certainty
Risk
knowledge
Uncertainty
Ignorance
deterministic knowledge
complete
probabilistic
partial probabilistic knowledge
no probabilistic knowledge
Standard representation of a decision: consists of a utility matrix and some information about to
which degree the various states of nature in that matrix are supposed to obtain.
Expected utility (probability-weighted utility theory): to each alternative is assigned a weighted average of its utility values under different states of nature, and the probabilities of these states are
used as weights.
 Objective and subjective utility
The larger the group of decisions, the larger catastrophic consequences can be leveled out
 Practical limit: decisions have to be made in manageable pieces.
 Absolute limit: some extreme effects cannot be leveled out even in the hypothetical limiting
case in which all human decision-making aims at maximizing expected utility.