Download Slides - Society for Risk Analysis

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Environmental persistent pharmaceutical pollutant wikipedia , lookup

Neuropharmacology wikipedia , lookup

Pharmacokinetics wikipedia , lookup

Bilastine wikipedia , lookup

Hormesis wikipedia , lookup

Theralizumab wikipedia , lookup

Toxicodynamics wikipedia , lookup

Transcript
High Throughput Testing-The NRC Vision, The
Challenge of Elucidating Real Changes in Biological
Systems, and the Reality of Low Throughput
Environmental Health Decision-Making
Dale Hattis
George Perkins Marsh Institute
Clark University
Outline
• The Older NRC Vision Based on High-Throughput Testing
and Safety-Type Risk Analysis
• One Goal of This Talk is to Illuminate Alternatives
• An Alternative Vision for Toxicology Based on
– Quantitative Modeling of Homeostatic Biological Systems, and
Problems in Their Early Life Setup and Late Life Breakdown
– Interactions of Toxicant Actions and “Normal Background”
Chronic Pathological Processes
– Variability/Uncertainty Analysis Transforming Current Risk
Analyses Used for Standard Setting
The Older NRC Vision Based on High
Throughput Testing Results
• Decades-long period for future development.
• Ensemble of high-throughput assays to represent a large
number (100+) of toxicity pathways.
• Well adapted to rapid screening of new chemicals and old
chemicals with no prior testing.
• Supports decisions on what concentrations of agents will
sufficiently perturb specific pathways to be of concern.
• Relate concentrations causing those perturbations to in
vivo concentrations using physiologically-based
pharmacokinetic modeling.
• No quantitative assessment of health risks or benefits of
exposure reductions for existing agents.
Traditional Toxicological Model Leading To General
Expectations of Thresholds in Dose Response
Relationships for Toxicants
• Biological systems have layers on layers of
homeostatic processes (think of a thermostat)
• Any perturbation automatically gives rise to
offsetting processes that, up to a point, keep the
system functioning without long term damage.
• After any perturbation that does not lead to serious
effects, the system returns completely to the status
quo before the perturbation.
Caveats--There might not be no-adverseeffect thresholds for:
• Tasks that challenge the maximum capacities of
the organism to perform (e.g., 100 yard dash;
perhaps learning to read)
• Circumstances where some pathological process
has already used up all the reserve capacity of the
organism to respond to an additional challenge
without additional damage (e.g. infarction causes
heart muscle cell death that may be marginally
worsened by incremental exposure to carbon
monoxide)
Other Caveats
• The ground state of the system is not a stable equilibrium,
but a series of cyclic changes on different time scales (e.g.
cell cycle; diurnal, monthly).
• Sometimes continuous vs pulsatile patterns of change carry
important signaling information in biological systems (e.g.
growth hormone signaling for the sex-dependent pattern of
P450 expression)
– Therefore there must be resonance systems that respond
to cyclic fluctuations of the right period.
– (Think of the timing needed to push a child on a swing)
– Therefore the effective “dose” may need to be modified
by the periodicity to model dose response relationships.
Potential Paradigm Change for
Applications to Therapeutics
• Historical Paradigm--”The Magic Bullet”
– Find a molecule that will kill the nasty bacteria
– Find a spot in the brain that, if electrically stimulated, will control
Parkinson’s disease symptoms
• New Paradigm--Understand and exploit natural resonances
to enhance or damp system oscillations
– Potential for pulsatile systems for drug release
– Potential for sensor-based systems for drug release (e.g., smart
pumps that release insulin in response to real time measurements
of blood glucose)
– Potential for timed or sensor-based electrical stimulation of target
tissues (e.g., heart pacemakers)
Key Idea for Transforming Risk Assessment
for Traditional Toxic Effects
• Quantitatively characterize each uncertainty (including those
currently represented by “uncertainty factors”) by reducing it to
an observable variability among putatively analogous cases.
–
–
–
–
Human interindividual variability--kinetic and dynamic
Variation in sensitivity between humans and test species
Adjustment for short- vs. longer periods of dosing and observation
Adjustment for database deficiencies (e.g. missing
repro/developmental studies)
• This general approach is not without difficulty—need rules for
making the analogies (defining the reference groups to derive
uncertainty distributions for particular cases).
• However it does provide a way forward for health scientists to
learn to reason quantitatively from available evidence relevant
to specific uncertainties.
Examples of Data Bases Assembled/Analyzed
Type of Projection
Parameters
Original Authors
General Human
Interindividual
Pharmacokinetic and
Pharmacodynamic
Parameters
Hattis et al. 2002
“Straw Man” Proposal
Adult/Child
Classical Drug PK (T1/2, Ginsberg et al., 2002;
Clearance, Vd) 27-41
Hattis et al., 2003
drugs; 366 data groups
Young Adult/Elderly Classical Drug PK (AUC , Hattis and Russ 2004
T1/2, Clearance, Vd) 17Report to EPA
44 drugs 215 data groups
Interspecies—Acute
Toxicity
LD50
10,160 Species-Pairs
Rhomberg and Wolff
(1998)
Additional Data Bases Analyzed And/Or
Assembled
Type of Projection
Parameters
Original Authors
Interspecies Sensitivity-Multi-Dose and
Carcinogenesis
Human Maximum
Price et al., 2001; Hattis
Tolerated Dose and
et al., 2002
Putative Animal
Equivalents for 61 AntiCancer Agents
Pharmacokinetics in
Pregnancy--Parameters
Derived from PBPK
Model Fits or Direct
Observations
Fetal Growth,
Placental/Fetal Transfer,
Maternal Tissue
Growth, Partition
Coefficients
Hattis, 2004 Report to
EPA
Adult/Early Life Stage
Carcinogenic Animal
Bioassay Sensitivity for
9 Mutagenic Agents +
Ionizing Radiation
Cancer Transformations
Per PPM, per dose/
(body weight) 0.75 or per
rem ionizing radiation
EPA (2005) cancer
guidelines + Hattis et
al., (2004, 2005)
The “Straw Man” Quantitative
Probabilistic Framework for Traditional
“Individual Threshold” Modes of Action
• It is ultimately hopeless to try to fairly and
accurately represent the compounding effects of
multiple sources of uncertainty as a series of point
estimate “uncertainty” factors.
• Distributional treatments are possible in part by
creating reference data sets to represent the prior
experience in evaluating each type of uncertainty.
Interpretation of Dose Response Information for Quantal
Effects in Terms of a Lognormal Distribution of
Individual Threshold Doses
Number of
People With
Thresholds
at a Given
Log(Dose)
Shaded area represents
the fraction of people
with thresholds below
a given dose, who
therefore suffer the effect
at that dose.
Log (ED02.5)
Z = -2
Log(ED50)
Z=0
Log(ED16)
Z = -1
Log(ED84)
Z = +1
Log (ED97.5)
Z = +2
Log(Threshold Dose) or
Z-Score (in units of standard deviations)
Analytical Approach for Putative
Individual Threshold-Type Effects
• Select Point of Departure (ED50), then define needed
distributional adjustments:
• LOAEL to ED50 or NOAEL to ED50
• Acute/chronic
• Animal to human
• Human variability, taking into account the organ/body
system affected and the severity of the response
• Incompleteness of the data base
Elements of the “Straw Man” Proposal
--Tentatively it is suggested that the RfD be the lower
(more restrictive) value of:
• (A) The daily dose rate that is expected (with 95% confidence) to
produce less than 1/100,000 excess incidence over background of a
minimally adverse response in a standard general population of
mixed ages and genders, or
• (B) The daily dose rate that is expected (with 95% confidence) to
produce less than a 1/1,000 excess incidence over background of a
minimally adverse response in a definable sensitive subpopulation.
Factor Reduction Needed to Reach the "Straw
Man" Criterion of <1E-5 Risk with 95% Confidence
Results of Application of the Straw Man Analysis
to18 Randomly-Selected RfDs from IRIS
100
Continuous
Quantal
10
1
.1
100
1000
Overall Uncertainty Factor
10000
Recent Results from Application of the “Straw Man”
Approach to “Value of Information Testing of the
IPCS “Data-Derived Uncertainty Factor Formulae
• Split of PD/PK variability should be closer to 5:2, rather than 3.1:3.1
as implied by the IPCS proposal (if one wishes the product to multiply
out to the traditional 10-fold uncertainty factor assigned for
interindividual variability
• Approximately equal protectiveness would be achieved by substitution
of the following values for the interindividual variability factor 10 for
RfD’s with the following characteristics:
Overall UF = 100
Overall UF = 1000
Quantal Endpoint
17
7.4
Continuous Endpoint
43
19
More Details of Our Analysis are Available in:
• Hattis, D., Baird, S., and Goble, R. “A Straw Man Proposal
for a Quantitative Definition of the RfD,” Drug and
Chemical Toxicology, 25: 403-436, (2002).
• Hattis, D. and Lynch, M. K. “Empirically Observed
Distributions of Pharmacokinetic and Pharmacodynamic
Variability in Humans—Implications for the Derivation of
Single Point Component Uncertainty Factors Providing
Equivalent Protection as Existing RfDs.” In Toxicokinetics
in Risk Assessment, J. C. Lipscomb and E. V. Ohanian,
eds., Informa Healthcare USA, Inc., 2007, pp. 69-93.
• Detailed Data Bases and Distributional Analysis
Spreadsheets:
http://www2.clarku/edu/faculty/dhattis
Implications for Information Inputs to Risk
Management Decision-Making
•
•
•
•
Increasingly cases such as airborne particles, ozone, and lead are forcing the
recognition that even for non-cancer effects, some finite rates of adverse
effects will remain after implementation of reasonably feasible control
measures.
Societal reverence for life and health means “doing the very best we can” with
available resources to reduce these effects.
This means that responsible social decision-making requires estimates of how
many people are likely to get how much risk (for effects of specific degrees of
severity) with what degree of confidence—in cases where highly resourceintensive protective measures are among the policy options.
The traditional multiple-single-point uncertainty factor system cannot yield
estimates of health protection benefits that can be juxtaposed with the costs of
health protection measures.
Alternative Vision#2--Multiple Directions for
Improvement for Toxicology and Risk Assessment
• New Pharmacodynamic Taxonomies and
Approaches to Quantification
– Taxonomy based on what the agent is doing to the
organism
– Taxonomy based on what the organism is trying to
accomplish and how agents can help screw it up
• Quantitative Probabilistic Framework for
Traditional “Individual Threshold” Modes of
Action
Taxonomy Based on What Organisms Need to Accomplish to
Develop and Maintain Functioning, and What Can Go Wrong
• Establishment and Maintenance of Homeostatic Systems at
Different Scales of Distance, Time, Biological Functions,
Involving
– Sensors of Key Parameters to Be Controlled
– Criteria (E.g. “Set Points”) for Evaluating Desirability of Current
State
– Effector Systems That Act to Restore Desirable State With Graded
Responses to Departures Detected by the Sensors
• Some Examples of Perturbations:
– Hormonal “Imprinting” by Early-Life Exposure to Hormone
Agonists
– The “Tax” Theory of General Toxicant Influences on Fetal
Growth, and Possible Consequences
Key Challenges for Biology in the 21st Century
• How exactly are the set points set?
• How does the system determine how, and how vigorously
to respond to various degrees of departure from specific set
points?
• Could all this possibly be directly coded in the genome?
• Or, more interestingly, does the genome somehow bring
about a learning procedure where, during development, the
system “learns” what values of set points and
modes/degrees of response work best using some set of
internal scoring system?
• How exactly are the set points, etc., adjusted to meet the
challenges of different circumstances (“allostasis” states-see Shulkin 2003, “Rethinking Homeostasis”).
in Each Birth Weight Range
Relationship Between Weight at Birth
(in 500 Gram Increments) and Infant Mortality
1000
Conventional Cutoff Defining
"Low Birth Weight"
100
10
1
0
1000
2000
3000
4000
Upper Ends of 500 G Ranges of Birth Weights
5000
Relationships Between Reported Cigarettes/Day
Smoked, Average Birthweight, and Infant Mortality-U. S. National Center for Health Statistics 1990 Data
14
Average Birthweight (g)
Average Birthweight
Infant Mortality/1000 Babies
13
12
3300
11
10
3200
9
8
3100
7
0
5
10
15
20
25
30
Reported No. Cigarettes/day
35
Infant Mortality/1000 Babies
3400
Results of Regresssion Analysis of the
Fraction of Control Fetal Weight Response
in Grouped Categores of TCA Equivalents
--Interpretation with a Linear Model
Fract Control Fetal Wt
1.1
y = 0.9925 - 7.40e-4x R^2 = 0.961
Error bars represent ± 1 standard error
1.0
0.9
0.8
0.7
0.6
0
100
200
300
TCA equiv mg/kg
400
500
Results of Regresssion Analysis of the
Fraction of Control Fetal Weight Response
in Grouped Categores of TCA Equivalents
--Interpretation With a Quadratic Model
Fract Control Fetal Wt
1.1
y = 1.0069 - 1.114e-3x + 8.62e-7x^2 R^2 = 0.983
1.0
0.9
0.8
0.7
0.6
0
100
200
300
TCA equiv mg/kg
400
500
Plot of the Incidence of Type 2 Diabetes
in Relation to Log(Mean Birth Weight)
--Data of Forsen et al., 2000
12
y = 47.05 - 11.46x R^2 = 0.960
11
% Type 2 Diabetes
10
9
8
7
6
5
4
3.2
3.3
3.4
3.5
Log(Mean Birth Wt g)
3.6
3.7
Taxonomy Built from the Fundamental Ways that
Chemicals Act to Perturb Biological Systems
• For preclinical stages or subclinical levels of
effect, is the basic action reversible or irreversible,
given a subsequent period of no exposure?
– Reversible (enzyme inhibition; receptor activation or
inactivation--Traditional Acute or Chronic Toxicity-traditional toxicology/homeostatic system
overwhelming framework (individual thresholds for
response)
– Irreversible (DNA mutation; destruction of alveolar
septa; destruction of most neurons)
Subcategories for Nontraditional (Based on
Irreversible Changes) Modes of Action
• How many irreversible steps are needed to
produce clinically recognizable disease?
– (few--up to a dozen or so) molecular biological
diseases--mutations, cancer via mutagenic mechanisms
– (many--generally thousands+) chronic cumulative
diseases (emphysema and other chronic lung diseases
cause by cumulative loss of lung structures or scarring,
Parkinson’s and other chronic neurological diseases
produced by cumulative losses of neurons)
Special Features of Chronic Cumulative Disease Processes
• Clinical consequences depend on the number of irreversible
steps that have occurred in different people (often little
detectable change until a large number of steps have occurred).
• Effects occur as shifts in population distributions of function.
• Thresholds for the causation of individual damage steps must be
low enough that the disease progresses with time in the absence
of exposures that cause acute symptoms.
• Different kinds of biomarkers needed for
– Accumulated amount of damage/dysfunction (e.g. FEV1)
– (most powerful for epidemiology based on associations with short
term measurements of exposure) Today’s addition to the
cumulative store of damage (e.g.
• excretion of breakdown products for lung structural proteins;
• blood or urine levels of tissue-specific proteins usually found only
inside specific types of cells such as heart-specific creatinine kinase
for measurement of heart cell loss due to infarctions)
Toward Risk Assessment Models for Chronic
Cumulative Pathological Processes
• Describe the fundamental mechanism(s) that causes the accumulation of
the individual damage events (especially the quantitative significance of
various contributory factors). Key aid--biomarkers of the daily progress
of damage (e.g. key enzyme released from a dying neuron of the specific
type involved in the disease)
• Quantitatively elucidate the ways in which specific environmental agents
enhance the production of or prevent the repair of individual damage
events
• Describe the relationships between the numbers, types, and physical
distribution of individual damage events and the loss of biological
function or clinical illness. Key aid--biomarkers for the accumulation of
past damage, e.g. FEV1.
Motivation to Move On
•
•
•
•
Younger generation of analysts will ultimately not tolerate older procedures
that fail to provide a coherent way to use distributional information that is
clearly relevant to the factual and policy issues.
Younger generation of analysts will have greater mathematical and
computational facility, particularly as biology becomes quantitative “systems
biology” with quantitative feedback modeling--increasingly an applied
engineering-like discipline.
Legal processes will ultimately demand use of the “best science” as this
becomes recognized in the technical community.
Newer information/communication tools will foster increasing habits and
demands for democratic accountability; experts worldwide will increasingly be
required to expose the bases of their policy recommendations—leaving less
room for behind-the-scenes exercise of “old boy” safety factor judgments.
Contributions of High-Throughput Testing in
Different Decision Contexts
• Preliminary evaluation of large numbers of new, and old but untested,
environmental agents--good promise to be helpful.
• Evaluation of contaminated sites (e.g. superfund)--support for
decision-making based on complex mixtures is highly dubious.
• Assistance to epidemiological research to locate contributors for
human disease (e.g. asthma)--also doubtful.
• Assessment of relative risks of different industrial processes--fanciful
because of the need to guess the relative weights to be assigned to
numerous dissimilar findings on short term tests without established
quantitative connections to adverse health effects.
• High profile choices of degrees of control/exposure reduction to be
mandated for specific agents (“slow throughput” decision-making)-Likely to muddy the waters by raising difficult mode of action
questions that can only be resolved by expensive and slow in vivo
testing--e.g. using “knockout” mice. This is in fact the most likely
near term contribution.