Download ESCAPING NEWTONIAN MECHANICS: PHILOSOPHY AND

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Social psychology wikipedia , lookup

Sociological theory wikipedia , lookup

Land-use forecasting wikipedia , lookup

Process tracing wikipedia , lookup

Philosophy of history wikipedia , lookup

Economics of digitization wikipedia , lookup

Origins of society wikipedia , lookup

Parametric determinism wikipedia , lookup

Home economics wikipedia , lookup

Anthropology of development wikipedia , lookup

Neuroeconomics wikipedia , lookup

Development theory wikipedia , lookup

Bioecological model wikipedia , lookup

Postdevelopment theory wikipedia , lookup

Economic model wikipedia , lookup

Development economics wikipedia , lookup

History of the social sciences wikipedia , lookup

Transcript
ESCAPING NEWTONIAN MECHANICS: PHILOSOPHY AND METHODS
FOR APPLIED REAL ESTATE RESEARCH
1
PART I PROBLEMS WITH THE DOMINANT QUANTITATIVE PARADIGM 2
Introduction
Limits of the positivist paradigm
Newton’s spirit in the social sciences
Error! Bookmark not defined.
2
3
Two historical attempts to deal with complexity
Deductive theory in economics
Statistical methods to analyse stochastic processes
6
6
8
Why Newtonian social science fails
Experiments and cumulative knowledge
Econometrician’s “critiques”
Epistemological revision in the physical sciences and mathematics
9
9
10
15
Conclusion
19
PART II TOWARDS IMPROVED SOCIAL SCIENCE PARADIGMS OF
INQUIRY
1
Indroduction
1
Changing Metaphysical Assumptions
Classify degree of disorder in systems
1
2
Diversifying Paradigms
Historical perspective.
Institutions and “institutional transactions.”
Creative problem solving
Multi-disciplinary perspective
Complexification searches--in applications details matter
3
3
4
5
5
6
Improving econometric methods
Qualitative paradigm exploratory research
Data improvement and descriptive statistics
State of the art quantitative methods
6
6
7
7
Institutional context of social science research
A revised agenda for graduate students
Revising researchers’ roles and reward structures
Goal-seeking, feedback, error correction
Risk Management
8
8
9
10
11
Conclusion
12
Bibliography
14
Escaping Newtonian Mechanics: Philosophy and Methods for
Applied Real Estate Research
by Max Kummerow
Department of Property Studies
Curtin University
January, 1998
Abstract:
Part I Problems with the dominant quantitative paradigm
Historically the social sciences, including real estate economics, sought to mimic
successful physical science methods. But it is impossible to use experimental methods-the epistemological mainstay of physical science--in most social science research. This
makes it far more difficult to demonstrate causation or to establish theory. Moreover,
human designed systems are complex, simultaneous, non-linear, open, and evolving
processes. Institutions and mental states are causally important--both are complex and
changeable--and what’s more, created by design to implement culturally mediated
values. Positivist studies often ignore basic issues like institutions, reflexivity, time
variance of data generating processes, dynamics, complexity, and purposeful choice or
design, as well as statistical problems. These errors and omissions in the received
paradigm mean quantitative research results are likely to be inaccurate, spurious, or
irrelevant outside the sample. The math in models often poorly represents real
systems. Generalisations sought by such research are invalid. Studies’ aura of
scientific rigour is meretricious despite sophisticated number crunching’s effectiveness
as rhetoric.
Part II Towards Improved Paradigms of Inquiry
Recent work in the philosophy of science supports methodological pluralism.
Qualitative research paradigms, dynamic econometrics, institutional economics, system
dynamics simulations, and other approaches add breadth and depth to social science
models.
Epistemological assumptions, research methods, roles of academic
researchers, and uses of research all need diversification.
For a paradigm shift to come about in the social sciences, it would be sufficient to
merely update our imitation of the physical sciences, the points mentioned here having
already found their way into the philosophy of the physical sciences due to the
quantum revolution in physics, the computer revolution, and recent developments in
mathematics, and biological sciences.
Keywords: Philosophy of science, research paradigm, experiment, econometrics,
history of science, real estate economics, applied research
1
Part I Problems with the Dominant Quantitative Paradigm
“Business school research is being heavily criticised. Many think
business school research hasn’t done the job. Much of it is considered
unreadable, too esoteric, or irrelevant by business constituencies it
aims to serve. Disciplines are too narrow, so they don’t match real
world problems. We produce papers which are sets of equations
suitable for talking to other academics, but not to business people.
Advances in some sciences come from universities, but innovations and
advances in business don’t come as often from business school
research.”
Ken Lusht, Penn State University
Limits of the positivist paradigm
Part I of this paper inventories difficulties of the dominant positivist empiricist
quantitative social science research paradigm. Part II, proposes ways to make research
more empirical, credible, and useful. The fundamental problems that lead to frequent
modelling failures in the social sciences have to do with the nature of the data
generating processes themselves. The question can be posed as “What can be done to
better understand, predict, and control disorderly, open, human designed systems?”
In the 1996 talk quoted above, Lusht went on to say computing technology had a big
impact by making data analysis so easy. Oddly, this destroyed respect for empirical
work. Most people don’t believe the numbers they see published because they know
they come from data mining and other abuses. So there is a tendency to cling to
theory, which makes academics more abstract and less able to relate to practical
business problems. Published work tends to be little “bits and pieces” while practical
problems are broader and require integrating a wider scope of information and
disciplines. And things change so fast in the business world that data and research are
quickly out of date. I used to claim the most important thing the late University of
Wisconsin real estate guru Jim Graaskamp taught his students was “details matter,”
generalisations cannot be trusted because contexts vary, tastes change, etc. Lusht
speculated that future business research will adopt a more holistic “problem solving”
perspective.
Meta-analyses that inventory and analyse results of multiple studies on a topic, can be
very disheartening if one expects the world to “hold still.” Model deficiencies come
under at least four interrelated headings:
1. Lack of stability of model specification and parameter estimates across samples.
2. Wide confidence limits for parameter estimates and forecasts.
3. Actual outcomes often outside forecast confidence limits.
4. Ideological content of models.
Social science models have not been able to escape ideological content by use of
positivist philosophy and quantitative methods. Many issues in economics and other
social sciences can be framed in alternative ways, leading to different models
2
supporting contradictory points of view. Apparently, we must "deconstruct" research,
as the post-modernists recommend, to identify biases.
Specification searches are not unambiguous even within a sample: McAleer and Veall
(1989) provide an example of alternative plausible, but mutually contradictory models
“explaining” the same data.1 When studies disagree regarding elasticities, model
structure, etc., what can be concluded? Pesaran and Smith (1985) remark that “The
experience of the past 40 years suggests that few econometric results are robust.”2
Newton’s spirit in the social sciences
Classical Newtonian mechanics is still the unconscious template for most quantitative
work in the social sciences. Emulation of the physical sciences, and Newton in
particular, was obviously due to the enormous success of the physical sciences. Sir
Francis Bacon’s dream that experimental science could improve human life came true
beyond expectations. Life expectancy doubled, population and incomes increased ten
fold, Man went to the moon, and so on. And Newton’s theory is so powerful, exactly
explaining everything, everywhere, at all times.
In his comprehensive History and Philosophy of the Social Sciences, Gordon points
out, quoting the investigators themselves, that early social scientists consciously
imitated Newton, but adopted different and contradictory “simple universal”
explanations of human motivation.
Table 1 Emulation of Physical Sciences by Social Scientists
(Quotes and page numbers all from Gordon, 1991)
“The metaphysical foundation of Hobbes’s political theory was given its classic formulation by Rene
Descartes: The conception of reality as consisting of mechanistically linked phenomena. p. 72
“The Physiocratic model was built on the idea that social phenomena are governed, as are physical
phenomena, by laws of nature that are independent of human will and intention. The task of the
economist... is to discover the natural laws governing economic phenomena...” p. 91
“Adam Smith... conceived of the realm of human behaviour as governed by laws akin to laws of
nature, not laws made by sovereigns or legislators. Smith’s universals were, of course, self-interest,
and the invisible hand. ..p. 117
Malthus says “the constant pressure of population against subsistence..(is).. one of the general laws
of animated nature. The constancy of the laws of nature and of effects and causes, is the foundation
of all human knowledge.” p. 187
Bentham decided “the principle of utility could serve in social science the role that gravity plays in
Newton’s model of the physical universe.” p. 251
“Comte conceived the idea that scientific methods could be applied to social problems yielding
results as certain as those of physics, chemistry, and mathematics, thereby eliminating the differences
of opinion that are such potent sources of political instability and social conflict.” p. 287 This new
science was first described as ‘social physics’ but later...he compounded Latin and Greek roots into a
new term, ‘sociology.’” p. 291
1
McAleer and Veall discuss different model specifications predicting the effects of capital
punishment. “How Fragile are Fragile Inferences? A Re-evaluation of the Deterrent Effect of Capital
Punishment” (1989). Review of Economics and Statistics, Vol. 71: 99-106.
2
H. Pesaran and R. Smith. (1985) “Keynes on Econometrics” in T. Lawson and H. Pesaran (eds.)
Economics: Methodological Issues. London: Croom Helm: 140.
3
John Stuart Mill, Principles of Political Economy (1848) “The laws and conditions of the
production of wealth partake of the character of physical truths.... p. 205
Marx and Engels addressed themselves to the task of discovering the laws of history, based on the
principles of historical materialism. p. 317
Emile Durkheim used the analogy of society to a living organism with his general law being social
solidarity created by the division of labour. p. 443
May Brodbeck wrote in 1968 that belief that social and physical sciences should be the
same is “the unexamined premise of the vast majority of present-day social scientists.”
(Gordon, 1991:637) However, there were sceptics in every period: Friedrich Hayek
coined the phrase “scientism” to deride the view that social and physical sciences are
identical. The Austrian school and others used Weber’s idea of verstehen to draw an
important distinction. Verstehen (understanding, awareness, consciousness) means
subjects can attach subjective meanings to events, and act according to these ever
changing cognitive or emotional states. Peter Winch argued that the study of social
phenomena must be philosophical rather than scientific, because humans are “rule
following” creatures which requires a different notion of causation. JM Keynes in a
comment on Tinbergen’s models maintained that economics cannot escape moral
content. Karl Popper, in his utterly convincing Poverty of Historicism, arguing largely
against Marxist historical materialism, makes a powerful case that no future outcome is
inevitable.3 But if this is so, how can social science theories explain and predict?
Gordon concludes “The determinants of human behaviour, we now realise, are very
complex.” (1991:133) Nevertheless, Gordon concludes (in 1991), the idea that
“economic processes are governed by general ‘laws’, analogous to those that control
natural phenomena...has dominated the methodological stance of the discipline down
to the present day.” (Gordon, 1991:546)
Gordon observes that, “Nomological propositions are possible only with respect to
phenomena that have some reasonable degree of uniformity. Some social phenomena
may be so diverse that no nomological proposition can be made... no matter how
advanced the social sciences become. In addition, even those social phenomena that
can be covered by nomological propositions seldom have a degree of uniformity and
precision comparable to those of the natural sciences.” (1991:51)
Nevertheless, the Newtonian idea that human behaviours are examples of what Hembel
called “covering laws”4 survives in every journal article that reports “findings,” models,
or coefficients without qualifying them by pointing out limitations of their application
in time, space, culture, institutions, or historical situation. Dependency upon complex
“auxiliary” or “side” conditions being held constant (ceteris paribus) in a world where
we know such conditions do change considerably undermines the usefulness of such
research results. Without such caveats, the unwary reader is invited to confuse a result
valid only in Poughkeepsie in 1982 with the universal laws of Newtonian physics. This
undermines the rationale for the generalisation seeking exercise.
3
Lenin, by the way, also rejected Marx’s “inevitable laws of history” and Trotsky fleeing from
Stalin’s assassins concluded that not only is historical progress not certain, society can even retrogress
into barbarism.
4
Hembel recommended the use of generalisations even in the humanities, ie historical studies, saying
every historical account should be presented as an example of a “covering law.” Most historians
rejected Hembel’s view, preferring to see historical events as unique.
4
It is understandable that researchers would prefer to discover universal laws. There
appears to be more glory in a universally valid “finding” which might contribute to
confirmation of “theory,” survive for the ages, and inform people everywhere than in a
case study valid only in Poughkeepsie in 1982. As we leave the lunch room, a research
institute head jokes “Well, back to pushing back the frontiers.” Lately some of the
pushing has come, unfortunately, from barbarians cutting the research funding.
Above all, Newtonian work is publishable and since academic promotion depends upon
publication, we pigeons peck madly at the computer keys to get the rewards, whether
or not the activity is meaningful.5 Research that does not claim to lead to
generalisations (or better yet, theory) is dismissed as “merely descriptive” or a “case
study,” hence not publishable in a respectable academic journal. Applied research,
consulting, or case studies get less respect than “basic” research that claims
generalisable “results.”6 Students are still told to test hypotheses--aiming for inductive
proof of generalisations--rather than to solve specific problems. Students are told to
seek time invariant Newtonian physics-like causal relationships, regardless of the
fictitious nature of such relationships in human designed systems.
Despite the failure to find a “Newton” of social science, and inability to use physical
scientists’ main epistemological weapon (experimentation, discussed below) the major
features of natural science’s paradigm of inquiry remain in place in academia as the
“proper” way to do social science. These include:
1) Remaining, “wertfrei” or value free--the idea of separating positive and normative
propositions to enhance “objectivity” and therefore credibility. This implies scientists
should play the role of passive, disinterested observers rather than active participants
and allows even partisan ideological statements a claim to being “objective truth.”
2) The goal of establishing theory. This assumes existence of time invariant
underlying order to be discovered and verified by testing against data--empirical
observation to establish general conclusions by induction.
3) Analysing reality into small bits and testing the bits. Presumably, understanding the
whole comes by recombining the bits.7
These propositions are all questionable in studies of human behaviour:
1) In the “human sciences” we always care about outcomes--they are our outcomes-so it is impossible to separate values from science. All human action, even doing social
science itself, implies purposes. Physical science resesarch is also not value free.
Values and choices are merely left unstated and implicit.
2) Many outcomes of interest are historically unique realisations of disorderly
stochastic processes. The scales fell from St. Paul’s eyes on the road to Damascus
and history was forever and irreversibly changed as a result. Enlightenment may be a
5
Our institutional reward structure makes such work meaningful to us, if not to anybody else.
I grew up hearing this view from my father, who considered himself a basic researcher. He felt it
was easier to get support from industry for applied work, but that more abstract basic research
generally had greater long term payoffs.
7
“Reductionism” means phenomena observed at one “nomological” level are understood by research
on another level. Behaviour of symptoms of disease is explained via reference to bio-chemical
processes, for example. In the social sciences, explaining social outcomes solely in terms of
individual behaviour fails to account for the importance of context and ignores “emergent” properties
of institutions and groups.
6
5
better human ideal than predictability? Socrates said “If I am wiser, it is because I
know that I do not know.”
3) Human behaviour is diverse, complex, and context dependent. The bits, as Ken
Lusht remarked, often do not add up to anything.
Two historical attempts to deal with complexity
Setting off in the spirit of Newton to explain and control a deterministic world, the
social sciences encountered enormous difficulties. Two methodological compromises
were adopted which enabled social science to proceed, while retaining an implicitly
physical science, Newtonian epistemology.
Deductive theory in economics
J.S. Mill, the great 19th century economist and moralist, argued cogently for the
necessity of deductive methods in economics.
“Since it is in vain to hope that truth can be arrived at, either in Political
Economy, or in any other department of the social science, while we
look at the facts of the concrete, clothed in all the complexity with
which nature has surrounded them, and endeavour to elicit a general
law by a process of induction from a comparison of details, there
remains no other method than the a priori one, or that of “abstract
speculation.”8
Mill admits, with “Not that any political economist was ever so absurd as to suppose
that mankind are really thus constituted, but because this is the mode in which science
must necessarily proceed.” (Reprinted in Hausman, 1984:53) Mill also notes, however,
that to apply theory to a specific case, then all the circumstances of that case must be
considered. “Anyone who wants to offer practical advice must consider these
disturbing causes, else his abstractions remain useless. (Reprinted in Hausman, 1984:
60)
The price paid for this methodological shortcut is that the results of economic theory
are already implicit in its assumptions--there is no proven connection to the real
world.9
All science contains an element of ideology or prior belief. It must or the world would
make no sense at all and one wouldn’t know where to start or what data to collect. But
economics is unique in its degree of reliance on pre-scientific deductive logic. Later in
his essay, Mill speaks of the virtues of dialogue between theorist and the “practical
man.” (Hausman, 1984: 64)
The idea of “simplification in order to identify the crucial issues” was echoed by John
Neville Keynes at the end of the 19th century: “Economic science deals with
phenomena that are more complex and less uniform than those with which the natural
sciences are concerned...its conclusions lack both the certainty and the universality that
8
From Mill “The Scope and Method of Political Economy” p. 59 of Daniel M. Hausman, ed. (1984)
“The Philosophy of Economics” Cambridge: Cambridge University Press.
9
For example, if instead of assuming that humans are insatiable and self-interested one assumes they
are content when they have a sufficiency and altruistic, one would end up with a completely different
economic theoryThese alternative assumptions have, in fact, been suggested by feminist economists.
6
pertains to physical laws. (Hausman, 1984:72) Nevertheless, Keynes argued against the
German Historical School economists who he says want to treat of “ought” by saying
“the demands of justice and morality must be satisfied.” The Historical School argued
that abstract economic man is a mistake and broader, more complex study of human
nature is needed. “Great stress is laid upon appealing to specific observation of the
actual economic world”...(the German’s were an) “inductive and statistical school.”
(Hausman, 1984: 81) J.N. Keynes, however, was a methodological pluralist:
“Appropriate method may be either abstract or realistic, deductive or inductive,
mathematical or statistical, hypothetical or historical” depending upon the issue at
hand.” (Hausman, 1984:83) Defence of deductive method was forcefully re-stated by
Lionel Robbins in the 1930’s.
It is not surprising, by the way, that my argument in this paper echoes the German
Historical School perspective. James Graaskamp, my teacher at the University of
Wisconsin was trained by Richard Ratcliff, a student of Richard Ely. Ely, a founder of
the American Economic Association, studied in Germany in the 1870s. Ely advocated
changing institutional rules when robber baron capitalism supported by the laissez faire
ideology of classical economics led to monopoly, exploitation, financial crisis, and
social unrest at the end of the 19th century.10 Case study exercises in Graaskamp’s
classes demonstrated that case specific details, institutions, and moral issues matter,
while generally applicable theory is scarce.
Milton Friedman's famous 1953 paper “The Methodology of Positive Economics” is
perhaps the best known mid-20th century defence of deductive method in economics.11
Herbert Simon used the phrase “principle of unreality” to denigrate Friedman’s idea
that truth of premises is irrelevant to theory validity and called for a more empirical
economics: “Let us make the observations necessary to discover and test true
propositions, call them x prime and y prime to replace the false x and y. Then let us
construct a new market theory on these firmer foundations.”12 Darnell and Evans,
Hausman, and others dismiss Friedman’s argument as “naive instrumentalism.”
(Hausman, 1984: 43) In retrospect, Friedman’s “positive economics” looks more
ideological and less positive.
Thorsten Veblen attacked deductive method arguing that neo-classical economics
“have yet to contribute anything at all appreciable to a theory of genesis, growth,
sequence, change, process, or the like in economic life.” The (neo-classical) theory is
drawn in terms of teleology (rational man’s economic goals), and so cannot say
anything on “the causes of change.”13. Veblen stressed changing cultural institutions as
causal variables in economic behaviour and called such behaviour “habits” as opposed
to unchangeable theory. (Hausman, 1984: 179)
10
Ely deserves more credit as an economic system designer than either Adam Smith or Marx. Ely
mapped out the humanised and regulated market economy model which proved far more successful
than either laissez faire capitalism or communism during the 20th century. Virtually all of the
policies Ely recommended (legalising labour unions, public education, child labour laws, regulation of
natural monopolies, anti-trust regulation, old age pensions) had been implemented by the mid-20th
century in developed countries.
11
Milton Friedman “The Methodology of Positive Economics” reprinted in Hausman, op. cit. pp.
210-244.
12
Herbert Simon “Testability and Approximation” in Hausman, op. cit. p. 247
13
Thorsten Veblen “The Limitations of Marginal Utility” in Hausman, op. cit. p. 174
7
E.F. Schumacher commented on another weakness of economics based on deduction
from simplified abstract axioms: “It is inherent in the methodology of economics to
ignore man’s dependence on the natural world...leading to “deception of self and
others. The problem of production regarded as solved, is not... The natural world is
ignored by economic theory.”14
Ian Kerr argues, in the context of value theory, that setting aside interpersonal utility
comparisons and considerations of the determinants of value, while merely observing
prices, taking preferences as given, carries a heavy price. “Any predictive power the
model might have is likely to be confined to the period of the time series that
‘empirically’ validated the model.” Kerr says values should be reintroduced into
economics because they “motivate economic activity and resultant prices.” 15
Statistical methods to analyse stochastic processes
Use of statistics opened up complex, stochastic processes to theory development and
testing. Adding an error term allowed exceptions to become random variation, so
exceptions that had been assumed away in deductive method could become properties
of distributions of outcomes. But this raises problems of theory testing. Contrary to
Popper’s falsificationism, a counterexample cannot disprove a theory about a
stochastic process, nor can a particular case in which a theory is born out confirm that
a relationship will always hold. Selection of the critical value in a significance test is
ambiguous, there may be no clear rule or reason for adopting .05 as the rejection level.
And with a sufficiently large sample size, any small difference becomes “statistically
significant.” Theory confirmation depends upon arbitrary decisions about type I errors
and sample sizes.
An insidious result of adoption of statistical methods, is the implicit assumption that
we live in a frequentist universe. This assumes large numbers of repetitions of
outcomes. But there is no repetition of coin tosses in historical processes. Once you
lose money in a real estate venture, no one is standing by to give you back the money
for another roll of the frequentist dice. Historical events come about only once. As
Renshaw points out, even low probability events may occur and have irreversible
consequences. Therefore it is important to take action under uncertainty to control
risks and seek desired goals--to create a desired state of affairs. What saves action
under uncertainty from going too far wrong is feedback correction mechanisms
informed by values. Mistakes are corrected when they are detected. Frequentism
implies passivity because the odds are given by the time invariant process (infinite
repetitions of events) and if this deal is bad luck, the next one may be better.
Frequentism is a slightly less powerful version of the oriental notion that fate controls
all so passivity is the best strategy.
14
E.F. Schumacher (1974)Small is Beautiful. London Sphere Books, Ltd. p. 38-40.
Ian A. Kerr (1995) “Value, Price, and Economic Welfare” unpublished draft, Curtin University of
Technology, p. 14.
15
8
Why Newtonian social science fails
Experiments and cumulative knowledge
The Greeks invented analytical thought, solving problems by breaking them down into
simple components. Each scientist will specialise in a “bit” of the problem and their
collective work provides comprehensive understanding--knowledge and theory
cumulate. "If I have seen farther, it is by standing on the backs of giants," remarks
Newton.
This method works most powerfully in laboratory experiments under controlled
conditions. Experimental method features:
1) Random assignment of subjects.
2) Control of extraneous variables.
3) Systematic manipulation of experimental variables. This allows the scientist to paint
a convincing picture of causation as magnitude of response can be shown to track
level of treatment.
4) More complex experimental designs using combinations of variables allow
measurement of interaction effects.16 (eg. 2x2x2 design, eight combinations of three
variables, each administered at two levels)
5) Replication allows the experimenter to determine levels of residual or random error
and thereby create a reference distribution to distinguish treatment effects from
random variation or noise. 17
Without experimental control we have merely “correlation”--one effect occurring with
another. As we all learned in our first statistics course and then have tried to ignore
ever since, “correlation does not imply causation.” Absent experimental control, the
variable of interest may be confounded with another variable that is the real cause. It
follows that all econometric models may be spurious.
There are several reasons why experiments on social science issues may not be
possible. First, ethical barriers often preclude experimentation on human subjects.
Second, in many cases it is politically or practically impossible to gain adequate power
or funds to experiment. People may not cooperate. Third, experiments in social
science are often impossible because the events in question are historical, one time only
events.
Fourth, it is not clear that the same result will occur in a controlled experiment with
humans as would occur in a complex social situation. Some housing economists felt
the U.S. Experimental Housing Allowance Program experiment, surely one of the
most expensive and carefully designed social policy experiments ever attempted,
probably had misleading results simply because participants knew it was a short term
experiment and therefore acted differently than they would have if the program had
16
My first job as a teenager was in an histology laboratory working on an experiment studying
interaction effects of two carcinogens. Box, Hunter and Hunter Statistics for Experimenters is an
excellent introduction to analysis of variance in experimental studies.
17
For an excellent account of analysis of variance under various experimental designs see Box,
Hunter, and Hunter Statistics for Experimenters
9
been “real” and permanent.18 The famous “Hawthorn effect” is another example
where the mere fact of being studied changed worker behaviour. And, of course,
human subjects can choose to lie or fake the results.
C. Wright Mills, a sociologist, denigrated quantitative methods as "abstract
empiricism." claiming the price of "rigour" in pseudo-quantitative social science
research is irrelevance. (Mills, 1959) Mills’ point was that by abstracting behaviour so
much from real life, the investigator puts his subjects into such an artificial situation
from which behaviour cannot be reliably generalised. A.N. Whitehead warned of the
“fallacy of misplaced concreteness” when disciplinary organisation of knowledge
requires a high level of abstraction. Application of results to the real world may be
misleading when the model only represents a small piece of the system.
Complexity of human behaviour militates against meaningful comparisons. Because
my experience, intellectual, and emotional makeup is different than yours, many
confounding variables confuse results even when it appears that a controlled
experiment has been carried out. What is being measured is not as clear as in a
physical science experiment. The researcher and subject may disagree on what is
causing behaviours, subjects’ reactions to a stimulus may differ due to different
cognitive interpretations of the situation.
Without experimental control groups, the social scientist can never know for sure
which variables caused an effect--no variable ever can be observed to operate in
isolation. It is always possible that an effect will be overwhelmed by the effects of
other variables or that an effect attributed to one variable is really due to another. It
isn’t surprising that economists often find coefficients with signs opposite to expected
signs. Assuming the theory that led to the expectation is ok, this must be due to
omitted variable bias. But if “wrong” signs can be biased, so can “right” signs.
While sometimes causation can be worked out clearly in non-experimental settings and
sometimes confounding variables creep into experiments, nevertheless, there is a huge
degree of difference in terms of practical ability to sort out cause and effect, magnitude
of response, variability of response, and interaction effects between variables.
Without experiments, we lose the ability to cumulate theory bit by bit by testing
variables one at a time.
Econometrician’s “critiques”
This section reports lines of research that provide mathematical insights into positivist
empiricist paradigm’s difficulties modelling human behaviour.
Obviously, if
quantitative models predictive power is poor, the math is wrong--that is, the systems
studied are poorly represented by the models.
Simulation modeling under uncertainty with irreversible outcomes
Renshaw simulated risks of species extinction through normal random variation when
populations have fallen to low numbers. Renshaw gives an example where starting
with a population of 3 individuals and probability of a birth = 1 and death =.5 for each
individual during each time period, in 20 simulations of only six time periods, the
ending population ranged from extinct to 225. (Renshaw, 1991:3) With such diversity
18
Anthony Downs and Katharine Bradbury. Do Housing Allowances Work. Washington: Brookings
Institution, 1981
10
of outcomes from even a glaringly simple process, how can we assume that the data
we observe (a single realisation of a complex process) suffice to specify and estimate a
convincing model?
“Apparently trivial non-linear models....give rise to a surprisingly rich
diversity of mathematical behaviour ranging from stable equilibrium
points, to stable oscillation between several points, through to a
completely chaotic regime...even aperiodic fluctuations. In biology we
are often asked to infer the nature of population development from a
single data set, yet different realizations of the same process can vary
enormously. Where fantasy takes over is in the belief that the
mathematically ...fine structure of deterministic chaotic solutions might
be observed...Any superimposed environmental noise, almost no matter
how small, destroys it.” 19
Renshaw concludes “Models..should therefore not be looked upon as providing a
literal representation of reality, but should be treated instead as a means to obtaining
greater understanding of the behavioural characteristics of the ecological processes
which they attempt to mimic.” (Renshaw, 1991:33)
Much instability is due to delays in responses within the system.
“Delays are likely to cause oscillations....(in ways similar to the delays
that) mean that it is possible to alternatively freeze and scald oneself (in
a shower). This can result in “dampened oscillations, or non-dampened
(expanding cycles), or two point or other cycles, or chaotic
oscillations....(with fast growth)... we are in for surprises...chaos means
the system has gone out of control. There is no way to predict its longterm behaviour even though the process is completely determined at its
initial value.” (Renshaw, 1991:88,105)
The issue with endangered species is often “Is some human activity causing population
decline or is it merely a natural fluctuation?” Renshaw answers, “We sometimes can’t
tell for sure.” The same uncertainty is inevitable with respect to environmental
questions affecting possible human extinction such as upsetting the earth’s temperature
or incident radiation regimes through atmospheric changes.
“If one takes the view that simple mathematical models can reflect only basic
qualitative phenomena of complicated biological structures, such as chaos or limit
cycles, then use of quantitative fitting techniques in non-linear systems may well be
overambitious.” (Renshaw, 1991:113) Nevertheless, Renshaw considers the parameter
estimates to contain “valuable information.” It is just that you wouldn’t be surprised if
you were wrong.
Renshaw sounds like a German Historical School economist:
“The fascination of natural communities of plants and animals lies in
their endless variety. Not only do no two places share identical
histories, climates, or topography, but also climate and other
environmental factors are constantly fluctuating. Such systems will
therefore not exhibit the crisp determinacy which characterizes so much
19
Eric Renshaw (1991) Modelling Biological Populations in Space and Time Cambridge:
Cambridge University Press p. 4
11
of the physical sciences. And “Understanding will not generally be
enhanced by employing “clever” mathematical techniques to develop
large and complicated probability solutions. These are often totally
opaque as far as revealing any insight into the underlying stochastic
processes concerned. (Renshaw, 1991:223,382)
Surely these comments would be equally applicable to attempts to model real estate
markets.
Leamer’s “Let’s Take the Con out of Econometrics”
Leamer begins his provocatively titled 1983 article by pointing out the consequences of
non-randomisation--the fact that without experimental control groups, one cannot
isolate the effects of variables. His solution is essentially Baysian “One must decide
independent of the data how good the non-experiment is.” 20 He recommends use of
“prior information that effectively constrains the ranges of the parameters.” Because of
the necessity for these restrictions, Leamer calls the idea of scientific objectivity “utter
nonsense” Many are troubled by the Baysian approach which seems to give too large a
role to subjective input from the investigator, replacing one source of uncertainty with
another.
Leamer quotes Coase’s jest that if you torture the data long enough it will confess.
“The econometric art as it is practiced involves fitting many, perhaps thousands of
statistical models. One or several that the researcher finds pleasing are selected for
reporting purposes.” (Leamer, 1983:36) Kennedy agrees: “Journals, through their
editorial policies, engage in some selection, which in turn stimulates extensive model
searching and prescreening by prospective authors. Since this process is well known
to professional readers, the reported results are widely regarded to overstate the
precision of the estimates, and probably distort them as well. As a consequence,
statistical analyses are either greatly discounted or completely ignored.” (Kennedy,
1992:84)
Charemza and Deadman provide a rule of thumb: If there are c candidate variables
and k are included in the model, then with significance level nominally supposed to be
α, then the probability of rejecting a true null is about (c/k)α. If p=.05 and there are
10 candidate variables and one chosen, the true p≅.50 rather than .05.21 Formally
speaking, models found by such data mining are not empirically validated--the null of
no effect would not have been rejected had probabilities been correctly calculated.
Moreover, one can, at a given p value make results “significant” by increasing sample
size enough to detect small differences. Statistical significance is commonly confused
with practical significance.
One can distinguish between a context of discovery and of validation. Data should be
mined for the best model, but the statistics (hence the model) are unreliable and this
should be admitted. Presumably the statistics from testing the same model with fresh
data would be interpretable. But this apparent solution boils down to an inescapable
dilemma. To get new information from data one must test more than one model, but
20
Edward Leamer (1983) “Let’s Take the Con Out of Econometrics” American Economic Review
V.73#1: 33.
21
Charezma and Deadman quoted in Lovell, 1992: 22
12
doing so spoils inferences. Once found, a model could, in a frequentist world, be
confirmed by fresh data, but unfortunately, it is impossible to repeat most economic
outcomes (time series data) in the world we actually experience.
The view taken in this paper is “Why should we expect coefficients to remain stable in
changing processes?” Even Leamer’s “fragility of inferences” testing implicitly retains
the Platonic/Newtonian metaphysical assumption of time/space invariant processes.
The real problem is harder still. In addition to finding a model consistent with data, we
have to also model process change over time.
Spurious Correlations and Confounded Variables
Grainger and Newbold (1974) found that 75% of the time, random walks generated by
a computer showed spurious relationships.22 Hendry found higher R² when inflation is
explained by cumulative rainfall than when inflation is explained by money supply.
(Hendry, 1993) Time series containing trends will be correlated with other trended
variables, regardless of causation. Random walks include stochastic trends because
they are “long memory” or “unit root” processes. McAleer warns “High R² does not
prove a model is adequate--everything can be wrong despite high R².”23
In time series data, time may be confounded with both the dependent variable and
some irrelevant or nonsense predictor to produce a spurious model. But in cross
sectional data, all that is required for a spurious relationship is a similar x---y---z, x--z confusion. The claim that x caused z is false, x really was associated with y which
caused z, the x--z relationship is spurious. In an experiment, effects of y would have
been eliminated through controlled conditions and random assignment and the x--z
causal relationship, if any, revealed.
The problem of confounded variables is especially severe in economics because many
variables are correlated. Real prices may be pro-cyclical, as is demand. Although
theory tells us that sales should drop when prices increase, office market regressions of
demand on price (real rents), for example, often show the “wrong” sign. It follows
that even in cases where the sign is “right” but there is no experimental control to
isolate a single variable’s effects, we can’t be sure whether or not a coefficient is
distorted by intervening variables.
Hendry’s General to Specific Strategy
Hendry proposes to solve the data mining problem by starting with a “general” model
including more variables and lags and functional forms than expected in the final
model. Then variables are eliminated until a more parsimonious form is found. The
claim is that omitted variable bias is thereby minimised and coefficients and t statistics
remain meaningful, since the deleted variables had no effect.
Hendry’s approach strikes me as a systematic form of data mining, rather than a
solution. Shouldn’t an honest statistician adjust t scores of remaining variables
downwards each time a variable proves insignificant and is removed from the
equation? Would not this in many cases destroy the ability to detect effects? And,
even so-called “general” models are highly restricted. It isn’t possible to include all
22
This account of Grainger and Newbold’s 1974 paper is from Griffiths, Judge, and Hill (1993), p.
696-697.
23
Lecture on time series methods, 1996.
13
functional forms, lags, interaction terms, or candidate variables given limited degrees
of freedom. How one can unambiguously “test down” to the smaller model, given that
bias is possible in the most general specifications? And, with the small data sets
available in real estate research, when one runs a general model all of the coefficients
may be insignificant.
Rational Expectations
Financial pages commonly feature headlines like “Profits rose, but sharemarkets fell
because the market had expected better numbers.” Causal variables may have opposite
effects to those predicted by theory, if expectations are out of line with results or
expectations change. By divorcing current outcomes from current causal variables,
rational expectations makes model dynamics exceedingly difficult to identify. Data
does not usually include the expectations of decisionmakers. Using office markets as
an example, how can one detect whether rents now are based on market conditions
now or market conditions anticipated three years from now?24 Worse, the lag
structure relating expectations to fundamentals may change. Perhaps in the past office
markets reacted to current conditions, but they’ve learned their lesson and are now
moved by information on fundamentals two years ahead. Qualitative paradigm
research (talking to people) seems a necessary precursor to data analysis.
Complexity of Simplicity
Simplicity and parsimony are among the major desiderata of theories. Keuzenkamp
and McAleer (K&M) begin their discussion of simplicity by quoting the 13th century,
Occam’s razor: “Entities are not to be multiplied beyond necessity. It is vain to do by
many what can be done by fewer (the law of parsimony).” (Keutzenkamp and
McAleer, 1995)
Dharmpala and McAleer note that while a simple model may be useful, it clearly
cannot be complete. The question is whether leaving details out matters or not. Karl
Popper’s key question “Is it falsifiable via hypothesis tests?” is only an arbitrary way of
looking at things. A model can be both true (some degree of fit with data), while it
must also be untrue (eg. incomplete). Concerns include model elegance, fit to sample
data, ability to pass tests of its statistical properties, parsimony, and usefulness. A
particular representation might be incomplete, but useful, or more complete and
irrelevant, for example, due to lack of data. (Dharmapala and McAleer, 1995)
K& M point out that simplicity and parsimony involve several dimensions including
number of parameters, degree and order of variables, computational simplicity, and
testing simplicity. The best model by one criterion may not be the best model with
respect to others. And, “As the number of independent constituents of a system
together with the laws of necessary connection become more numerous, inductive
arguments become less applicable.” (K&M, 1995:279) The Duhem-Quine thesis says
that negative test results do not invalidate a theory where causation is complex--some
part of the theory might be adjusted to save the rest.
24
Pat Hendershott has proposed that with the institutional constraint of long term leases, current
office rents should be struck at a level equal to the average of real rent expectations over the ten year
lease term. Testing all lags and leads, an empirical way of identifying dynamics, runs the risk of
finding results due to chance that could not be generalised outside the sample--the data mining
problem.
14
Nothing guarantees a DGP (a metaphysical concept) will be stable. Although constant
parameters may be desirable...this is not the case because human nature or society is
not stable. Models with time varying parameters may be helpful in the social sciences.
“The frequentist interpretation of probability, which presumes the existence of a stable
universe....does not apply to the social domain. This does not make inference
impossible, but one should explicitly acknowledge the cognitive limitations of the
theory of inference that is used.” (K&M, 1995)
K&M say that in applications one is forced to test subsets of systems or models, use
successive approximations and (contrary to Hendry) work from simple results to more
complex understandings. “Subjective judgements matter and so does the context of
inference.” The purpose of modelling matters. The question of whether the aspects of
nature of current interest are simple or complex certainly has something to do with
whether simplicity is desirable in theories. And, what might be safely ignored in a
parsimonious short term model might enter longer term models.
Epistemological revision in the physical sciences and mathematics
While the social sciences were adopting quantitative methods imitating 18th century
physical sciences, the physical sciences themselves moved beyond a Newtonian
deterministic worldview. Heisenberg’s famous Uncertainty Principle is interpreted by
Murray Gell-Mann as introducing randomness originating at the subatomic level into
all processes.25 Russell Hanson pointed out in the 1950s that categories and concepts
lead to theory laden data, so there is no objective information, even in physics. All
information implies choices and conceptions, which implies preferences and tells
something about the observer as well as the observed.
Mathematician John Casti summarises some of the major sources of uncertainty:
a) Catastrophe theory “corresponds to those parameter values where the fixed point
governing the system’s behaviour shifts from being a stable attractor to an unstable
one. This is how a small change in something can lead to a discontinuous shift.” like a
sharemarket crash.26
b) Deterministic chaos. For some simple mathematical expressions changes in initial
conditions smaller than measurement errors may result in large differences in
outcomes.
c) Turing machines and the halting theorem. Godel and Turing showed that following
a set of unambiguous rules doesn’t always lead to a clear outcome. They proved there
are problems without solutions.
d) Irreducibility, means that the whole may be different from the sum of the parts.
Emergent properties arise as system complexity increases.
Forrester and others, beginning with control theory developed for electrical
engineering applications, developed a system dynamics (SD) methodology emphasising
physical process time lags and feedback corrections. Forrester claimed SD simulation
models could do better than statistical approaches in modelling complex, non-linear
25
Gell-Mann won the Nobel prize for proposing “quarks” and founded the Santa Fe Institute for the
study of the mathematics of complexity.
26
John Casti (1994), Complexification. New York: Harper-Collins Publishers. p. 53
15
systems.27 But the famous Club of Rome World Model, which predicted collapse of
human population within 100 years of its 1970 publication date, was roundly criticised
as flawed in that conclusions were implicit in arbitrarily chosen model structure and
assumptions.28 Some SD modellers have retreated to “systems thinking” whereby
models are used to learn about systems without committing the reification fallacy-confusing the model with reality. Ecological modellers use SD ideas like time lags and
positive and negative feedback control, as well as observed population “limiting
factors,” carrying capacity, and threshold effects. Logistic equations (to reflect limits)
and cycles reflecting time lags and overshooting, are common in such models.
Ecological modellers have confronted the same difficulties as social scientists--defeat
of models by complexity--as indicated by Renshaw’s comments.
Cornish, et al. cite examples of unintended and unforseen chains of causation in
complex systems.29 In complex systems, no one can foresee all these consequent
outcomes beforehand, including system dynamics modellers. Clearly the simple linear
models used in applied econometric work in real estate are very far from correctly
representing the mathematics of the systems of interest. The rapid increase of
interaction terms as variables are added, and the rapid increase in possible functional
forms overwhelms the data. The complexity of systems leads to complexity in
mathematics, which makes specification and estimation impossible given that we live in
an historical, one time only world of limited and costly data.
Reasons why the metaphysical assumption of time invariant Newtonian order in data
generating processes cannot be maintained in social science modelling include:
Openness to exogenous shocks.
Systems analysts define a “closed” system as one with no transactions across the
system boundary. Social processes are open to exogenous influences. Events such as
wars, droughts, elections, market crashes, mineral discoveries, technological
innovations, etc. may influence events. These exogenous influences can change not
just parameter values, but the data generating process itself. New variables are
introduced, formerly important variables may become insignificant. Nobel prizewinning
biologist Konrad Lorenz wrote “unforetellability is...an inalienable aspect of everything
living” because living systems are open.30
Indeterminacy.
Many issues of interest cannot be included in models because they haven't been
decided as yet. They aren't merely unknown, they can't be known because they have
not yet been determined. People change their minds, take decisions, find new
information, and so on in ways that make it impossible, in principle to know what will
happen in advance of the event. Greenspan himself does not know what he will do
27
Forrester sounds very Newtonian in his early claims. His mathematics is deterministic, although it
is non-linear and dynamic rather than linear and static like most economic models.
28
The World modellers revisited the topic in 1990 (Beyond the Limits), repeating their warnings, but
softening the conclusions by admitting that everything is conditional upon human activities and other
sources of uncertainty. Their original model projected implications of current trends such as
population growth rates and technology, but those trends can (with some time lags and difficulty) be
changed.
29
Edward Cornish, et al. The Study of the Future (1977) Washington: World Future Society.
30
Lorenz, Konrad (1987)The Waning of Humaneness, Unwin Paperbacks, London, p. 242
16
next week, so a model must recognise indeterminacy to accurately represent the
system.
Cognition and learned behaviour.
Human behaviour is "plastic" rather than fixed by genetics. A wolf, clearly, is meant
to make a living pulling down deer by hunting in packs. It is not so clear what humans
are meant to do for a living in an artificial world of their own manufacture. This
means we cannot take for granted that what is will remain, nor that what is should be.
People can decide to do things differently. We learn from experience. These are not
properties of atomic particles or chemical reactions. Molecules do not have epiphanies
on the road to Damascus that cause them to change behaviours. Gordon expresses the
importance of values and normative behaviour in human systems nicely:
“When a positive proposition fails to be supported by empirical
evidence, the proposition is called into question; but when a normative
proposition is at odds with the state of the world, the state of the world
is called into question. Put somewhat differently, when a person’s
positive beliefs do not agree with the facts, he is rationally obliged to
change his beliefs, but when the facts do not agree with a person’s
normative beliefs he is morally obliged to change the facts if he can.
The member of the Flat Earth society should change his geographical
theory; the thief should change his conduct.” (1991:56)
Since much of the social world is unacceptable by common moral standards (millions
of starving children, wars, ecological destruction, etc.) a purely positivist science is
irresponsible.
Irrationality, free will.
Although much of Freudian psychology is discredited or supplanted by more
convincing theories, clearly some notion of unconscious mind is supported by evidence
such as dreams, sudden insights, involuntary reactions such as stress responses, and so
on. Conscious thought is only a portion of what the nervous system accomplishes. If
unconscious thought influences our actions, how can we put these processes in a
model?
Western religious traditions, instead of claiming everything we do maximises our
utility, speak of "sin" and "moral choice" and say that to err is human and inevitable.31
Clearly a good share of behaviour is irrational in some sense of the word: Gambling,
smoking cigarettes, destroying the ozone layer, all have negative long term utilities.
During what time period do we maximise utility? Do future generations count? Does
one maximise utility for oneself, for one's family, for one's country, for humanity, for
humanity including future generations, for humanity and the other species on earth, for
God? Whichever notion of rational behaviour is chosen, some people will choose to
"cheat" or will make less than optimal choices.32 If irrational choices are within the
31
The Catholic Church expects sin to be so regular that confession of sins is part of weekly Church
practise. The doctrine of “original sin,” Adam and Eve’s transgression, is a foundation of the
Christian conception of mankind.
32
Socrates thought the only reason for evil was ignorance, but Bart Simpson’s pet elephant, Stampy,
pushed the other elephants around just because he’s a jerk.
17
range of possible human behaviour, predictive modeling based on rational responses to
market fundamentals is problematic.
A rational market would not build excessive amounts of office space, but bribery,
corruption, egotism, or other unethical or irrational motives may contradict the
predictions of models based on assumptions of rational behaviour--or on any
assumptions about human behaviour. Free will subverts positive modeling by allowing
choices that may contradict predictions.
It is sensible, therefore, to think about forecasting in terms of “possible futures” rather
than “the future.”
Cornish, et al note that “possible futures are diverse, from
extinction to great prosperity and comfort.” The future world is plastic. “Human
beings are not moving toward a predetermined future world, but instead are active
participants in the creation of the future world.” (Cornish, 1977:91) Desired future
states can be causal variables if they prompt us to change our behaviour.
Rational expectations and strategic behaviour
Kreutzenkamp and McAleer mention game theory where “indeterminate decision
problems, due to non-computability” lead to situations where “Although the outcomes
of decisions need not be completely chaotic (namely random without following any
probability law), they may well be.”
What you do depends on what I do which depends upon what you do. The project
will be worth x (justifying building) if project y (a competitor) does not proceed, but xz (NPV<0, meaning “do not build”) if y goes ahead. y’s payoffs are similar. What is
x’s value? The answer is literally indeterminate. “Circular reference” is the error
message from an Excel spreadsheet. Two issues, at least, reduce the robustness of
game theory predictions even in cases where the game’s rules seem to lead to
equilibrium solutions: “In practice it is unlikely that the modeller will have specified
payoff functions that are exactly correct” and “relaxing the assumption that all payoffs
are known to other players.” (Fudenbach and Tirole, 1992:480) In such situations, the
idea of a DGP or data generating process is a convenient metaphysical fiction.
Bounded rationality implies that some decisions are made on arbitrary grounds,
yielding arbitrary consequences.
Reflexivity
If we had a model that could predict, and this model indicated current mis-pricing of
assets, markets would immediately adjust, thereby invalidating the model. Certain
kinds of predictions will “work” only if people don’t know about them. The expanding
edifice of knowledge collapses of its own weight. More control means less certainty of
outcomes. World financial markets seem to be demonstrating the irrationality of
reflexive systems at the moment. Socially irrational bubbles and busts are more likely
than equilibrium outcomes as models predictive power increases, unless institutions
impose structure.
Creativity, technology change, evolution.
In applied social sciences, the systems of interest are constantly being changed and
reinvented. One of the major areas of indeterminacy involves inventions, discoveries
or new technology. These innovations are by definition unknown and unknowable in
advance or they wouldn’t be new discoveries. One cannot logically predict new
technology. If one knew what undiscovered technology would be, it would not be
18
undiscovered. One may make educated guesses, but one cannot know for certain
whether new inventions will make it easier to bring workers to offices (eg.
transportation innovation) or communications technology will make it cheaper not to
bring them to offices (eg. home office innovations). 33
Creativity is a broader term encompassing any sort of innovation, not just science and
technology. 34
With respect to office markets, for example, innovations might
include a new location for office space--a decision to build in an area near the center
city which previously had no offices, for example. Or a new building design--an
energy efficient tower for example. Or an aesthetic concept--an architect who decides
to build a blue building, for example. We know that aesthetics can have a major
impact on economic outcomes--people may pay higher rents for space an attractive
building. Whether a creative solution will a) be found, b) be successful, or c) be a flop
is often uncertain in advance of implementation. From a modeling perspective, we
have a wild card that we cannot quantify.
Historical uniqueness.
In nature we find some processes, that remain quite stable over long periods of time,
perhaps even for all of time, others where, as the Greek Xanthippes put it “You can’t
put your foot in the same river twice,” because the river will have changed. Heraclitus
(opposite to Plato) regarded change as the essential metaphysical assumption.35
The German Historical School economists of the 19th century recommended gathering
comprehensive data about particular situations as an aid to understanding and policy
intervention. Oddly, very few published studies of real estate acknowledge the obvious
fact that real estate in an historical process.
Reductionism and emergent properties
Analysis always takes place at some level of generality. In physical sciences one may
have a cosmic theory, a theory of things our size, or a theory of subatomic particles-each of which might describe the same phenomena. Which level is chosen makes
considerable difference. In economics, the question is often “who counts.” Often
economists write as if social costs and benefits are the relevant criterion of efficiency.
But individually optimum outcomes do not coincide with social optima. Agents may
earn fees in office markets, for example, and thereby have incentives to promote
projects even when investors are likely to lose money. Institutional mechanisms to
prevent harm through individual misconduct are continually tested by scams, schemes,
abuses, corruption, and institutional favouritism of particular interests above social
interests. Therefore, it is impossible to talk about either individual or social rationality
without discussing institutional constraints.
Conclusion
The saying, “If all you have is a hammer, everything looks like a nail” applies to the
use of econometric methods without enough attention to qualitative issues, data
quality, system behaviour (lags and feedback control), and institutional context. This
33
Popper’s Poverty of Historicism made this point as well as the points above about free will and
reflexivity.
34
James R. Evans (1991) Creative Thinking : Cincinnati, Oh, South-Western Publishing, is an
introduction to some ideas about fostering creativity.
35
Cited by Gordon, 1991:154.
19
does not denigrate the hammer--econometric methods are popular precisely because
they are so flexible, powerful, and appealing. But no matter how useful one tool, using
a variety of methods will often build a better solution. One by-product of integrating
econometrics with other research paradigms in applications would be, I think, more
credibility and wider use of econometrics. Borrowing a diverse toolkit and combining
methods may be among the best candidates for a real estate research paradigm. Part II
assembles some ideas for paradigm diversification--many of which echo the case study,
Historical School approach.
20
Part II Towards Improved Social Science Paradigms of Inquiry
Indroduction
Thomas Kuhn pointed out that for a paradigm revolution to occur, it is not enough
that the current paradigm have logical problems or difficulties encompassing empirical
observations. There also has to be some more successful paradigm as an alternative.
It seems to me that we are in a period of methodological pluralism--there are several
paradigms that may offer better opportunities. Combining methods may decrease the
depth of the analysis in each area in comparison to narrowly focussed studies. But
synergy developed through a dialogue between different approaches justifies the
difficulties of eclectic research methods.
An analogy that appeals to me is that in building a house the usefulness of each tool is
enhanced by the application of other tools. A useable house could not be made with
just one tool. In the same way, it seems to me, an eclectic research methods toolkit
extends our understanding and ability to solve a practical problem.
This section begins with metaphysics--beliefs about processes studied in social science
research. Assumptions of simplicity and time invariance are replaced with assumptions
of process change and complexity. Then there are suggestions to “do econometrics
properly” in this sort of world. A major theme is to broaden inquiry so as to take
account of important issues omitted from positivist research. Rather than assuming a
simple model adequately represents a system, investigators may undertake a multidisciplinary complexification search to identify all the issues relevant in a given
situation. Paradigms of inquiry should include issues both earlier in the research
process--creativity, values, and also later--social roles of researchers and
implementation.
Changing Metaphysical Assumptions
Historian of religion Karen Armstrong remarks, “Plato was convinced that the divine
world was static and changeless. The Greeks saw movement and change as signs of
inferior reality: something that had true identity remained always the same,
characterized by permanence and immutability.”36 Plato must have hated the political
changes leading to the death of Socrates. Clearly, Newtonian deterministic science is
Platonist and consistent with Deisim which reconciled science with religion. Scientists
studied God’s unchanging laws, as Malthus stated. Brian Appleyard wrote “The
platonic roots of science point to a deeper truth--that science itself is a form of
mysticism.” (1992:63)
Bertolt Brecht felt differently about stability and order than Plato, perhaps because
Brecht grew up under a class bound military dictatorship. For Brecht, change
represented a chance at improving one’s lot. Brecht wrote Wenn das bleibt, was ist,
Seid ihr verloren. “If that which is remains, you are lost. Your friend is change, Your
companion in the fight is ambiguity.” Timelessness, for Plato, meant preserving
Socrates’ truths, despite Socrates death, a triumph for eternal values. Order, for
36
Karen Armstrong A History of God ,
p. 35
1
Brecht, meant merely continued oppression and stagnation, his society required
change for human values to flower.
Assuming either order or disorder in data generating processes requires a metaphysical
assumption about events not observable (ie, outside the sample data). The current
paradigm implicitly assumes time-invariant order (breaks or time varying parameters
are explicitly modelled, but the assumption is that unless we say it changes, the
structure of the process is time invariant. That is the basic assumption required to fit
one model to diverse outcomes (a sample). A Brecht-like metaphysical assumption
that processes are disorderly until proven otherwise would be more empirically
defensible when the subject matter is designed human systems. This would improve
econometrics by directing attention to study of process change and would help explain
out of sample behaviour inconsistent with models.
Classify degree of disorder in systems
Processes differ greatly in their degree of orderliness. Consider four different types of
data generating processes, classified by their degree of order:
A. Completely determined processes
For Newtons first law of motion, F = MA (force equals mass times acceleration),
error is conceptually zero due to a deterministic mathematical relationship between
variables. As instrumentation improves, additional decimal places are added to
variables precision, but the functional form remains intact as established “theory”.
B. Well behaved stochastic processes
With orderly stochastic processes the error distribution is known and stable. This is
the world assumed by most applied economics, including real estate research,
represented as:
y = bx + e where the errors are iid, n(0,1)
C. Unstable stochastic processes
Consider an evolving, non-linear, changeable world, with model structure and
parameter estimates changing over time. Now we have something like:
y + u = bX k + cZ j + d X X + eZZ + X Z + ex + ez + er
Simultaneity means this model is a system of equations with each variable a function of
all other variables. U is an error of measurement or conceptualisation on the left hand
side (perhaps y is a poor proxy). In a changing world, values and preferences evolve,
so both sides of the equation are moving targets.
On the right hand side, some variables (X) are known, but some (Z) are unknown
(omitted), that functional forms are probably not all linear. Interactions of various
types are represented by XX, ZZ, XZ terms, although possible non-linear interaction
terms are omitted. Some variables’ effects will be latent meaning there is no variation
in the sample, but over a longer term, variation could occur and affect the dependent
variable.
2
Errors arise from several sources--random error, errors in measurement of X
variables, errors associated with unknown omitted variables, or structural instability.
Errors and thus confidence intervals are uncertain.
Explicitly acknowledging the various sources of error and uncertainty, rather than
lumping them all together in an error term and assuming the errors follow a certain
distribution, allows us to discuss reasons for model forecast failures, and therefore may
help us forecast better.
As sample size increases, diversity of data increases, and new sources of error enter the
model. If there is 20 years of data, one cannot assume that processes are structurally
stable enough so that the estimates obtained from the full data set are more useful than
those from a more uniform (recent?) subset of data.
D. Structureless processes
We can imagine processes with no structure and unknown errors. No available
information can reduce uncertainty.
Modelling/forecasting such processes is
impossible so one must shift to risk management or goal seeking/feedback correction
modes of analysis.
Most academic research in real estate assumes a well behaved stochastic world like
“B.” Models mislead rather than inform when a simple model is accepted as
adequately representing a complex, changing, unknowable process. Poor results come
from pretending to know what is unknowable.
The degree of disorder in the process of interest is an empirical question. The
empirically observable world of real estate markets resembles “C” with uncomfortably
and unescapable large residual error in any model, and considerable uncertainty
regarding model structure, parameter estimates, and error distributions. There is
structure (albeit evolving structure) present, making modelling a potentially useful
activity. But this modelling would be far more credible if it began with more realistic
assumptions about what models can and cannot deliver relative to such ill-behaved
processes. For one thing, if the DGP is a “C” type process, it seems sensible to
respecify and re-estimate models afresh for each case, that is, to adopt case study
methods rather than seek generalisations. Rather than regarding parameter instability
as necessarily resulting from a “bad” model, we should be open to the idea that
instability may be a correct representation of a changing system.
A number of implications follow from changing metaphysical assumptions from order
to disorder:
Diversifying Paradigms
German Historical School case studies, institutional economics, and other approaches
should be added to the toolkit and published in the journals.
Historical perspective.
Historical events will never be repeated. Changing processes argues for case study
methods. Rather than reporting coefficients and their standard errors as if some
universally applicable discovery had been made, we should say specify when and where
apply and how limited generalisation may be. Structural stability tests, testing “break”
dummies, and rolling regressions indicating range coefficient variation in different sub-
3
samples should be standard procedure. The best sample may often not be “all the data
available” but rather the most uniform subset of data to model the process of interest
(ie current system behaviour as opposed to past behaviour). Exploring dynamics
should be more prominent in specification searches. “Tuning” forecasts by ad hoc
adjustment of estimates to reflect non-sample information may improve models.
Incorporating non-statistical one off descriptive data, a la the German Historical
School, or proposing institutional remedies to certain sources of uncertainty (ie insure,
hedge, or allocate risks through contracts) may be useful.
New “side conditions” 37 or possible explanatory variables that might “matter” come to
the fore in different periods. Definition of variables may change over time. It isn’t
clear that “office space” means the same thing in the 1990’s smart building, as it did in
the 1970’s paper based office. This year the world is worried about Japanese banks,
next year it may worry about something else. All this argues for historical case studies.
Models that move in the direction of historical perspective would include more
interaction terms. Several property valuers have pointed out to the author that the
value of a particular property feature “depends”--for example value of a swimming
pool depends on climate. Many processes exhibit threshhold effects. If case studies
entail a loss because we cannot safely generalise to the next unique situation, there is
also a considerable gain in information made possible by the historical perspective.
We can look at the rich and diverse issues affecting each particular decision, because,
once generality is given up as a lost cause, we are no longer confined to variables that
can be generalised across cases and more attention can be given to the interactions in
particular situations. The relevant information set can expand to include the mayor’s
personality, the easement through the alleyway, the quality of the design, and the likely
results of the next election, etc., with considerable improvement of the model’s
precision. The dominant paradigm rejects most of this as merely descriptive.
Institutions and “institutional transactions.”
Historical School political economists “rejected the abstract modeling of classical
economics and promoted the detailed study of economic history in a severely
descriptive mode, without utilizing the theoretical concepts and propositions that had
come to dominate English political economy.” (Gordon, 1991:466) Bromley, a 1990’s
Institutional economist, points out that so called positive economics takes place in an
institutional framework crucial to its results. For example, efficiency measures begin
with prices, whereas prices change with institutionally determined preferences and
income distributions. (Bromley, 1989)
Real estate takes place in an institutional context crucial to determining outcomes.
Planning controls, financial regulation, public infrastructure, tax treatment of real
estate, and legal frameworks are important institutional factor affecting real estate.
Property itself is an evolving institution. Yet institutions and institutional change (what
Bromley calls “institutional transactions”) are omitted from most positive economics
research, as if prices adjusted in a vacuum. Once we accept the idea of process
change, institutions become one of the major sources of process variation over time. It
is the purpose of instutitions to change DGPs. Bromley summarises the institutionalist
point of view as “Rules matter.” The system dynamics paradigm also recommends
consideration of possible system design and policies changes. (Coyle, 1996)
37
Sometimes called “auxilliary conditions” or “necessary conditions”
4
Creative problem solving
Richard Feynemann, Nobel prizewinning physicist pointed out that the Popperian
account of hypothesis testing failed to account for the creative process necessary to
expand knowledge into the unknown. Peter Medawar (1996), a Nobel prizewinner in
medicine makes a similar point about the importance of innovation in science.
Innovation is also part of every real estate project--in design, finance, marketing, etc.
This is already a major shift from the positivist “dispassionately observe the data” way
of looking at the world. Real estate analysts, on the contrary, are expected to come up
with ideas that might be made feasible and then evaluate the likely consequences of
their invention.
Multi-disciplinary perspective
The holist looks for patterns and concepts “complex and rich in content, and their
content is close to ordinary experience with its emotive and subjective emphasis.”38.
Graaskamp’s insight that a comprehensive approach is needed would be obvious to
anyone whose money is at stake in a property investment. It is the total package in a
real estate project--design, finance, etc. that creates success. There is synergism
between good performance in one subset of the problem--say for example, site
selection--and success in other areas--eg attracting finance. Conversely, failure of any
subsystem could bring down the entire project.
Specialisation puts us in blinders when it comes to practical applications where results
depend on a wide array of issues, across many disciplines.
The narrowness of
“finance” or “economics” perspectives in evaluating real estate projects may explain as
well as anything why so many real estate projects fail. But real estate is about unique
projects that are active investments in imperfect markets, not passive cash cows. In
real estate, risks are actively managed.
Holistic or multi-disciplinary approaches encompass at least three different aspects:
1. Marshalling diverse specialised expertise as required. A developer might hire
consultants on soils, hydrology, market research, interest rate futures, macro-economic
forecasting, and so on, to improve information necessary to creating, understanding,
and projecting project outcomes, both to improve decisions and to improve the project
itself.
2. Taking a generalist’s overview to integrate these diverse kinds of information into
an overall picture of the project to inform the decisions encountered in the
development process.
3. Due to complexity, the developer will “satisfice” in Herbert Simon’s language,
rather than optimise.39 Graaskamp’s definition of project feasibility includes the phrase
“fit to a context of constraints.”
38
Diesing, cited in Reason and Rowan, 1981:186
“Satisficing,” a term coined by Herbert Simon in the 1960’s means accepting the first alternative
encountered that meets minimum criteria. The important insight here is that information costs and
time constraints make it impossible to optimise in most business settings.
39
5
For virtually any empirical “result” published in the academic literature, one could
imagine a set of conditions where the result would not hold. Only by putting
specialised studies in broader context can their full usefulness be obtained. Mellor
(1996) discusses this in terms of necessary and sufficient conditions. Gordon uses the
term “auxiliary conditions.” One cannot assume ceteris paribus. This makes it
necessary to shift to a case study approach to incorporate wider information into
modelling. The price is that one must give up generalisations.
Complexification searches--in applications details matter
If multi-disciplinary details have material impact on real world historical outcomes,
then rather than a model simplification search to find a general model, investigators (in
the German Historical School or Ely “look and see” method traditions) should perform
“complexification searches” to find all of the issues that will matter in a specific case.
Williams, a political scientist mindful that administrative details destroyed basically well
conceived Great Society programs, called this “the implementation perspective.”
Improving econometric methods
Much of the solution for quantitative modelling problems is to improve econometric
methods. Doing so may require combining statistical information with priors from
other methods including preliminary qualitative research.
Qualitative paradigm exploratory research
Tukey, Velleman, and others, espouse freeing the investigator to be creative40 by
allowing the data to suggest new hypotheses, as well as to test old ones.
An
exploratory approach goes back to the stage before data collection to examine the
process itself with few preconceptions. Key issues are identification of concepts and
variables and understanding of processes and relationships. From the infinite number
of possible issues to consider, the exploratory process attempts to create a provisional
model by sorting out the variables that matter and defining them in meaningful and
measurable ways. Early work should use a qualitative research paradigm (see below)
to assemble a wide range of what Forrester refers to as verbal, numerical, and mental
data.
Too often in social science research, a data set is acquired or even primary data is
gathered, by investigators with only a superficial knowledge of the relevant system.
The obvious hazard is that the system might operate quite differently than understood
by the investigator. Everything is context dependent, so we need to understand the
context. Baysian priors should be acquired through what qualitative paradigm writers
call “prolonged engagement.”
We see few published exploratory studies. Because emphasis is on statistical
innovations which give a leg up for publication, systems studied often receive less
attention from students than the software manual for the econometrics package. But
clearly if the hypotheses are not relevant--say the model excludes key variables
40
Paul Velleman, in the manual for his software package “Data Desk” advocates an interactive
relationship with data, citing Tukey.
6
operating in the real system--the research misses the mark.41 In addition to traditional
literature research, exploratory studies for applied research may include discussions
with knowledgable persons in the market, interviews with “experts” in relevant fields,
historical study, familiarisation with relevant institutional structures (examination of
laws, regulations, contracts, etc.), perhaps discussions with consumers and producers
in the market, wider reading of secondary materials, and general background studies.
As Richard Ely put it “Look and see,” until the system is at least partly understood
before proposing a model.
Data improvement and descriptive statistics
More attention should be devoted to identifying and defining relevant concepts and
variables and to collecting data. One often finds that information collected by sources
such as the census bureau is not precisely the data needed for applied research. Data
checking and cleaning are very important as are assessment of measurement errors.
How many of those who use census data in macro modelling, for example, have
explored sources of error in the data and obtained estimates of measurement errors?
Gujarati, for example, says that in the presence of measurement errors in the
explanatory variables, “OLS estimators are not only biased but also inconsistent, that
is, they remain biased even if the sample size N increases to infinity.”42 It is often
possible to measure better by redefining variables to improve "construct validity”,
gathering more data, spending more money to refine estimates of explanatory variable
values, or otherwise improving measurement.
Data improvement is an area where cooperation between industry sources of data and
academics who have time and expertise to more thoroughly analyse data will prove
fruitful. Industry/academic collaboration is also relevant to exploratory research and
wider multi-disciplinary perspectives.
Industry/researcher cooperation is also
important in other stages of the research process, particularly in implementation of
research results.
Once data are acquired, more descriptive and graphical preliminary statistical work
will help reveal possible relationships in the system. Exploration of lag structures is
particularly important given adjustment over time and expectations. In presenting
results of studies, exploratory steps should be described in some detail to justify
definitions of relevant concepts, choice of variables, etc.
State of the art quantitative methods
Most published studies fail to use the best possible quantitative practice. Ong and
McAleer found most econometric work reports too few diagnostic statistics and details
about specification search procedures to be credible. A menu of issues not often
adequately treated would include: Specification search steps, candidate variables,
exogeneity, checking lag structures and expectations (dynamics), interaction terms,
functional forms, stationarity testing, cointegration tests, diagnostic testing of
41
An economics professor, apparently thinking of the importance of bread in ancient Rome, used
wheat as an example of a commodity with inelastic demand. Any farmer could have told him about
least cost linear programs for chicken and hog rations. A small wheat price move can create a big
demand change, but not always--it depends on circumstances.
42
Damodar N. Gujarati (1988) Basic Econometrics: Second Edition, New York: McGraw-Hill
Publishing Company. p. 418
7
assumptions, checking for influential points and outliers, checking structural stability,
use of appropriate estimation methods. Meta-analyses are also important to find out
the range of results found by different researchers. Readers will be aware of the vast
econometrics literature discussing these issues.
Institutional context of social science research
A revised agenda for graduate students
The received “how to do research properly” instructions given to generations of
quantitative paradigm graduate students were roughly speaking:
1. Choose a topic--generally something your adviser is interested in and a topic
previously addressed in the literature.
2. Survey the literature, meaning previous quantitative paradigm published work
closely related to the topic.
3. Formulate testable (falsifiable) hypotheses--simple causal relationships.
4. Obtain data--often secondary sources, maybe a survey
5. Specification search and estimation
6. Diagnostic testing
7. Write up the dissertation following an outline including introduction, literature
review, methods, results, and conclusions.
Key words: Narrow focus, theory, data, estimation, hypothesis testing, objective (ie
value free, disinterested)
An Historical School recipe would be instead:
1. Identify a practical problem
2. Negotiate your position relative to the problem and relationships with other change
agents and stakeholders. Deconstruct the purposes and role of the research in light
of personal values and who will implement its findings.
3. Wallow in the problem qualitatively until some idea of relevant variables and
processes is obtained. This means talking to people with diverse perspectives on
the problem. Probably over 50% of total research time should be spent in this
“exploratory research” phase. The investigator should end up with informed
opinions (Baysian priors) or hypotheses about “how the system works,”
understanding of some of its complexities, reasons for the problem, potential
remedies, institutional issues, the “politics” of the situation, dynamics, sources of
data, etc. Qualitative research paradigm relevant jargon: prolonged engagement,
triangulation, confirmation, “thick data”, interpretivist approach, naturalistic
inquiry, systems.
4. Literature review--broader than the academic literature on a narrow topic. May
seek to find new methods or paradigms with relevance to the problem or ways to
combine diverse methods.
5. Data gathering--probably obtain primary data through surveys, interviews, etc.
Diverse, inter-disciplinary perspectives and data.
6. Model building--Choosing from a wide menu the method appropriate to the task at
hand: Simulation, econometrics, system dynamics, perhaps qualitative description.
8
An important criterion of model success is whether it can be understood and applied
in ways that might make a difference to solving the problem.
7. Model testing--System design and policies experiments, problem solving modelling
efforts
8. Propose problem solution--Seek to identify a practical course of action that can be
implemented.
Key words:
problem, qualitative research (to inform quantitative research),
specification search, system dynamics, system re-design, problem solving,
comprehensive, multi-disciplinary, combining paradigms, moral choice
Revising researchers’ roles and reward structures
Reason and Rowan43 provide several examples of “new paradigm” social research
shifting investigators’ roles from abstract observer to value driven participant in social
processes. Their “new paradigms” can be distinguished from conventional academic
research by a changed relationship between researcher and subjects, that is, a change in
the social role played by the researcher in relationship to the study.
Action research--the researcher is an advocate for some change so that information
gathering and what Bromley (1989) calls “institutional transactions” go hand in hand.
Holistic research--the researcher takes pains to integrate everything relevant to an
outcome into the model.
Participative research--the researcher sees herself as a participant in the process
studied.
Collaborative research--the research is designed and executed in collaboration with
subjects--as if an anthropologist went to a tribe and asked them to help him explore
changing childrearing practices.
Dialogue research--research as argument, counterargument, taking positions and
reacting to them to explore new ways of seeing a situation.
Endogenous research--the subjects themselves perform the research, perhaps with the
social scientist as an employee to carry out the work.
Polyocular research--consciously attempting to examine a problem from different
perspectives or paradigm standpoints.
Research cycles/Learning cycle/Cognitive cycle--based on theories of experiential
learning wherein action alternates with reflection, generalisation, and planning for new
action. This cycle allows corrections or learning based on experience.
“Truth” is taken as something to be actively explored and created through deeper
understanding and moral choice, rather than passively measured. In an analogy with
Heisenberg’s Uncertainty Principle of physics, social research “may alter the reality it
43
Reason and Rowan, psychologists, edited a 1981 collection entitled Human Inquiry: New
Paradigms for Research in the Human Sciences.
9
seeks to explain...subjects new understanding may lead them to change their
behaviour...(and moreover may) produce irreversible changes in the people
involved.”44 The researcher is clearly part of a social process and the subject of
research is an actor who also can think, react, and choose. Research becomes “a tool
for self-direction” where either the researcher or subjects of research can advocate a
“new definition of the situation.”45
David Sims comments that “I do not see research on social processes as being directed
to the making of ‘discoveries’ or to making general statements...(but rather to
producing) a way of understanding a situation that can be applied to other
situations...without any presumption that this collection of tentative notions will be
neat, reducible to a few easily communicated propositions which are internally
consistent..(it is)..a kind of clarification or consciousness raising or creative process.”46
Qualitative researchers Lincoln, Guba, Denzin, and others characterise statistical data
as “thin data” when it gives little insight into complex motives. They recommend
“prolonged engagement” with a topic from many perspectives and considerable
attention to subjective meanings, cognitive processes, and human purposes. They see
every study as a case study and look for “transferability” and “credibility” rather than
the generalisations or statistical significance of statistical work. Perhaps the biggest
difference between the quantitative and qualitative approaches is that the positivists
strive for detached, passive observation, while the qualitative researchers intervene to
solve problems in light of some beliefs or values.
My personal view is that qualitative and quantitative methods are complementary and
that methodological mutual respect is as valuable as racial or religious tolerance. Not
only are diverse methods valuable in themselves, combining methods may lead to
greater understanding.
The reward structures faced by researchers are an extremely important issue and a key
to diversification of methods. So long as journals publish only econometric positivist
paradigm papers and promotions depend upon publications, it is obvious what
academic researchers will do. A first step would be for journals to publish more
diverse work, or for new journals to provide a wider marketplace for non-positivist
methods. Researchers should get more credit for applied problem solving work such
as consulting or working to improve data even if the product is a consulting report,
useful software, or institutional innovation rather than a publication. Graaskamp’s
career suggests that the most useful research may be that which changes industry
practice, rather than research leading to academic journal publications.
Goal-seeking, feedback, error correction
Evolutionary biologists point out that the odds against present life forms are enormous,
until one realises that the process is not random. Life forms are always being tested as
44
L. Dave Brown “Participative Research in a Factory” in Reason and Rowan, eds., 1981: 314
Peter Reason “An exploration of the dialectics of two-person relationships” in Reason and Rowan,
eds., 1981:331
46
David Sims “From ethogeny to engogeny: how participants in research projects can end up doing
research on their own awareness” in Reason and Rowan, eds., 1981:381-383.
45
10
to their ability to survive and hence errors in design are weeded out and corrected, and
improvements are continuous. Human societies also exhibit error correction and goal
seeking behaviour through negative feedback loop control systems such as elections,
monetary policy, bankruptcies, or asset allocation strategies. Institutional changes,
such as deregulation or regulation, also are examples of feedback corrections to send
us towards desired outcomes.
Each of us carries a moral responsibility for our actions. The Achilles heel of science
may be its refusal to take responsibility, a monumental blind spot in an intellectual
paradigm having so much impact on our lives. John Kenneth Galbraith says neoclassical economics is the “influential and invaluable ally of those whose exercise of
power depends on an acquiescent public.” Emancipation of the state from economic
interest, Galbraith argues, “would be aided by emancipation of economic belief.”47
Joseph Schumpeter pointed out that “the advocate of some interest may yet do honest
analytic work..bluntly, advocacy does not imply lying.” Schumpeter added “Scientific
performance does not require us to divest ourselves of our value judgements. Gunnar
Myrdal agrees saying “There is no way of studying social reality other than from the
viewpoint of human ideals. A “disinterested social science” has never existed and, for
logical reasons, cannot exist...our very concepts are value laden.”48
Stigler, however, thinks the opposite point of view is so obvious that one need not
even present an argument on the subject: “it does not seem necessary to retread
familiar ground to show that economics as a positive science is ethically neutral.”49
John Maynard Keynes takes Myrdal’s side of the argument: “Economics is essentially
a moral science and not a natural science. That is to say, it employs introspection and
judgements of value.”50
Risk Management
One of the implications of admitting that processes are too disorderly to model is to
suggest a risk management program. Another implication for property is clearly that
the wisdom of an investment, and its likely outcome depend upon the investor’s
characteristics as well as the property’s performance. An investor with a diversified
portfolio would evaluate a risky investment differently than a non-diversified investor,
and so on.
Graaskamp listed a number of means whereby risks can be shifted by contracts,
hedging, insurance, etc. Investigators may choose to examine the consequences of
uncertainty through sensitivity analyses or Monte Carlo methods to simulate
distributions of possible outcomes.
47
Quoted in Bowles, Samual, Richard Edwards, and William G. Shepherd (1989) Unconventional
Wisdom. Boston: Houghton, Mifflin Company, pp. iix-ix.
48
Gunnar Myrdal, quoted by Karl Klapholz in “Value Judgements and Economics” in Hausman, op.
cit. p. 277.
49
George Stigler quoted by Karl Klapholz in “Value Judgements and Economics” in Hausman, op.
cit. p. 276.
50
John Maynard Keynes “Letter to Roy Harrod (1938)” reprinted in Hausman, op. cit. p. 301.
11
Conclusion
A post-Newtonian social science research paradigm would envision muddling through
a complex, indeterminate, historical process on a case by case basis, not finding general
theories of a mechanistic, deterministic, time invariant world. Newtonian science
began with goals of understanding and control. A new paradigm begins by conceding
the limits of understanding and control, the reality of environmental constraints. In this
ambiguity, it must start by examining values, goals, and objectives--outcomes result
from creative processes involving moral choices and purposeful actions, not inevitable
states of nature we can only passively observe. Because we cannot understand
completely, we must be willing to settle for pragmatic marginal improvements, accept
risks, and correct inevitable mistakes as they occur or change systems as old solutions
lead to new problems. This includes tinkering with institutions or system design in
response to perceived problems as they arise. Goals include risk management and
survival.51 The ideology is one of humility, accommodation, harmony, and moral
responsibility rather than self-defeating illusions like conquest of nature and control of
other humans.52 Diverse research methods and social roles for researchers should be
encouraged, because we do not know which approaches will pay off, and because of
the interesting possibilities for synergy between approaches. Diversifying the reward
structures of university researchers is essential so that they can find rewards for
research bridging the gap between abstract academic work and applications.
When 17th century telescopes forced abandonment of orderly Ptolmaic cosmology,
Church and secular authorities feared the consequences of this breakdown of perceived
orderliness in physical (and by implication, social) relationships. But it turned out that
conceding the empirical fact that disorder exists, led to much increased understanding
and control. But even the idea of “control” as a goal for social science is questionable.
Callaway says it is possible that if the goal is human freedom and autonomy “control
spoils the results.”53 Margins of safety in the human enterprise are increased by
conceding the limits of our ability to understand, predict, and control complex
processes. In applied real estate research, “The primary interest is not in generalizing
51
One of Graaskamp’s most brilliant insights was his observation that survival (in business, cash
solvency) is a more fundamental goal than wealth maximisation.
52
There are no utopias. Richard Ely’s pragmatic mixed economy has proven in the 20th century to be
a more successful economic system than the purer, more ideological visions (but less implementation
detail oriented) of Adam Smith or Karl Marx.
53
Helen Callaway “Womens’ Perspectives: Research as Re-vision” pp. 457-471in Reason and
Rowan, eds. op. cit. p. 469.
12
to other settings, but rather in applying knowledge to improve actors’ effectiveness in
the situation under study.”54
54
William R. Torbert “Empirical, Behavioural, Theoretical, and Attention Skills necessary for
collaborative inquiry.” pp. 437-446 in Reason and Rowan, Eds. op cit p. 442.
13
Bibliography
Appleyard, B. (1992). Understanding the Present: Science and the Soul of Modern Man.
London, Pan Macmillan, Ltd.
Barrett, W. (1979). The Illusion of Technique. New York, Anchor Press/Doubleday.
Blaug, M. (1980). The Methodology of Economics. Cambridge, Cambridge University Press.
Borgman, A. (1984). Technology and the Character of Contemporary Life. Chicago,
University of Chicago Press.
Box, G. E. P. (1989). When Murphy Speaks- Listen. Wisconsin., Center for Quality and
Productivity Improvement.
Box, G. E. P., W. G. Hunter, et al. (1978). Statistics for Experimenters. New York, John
Wiley and Sons.
Boyd, R., P. Gasper, et al. (1991). The Philosophy of Science. Cambridge Massachusetts, MIT
Press.
Brodbeck, M. (1968). Readings in the Philosophy of Social Sciences. New York, Macmillan.
Bromley, D. (1989). Economic Interests and Institutions: The Conceptual Foundations of
Public Policy. Oxford, Basil Blackwell Inc.
Caldwell, B. J. (1982). Beyond Positism: Economic Methodology in the Twentieth Century.
London, George Allen & Unwin.
Casti, J. L. (1993). Complexification. New York, Harper Collins.
Cornish, E. e. a. (1977). The Study of the Future. Washington, World Future Society.
Daly, H. E. and J. B. Cobb, Jr. (1989). For the Common Good. Boston, Beacon Press.
Denzin, N. K. and Y. S. Lincoln (1994). Handbook of Qualitative Research. London, Sage
Publications.
Dey, I. (1993). Qualitative Data Analysis. London, Routledge.
Diaz, J. (1993). “Science, Engineering, and the Discipline of Real Estate.” Journal of Real
Estate Literature 1: 183-195.
Ely, R. T., R. H. Hess, et al. (1971). The Foundations of National Prosperity: Studies in the
Conservation of Permanent National Resources, The Macmillan Company.
Epley, D. R. (1996). “The Current Body of Knowledge Paradigms Used in Real Estate
Education and Issues in Ned of Further Research.” Journal of Real Estate Research (Special
Ten Year Anniversary Issue): 229-236.
Friedman, M. (1953). The Methodology of Positive Economics. Essays in Positive Economics.
Chicago, University of Chicago Press.
Fudenberg, D. and J. Tirole (1992). Game Theory. Cambridge, MA, MIT Press.
Gordon, S. (1991). The History and Philosophy of Social Science. London, Routledge.
Graaskamp, J. A. (1988). Feasibility Analysis Lecture Transcriptions. Madison, Graaskamp
Archives.
Griffiths, W., R. C. Hill, et al. (1993). Learning and Practicing Econometrics. New York, John
Wiley and Sons.
Hall, N. (1991). Exploring Chaos. New York, W.W. Norton and Company.
Handy, C. (1995). Beyond Certainty. London, Random House.
Hausman, D. M. (1984). The Philosophy of Economics. Cambridge, Cambridge University
Press.
Hausman, D. M. (1989). “Economic Methodology in a Nutshell.” Journal of Economic
Perspectives 3(2): 115-127.
Hembel, C. (1965). Aspects of Scientific Explanation. Englewood Cliffs, NJ, Prentice-Hall.
Horgan, J. (1993). “Profile:Feyerabend Paul Karl.” Scientific American May: 16-17.
Kerr, I. (1996). The Linkages between Real Value and Economic Welfare. Annual European
Conference on the Hisotry of Economics, Lisbon.
Keynes, J. M. (1939). “Professor Tinbergen's Method.” Economic Journal 49: 558-568.
14
Kreuzenkamp, H. A. and M. McAleer (1995). “Simplicity, Scientific Inference and
Econometric Modelling.” Economic Journal 105: 1-21.
Kuhn, T. S. (1970). The Structure of Scientific Revolutions. Chicago, University of Chicago
Press.
Lakatos, I. and A. Musgrave, eds. (1970). Criticism and the Growth of Knowledge. London,
Cambridge University Press.
Lincoln, Y. S. and E. G. Guba (1985). Naturalistic Inquiry. Beverly Hills, CA, Sage
publications, Inc.
Lorenz, K. (1983). The Waning of Humaneness. London, Unwin Paperbacks.
Makridakis, S. and S. C. Wheelwright (1989). Forecasting Methods for Management. New
York, John Wiley and Sons.
McAleer, M. (1994). “Sherlock Holmes and the Search for Truth: A Diagnostic Tale.” Journal
of Economic Surveys 8(4): 317-370.
McAleer, M. (1996). Dialogue on specification searches.
McKibben, B. (1993). The Age of Missing Information. New York, Penguin Books.
Meadows, D. H., D. L. Meadows, et al. (1992). Beyond the Limits. Post Mills, VT, Chelsea
Green Publishing Company.
Medawar, P. (1996). The Strange Case of the Spotted Mice. New York, Oxford University
Press.
Mellor, H. (1995). Chance, Sufficiency and Necessity. London, Cambridge University Press.
Mills, C. W. (1959). The Sociological Imagination. New York, Oxford University Press.
Morrison, F. (1991). The Art of Modeling Dynamic Systems. New York, Wiley Interscience.
Olson, M. (1971). “The Logic of Collective Action:Public Goods and the Theory of Groups.”
Harvard Economic Studies CXXIV.
Peltonen, M. (1996). The Cambridge Companion to Bacon. Cambridge, Cambridge University
Press.
Phillips, E. M. and D. S. Pugh (1994). How to Get a PhD. Buckingham, Open University
Press.
Pindyck, R. S. and D. L. Rubinfeld (1991). Econometric Models and Economic Forecasts.
New York, McGraw-Hill, Inc.
Plato (1975). Phaedo. Oxford, Oxford University Press.
Popper, K. (1959). The Logic of Scientific Discovery. London, Hutchinson.
Popper, K. R. (1982). The Open Universe: An Argument Against Determinism. London,
Routledge Press.
Reason, P. and J. Rowen (1981). Human Inquiry: New Paradigms for Research in the Human
Sciences. Chichester, John Wiley.
Renshaw, E. (1991). Modelling Biological Populations in Space and Time. Cambridge,
Cambridge University Press.
Rosenberg, A. (1988). Philosophy of Social Science. Boulder, CO, Westwiew Press.
Senge, P. M. (1990). The Fifth Discipline. Sydney, Random House Australia.
Simon, H. A. (1969). The Sciences of the Artificial. Cambridge, MASS., MIT Press.
Tilman, R. (1993). A Veblen Treasury. Armonk, NY, M.E. Sharpe.
Tuchman, B. (1984). The March of Folly. London, Sphere Books Ltd.
Weinberg, G. (1975). An Introduction to General Systems Thinking. New York, John Wiley
and Sons.
Weiss, M. A. (1989). Richard T. Ely and the Contribution of Economic Research to Home
Ownership and Housing Policy. Cambridge, MIT Center for Real Estate Development.
Williams, W. and R. F. Elmore (1976). Social Program Implementation. New York, Academic
Press.
Winch, P. (1958). The Idea of a Social Science. London, Routledge.
Ziman, J. (1976). The Force of Knowledge. London, Cambridge University Press.
15
Forword
This paper began 35 years ago on the side of an unexcavated pyrimid at Tikal, a Mayan
ruin in Guatemala. A huge yellow and black spider spinning on the ruins of the great
abandoned city spoke saying “One day I may spin on the ruins of your civilisation.”
Clearly the Mayan intellectuals were brilliant--their 352 year calender allowed them to
predict celestial events far into the future with great precision. Stargazing was even
useful--it helped time planting of crops. But as they stood on the pyrimids gazing at the
beauty and order of the stars, the city’s water supply and soil were failing, and a brutal
military dictatorship cut out peoples’ hearts in human sacrifices. So this paper starts
with questions like: “Sure, econometrics is fun, but what does it have to do with the
water supply?” And “Is neo-classical positivist economics like Mayan astronomy,
lovely, wonderful, fascinating, but unable to solve society’s fundamental economic
problems?”
If you think that economists understand economic processes, humanities’ institutions
are functioning well, and the future is bright, this paper is not for you, in fact it may
strike you as nonsense. You may be more inclined to find the paper useful if you are
troubled by a billion people living in poverty, increasing gaps between rich and poor,
and failures of institutions like the media and governments to address these problems.
This paper is based on concern that science is not only failing to solve problems of
degradation of plantary life support systems--population growth, soil loss, air and
water pollution, species extinctions, toxics, climate change, depletion of non-renewable
resources, risk of biological or nuclear warfare--science has contributed to creating
most of these problems. Like the Medieval Church in the face of the Black Plague, the
dominant intellectual paradigm of our age does not even allow us to ask the questions
relevant to solving the major problems facing humanity.
16