Download Notions related to Tarski`s A decision method for elementary algebra

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Factorization of polynomials over finite fields wikipedia , lookup

Structure (mathematical logic) wikipedia , lookup

History of algebra wikipedia , lookup

Fundamental theorem of algebra wikipedia , lookup

Transcript
Hourya Benis Sinaceur
[email protected]
Notions related to Tarski’s A decision method for elementary algebra and geometry (to
be improved and completed with regard to the content as well as to the formulation)
I.
Mathematical notions
Algebraic closed field: a field K is algebraically closed iff any irreducible polynomial with
coefficients in K has a degree equal to 1.
Analytical variety: subset of the vector space Cn which is locally defined as the set of points
(z1, z 2,…, zn) making =0 a finite number of analytical functions.
Arithmetically definable: elementarily definable, i.e. definable in the language of first-order
logic. “A set A of real numbers is called arithmetically definable if there is a formula  in the
[first-order] system [of real algebra] containing one free variable and such that A consists of
just those numbers which satisfy ”.
A set of real numbers is arithmetically definable iff it is set-theoretical sum of a finite number
of intervals with algebraic end-points (DM, note 13, p. 53)
Completeness: a first-order consistent theory T is complete iff every sentence of T is either
provable or refutable, or iff every arithmetically definable property which holds in one model
M of T also holds in any other model M’ of T. One can also say that any two models of T are
elementarily equivalent (DM, note 15, p. 5’ and supplementary note 7, p. 62). The latter
definition of completeness allows to formulate Tarski’s transfer principle, which permits to
generalize to any real closed field every theorem of the elementary theory of the ordered field
of of real numbers.
Consistency: a first-order theory T is consistent iff T has a model or
iff for any first-order sentence , if  is a theorem T, then the negation of  is
not a theorem of T.
Constructible number: “a number which can be obtained from the number 1 by means of
the rational operations, together with the operation of extracting square roots” (DM, note 21,
p. 57). This is the meaning of Hilbert’s constructible numbers in The Foundations of geometry
(first ed. Teubner, 1899).
Continuity axiom schema: expresses the fact that (DM, note 9, p.49)
- every rational integral function (i.e. polynomial function) which is positive at one
point and negative at another, vanishes at some point in between, or
- every set of real numbers which is bounded above ha a least upper bound, or
- every positive number has a square root and every polynomial of odd degree has a
zero
Decision machine: computer
1
Decision method: “By a decision method for a class K of sentences (or other expressions) is
meant a method by means of which, given any sentence , one can always decide in a finite
number of steps whether  is in K” (p.1)
“When we say that there is a decision method for a certain theory, we mean that that there is a
decision method for the class of true [my underlining] sentences of the theory” (p.1).
“When dealing with theories presented as formal axiomatized systems, one often uses the
term ‘decision method’ … by referring it to all theorems of the theory” (Note 1, p. 37).
Compare with the definition given in Tarski, Mostowski, R.M. Robinson, Undecidable
theories, North-Holland, 1953, p. 3 : “By a decision procedure for a given formalized theory
T we understand a method which permits us to decide in each particular case whether a given
sentence formulated in the symbolism of T can be proved by means of the devices available in
T (or, more generally, can be recognized as valid in T).”
Formally, there exists a decision method for the class A of expressions iff the set of numbers
correlated by some one-to-one mapping with the expressions of A is general recursive.
Decision problem: “By a decision problem for a class K we mean the problem of finding a
decision method for K.”
Elementary sentence or property: sentence or property which is formulated in the language
of first-order logic
Elementary algebra: “By elementary algebra [of real numbers] we understand that part of
the general theory of real numbers in which one uses exclusively variables representing real
numbers, constants denoting individual numbers, like ‘0’ and ‘1’, symbols denoting
elementary operations on and elementary relations between real numbers, like ‘+’, ‘.’, ‘-‘, ‘>’
and ‘=’ and expressions of elementary logic such as ‘and’, ‘or’, ‘not’, ‘for some x’, and ‘for
all x’. Among formulas of elementary algebra we find algebraic equations and inequalities;
and by combining equations and inequalities by means of the logical expressions listed above,
we obtain arbitrary sentences of elementary algebra” (DM, p. 2).
“In elementary algebra we do not use variables standing for arbitrary sets or sequences of real
numbers, for arbitrary functions of real numbers, and the like”. ‘Elementary’ refers to “this
abstention from the use of set-theoretical notions”.
Elementary geometry: “by a sentence of elementary geometry we understand one which can
be translated into a sentence of elementary algebra by fixing a coordinate system” (DM, p. 3).
First order logic: it comprises the sentential calculus + the predicate calculus involving only
variables for individuals. Namely a language L of first-order logic consists in :
1. logical symbols:
- Variables of individuals v1, v2, … (the set of variables is countable)
- Sentential connectives such as :  (conjunction),  (negation)
- Quantifiers such as  (for any) and  (there exists)
- Punctuation symbols : ( , )
2. mathematical symbols:

- Relations symbols Rn 
, n ≥1


- Functions symbols f of n variables, n ≥1
- Individual constant symbols such as c1, c2,… (the set of constants may be finite or
countable)
One is used to write, for instance, L = {>, =, +,  , -, 0, 1} for the language of elementary
algebra stated by Tarski.

2
First-order formal system (first-order axiomatized system) for a mathematical theory:
We just have to add to the language of first-order logic mathematical axioms and rules of
inference written in this language. Among mathematical axioms we find in Tarski’s system of
elementary algebra the axioms which characterize the set of real numbers as a commutative
ordered field plus the continuity-schema (DM, note 9, p. 49-50, see also Tarski 1967, § 1).
Among rules of inference we can have the modus ponens (or rule of detachment) and the
substitution rule.
Gödel arithmetization: consists in correlating, in a definite manner, natural numbers to
symbols, sequences of symbols, sequences of sequences of symbols, etc. of a formal system.
Gödel’s completeness theorem (1930): every formula of first-order logic is either refutable
(i.e. its negation is provable) or satisfiable.
Gödel’s incompleteness theorems (1931):
Theorem 1 : in any formal system S in which any sentence of elementary arithmetic is
expressible there is an undecidable (i.e. neither provable nor refutable) sentence
Theorem 2 : the consistency of a formal system S in which elementary arithmetic is
expressible cannot be proved in S provided S is consistent
Lower predicate calculus (DM, note 9, p 49): first-order logic (in which variables stand for
individuals, and never for sets or sequences of individuals).
Quantifier elimination: an elementary theory T (elementary algebra or geometry) admits
quantifier elimination iff given any formula  of T, we can find a formula  of T which is
logically equivalent to , and contains no quantifiers and no free variables besides those
already occurring in . In particular a procedure of quantifier elimination enables us to find
for any sentence an equivalent sentence without quantifiers.
Recursive:
A function f is general recursive if we have a finite procedure for calculating the
value of the function. We can also have the following definition :
One takes as arithmetical language the language L with the relations = and > and the
operations + and . , and adds a constant cn for every natural number n  N. A formula of this
language is a -formula if it contains no negation symbol, no implication symbol, and no
universal quantifier. Then a function f : Nn N is recursive iff there is a formula of L
F(x1, x 2, …, x n, xn+1) with n+1 free variables such that :
f (a1, a 2, …, an) = an +1 iff F(a1, a 2, …, an , an +1) is satisfied in the standard arithmetic model.
 enumerable subset of the set of natural numbers is the set
Recursive enumerable: a recursive
of positive values of some polynomial with integer coefficients. Or a set is recursive
enumerable iff it is definable by a -formula of the language described above.
Real closed field: a field K is real closed iff
- every element of K has a square root and
- every polynomial of odd degree, with coefficients in K, has at least one root in K.
3
Semi-algebraic sets : (note 13, p. 52-53) “finite sums of finite products of elementary
algebraic domains”. Tarski defines an elementary algebraic domain as the set of points of Rn
the coordinate of which satisfy a given algebraic equation or inequality.
Arithmetically definable sets of sequences of real numbers.
Sturm’s theorem: division procedure which is formally similar to Euclid’s algorithm and
permits to calculate the number of real roots of a given polynomial on a given interval (see
DM, note 12, p. 50-52).
II.
Mathematical themes
Abstract algebra: study of algebraic structures such as the structure of commutative groups,
the structure of ordered fields, the structure of nœtherian rings, etc.
Algorithms: an algorithm is a calculation procedure. One knows Euclid’s algorithm (4th
century B.C.) which is a division procedure : given two rational numbers a and b, a < b and
b≠ 0, there exists one and only one pair of rational numbers q and r such that a = bq + r, with
0 ≤ r < b. a and b are relatively prime iff r ≠ 0. In the same way we can take two polynomials
A and B≠ 0, with coefficients in a field K. There exists exactly two polynomials Q and R with
coefficients in K, such that A = BQ + R, d°( R)< d°( B). A and B are relatively prime iff R≠ 0.
We know also Sturm’s algorithm (1829), which uses a modified form of Euclid’s algorithm
for polynomials to determine the exact number of real roots of a given polynomial in a given
real interval [,].
Formally, an algorithm is a finite sequence of predetermined operations which are performed
according a finite number of prescriptions and succeed to solve systematically by calculation
any problem of a given type.
From the logical point of view an algorithm is a decision method.
One speaks of an “algorithmic trend” in mathematics. This trend corresponds to the aim to
obtain numerical results for the solution of a given problem and not to be content with a
general proof stating that the solution exists. At the end of 19th century Paul Gordan (18371912) and Leopold Kronecker (1823-1891) championed the algorithmic point of view in
opposition to the axiomatic and set-theoretical point of view advocated by Richard Dedekind
(1831-1916) and David Hilbert (1862-1943), among others.
Complexity: the complexity of a procedure is the time a computer needs to carry out the
procedure. This time can be function of various parameters (the number of variablesin a
formula, the number of alternations of quantifiers, etc.) One distinguishes between
polynomial complexity and exponential complexity. Elementary algebra and geometry have
an exponential complexity, which means that Tarski’s decision method is not practically
effective.
Decision problem: the decision problem for the whole of mathematics was posed in 1928 by
Hilbert, who considered it as the main task of a new branch of mathematical logic to which he
gave the name of “metamathematics”. The solution of this general problem has been proved
negative. Indeed, Gödel’s first incompleteness theorem (1931) and Church’s and Rosser’s
results (1936) showed the undecidability of any system which encompasses elementary
arithmetic of natural numbers.
4
Model theory: is the study of the mutual relations between sets of sentences of a given
formal first-order language and classes of models of those sets.
“Knowing the formal structure of axiom systems, what can we say about the mathematical
properties of the models of the systems; conversely, given a class of models having certain
mathematical properties, what can we say about the formal structure of postulate systems by
means of which we can define this class of models? As an example of results so far obtained I
may mention a theorem of G. Birkhoff (Proceedings of the Cambridge Philosophical Society
31, 1935, 433-454), in which he gives a full mathematical characterization of those classes of
algebras which can be defined by systems of algebraic identities.” (Tarski, Collected Papers,
IV, Birkhäuser, p. 714)
Real algebra: in the works of the American Algebra School (Edward V. Huntington, L.E.
Dickson, E.H. Moore) of the beginning of the 20th century this expression meant the algebraic
abstract structure of real numbers considered as an ordered field satisfying the least upper
bound property (Huntington, A set of postulates for real algebra, comprising postulates for a
one dimensional continuum and for the theory of groups, Transactions AMS 6, 1905, 17-41).
By ‘real algebra’ Artin and Schreier (Algebraische Konstruktion reeller Körper, 1926)
understood the theorems which hold in any real closed field.
Real root counting: classical numerical analysis was essentially devoted to conceive of
devices permitting to separate complex roots from real roots of a polynomial of degree n and
to locate the real roots on the real line. Several devices permitted to find a natural number k
such that the number m of real roots is ≤ k. Sturm’s algorithm permits to calculate m.
Semi-algebraic geometry: is the study of subsets of Rn which are defined by polynomial
equations or inequalities.
Undecidability: a theory T is undecidable if there exists no decision method for T. When we
have or search for a decision method, we just need an intuitive understanding of what is a
decision method. For proving an undecidability result one needs a formal definition of
‘decision method’. The formal definition was given by Church (1936) trough the notion of
general recursive function and the thesis according to which calculability and recursiveness
are equivalent.
By Gödel numbering we can map the set of sentences of a (countable) language L into the set
of natural numbers N. The image of a theory T written in L is a subset M of N. T is
undecidable iff the characteristic function  of M is not general recursive.
Tarski distinguished between undecidability and essential undecidability (1953). A theory T is
essentially undecidable if 1) it is undecidable and 2) every consistent extension of T, which
has the same constants as T, is undecidable. Example : Peano’s arithmetic is essentially
undecidable (Rosser, 1936).
The undecidability of a theory T1 can also be obtained by an indirect method. On can try to
show that either (i) T1 can be obtained from an undecidable theory T2 by deleting finitely
many axioms from the axiom system of , or else that (ii) some essentially undecidable theory
T2 is interpretable in T1. By applying (i) and taking a fragment of Peano’s arithmetic (suitably
modified) for T2,Church proved (1936) the first-order logic is undecidable.
III.
Historical notions
5
Metamathematics: Hilbert used this term for the first time in his 1922 paper :
Neubegründung der Mathematik. Erste Mitteilung, in Hilbert’s Gesammelte Abhandlungen, t.
III, 1935, Berlin, 157-177. For Hilbert ‘metamathematics’ and ‘proof theory’ were
synonymous. Hilbert’s program focused metamathematics on the syntactic study of
mathematical proof in the formal frame of axioms systems and on the search of “finitary”
proofs for the consistency of the first-order theory of natural numbers and the first-order
theory of real numbers. Roughly, ‘finitary’ meant that no actual or completed infinity was to
be used; only potentially infinite collections were allowed.
Tarski widened the scope of metamathematics, which no longer coincided with proof theory
and the search for finitary consistency proofs. First, Tarski introduced formal semantics as a
domain of metamathematics. The metamathematical study of mathematical theories uses also
both the syntactical study of axioms systems for those theories and the semantic study of the
models of these axioms systems (a model is defined in terms of the semantic notions of truth
and satisfiability). Secondly, in his practice, Tarski did not hesitate to use non finitary and
impredicative methods and he admitted first-order logic with infinitely long expressions.
IV.
Historiographical Themes
Abstraction: characteristic way of mathematical thinking, much developed since the 19th
century. This way is very well illustrated by Dedekind’s foundation of natural numbers (Was
sind und was sollen die Zahlen?, Vieweg, 1888, first English translation, The Open Court
Publishing Company, 1901). Dedekind set an axiom system (commonly known as DedekindPeano system) which characterizes not only the set N of natural numbers, but any “simply
infinite” (i.e. countable) set, totally ordered by a linear order. It is worthwhile quoting
Dedekind’s passage (Definition 73): If in the consideration of such a set, “we entirely neglect
the special character of the elements; simply retaining their distinguishability and taking into
account only the relations to one another in which they are placed by the order-setting
transformation , then are these elements called natural numbers or ordinal numbers or
simply numbers, and the base-element 1 is called the base-number of the number-series N.
With reference to this freeing the elements from every other content (abstraction) we are
justified in calling numbers a free creation of the human mind.”
Arithmetization: One gathers under this name many different endeavours which took place
in the 19th century. The general aim was to reduce geometric, analytic or even algebraic
procedures to arithmetical calculations. One part of efforts was devoted to show how to
express geometrical phenomena such as continuity of functions or convergence of series
without using the notion of infinitesimal. We can mention the achievements of B. Bolzano
(1781-1848), which were remarkable but generally unknown (they have been rediscovered in
the beginning of the 20th century), and the extensive work of A.L. Cauchy (1789-1857), who
had a direct and huge influence on the French Analysis School. The different definitions of
the set of real numbers by K. Weierstrass, C. Méray, G. Cantor, R. Dedekind (1831-1916) are
related to this effort of eliminating infinitesimals. Another, more general, aspect was
represented by mathematicians like P. Lejeune Dirichlet (1805-1859), R. Dedekind, or L.
Kronecker (1823-1891), who advocated explicitly the requirement that every theorem of
algebra and higher analysis can be expressed as a theorem about natural numbers.
Axiomatization: an axiomatized theory is a theory presented in the following way: one has
singled some statements, assumed to be true and called axioms, from which one can derive
(prove) any true sentence of the theory. Euclid’s Elements (4th century B.C.) are the most
6
ancient example of an axiomatic presentation of arithmetic and geometry. Modern axiomatics
flourished from the 19th onward. Hilbert’s Foundations of geometry (1899) is considered as
the exemplary realization of the modern style. The main difference from ancient axiomatics
consists in the fact that the primitive geometrical elements (point, line, plane, etc.) have no
intrinsic meaning; they are defined by sets of properties stated by axioms, and an axiom
system can be satisfied (or interpreted) by entities of different nature. For instance, the
ordered field of real numbers and the set of points of the Cartesian plane satisfy the same
axioms. Another important characteristic of Hilbert’s conception is the introduction of metaquestions about an axiom system: are the axioms independent of each other (i.e., is the system
minimal)? Is the system consistent? Is the system maximal consistent (i.e. complete)?, etc.
In The Foundations of geometry, Hilbert did not use a formal language with logical symbols
and pre-definite logical rules of inference. The development of symbolic logic, notably thanks
to the works of G. Boole (1815-1864), G. Frege (1848-1925), E. Schröder (1841-1902), G.
Peano (1858-1932), A.N. Whitehead (1861-1947) and B. Russell (1872-1970), and Hilbert
(from 1900 onward), transformed the notion of axiomatic system so that it involved the
explicit consideration of a (first-order) formal language. In particular, for proving his
incompleteness theorems, K. Gödel (1906-1978) took as base the system called ‘type theory’
and stated by Russell and Whitehead for the whole of mathematics in Principia mathematica
(1910-1913). Similarly, A. Tarski took his departure point from a suitably modified version of
this system, mainly in the works of his Polish period and especially in the 1931 paper on
definable sets of real numbers.
Effectivity: The contrast between effective methods, i.e. calculation procedures, and abstract
methods, e.g. the indirect proof of the irrationality of 2 , is ancient; but in the 19th century
this contrast was strongly stressed by some mathematicians. J. Fourier (1768-1830) insisted
repeatedly on the “effective” methods for solving numerical equations, and he contrasted
them with the a priori and general methods of J.L. Lagrange. By an effective method Fourier
understood “a limited number of operations”,“whose nature is determined in advance”, and
which are ordered such that the result of the last operation is the value of a root of the given
equation. Fourier stressed that the procedure proceeds always in the same way and is
uniformly applied to different equations. It must be easy and rapid. According to Fourier’s
description, an effective method is nothing else than an algorithm, although Fourier did not
use this term. Surprisingly, Sturm who studied with Fourier, did not champion explicitly the
effective methods, even though he found out an algorithm much more efficient than that of
Fourier for calculating the number of real roots of a polynomial.
Later, Leopold Kronecker (1823-1891), partially inspired by Sturm’s algorithm, advocated the
effective methods and the reduction of mathematical procedures to arithmetical operations on
natural numbers against the Cantorian actual infinite and the Hilbertian general existence
proofs, e.g. the first proof of the finite basis theorem (1888). According Kronecker, a
procedure is effective if it permits to obtain a numerical value by applying a finite number of
times a finite number of operations determined in advance. Kronecker’s criticism influenced
Hilbert. One the one hand, Hilbert found out an effective (constructive) proof for the finite
basis theorem. On the other hand, and more generally, he conceived of a program for
justifying by finite methods the use of Cantorian set theory. One essential aim of this finitist
program was the search for constructive consistency proofs for different mathematical
theories, and first and foremost for the elementary arithmetics (solved negatively by Gödel).
Profiting from the legacy of the algebra of logic (developed by Boole and Schröder among
others), Hilbert posed also the decision problem in terms of effective logical operations,
which would proceed just as numerical calculations on integers. Tarski’s decision method for
the elementary theory of real numbers, which is a suitable generalization of Sturm’s
7
algorithm, is an effective method. That means that this method can in principle be performed
by a machine in a finite interval of time. However, it has been proved that this method has an
exponential complexity so that it is, practically, not effective.
Institutions: The rand Corporation played a central role in the 1948 version of Tarski’s paper,
the first version of which had the title: “The completeness of elementary algebra and
geometry”. J.C.C. McKinsey “who was at that time working with the Rand Corporation, wad
entrusted with the task of preparing the work for publication…As was expected [the new
monograph] reflected the specific interests which the Rand Corporation found in the results…
[it brought] to the fore the possibility of constructing an actual decision machine. Other, more
theoretical aspects of the problems discussed were treated less thoroughly, and only in notes”
(DM, 1951 Preface).
Logic and mathematics: It is not possible to give a comprehensive account of the whole
history of the relations between logic and mathematics. I just want to sketch a few typical
trends, which constituted the development of mathematical logic from the 19th century
onwards.
- The algebra of logic consists in applying the symbolic and formal method of algebra
to logic with the aim at reducing reasoning to calculating numerical values of
algebraic equations. In this way were created the sentential calculus and the calculus
of classes, which are in some sense two realizations of Leibniz’ calculus ratiocinator.
One main source of this trend originates in A. De Morgan treatise: Formal Logic: or,
The calculus of Inference, Necessary and Probable (1847) and in Boole’s works: The
Mathematical Analysis of Logic (1847), The Calculus of Logic (1848), an An
Investigation of the Laws of Thought, on which are founded the Mathematical
Theories of Logic and Probabilities (1854). According to Boole, his 1854 work
presented “the mathematics of human mind”. Subsequent developments of algebra of
logic were made by C.S. Peirce (1835-1914), E. Schröder (1841-1902) whose
Vorlesungen über die Algebra der Logik (3 vol., 1890/91/95) had a deep influence in
the German speaking countries, L. Löwenheim (1878-1957), T. Skolem (1887-1963).
- Quantification theory and logical analysis of mathematical inference was conceived of
by G. Frege in his revolutionary work: Begriffsschrift, eine der arithmetischen
nachgebildeten Formelsprache des reinen Denkens (1879). According to Frege, the
formal language of quantification theory (first- and second-order) makes mathematical
reasoning logically rigorous and, in particular, free from any psychological ingredient.
The Begriffsschrift introduced what could be described as a lingua characteristica (in
Leiniz’ sense) for logic; in other words, Frege introduced the modern notion of logical
formal language and distinguished between first-order language and higher-order
languages. The third part of the Begriffsschrift and, in a more accessible way, Die
Grundlagen der Arithmetik (1884), proposed a logical theory of the successor function
and of the principle of complete induction (inference from n to n+1). Frege thought
that the whole of mathematics was reducible to arithmetics, which was in turn
reducible to purely logic relations and inferences. This is the logicist reduction of
mathematics to logic, a view formally developed in Die Grundgesetze der Arithmetik
(1893) and supported also by Bertrand Russell in The principles of mathematics
(1903) and in Principia Mathematica (co-authored by Whitehead). Technical results
of the logicist view became common patrimony, while the alleged epistemological
priority of logic was submitted to harsh criticism by mathematicians like H. Poincaré
(1854-1912), L.E.J. Brouwer (1881-1966), and many others. For his part, Hilbert gave
8
-
-
-
a system of quantification theory, which was independent from any epistemological or
ontological commitment (Hilbert and Ackermann, 1928).
Hilbert’s proof theory did consider logic as calculus (in the sense of algebra of logic),
as well as language (in the sense of Frege’s theory of inference and quantification).
The paradoxes of set theory urged Hilbert to find out a consistency proof for
arithmetics, a term by which he understood the theory of natural numbers as well as
the theory of real numbers. For realizing this aim, Hilbert undertook a simultaneous
formal foundation of arithmetics and logic. He defined proofs as finite sequences (or
arrays) of formulas of a certain formal language and conceived of a
“metamathematical” study of those finite objects, in order to master infinite sets and
procedures by this indirect way. Hilbert’s central idea was to reduce the consistency
proof of a theory to a sequence of numerical equations, the latter of which is either
1=1 (in case the discussed theory is consistent) or to 1=0 (in case the theory is
inconsistent). Thus, he stressed the analogy between finite sequences of logical
symbols (formal proofs) and elementary calculations on natural numbers or, in other
words, the analogy between proof and calculus. Thus, Hilbert reversed the logicist
point of view: instead of reducing arithmetics to logics, his proof theory could be seen
as a reduction of logic to arithmetics. To delete Poincaré’s and Brouwer’s criticism
against the actual infinite, Hilbert searched for a consistency proof by “finitary” means
(Hilbert’s finitistic program). It is generally admitted that ‘finitary’ or ‘finitistic’
meant a sub-domain of elementary arithmetics, which is called primitive recursive
arithmetics (PRA). The second Gödel’s incompleteness theorem (1931) showed the
failure of Hilbert’s aim at proving the consistency of an elementary system of
arithmetics by means available within the system, provided the system is consistent.
Theory of recursive functions: recursiveness is the precise notion for calculability or
effectivity. The very first root of the notion is the § 126 of Dedekind’s Was sind und
was sollen die Zahlen? Then, we can mention the achievements of Hilbert (Über das
Unendliche, 1926), Herbrand (1929), Gödel (1931, 1934) Church and Kleene (1936).
Church’s thesis states that -definability and general recursiveness are two equivalent
exact characterizations of all the arithmetic functions that are effectively calculable
(i.e. for which there are algorithms). From his part, Turing (1912-1954), who only
learned of Church 1936 work when his own paper was ready for publication, gave an
analysis of effective calculability in terms of the possibility of computation by a
“Turing machine” (1937). Turing’s thesis is that every algorithm can be mechanically
performed by a Turing machine. Turing proved that Turing computability was
equivalent to general recursiveness. There are many mathematical applications of
recursive functions. One of the most famous is the negative solution of Hilbert’s tenth
problem on Diophantine equations.
Formal semantics as deductive science: Widening the scope of Hilbert’s
metamathematics, Alfred Tarski constituted a new metamathematical branch, defining
formally the semantic fundamental concepts which were previously used in an
intuitive way: satisfaction, truth, model, consequence. He stressed the distinction
between formal language and metalanguage, so as to resolve semantic paradoxes like
the liar. He developed also the systematic study of the mutual relations between
formal languages (syntax) and the classes of their models (semantics), laying the base
for a new mathematical discipline, which was named ‘Model theory’ in the 1950s.
Tarski was very eager to establish close connections between model theory and the
ancient mathematical disciplines, such as Euclidean geometry or the theory of real
numbers. He could show the fruitfulness of crossing model-theoretic methods with
algebraic or geometric methods. Thus, by applying the metamathematical concept of
9
definability to real numbers, he defined (1931) a very new concept, which became
later the concept of semi-algebraic set.
Although he borrowed and transformed many technical elements and some views from
each of the three main epistemological standpoints: logicism, formalism and
intuitionism, Tarski supported explicitly the philosophy of none of them. Moreover, he
repeatedly claimed he could develop his mathematical and logical investigations
without reference to any particular philosophical view concerning the foundations of
mathematics. He wanted his results to be totally disconnected from any philosophical
view, and even from his personal leanings. For he believed that scientific precision
was inversely proportional to philosophical interest, even though he had strong interest
in philosophical issues.
This will of philosophical neutrality made a shift in the history of modern
mathematical logic. Tarski was the first logician to assume an explicit splitting
between logical work as such, on the one hand, and, on the other hand, assumptions or
beliefs about the actual or legitimate ways of doing that work and about the nature of
the mathematical and logical entities linked with those ways.
Mathematical interdisciplinarity:
Modern Algebra: This is the title of B.L. van der Waerden textbook, which from the first
edition, published by Springer in 1930, to the 8th reprint in 1971 remained a major reference.
This textbook, which replaced Serret’s Cours d’algèbre supérieure (Paris, Bachelier, 3th ed.
1866) and Weber’s Lehrbuch der Algebra (Braunschweig, Vieweg, 1894-95), represented the
Göttingen-Hamburg school of abstract algebra (1910-1933). The leaders of this school were
Emmy Nœther (1882-1935) and Emil Artin (1898-1962). They both agreed in recognizing in
Steinitz’ Algebraische Theorie der Körper (1910) the very beginning of a new way of dealing
with mathematical general concepts or structures. They both payed great tribute to Dedekind’s
and Hilbert’s breaking works. According to Hermann Weyl this new way changed the face of
the mathematical world.
The matter of a great part of van der Waerden’s textbook was constituted by Artin’s and
Nœther’s lectures on Galois’ theory, groups theory, fields theory, real closed fields, ideal
theory, elimination theory, hypercomplex numbers, etc. Van der Waerden attended those
lectures and made himself some important contributions.
Set-theoretic mathematics:
Structures: The notion of structure emerged in the 19th century. A structure is a set whose
elements are submitted to at least one operation defined by its properties stated in axioms.
Thus a group is set of elements {a, b, c, d, …} plus an operation satisfying the following
axioms :
- abc ((a b) c = a (b c))
- e (a e = e a = a)

- aa' ( a a' a' a  e )
Two different sets have the same structure if their operations, which can be also different,

 
 satisfy the 
same axioms. For instance let us consider the set R of real numbers with the


 addition
 and the set R * of all real numbers except zero, with the multiplication. R,

and R*, satisfy both the three above axioms (associativity, neutral element, and inverse)



10

plus a forth axiom : ab (a b = b a), which states the commutativity of + and  . R,
and R*, are both commutative groups.
Let us consider the following equation involving the exponential function : ex+y = ex  ey
R,
It sets an identity
between
of the former
structure

 and R *, , so that to any property


corresponds an analogous property of the second structure. Such a possibility to translate
properties from the former to the second structure is a source of getting new results. Great

progress has been achieved in mathematics through such translations.


The Warsaw School of Logic in the beginning of 20th century: The Warsaw School of
Logic was a joint product of philosophers (Twardowski and his disciples) and mathematicians
(Sierpinski, Kuratowski). The first specialist lectures on mathematical logic, given by J.
Lukasiewicz, started in 1907/08. In 1915, Lesniewski had one of the two chairs in Warsaw
University, the other one being given to Lukasiewicz. In cooperation with mathematicians,
Lukasiewicz and Lesniewski trained a group of excellent logicians. The first of these was
Alfred Tarski, who rapidly won the greatest international renown. Among the members of the
group, let us mention, A. Lindenbaum, M. Presburger, B. Sobocinski, and M. Wajsberg.
Lukasiewicz and Lesniewski postulated the autonomy of logic with regard to philosophical
speculations, an attitude that Tarski also vigorously supported. However, the “interpretative”
attitude towards logic and the interest for semantics might be attributed to the influence of the
philosophical trend, which originated in the works of Bolzano, Brentano, Meinong, Husserl.
Tarski led in the late 1920s a seminar, in which important results on the syntactic
completeness or on the decidability of some partial theories were established. In particular,
the method of elimlinating quantifiers was developed in a “general and systematic way” in
Yarski’s University lectures for the years 1926-1928 (DM, notes 4 and 11)
For more, see J. Wolenski, Logic and Philosophy in the Lvov-Warsaw School, Kluwer, 1989.
11