ExtenDD brings together two areas of intensive research that - surprisingly - so far have rarely come together: complex terms and proof theory. By complex terms we understand expressions that purport to denote objects like proper names, but that, unlike proper names, exhibit structure that contributes to their meaning. The most important examples of such expressions are definite descriptions. Proof theory is the mathematical study of proofs themselves. ExtenDD applies the tools of proof theory to proofs that appeal to propositions containing complex terms. A definite description is an expression of the form ‘the F’: the author of ‘On Denoting’, the author of Principia Mathematica, the present King of France. A definite descriptions aims to denote an object by virtue of a property that only it has: ‘the F’ aims to denote the sole F. The first example denotes Russell, as he was the sole author of ‘On Denoting’. Sometimes a definite description fails, because nothing or more than one thing has the property F, as in the second and third example: Principia Mathematica had two authors, Whitehead and Russell, and France is presently a republic. If a unique object has the property F, the definite description ‘the F’ is called proper; otherwise it is improper. Pride of place for the first formalisation of a theory of definite descriptions as part of a mathematical investigation belongs to Frege’s Grundgesetze der Arithmetik. But it was Russell who brought definite descriptions to prominence. Since his article ‘On Denoting’, in which Russell put forward his celebrated theory of definite descriptions, these expressions occupy a central place in philosophical research and many deep and detailed studies have been carried out in philosophical logic, epistemology, metaphysics and philosophy of language. Russell was prompted by reflections on improper definite descriptions: what do sentences like ‘The present King of France is bald’ mean? ‘The present King of France’ denotes no one, so the truth or falsity of ‘The present King of France is bald’ is not determined by whether the present King of France is bald or not. As there is no present King of France, ‘The present King of France is bald’ should not be true, as otherwise we are stating that a certain person has a certain property.
Russell accepted the principle of bivalence that every sentence is either true or false, hence the ‘The present King of France is bald’ should be false. But then it would appear that ‘The present King of France is not bald’ should be true and the same problem arises once more: if so, we are stating that a certain person fails to have a certain property. Russell concluded that sentences of the form ‘The F is G’ do not have the subject-predicate form they appear to have. According to Russell no meaning is assigned to the expression ‘the F’ standing alone, but only in the context of complete sentences in which it occurs. ‘The F is G’ means ‘There is exactly one F and it is G’. Thus the complex term ‘The F ’ disappears upon logical analysis. There are then two negations of ‘The F is G’: the external negation ‘It is not the case that the F is G’, meaning ‘It is not the case that there is exactly one F and it is G’, and the internal negation ‘The F is not G’, meaning ‘There is exactly one F and it is not G’. The former is true, the latter false. Russell’s problem is solved.
Russell’s theory of definite descriptions had enormous influence on the development of analytic philosophy in the twentieth century, as it was applied, by Russell himself and those influenced by him, to epistemology, metaphysics and philosophy of language. Scores of text books still use Russell’s theory as their official theory of definite descriptions. Despite its brilliance, Russell’s account has drawbacks. It appears to assign wrong truth values to sentences. Diogenes Laertius reports that Thales, having discovered his theorem, offered a sacrifice of oxen to the god at Delphi. According to Russell’s theory, what Diogenes reports is false, as it contains an improper definite description. That sounds wrong, if Thales did indeed offer a sacrifice at Delphi. Some philosophers, like Strawson, argued that ‘The present King of France is bald’ should be neither true nor false, as, due to the improper definite description, it is somehow defective. The second half of the 20th century saw the development of new approaches to definite descriptions. In Frege’s and Russell’s classical logic, it is assumed that every singular term denotes an object. In free logic, originating in the work of Hintikka and Lambert, this assumption is given up. Negative free logic takes a Russellian line according to which atomic formulas containing non-denoting terms are false. Neutral free logics pick up Strawson’s view that they are neither true nor false. Positive free logic permits them to be true. There is now a plethora of theories of definite descriptions, but very rarely have they been investigated from a proof theoretic perspective.
Proof Theory was founded by Hilbert at the beginning of the 20th century as the mathematical study of proofs. Initially it focused on axiomatisations of mathematical theories and proving their consistency. It was soon extended to the study of proofs outside mathematics, in particular to formal logic, which in turn was applied to the analysis of ordinary language and arguments in general. Gödel’s limitative results showed that the greatest ambition of Hilbert’s programme, to prove the consistency of arithmetic by finitary means, could not be achieved as envisaged. Gentzen’s groundbreaking work led to a substantial redefinition of the aims and tools of proof theory. It now occupies a centre stage in modern formal logic. His sequent calculus and systems of natural deduction represent the actual proofs carried out by logicians and mathematician much better than Hilbert’s axiomatic systems. These tools permit the deepest study of proofs and their properties. Following Gentzen’s work, a consensus developed over what constitutes a good proof system and the form rules of inference should take. The rules for each logical constant, expressions such as ‘not’, ‘all’ and ‘if-then’, exhibit a similar form, a remarkable systematic, as Gentzen notes. Each constant is governed by two kinds of rules for their use in proofs, and these rules govern only that constant. This form permits a precise study of where and how the logical constants are used, whether this use is essential or may be eschewed, and it is the basis for cut elimination in proofs in sequent calculus and normalisation of proofs in natural deduction. These results relate to a common phenomenon of mathematics: proofs are combined to establish new results, as lemmas are used in proofs of theorems from which corollaries are inferred in turn. Sometimes a theorem or corollary is simpler than the lemmas or theorems from which it is deduced. Gentzen’s Hauptsatz (cut elimination theorem) establishes that there is also a direct proof that proceeds from simple to complex ‘without detours’, as Gentzen says. As it is the latter in terms of which the correctness of proofs is defined, Gentzen’s result establishes that combining proofs in the said way is guaranteed to give a correct proof. In addition to these technical achievements, Gentzen’s ideas led also to the development of the more philosophically oriented proof-theoretic semantics, where the meanings of constants are defined in terms of their rules of inference.
ExtenDD will combine the paradigm of philosophy that is the theory of definite descriptions with the paradigm of proof theory that are Gentzen’s calculi. Despite the long history of research into definite descriptions, the methods developed by Gentzen have rarely been applied in research on definite descriptions. Only a small effort has so far been put into the adequate treatment of definite descriptions in this framework. The same counts for other complex singular terms such as set abstracts and number operators. ExtenDD fills this important gap in research. Applying the methods of proof theory to definite descriptions is profitable to both sides. Competing theories of definite descriptions and complex terms in general, their advantages and shortcomings, are shown in a new light. The behaviour of complex terms needs subtle syntactical analysis and requires enriching the toolkit of proof theory. ExtenDD deals with both challenges: it develops formal theories of definite descriptions and modifies the machinery of proof theory to cover new areas of application. ExtenDD will also formulate new theories of definite descriptions and other terms with an eye on proof-theoretic desiderata. ExtenDD aims to have a significant impact on proof theory, the philosophy of logic and language, linguistic, computer science and automated deduction.