Skip to the Main Content

Note:These pages make extensive use of the latest XHTML and CSS Standards. They ought to look great in any standards-compliant modern browser. Unfortunately, they will probably look horrible in older browsers, like Netscape 4.x and IE 4.x. Moreover, many posts use MathML, which is, currently only supported in Mozilla. My best suggestion (and you will thank me when surfing an ever-increasing number of sites on the web which have been crafted to use the new standards) is to upgrade to the latest version of your browser. If that's not possible, consider moving to the Standards-compliant and open-source Mozilla browser.

September 3, 2020

Making Life Hard For First-Order Logic

Posted by David Corfield

I mentioned in a previous post Sundholm’s rendition in dependent type theory of the Donkey sentence:

Every farmer who owns a donkey beats it.

For those who find this unnatural, I offered

Anyone who owns a gun should register it.

The idea then is that sentences of the form

Every AA that RRs a BB, SSs it,

are rendered in type theory as

z: x:A y:BR(x,y)S(p(z),p(q(z)), \prod_{z: \sum_{x: A} \sum_{y: B} R(x, y)} S(p(z), p(q(z)), where pp and qq are the projections to first and second components. We see ‘it’ corresponds to p(q(z))p(q(z)).

This got me wondering if we could make life even harder for the advocate of first-order logic – let’s give them the typed version to be generous – by constructing a natural language sentence which it would be even more awkward to formalise.

One thing to note is that the type above has been formed from a dependency structure of the kind:

x:A,y:B,w:R(x,y)s:S(x,y). x: A, y: B, w: R(x, y) \vdash s: S(x, y).

Now I’ve been writing up on the nLab some of what Mike told us about 3-theories back here. Note in particular his comments on the limited dependencies which may be expressed in the ‘first-order 3-theory’:

one layer of dependency: there is one layer of types which cannot depend on anything, and then there is a second layer of types (“propositions” in the logical reading) which can depend on types in the first layer, but not on each other.

So to stretch this 3-theory we should look for a maximally dependent structure. Take something of the form

x:A,y:B(x),z:C(x,y)d:D(x,y,z). x: A, y: B(x), z: C(x, y) \vdash d:D(x, y, z).

Ideally the types will all be sets.

By the time quantification has been applied, the left hand side swept up into an iterated dependent sum and a dependent product formed, the proposition should be bristling with awkward anaphoric pronouns.

The best I could come up with was:

Whenever someone’s child speaks politely to them, they should praise them for it.

A gendered version with, say, men and daughters would help with the they/them clashes.

Whenever a man’s daughter speaks politely to him, he should praise her for it.

Any better candidates?

Posted at September 3, 2020 1:52 PM UTC

TrackBack URL for this Entry:   https://golem.ph.utexas.edu/cgi-bin/MT-3.0/dxy-tb.fcgi/3249

16 Comments & 0 Trackbacks

Re: Making Life Hard For First-Order Logic

It seems the first appearance of a donkey sentence was in 1328!

Omne homo habens asinum videt illum.” Walter Burley, De puritate artis logicae tractatus longior.

Only six and a half centuries later we had a proper formalism for it.

Posted by: David Corfield on September 4, 2020 1:30 PM | Permalink | Reply to this

Re: Making Life Hard For First-Order Logic

I realise that this post may not have been intended to be taken too seriously, but it is important to note that dependent type theory itself does not solve the fundamental problem here, which is to systematically represent natural language uses of anaphoric pronouns in syntax in a way which leads to correct semantics. It is one thing to take individual examples in natural language, and show that dependent type theory (or discourse representation theory, or whatever) allows one to express them. It is a completely different thing to be able to do this systematically, i.e. to have some formal/computational way to take the natural language fragment and represent it syntactically. I believe that I have raised this point before on the nForum :-).

In these slides, the suggestion is to extend ‘ordinary’ dependent type theory with some additional rules to try to achieve this. My purpose in raising this is not to discuss whether this particular suggestion is promising, just to emphasise that it is felt to be necessary.

Posted by: Richard Williamson on September 8, 2020 9:15 AM | Permalink | Reply to this

Re: Making Life Hard For First-Order Logic

I’ll leave a computational approach to the computational linguists. Good luck to them with that. There’s plenty here to interest philosophers of language just in terms of representation of some select sentences by DTT.

Of those computational linguists still active who are using DTT for natural language my sympathies are with Luo’s stance over that of Bekki and colleagues as regards typing in that each common noun denotes a type for Luo, whereas these nouns are formed by predication on some master type EntityEntity for Bekki.

However, the latter suggest that there is an interesting middle way

Note that although we do not take common nouns to be types, it is possible to refine type entityentity by introducing more fine-grained types such as ones representing animate/inanimate objects, physical/abstract objects, events/states, and so on. (Tanaka et al. 2015, p. 3)

  • Tanaka, R., Mineshima, K. and Bekki, D. 2015. ‘Factivity and Presupposition in Dependent TypeSemantics’. Conference talk at TYTLES: TYpe Theory and LExical Semantics, Barcelona, (pdf)

I find it very hard to think we should be starting from a master type of entities – the difference between an object and an event seems to me so fundamental.

Posted by: David Corfield on September 11, 2020 6:32 PM | Permalink | Reply to this

Re: Making Life Hard For First-Order Logic

I would think that this is a matter of relevance for philosophy too. To that end…

There’s plenty here to interest philosophers of language just in terms of representation of some select sentences by DTT.

…Could you list precisely some philosophical questions that interest you in this regard?

Posted by: Richard Williamson on September 12, 2020 12:07 PM | Permalink | Reply to this

Re: Making Life Hard For First-Order Logic

  • Is there a formalism which makes sense of what philosophers call category mistakes?

  • Is Jason Stanley right when he says “All truth-conditional effects of extra-linguistic context can be traced to logical form”?

  • What should we make of Quine’s ideas on ontological commitment when we replace first-order logic by dependent type theory?

And other issues I raise in recent posts.

Posted by: David Corfield on September 13, 2020 2:04 PM | Permalink | Reply to this

Re: Making Life Hard For First-Order Logic

Thank you! My feeling, which may be a failing on my part, is that I don’t really see where any significant philosophy is entering.

Is there a formalism which makes sense of what philosophers call category mistakes?

I would say that the answer is clearly no. A formalism would imply (which was exactly the point I was making) a systematic way to define and identify a category mistake, which will clearly be impossible in general (there are an infinite number of ways for a set of individuals to understand something in a meaningful way that to others would appear meaningless, that would defy any fixed formalism).

In particular, I don’t think that use of some notion of syntax which admits a notion of context really changes anything fundamentally. This goes back to what I was suggesting in the other recent thread: it seems to me that all that is happening is a shift of emphasis from one aspect of the problem to another.

Is Jason Stanley right when he says “All truth-conditional effects of extra-linguistic context can be traced to logical form”?

To me, that any philosopher could make any absolute statement whatsoever and believe it to be true is preposterous! But disregarding that, looking at his work and what you have written about it, the point seems to be essentially that one should take into account context. Then, yes, dependent type theory has a syntactic notion of context. But if one cuts to the heart of the matter, is the fact that some more or less rational/logical use of context is typically observed in natural language really a deep or new philosophical observation? Does the fact that dependent type theory has context in the syntax add much philosophically?

What should we make of Quine’s ideas on ontological commitment when we replace first-order logic by dependent type theory?

This kind of thing strikes me mostly as so much hot air. It is evident that no notion of existence whatsoever can be rationally made sense of, which rather undermines the whole notion of an ontology. I find it difficult to take sentences like ‘What demands are imposed on the world by the truth of the following sentence?’ seriously: it is evident that this kind of intellectualism cannot impose anything in any absolute sense. But if we play the game and forget these objections, I don’t see that picking individual examples in natural language can shed much light on this kind of thing: surely if there is any kind of ‘ontological commitment’ involved, we must have something systematic? Futile though it may be, Quinn does at least seem to attempt to give a hard and fast general rule for when something has ‘ontological commitment’.

To end, I would like to write something more positive. One philsophical matter that I do think has some depth to it is something that was suggested to me by some very early lecture notes of Martin-Löf on constructive mathematics. Right at the beginning of these, he suggests (at least to my reading) that constructive meaning of the number one comes from the very act of drawing a mark on a piece of paper; two marks for the number two; four marks with a line through for the number five perhaps as people used to do in the old days; etc. What Martin-Löf suggests (to my reading again) is that one give constructive mathematics in general meaning in this way. I.e. the meaning comes from the fact that we really can write down on a piece of paper a series of marks that define what we are speaking of. This is quite radical: it is in one way what one might call a purely syntactic notion of meaning, but goes beyond ‘syntactic’ in the conventional sense, in that it is not referring to rules, but to the more primitive act of actually writing down a series of marks on paper. Interesting philosophical questions then arise as to the relationship between this kind of meaning and what one might call ‘interpretative meaning’, namely how mathematicians, individually and socially, assemble myths to give these marks a more ‘human’ meaning. Something along these lines is an example where I think dependent type theory can be relevant philosophically, namely in the way it breaks down the difference between logic and syntax.

Posted by: Richard Williamson on September 15, 2020 1:39 AM | Permalink | Reply to this

Re: Making Life Hard For First-Order Logic

And philosophers get criticised for telling other disciplines how to go about their business!

Posted by: David Corfield on September 15, 2020 10:55 PM | Permalink | Reply to this

Re: Making Life Hard For First-Order Logic

It is possible to have an interest in philosophy, and indeed to be a philosopher (whatever the term means!) for that matter, without holding an academic position in philosophy!

The original point under discussion here was the importance of individual examples for questions in philosophy of language; I offered my thoughts on that in the context of the questions you raised.

Posted by: Richard Williamson on September 16, 2020 1:01 PM | Permalink | Reply to this

Re: Making Life Hard For First-Order Logic

Earlier drafts of my article Narrative and the Rationality of Mathematical Practice were titled ‘How mathematicians fail to be fully rational’, but I came to think that this sounded rather preachy. I imagine “preposterous”, “hot air” and “futile” won’t be received too well by those who devote their research lives to such topics.

But let’s focus on the issue in what you write that concerns me. I’m proposing that insofar as first-order logic achieves anything in analytic philosophy, dependent type theory can do at least as well and generally does better. Similarly for modal logic and modal type theory.

Now there are those, such as Peter Hacker, the Oxford philosopher, who take the supposed achievements of first-order logic in philosophy to be completely illusory. Such a charge needs exploring, but let’s for the moment overlook this, and consider your claim that for a formalism to do well it had better do so in a systematic way rather than through individual examples. If my attempts to show the benefits of DTT over FOL are by means of individual cases, then they count for little.

I think I don’t know what you mean by ‘systematic’. Above you write

It is a completely different thing to be able to do this systematically, i.e. to have some formal/computational way to take the natural language fragment and represent it syntactically.

Since we are not even close to being able to do this with the very restricted language that appears in a mathematics text, I consider this to be a near impossible demand.

I’m following Ranta and Martin-Löf:

So we gradually developed a view that satisfied both of us, that formal grammar begins with what is well understood formally, and then tries to see how this formal structure is manifested in natural language, instead of starting with natural language in all it unlimitedness and trying to force it into some given formalism.

Then thinking through examples, especially tricky ones, is very illuminating. E.g., once you’ve come to see how in English we represent events and their duration, distinguishing between activities and accomplishments by ‘for’ and ‘in’, you can begin to figure out how we parse sentences which push the boundaries:

It took me two days to learn to play the Minute Waltz in sixty seconds for more than an hour.

Parsing this is a process of construction. How can one wield type and term specifications to render this a meaningful sentence?

(Hint: by constructing an activity out of the iteration of an accomplishment, then turning that activity into a further accomplishment.)

To my mind there’s something reasonably systematic about the representation of event structures in English, which carries over to other European languages I know. I’m encouraged that Luo and Bekki are coming at DTT for natural language from different linguistic backgrounds.

I want to say that difficult cases are instructive. It is quite reasonable for rival camps to challenge each other with them. To the Davidsonian analysis of event structure, Peter Hacker proposed the example of

He wisely apologised.

Are we really to say that by this we mean

There is an event, ee, the agent of ee is male, ee happened in the past, ee is an apologising, the manner of ee is wise,… ?

At the very least we need to consider how ‘wisely’ applies to the kind of act that is an apology. Evidently there is a context for such a statement that contains the identity of the apologiser, the act for which the apologising is required, the people who were offended by the act, some idea of how they might seek redress, and so on. A story of a certain shape needs to be supplied. Ranta’s ideas on narratives as DTT contexts seem to me appropriate. It seems to me that Apologise(x,y,z)Apologise(x, y, z) has slots for variables of certain types, even when not articulated, so that you can’t even form the sentence ‘He apologised’ without already intending recipients and an action.

Then, just as we need to specify a predicate such as ‘small’ in relation to a type, so that small for an elephant and small for a mouse are different, we need to specify an adverb such as ‘wisely’ relative to a type of action.

Now maybe we should follow Hacker and give up on the idea of there being any formal structure to extract in such cases, but I think they’re worth mulling over. Something reasonably systematic may emerge. To date DTT has received a negligible fraction of the attention afforded to FOL.

Posted by: David Corfield on September 18, 2020 11:44 AM | Permalink | Reply to this

Re: Making Life Hard For First-Order Logic

I’ll respond in two separate posts. Firstly, I’d like to respond a little to the following.

Earlier drafts of my article Narrative and the Rationality of Mathematical Practice were titled ‘How mathematicians fail to be fully rational’, but I came to think that this sounded rather preachy. I imagine “preposterous”, “hot air” and “futile” won’t be received too well by those who devote their research lives to such topics.

It is important I think to separate the person from their ideas in these matters. If one is within a certain community, this may be more delicate; but if one is on the outside, one is free from such constraints, and can treat the ideas purely on their merits. Frank discourse has its place; if the proponents of the ideas feel that the epithets I applied are unjust, they should be able to mount a reasoned defence. For my part, I feel able to defend those epithets.

Take the following for example.

Is Jason Stanley right when he says “All truth-conditional effects of extra-linguistic context can be traced to logical form”?

To me, that any philosopher could make any absolute statement whatsoever and believe it to be true is preposterous!

It is a fundamental tenet of my ‘philosophy’ such as it is that rationality can never reach outside itself; that human beings can never understand rationally anything in any absolute sense, neither their own existence nor anything else. My taste is for this reason much more on the side of the great philosophical works of ‘ancient’ China and India, which take an essentially non-rational approach to expressing wisdom about the world, than on the side of much of Western philosophy, for much of the latter is concerned with building up rational edifices that cannot possible hope to say anything about that which they really wish to say something about. This is not to say that such works are useless, just that the use they have comes for instance from sociological, political, etc, influences, not from reaching any truth philosophically.

I consider it a first sign of wisdom in a person that they have come to appreciate the limitations of rationality. One would hope that a philosopher is somewhat wise; hence why I consider it ‘preposterous’ that a philosopher speaks in absolutes in this way.

Similarly with ‘hot air’. Quinn or whoever need not take offence at that, since I would consider most of Kant’s work for example to be hot air too. Again, this is not to say that Kant was not a great figure in the history of Western thought. Just that his rational edifice building has no hope whatsoever of expressing any kind of absolute truth in the way he wishes it to do: thus it is ‘futile’, ‘hot air’, etc. In the present age, some may enjoy intellectual quibbling of this ‘hot air’ nature, but it is not to my taste. One may also note that in the present age, the great works of Eastern thinking, as well as great works of ‘anti-philosophical’ Western thinkers such as Nietzsche and Wittgenstein, are readily available in ways that were not to Kant, whose milieu in this sense was much narrower, so there is a less of an excuse.

As far as modern Western philosophy goes, I appreciate philosophers who orient themselves around the great philosophical questions. Thus, yes, we can certainly have a philosophical discussion about the most minute matter, but I’d like to know what the philosopher stands for; I’d like to see them put forward some tenets, and orient the discussion around those. One might make a comparison to music: a Brunnhile might be able to sing the end of Götterdämmerung in a studio with crystal clear sound, note perfect, and with all the technical flourishes one can imagine applied; but I’d still rather listen to a crackly recording of Kirsten Flagstad and Furtwängler.

Posted by: Richard Williamson on September 20, 2020 10:15 PM | Permalink | Reply to this

Re: Making Life Hard For First-Order Logic

I’ll now respond to the rest of the post. Firstly, I don’t think anybody could dispute the following.

I want to say that difficult cases are instructive.

Certainly they are instructive, just like counter-examples in mathematics. What I am suggesting is that we would be arriving at something with much more depth to it if we did not just ‘resolve’ examples in an ad hoc way, but rather put forward some systematic alternative, and checked whether it can handle the examples.

Regarding ‘systematic’…

I think I don’t know what you mean by ‘systematic’. Above you write

It is a completely different thing to be able to do this systematically, i.e. to have some formal/computational way to take the natural language fragment and represent it syntactically.

Since we are not even close to being able to do this with the very restricted language that appears in a mathematics text, I consider this to be a near impossible demand.

I will try to explain what I am getting at. I would summarise the original post as follows: take some sentence with use of anaphoric pronouns; note that first order logic does not deal with these satisfactorily; find some alternative using dependent type theory that takes into account context. But it is a many decades old observation that anaphoric pronouns are problematic for the Montague semantics, whose syntactics is both modal and higher order, never mind first order. That one approach to getting around this is to bring in context is equally old, and approaches which bring in context (discourse representation theory for instance) have been worked out which are rigorous and formal enough to be used in computational linguistics.

Thus I think it is reasonable, if one is to expect to convince anybody that dependent type theory is superior to, or even the equal of, discourse representation theory or whatever, to expect some general proposal, from general to particular (i.e. analytic!), to be put forward: this is how we should think of anaphoric pronouns in general in dependent theory. Then we can test how good that proposal is against the tricky examples which motivated it.

Thus the following is fine as far as it goes…

So we gradually developed a view that satisfied both of us, that formal grammar begins with what is well understood formally, and then tries to see how this formal structure is manifested in natural language, instead of starting with natural language in all it unlimitedness and trying to force it into some given formalism.

…but then it doesn’t seem right to claim that you are illustrating the limitations of first order logic (or, better, the Montague semantics) compared to dependent type theory. For the Montague semantics and discourse representation theory are putting themselves on the front line to be shot at: they are making a rigorous attempt to capture linguistic meaning, which is good enough to be able to be used in computational linguistics today. If one does not have a consistent/formal/systematic way to apply whatever one’s tool of choice is (first logic or whatever) to natural language to clarify its ‘meaning’, I can’t really see how can hope to do any rigorous philosophical analysis, or hope to approach any kind of philosophical insights, for one does not have anything in common from one analysis to the next.

One might try to mend a shoe with a knife or with a hammer; a random person might succeed in any given case with enough unskilled hacking or wallopping, but it is a stretch to deduce that ‘one can mend a shoe with a knife’ (or hammer) in such cases. If, on the other hand, a cobbler has a set of principles for how to proceed in such cases with a knife, and we see from experience that the cobbler can apply those principles to fix diverse given cases, then ‘one can mend a shoe with a knife’ in such cases is a much stronger deduction.

Posted by: Richard Williamson on September 20, 2020 11:20 PM | Permalink | Reply to this

Re: Making Life Hard For First-Order Logic

I don’t know that these examples are going to look much harder, since people who are used to thinking in first-order logic are also used to automatically collapsing all kinds of dependency.

Posted by: Mike Shulman on September 12, 2020 4:33 PM | Permalink | Reply to this

Re: Making Life Hard For First-Order Logic

I should take a look at Jacobs’ Translating dependent type theory into higher order logic.

Might we say that what you find funny with higher order logic is forced on those who want the expressivity of dependent type theory but are desperately keeping to the first-order 3-theory?

Posted by: David Corfield on September 13, 2020 2:09 PM | Permalink | Reply to this

Re: Making Life Hard For First-Order Logic

Jacobs claims

the facility for compact expression of complex facts is what makes DTT so attractive.

Dependent types give

…the logic increased expressive power. Our favourite example is the result which says that every injective endofunction on a finite set is surjective. Using such a ‘dependent logic’, it can be expressed simply as

n::Nat,f::Nat[n]Nat[n]|injective(f)surjective(f)n :: Nat, f :: Nat[n] \to Nat[n] | injective(f) \vdash surjective(f).

I guess the idea is that under the proposed translation this judgement becomes particularly messy.

Posted by: David Corfield on September 14, 2020 2:02 PM | Permalink | Reply to this

Re: Making Life Hard For First-Order Logic

Yeah, I guess we might say that.

Posted by: Mike Shulman on September 14, 2020 9:41 PM | Permalink | Reply to this

Re: Making Life Hard For First-Order Logic

As a classical model theorist trying to learn about type theory, I find the hostility towards first-order logic in this post more than a little off-putting.

Posted by: James Hanson on October 18, 2023 11:10 PM | Permalink | Reply to this

Post a New Comment