The formal approach to meaning:
Formal semantics and its recent developments

 [This paper is inJournal of Foreign Languages (Shanghai), 119:1 (January 1999), 2-20.]

Barbara Abbott
Michigan State University
June 1998
abbottb@pilot.msu.edu

Like Spanish moss on a live oak tree, the scientific study of meaning in language has expanded in the last 100 years, and continues to expand steadily. In this essay I want to chart some central themes in that expansion, including their histories and their important figures. Our attention will be directed toward what is called 'formal semantics', which is the adaptation to natural language of analytical techniques from logic.[1The first, background, section of the paper will survey the changing attitudes of linguists toward semantics into the last third of the century. The second and third sections will examine current formal approaches to meaning. In the final section I will summarize some of the common assumptions of the approaches examined in the middle sections of the paper, sketch a few alternatives, and make some daring predictions.

'Meaning' is a broad term that can encompass any aspect of the potential for cognitive or emotive impact of speech on interlocutors. However in linguistic semantics these days the cognitive aspects are the center of focus. On the other hand the traditional distinction between semantics, as the study of the relation between linguistic expressions and what they are used to talk about (people and things, events and situations, etc.) and pragmatics, as the study of anything involving the use of language, has become less certain and is in fact lost in several different current approaches.

1. Background

1.1 The Bloomfieldian era. Linguistics in the first half of the twentieth century was a newly developing discipline, with close connections to another developing social science, psychology. In the United States (and elsewhere) dominant figures in psychology were striving to implement the principles of British empiricist philosophy, and especially logical positivism, which stressed attention to objective observable data in formulating scientific theories. Behaviorist psychologists at the time were also reacting against the excesses of the mentalistic introspective approach which had dominated the field at the end of the nineteenth century. Leonard Bloomfield, who was the most influential figure in linguistics in the United States in the first half of the twentieth century, was strongly influenced by behaviorism. The beginning of the chapter of his classic text Language which is titled 'Meaning' reveals this influence:

We have defined the meaning of a linguistic form as the situation in which the speaker utters it and the response which it calls forth in the hearer. ... In order to give a scientifically accurate definition of meaning for every form of a language, we should have to have a scientifically accurate knowledge of everything in the speakers' world. The actual extent of human knowledge is very small compared to this. ...

The statement of meanings is therefore the weak point in language-study, and will remain so until human knowledge advances very far beyond its present state. (Bloomfield 1933, 139-140)

Bloomfield's 'stimulus-response' model of meaning was as impractical as it was suited to his theoretical orientation. As Bar-Hillel described it, Bloomfield 'deplored the mentalistic mud into which the study of meanings had fallen, and tried to reconstruct [the field of linguistics] on a purely formal-structural basis' (Bar-Hillel 1954, 234-235).

Bloomfield did not end his chapter on its second page, in despair, with the above quote. He did find a way to talk, however briefly and informally, about the arbitrariness of meaning, polysemy and homonymy, semantic features, narrowing and broadening of word meaning, connotations of style and slang, and taboo words, though not always using these terms. (It is significant that Bloomfield had nothing at all to say about sentence meaning.) However the constraints of the crude behaviorist view of meaning he shared with other linguists of the time did prove to be a strong barrier to the development of linguistic semantics, a barrier which continued into the Chomskyan era.

1.2 The Chomskyan revolution. In 1957 a little book named Syntactic Structures was published by an obscure Dutch press, but was reviewed glowingly and at great length (33 pages, to be exact) by Robert B. Lees in Language -- the journal of the Linguistic Society of America. Noam Chomsky's revolution in linguistics had begun. Probably Chomsky's most important contribution, from the perspective of the future development of linguistic semantics, was the institution of the generative conception of grammar, on which the goal of the grammarian was not to simply catalog elements from a corpus, or fixed quantity, of observed utterances, but rather to construct a completely explicit formula that would generate, or characterize exactly, all, and most importantly only, the infinitude of sentences of the language. Besides the notable consequence of putting syntax at the center of linguistics, where formerly it had stood quietly at the back door, this change in goals would eventually help to draw the attention of semanticists toward the problem of describing explicitly how the meanings of sentences are derived from the meanings of the words that make them up. However that development would have to wait for a few years, since the scientific study of semantics was still in the vexed state it had been in in Bloomfield's day. The primary issue about meaning at the time was whether or not intuitions about meaning should play any role in determining grammatical (= syntactic, morphological, or phonological) analysis. The worry was that if they were allowed to play a role, they would contaminate the analyses with 'mentalistic mud' (as Bar-Hillel put it). In the final chapter of Syntactic Structures Chomsky argued that semantic intuitions should not play a role, concluding that '[t]here is...little evidence that "intuition about meaning" is at all useful in the actual investigation of linguistic form' (Chomsky 1957, 94).

In the decades following, linguists found themselves unable to resist looking at meaning. Already in the review article mentioned above, Lees had speculated about whether 'it could be shown that all or most of what is "meant" by a sentence is contained in the kernel sentences from which it is derived' (Lees 1957, 394). A couple of developments in syntactic analysis[2made possible the main tenet of the school of thought called generative semantics: that the deep structure of a sentence constitutes a representation of its meaning. Following the work of linguists such as Fillmore, Postal, McCawley, Ross, and G. and R. Lakoff, deep structures took on some of the aspects of a representation in first order predicate logic, though in tree form. Negation, quantifiers, and adverbs were analyzed as sentence operators ('higher predicates'), in order to represent ambiguities such as those in (1)-(3):

(1) Every woman loves one man.

    a. There is one man that every woman loves.
    b. Every woman loves some man or other.

(2) Everyone didn't enjoy the play.

    a. At least one person did not enjoy the play.
    b. No one enjoyed the play.

(3) Mary isn't working in the garden because it's sunny.

    a. Because it's sunny, Mary isn't working in the garden.
    b. It is not because it's sunny that Mary is working in the garden (but for some other reason).

Thus (1), for example, would be assigned two deep structures, roughly as in (4).

(4) a. [[one man]y [[every woman]x [x loves y]]]
    b. [[every woman]x [[one man]y [x loves y]]]

However generative semanticists at the time did not worry about the task of assigning explicit truth conditions to deep structures. Rather the devising and justification of particular deep structures was seen as the end of the job of semantics.

Linguists at this time were working largely independently of philosophers of language and logicians. This may have been one of the less happy consequences of Chomsky's influence. In the article cited above, Bar-Hillel suggested that linguists pay attention to developments in logic and try to incorporate a formal account of semantics into their grammar, but Chomsky's rather sharp reply the following year asserted that 'the relevance of logical syntax and semantics' to the study of natural languages 'is very dubious' (Chomsky 1955, 36). This reaction of Chomsky's was explicitly based on the not uncommon idea that natural languages and formal languages are fundamentally different from each other, but it was also very much in tune with his larger project of overthrowing empiricist philosophy of language and mind in favor of a return to Cartesian rationalism, as well as his personal style of publicly expressed arrogance and disdain for the work of others. Chomsky's first crops of linguistics Ph.D.'s began to appear in the mid 1960's, and thereafter increasing numbers of American linguists were taught by linguists who had been taught by Chomsky himself. These students, and their students, tended to inherit the idea that little of substantial value had been said about language in the centuries immediately prior to Chomsky. In the late 1960's and early 1970's several developments altered this picture.

1.3. Montague and formal semantics. The just mentioned assumption that natural languages like English and Chinese are fundamentally different from the formal languages devised by logicians was a cornerstone of a twentieth century dispute within British empiricist philosophy between formalists, who held that natural languages were too riddled with vagueness and ambiguity to be useful philosophical tools, and ordinary language philosophers, who held that natural languages not only could be excellent tools if used carefully, but also were rich repositories of the wisdom of generations of speakers. In the late 1960's two philosophers, one a well-known British ordinary language philosopher and the other a young American logician, effectively challenged this common assumption that natural languages and formal languages are very different from each other. H. Paul Grice, in his William James lecture series delivered at Harvard University in 1967, presented a systematic account of what he argued were only apparent divergences between a number of logical expressions and their natural language counterparts. At roughly the same time[3Richard Montague, in a series of papers with titles like 'English as a formal language' and 'Universal grammar', was making good on the following bold statement: 'There is in my opinion no important theoretical difference between natural languages and the artificial languages of logicians; indeed, I consider it possible to comprehend the syntax and semantics of both kinds of languages within a single natural and mathematically precise theory' (Montague 1970b, 222).[4]

Montague's papers are highly formal and condensed, very difficult for ordinary humans (even logicians!) to read with comprehension. Fortunately it happened that a young linguist named Barbara Partee, an exceptionally intelligent and clear-thinking as well as personable individual who was in Chomsky's first (1965) class of Linguistics Ph.D.'s from MIT, took a job at UCLA, met Montague there, and developed an interest in his approach to natural language and its contrast with Chomskyan transformational grammar. In lectures in the early 1970's, especially following Montague's untimely death early in 1971, Partee presented his work in such a way as to make it both comprehensible and appealing. 1974 was the fiftieth anniversary of the Linguistic Society of America and that summer a special Golden Anniversary Linguistic Institute was held at the University of Massachusetts at Amherst, where Partee was now on the faculty. Partee's class on Montague Grammar was one of the highlights of this stellar Institute, and was attended by many prominent linguists. Her 100 page article 'Montague Grammar and Transformational Grammar', which contained a kind of 'do it yourself' kit for learning formal semantics, appeared in Linguistic Inquiry in 1975, and served as a kind of introductory text until the excellent volume by Dowty, Wall & Peters appeared in 1981.

Linguists were not entirely ignorant of relevant work in logic and philosophy of language at this time, but there was not the kind of interaction that there is today. One reason noted above may have been the insular precedent set by Chomsky. Another may have been the personalities of the generative semanticists, who would have been expected to be the linguists most interested in developments in logic and philosophy of language. In August of 1969 the philosophers Donald Davidson and Gilbert Harman organized a small colloquium of logicians and linguists in an effort to promote more fruitful interactions, but Quine (one of the participants) remarked in his condensed autobiography that '[t]he colloquium was a fiasco at bridge building' (Quine 1986, 38), and suggested that the personalities involved were the cause. However the volume that resulted from this small conference, Davidson & Harman 1972, contained many classic articles (including contributions from both Partee and Montague) which were widely read by linguists as well as philosophers, and ultimately the work of Montague and Partee along with linguistically inclined philosophers like David Lewis and Robert Stalnaker had and continue to have a tremendous impact on the field.

Probably the most important byproduct of this interaction was that linguists became very aware of the fact that simply to represent meaning is not to give an analysis of it. This point was made most effectively by David Lewis, who criticized Katz and Postal's system of semantic representation in terms of semantic features which they called 'markers' (Katz & Postal 1964). Lewis pointed out that Katz and Postal were merely giving rules for translating English into an artificial language that might be called 'Markerese', and he said:

we can know the Markerese translation of an English sentence without knowing the first thing about the meaning of the English sentence: namely, the conditions under which it would be true. Semantics with no treatment of truth conditions is not semantics. (Lewis 1972, 169.) By studying ordinary predicate logic as well as Montague's more specialized work linguists became familiar with truth conditional model-theoretic semantics, in which interpretations for expressions, including truth conditions for sentences, are assigned relative to a model.

From the early 1970's to the present time, linguists and philosophers have worked closely and fruitfully together, attending and presenting papers at each other's conferences and publishing in each other's journals, and work in semantics, and especially formal semantics, has flourished in the United States. The journal Linguistics and Philosophy, which describes itself as focusing 'on issues related to structure and meaning in natural language as addressed in the philosophy of language, linguistic semantics, syntax and related disciplines', published its first issue in 1977 and is now in its 21st volume. Other journals devoted to semantics have also begun to appear -- Journal of Semantics (which started in 1984), Natural Language Semantics (1993) -- as well as the prestigious conference series Semantics and Linguistic Theory (SALT), which publishes the proceedings of its annual meetings and is now (1998) in its eighth year. A number of linguists and philosophers have joint academic appointments in Linguistics and Philosophy (among them Richmond Thomason, Barbara Partee, and myself) and the linguistics program at MIT is housed in the Department of Linguistics and Philosophy, though it should be noted that Chomsky's resistance to formal semantics has continued unabated.

2. Current formal semantics: quantification.

We will begin our investigation of the current scene in American linguistic semantics with a closer work at Montague's work, including some of the problems he was able to formalize solutions to. In this work interpretation of noun phrases takes center stage, and that will continue when we look at other analyses of quantification in natural language. Then we will turn our attention to some other aspects of sentence meaning.

2.1. Montague Grammar. The papers of Montague's cited above deal with 'fragments' of English. Montague's aim was not to construct a grammar for the whole language, but rather to give a complete (and completely explicit) syntax and semantics for an infinite subpart of the language which contained some constructions which pose interesting challenges for the semantician. Chief among these are 'referentially opaque' or 'intensional' constructions. 'Referential opacity' is the term coined by Quine 1953 for the failure of substitution of coreferential expressions (expressions which refer to the same thing) to preserve truth in certain contexts. (See also Quine 1956 for an excellent introduction to this problem.) One major group of referentially opaque contexts consists of sentences about propositional attitudes, or people's psychological attitudes (such as belief, desire, hope, fear, knowledge) towards situations or states of affairs. (5a) below can be true and (5c) false, despite the fact that the truth of (5b) means that the NPs Jocasta and Oedipus's mother are coreferential.

(5) a. Oedipus wanted to marry Jocasta.
    b. Jocasta was Oedipus's mother.
    c. Oedipus wanted to marry his mother.

Frege 1892 had argued that associated with expressions is a sense (Sinn) as well as a reference or denotation (Bedeutung), and that in referentially opaque contexts expressions denote their sense instead of their customary reference. Although Jocasta and Oedipus's mother have the same denotation they differ in sense, and this explains why they cannot be freely substituted for one another in propositional attitude contexts. Montague's semantics formalized Frege's solution using the notion 'intension', which is a formal analysis of the Fregean concept of sense developed by Carnap, Kripke, Montague and others. Intensions are functions from possible worlds, or possible states of affairs, to denotations or referents (the latter also known as extensions).[5]

'Montague Grammar' came to denote the style of grammar presented in Montague 1973, which has three components: a syntax generating a fragment of English (which in Montague 1973 included sentences about propositional attitudes), a complete syntax and model-theoretic semantics for a tensed intensional logic, and a set of translation rules mapping parsed English expressions to expressions of the intensional logic. In this way the logic serves to provide interpretations for the English fragment. The intensional logic was included in this paper for perspicuity; in 'English as a formal language', Montague interpreted an English fragment directly.

2.2. Generalized quantifiers. Chomsky has impressed linguists with the importance of accounting for what he calls the 'creative' aspect of human language -- the fact that we are able to produce and comprehend an unlimited number of novel utterances, sentences that we have never heard before. Compositionality is the semantic property of linguistic expressions that we assume is an essential part of the explanation for this miraculous ability. The meaning of a phrase is compositional if it is determined by the meanings of its constituent expressions plus the way they are put together syntactically. Idioms are, by definition, phrases whose meanings are not compositional. If all of language were idiomatic in this sense, then language would have no creative aspect in Chomsky's sense.

The formal languages of logic are strongly compositional, which means that expressions of a given syntactic category all receive the same type of interpretation and contribute in the same way to the interpretation of larger expressions of which they form a part. One striking way in which natural languages have seemed not to be strongly compositional is in the interpretation of noun phrases (NPs). NPs that are proper names, like Mary, have the same distributional properties as quantified NPs like every student, and sentences like those in (6) share their syntactic structure:

(6) a. Mary talks.
    b. Every student talks.

However, while it is natural to say of (6a) that it is true just in case the individual denoted by the name Mary belongs to the set of entities that talk, a parallel analysis is not possible for (6b), and traditionally sentences like (6a) have received very different translations into first order predicate logic from sentences like (6b), as seen in (7).

(7) a. Talks(m)
    b. " x [Student(x) -> Talks(x)]

Probably the most impressive and far-reaching innovation in Montague's semantics came about because of his need to solve this problem, and that was the introduction of the generalized quantifier analysis of NPs. A generalized quantifier (GQ) is (an expression denoting) a set of subsets of some domain; on this view the traditional existential quantifier would be interpreted as the set of all non-empty subsets of the domain of discourse. Taking NPs to denote GQs, Mary would be interpreted as the set containing all and only those sets which have the person Mary as a member, and Every student would denote the set containing all and only supersets of the set of students. In this way the subject NPs of (6a) and (6b) can both be seen as taking the predicate as an argument, and each sentence is true if and only if the set of entities that talk belongs to the GQ denoted by the subject NP.[6]

Besides enabling Montague to solve the strong compositionality problem, the GQ analysis of NPs allows a number of other improvements. Some of these were observed and made use of by Montague; for example the generalization of conjunction and disjunction, which in ordinary predicate logic are strictly sentence operators, to apply also to NPs (as well as verb phrases). Following Montague's death other linguists and philosophers, beginning with the seminal work of Barwise & Cooper 1981, explored other avenues opened by the GQ approach and many papers and books have appeared detailing the results (see e.g. Gärdenfors 1987 and Bach et al. 1995, and the works cited there). A major thrust of this work is a change of focus from the NP to the determiner, which under the GQ approach can be easily treated categorematically (unlike the traditional logic syncategorematic analysis of quantifier expressions) and analyzed as denoting a function from sets (common noun denotations) to sets of sets (GQs) or, equivalently, as expressing a relation between sets -- the set denoted by the common noun it combines with and the set denoted by the predicate. This shift in focus is very much in tune with current trends in syntax, where increasingly function morphemes have taken center stage, so that clauses are now taken to be complements of complementizers and so part of a CP category, and nominal phrases are complements of determiners and so part of a DP category.

A number of formal semantic properties of quantified NPs came to light under the GQ approach. One of the most widely cited concerns 'entailingness' or monotonicity. In general an operator is upward entailing if a sentence containing it entails a sentence where the operator's argument is replaced with superset of its original argument, and downward entailing if the entailment goes in the other direction. Viewing quantificational determiners as functions from sets to GQs, and GQs as functions from sets to truth values, we have two operators to consider -- the determiner and the GQ. The determiner every is downward entailing, as shown by the fact that (8a) entails (8b):

(8) a. Every dog barks.
    b. Every spotted dog barks.

On the other hand the GQ Every dog is upward entailing, as seen by the fact that (9b) entails (9a):

(9) a. Every dog barks.
    b. Every dog barks loudly.

Both some and some dog are upward entailing, while no and no dog are both downward entailing, as the reader may confirm by substituting them for every in the examples in (8) and (9).

Negative polarity items are expressions like any and ever that are limited in their occurrence; they occur naturally in negative sentences, and in some other environments, but an exact statement of the constraint has proved elusive. An appealing hypothesis is that they occur exactly in downward entailing environments (see Ladusaw 1983), and this is confirmed by the examples in (10) - (12).[7]

(10) a. *Some dog who ever smiles barks.
        b. *Some dog barks at anyone.

(11) a. No dog who ever smiles barks.
        b. No dog barks at anyone.

(12) a. Every dog who ever smiles barks.
        b. *Every dog barks at anyone.

Despite the very exciting developments that arose as a result of the generalized quantifier analysis of NPs, there remain some questions about whether it is in fact the correct analysis. One challenge which we will look at in the next subsection has come, in part, from one of Partee's own students, and in connection with a new approach which considerably blurs the traditional distinction between semantics, as the study of words-world relations abstracting away from contexts of utterance, and pragmatics, as the study of the effects of context on interpretation.

2.3. Discourse representation and file change semantics. In the early 1980s a rather different approach to natural language quantification was proposed independently by Irene Heim, at the time one of Barbara Partee's students at the University of Massachusetts at Amherst, and by Hans Kamp, a philosopher and logician. (See Heim 1982, 1983, and Kamp 1984.) One problem which Kamp and Heim were concerned with was providing an adequate analysis of what are called 'donkey sentences', as in (13):[8]

(13) Every farmer who owns a donkey beats it.

Such sentences actually present two problems. The first concerns the interpretation of the pronoun it. If we represent a donkey in traditional predicate logic, using the existential quantifier, then the it (represented by the final occurrence of y in (14) will be outside the scope of that quantifier and will not be appropriately bound:

(14) " x [[Farmer(x) & $ y [Donkey(y) & Own(x,y)]] -> Beat(x,y)]

If, on the other hand, we use a universal quantifier for a donkey as in (15),

(15) " x " y[[Farmer(x) & Donkey(y) & Own(x,y)] -> Beat(x,y)]

we get a correct representation of the meaning of (13) but we have to explain how a donkey should sometimes be represented with a universal quantifier, but not other times, e.g. (16):

(16) Mary owns a donkey.

Discourse Representation Semantics (DRS) (Kamp) and File Change Semantics (FCS) (Heim) both solved this problem with an approach to semantics which views the meaning of a sentence in terms of the impact an utterance of it has on the discourse of which it is a part, in other words in terms of its context change potential. This is an approach that had earlier been urged by Stalnaker, in connection with the problem of presuppositions and presupposition projection. (See Stalnaker 1974, 1978.) Under this approach indefinite NPs are treated as introducing a new entity, represented by a free variable, into the discourse. (Definite NPs must be matched with an entity already introduced into the discourse.) When they occur in simple sentences like (17) they receive an existential interpretation in view of the semantic rule for interpreting the entire discourse -- roughly, the discourse is true if there is a sequence of individuals meeting all the conditions that have been mentioned. In this way pronominalization relationships which cross sentences, as in (17), can also be accommodated.

(17) Mary owns a donkey. It always brays when it wants to be fed.

The pronouns in (17) are beyond the capability of traditional formal semantics, which follows traditional logic in providing interpretations sentence by sentence.

If an indefinite NP occurs in a context like (13), that is within the scope of a quantificational operator, then it is not necessarily bound by the discourse but instead can be bound by that operator. Lets look more closely to see how this happens. In the discourse oriented view of semantics, quantification breaks down into a three part, or tripartite, structure. The first element of the structure is the quantificational operator, the second element includes any restriction on the range of quantification, and the third element (often called the 'scope') is the actual assertion associated with the quantifier. If indefinite NPs fall within the restrictive portion of a quantificational structure, they inherit binding by whatever quantificational operator is involved. So (13) receives a representation as in (18), where Q stands for 'Quantificational operator', R stands for 'Restriction', and S stands for 'Scope':

(18) Q[every: x, y] R[Farmer(x), Donkey(y), Owns(x,y)] S[Beats(x,y)]

One of the attractive features of this approach is that it can also handle examples of adverbial and adjectival quantification which were pointed out by David Lewis (see Lewis 1975). Notice that (19) below means the same thing as (13), and would also be represented by (18), but this time the universal quantification is expressed by the adverb, and there are two indefinite NPs -- a farmer and a donkey -- to fall within its scope.

(19) Invariably, if a farmer owns a donkey he beats it.

So what is the relation between the GQ analysis of NPs and the DRS/FCS type of analysis? Barbara Partee, ever the unifier, argued that both are correct, but possibly for different kinds of NPs and different kinds of contexts. In Partee 1986 she argued that indefinite NPs in fact need three different types of representations, depending on the context in which they occur. Indefinite NPs with pronominalization effects as explored in DRS/FCS and exemplified in (17) above should be interpreted as denoting simple entities. Indefinites that function as predicate nominals, as in (20), should be analyzed as denoting sets of things, here, the set of students.

(20) Mary is a student.

And the indefinite NP in (21) needs to be regarded as denoting a GQ, since it is conjoined with a quantificational NP:

(21) One student and all the teachers appeared at the rally.

This is not the end of the story, however; see Bach et al. 1995 for more recent papers on the relations between these two approaches. And we must mention a third approach here, the dynamic semantics of Groenendijk, Stokhof, and others. (See Groenendijk & Stokhof 1991, Groenendijk, Stokhof & Veltman 1996.) Expressing a concern about the lack of attention to compositionality in the DRS/FCS approaches, Groenendijk and Stokhof have explored a modification of traditional predicate logic which will be able to interpret donkey sentences and cross-sentence anaphora. It is possible to equate the interpretation of a sentence in traditional predicate logic with the set of assignments of values to variables which will satisfy it. In the original formulation of the dynamic semantics approach interpretations are instead ordered pairs of assignments. Successive sentences in a discourse carry over information from previous assignments, so that examples like (17) receive the proper interpretation. In conditional sentences, which donkey sentences are formally, the same property holds between antecedent and consequent, so that in a logical form like (14) the rightmost occurrence of the y variable will be bound by the existential quantifier. This basic approach is modified and elaborated in Groenendijk et al. 1996.

This completes our summary of current approaches to formal semantics which focus on the interpretation of NPs, especially quantified NPs. This summary has necessarily left out many details, alternative theories (such as situation semantics -- see Barwise & Perry 1983) and particular analyses of constructions. For more information, see the many excellent papers in Lappin 1996. We turn now to look at some other aspects of sentence interpretation.

3. Aspects of eventualities.

As noted above, by far the most attention in formal semantics has been paid to the interpretation of NPs. However philosophers and linguists have also been drawn to consider other aspects of sentence interpretation and now we will look at some of these. We will begin with a problem noticed by Donald Davidson, and that will lead us to consider the nature of different kinds of eventualities as well as some more complexities of NP interpretation.

3.1. Davidson's 'event' semantics. Davidson (1967) considered the fanciful example in (22):

(22) Jones buttered the toast with a knife in the bathroom at midnight.

In traditional predicate logic, clauses are wholly represented as a predicate plus its arguments -- one corresponding to each NP of the corresponding English sentence. There are 5 NPs in (22) (Jones, the toast, a knife, the bathroom, midnight); hence to represent this sentence in traditional logic we would have to have a 5-place predicate, something like Butter-with-in-at, to go with five arguments corresponding to these five NPs. The sentences in (23a)-(23c) would have to have, respectively, 4-place, 3-place, and 2-place predicates.

(23) a. Jones buttered the toast with a knife in the bathroom.
        b. Jones buttered the toast with a knife.
        c. Jones buttered the toast.

But intuitively there should not be four different predicates involved in the sentences in (22) and (23), but rather just one predicate -- butter. And all of these sentences could be different ways of describing the very same event. To put the problem in more formal terms (the way Davidson described it), (22) entails each of the sentences in (23) (and they each entail the ones below them), but these semantic relations could not be captured in traditional predicate logic.

What Davidson proposed by way of a solution was to recognize events as a kind of entity -- that is, to add events to the other things (people, dogs, chairs, etc.) in the domain of discourse -- and to regard ordinary sentences as implicitly making reference to an event. Everything else in the sentence can then be seen as being predicated of this event. So (22) would introduce an event which is a 'Jones buttering the toast' type of event, and this very event has other properties -- it occurred with a knife and in the bathroom, etc. The logical form of (22), according to Davidson, is something like (24),[9where e is a special variable over events:

(24) $ e [Butter(Jones, the toast, e) & With(a knife, e) & In(the bathroom, e) & At(midnight, e)]

The logical form for (23c) would be (25).

(25) $ e [Butter(Jones, the toast, e)]

(25) is entailed by (24) as well as by the Davidsonian logical forms for (23a) and (23b).

Linguists were not aware of Davidson's proposals for a while after they were introduced, but more recently they have received a great deal of attention. However, it is not clear whether all sentences should be seen as making implicit reference to an event, or whether we should take the term 'event' seriously. Not all sentences describe events. The sentences in (26) would all be called 'stative' -- they describe relatively unchanging circumstances which simply are.

(26) a. Joyce knows Arabic.
        b. Four divided by two equals two.
        c. Dogs make good companions.

Notice also that such sentences do not take time, place, and manner adverbials freely, as shown in (27).

(27) a. ?Joyce knows Arabic at midnight.
        b. ?Four divided by two equals two in the bathroom.
        c. ?Dogs make good companions with a knife.

Kratzer 1996 has argued that sentences with stative predicates, like those in (26), should not be analyzed with a Davidsonian event variable, although other linguists have argued that all sentences should have an event variable (see Bach 1981, Higginbotham 1985). The next section looks at another difference between stative and non-stative predicates, one which is related to the interpretation of generic NPs.

3.2. Generic NPs. Sentences like those in (28) present an interesting puzzle:

(28) a. Dogs are excellent companions.
        b. Dogs are barking outside my window.

Though the same word -- dogs -- functions as subject in both it seems to refer to two different things. (28a) is a statement about dogs in general, perhaps all dogs, while (28b) talks about some specific dogs, perhaps only two or three. Greg Carlson (another of Barbara Partee's students!) had a crucial insight in proposing a solution to this puzzle. He shifted attention from the subject to the predicate and saw that the apparent distinction in NP interpretation correlated well with a difference in whether the verb phrase expressed a permanent property, or instead a more temporary property, of the subject. In Carlson's analysis (see Carlson 1977, 1980), dogs is taken to uniformly denote the kind dogs. Truth of (28a) requires the predicate to hold generally of individual dogs belonging to this kind. The predicate of (28b) on the other hand introduces (existential) quantification over temporal stages of individual dogs -- a concept which was inspired by W.V. Quine 1960.

Although particular aspects of this analysis have been disputed (see Carlson & Pelletier 1995 for some current views of generics), Carlson's distinction between individual level and stage level predication has proved to have far reaching significance. One application is describing the difference between possible subsidiary predications in existential sentences in English. (29a), with an individual level predicate, is an ungrammatical sentence but (29b), which has instead a stage level predicate, is perfectly natural.

(29) a. *There are dogs excellent companions.
        b. There are dogs barking outside my window.

Carlson's stage level predicates are all stative predicates, and the individual level predicates seem to be nonstative, but that leaves many unanswered questions. Are there just two types of eventualities? If not, what other kinds are there, and how are the different categories defined? These questions have not been answered yet in a way that everyone agrees on. We will look at some proposals in the next section.

3.3. Types of eventualities. The German word Aktionsarten (singular Aktionsart) is commonly now used in the study of different types of eventualities, to distinguish aspect in this sense from the aspectual markers found on verbs in inflecting languages. Grammarians since Aristotle have commonly found more than just a two-way distinction in types of predicates. Aristotle himself pointed to a three way distinction among states like knowing Arabic, which are relatively unchanging, processes like running, in which there is activity of some kind going on, and actions like building a house, which have a natural culmination or termination point. The latter are now commonly referred to as telic eventualities.

Zeno Vendler, one of the earliest philosophers to pay serious attention to the kinds of linguistic evidence that motivates linguists, divided Aristotle's telic eventualities into two subcategories -- accomplishments like building a house, which are volitional and take some time to bring about, and what he called achievements like noticing a mistake or dying, which are nonvolitional and are referred to as though they were instantaneous. (See Vendler 1967.) Some of the grammatical distinctions in these four categories are illustrated in the following examples, where know Arabic represents stative predicates, push a cart represents processes, build a house represents accomplishments, and notice a mistake represents achievements.

(30) a. *Mary is knowing Arabic/noticing a mistake.
        b. Mary is pushing a cart/building a house.

(31) a. Mary knew Arabic/pushed a cart for a year.
        b. *Mary built a house/noticed a mistake for a year.

(32) a. *Mary knew Arabic/pushed a cart within day.
        b. Mary built a house/noticed a mistake within a day.

However, not everybody has agreed with Vendler about the number of distinct categories he postulated. In the formal semantics for Aktionsarten presented in Parsons 1990, there are just two operators: Cul ('culminate') to represent Vendler's accomplishments and achievements, and Hold, for sentences representing either states or processes. Bach 1986, on the other hand, subdivided eventualities into six different subcategories, including two different types of states (dynamic and static), in addition to processes and several kinds of telic eventualities. There are other complications too; Verkuyl has stressed the importance of the effect different types of NP arguments can have on the aspect of a sentence. (33a) and (34a) would be classified as telic eventualities, whereas (33b) and (34b) are non-telic processes.

(33) a. Mary painted a picture (*for a year).
        b. Mary painted pictures (for a year).

(34) a. A guest arrived (*for an hour).
        b. Guests arrived (for an hour).

Verkuyl 1993 proposes a formal semantics in which NPs as well as verbs are taken into account, and eventuality types are determined compositionally for the sentence as a whole.

4. Summary, conclusions and prognostications.

4.1. Commonalities. All of the formal analyses described and summarized here have shared some common assumptions about the goals of semantics. One is that any proposed analysis of the semantic interpretation for a language, or a portion thereof, must be given in rigorous and explicit terms. Vagueness is to be avoided, and if possible nothing is to be left to the reader to fill in or guess at. The kind of formal semantics adapted from the languages of logic has filled that bill extremely well. This explains the frequent use of special symbols in formal semantics. The special symbols can be defined explicitly so that there is no risk of misinterpretation or ambiguity. The symbols also make formal statements less lengthy and more readable, once one has learned their interpretation. Although the heavy use of special symbols initially presents somewhat of a formidable seeming barrier to formal semantics, ultimately it has more than enough value in clarity to make climbing over this barrier well worth while.

Another common assumption was referred to above in the contradictory-sounding statement from David Lewis: 'Semantics without truth conditions is not semantics' (Lewis 1972, 169). Truth conditional semantics takes the core of meaning for a sentence to be given by some kind of explicit statement about what it would take for a sentence to be true. There are many arguments for this assumption. One is that it makes clear how language is related to the things in the outside world that it is used to talk about. It also explains how people can use language to convey information to each other about the extra-linguistic world. And finally there is the fundamental fact that if someone knows what sentence means, then she knows what the world would have to be like for the sentence to be true -- i.e. the truth conditions of the sentence. Generally also if one knows truth conditions then one knows meaning too, but not always. Necessarily true sentences like the truths of mathematics all have the same truth conditions -- they are true under any circumstances or in every possible world. Nevertheless these sentences don't all mean the same thing. Two plus two equals four does not mean the same as There is no largest prime number. So there is more to meaning besides truth conditions, but formal semanticians agree that giving truth conditions is an essential core to describing meaning.

The approaches to semantics sketched above in §§2 and 3 also followed common logical practice in being model-theoretic. Model-theoretic semantics is a generalization of the truth conditional approach according to which truth conditions are given relative to a model. The semantics for a given language will specify what a model for the language must consist of -- what kinds of things it must have and how the language is to be related to them. Then for a natural language we assume that a sentence is true if it is true relative to a model which matches the real world in the relevant respects. (See Kalish 1967 for discussion and historical notes.)

4.2. Alternatives, formal & informal. Not all truth conditional semantics is model-theoretic. Donald Davidson has proposed a different style of semantics for natural language, which is also based on modern logic, but which takes the task of semantics for a language as divising a system which will generate specific statements of truth conditions which are called 'T-sentences'. T-sentences were introduced by Tarski (see Tarski 1944), but the 'T' stands for 'truth' not for 'Tarski'! A T-sentence for the English sentence Snow is white is given in (35).

(35) Snow is white is true if and only if snow is white.

(35) looks fairly vacuous, but part of that vacuous look is because the object language -- the language we are talking about the semantics of, is the same as the metalanguage -- the language we are using to do the semantics with. When the object language is different from the metalanguage, the T-sentence looks more significant:

(36) Xuê shì baí de is true if and only if snow is white.

Tarski proposed as a minimal condition on the adequacy of the semantic rules for a language, that they should allow the derivation (that is, the proof, in the logical sense) of the correct T-sentences for all the sentences of the object language. Larson & Segal 1995 have undertaken the task of working out the formal details of the T-sentence approach for a large fragment of English which includes generalized quantifiers, referentially opaque sentences, tense and aspect features, and many other interesting and challenging constructions. Their work is presented as a textbook for graduate students, but it is of great interest to professional linguists and philosophers of language as well.

There are few alternatives to the approaches falling under the heading of formal semantics, and none that offer the same comprehensiveness. Probably the most well known is the approach of Jackendoff -- see Jackendoff 1990, 1997. Jackendoff's specialty is lexical semantics, about which formal semanticists have had the least to say, and his work, which offers many insights into the nature of word meaning, deserves careful attention. Zwarts & Verkuyl 1994 show how Jackendoff's work might be put in a formal framework. Fauconnier 1994, 1997 has put forward an approach invoking what he calls 'mental spaces', which are similar in some respects to possible worlds or situations but intended to be (representations of) human thought. This work focuses in particular on unusual cases of reference such as those illustrated in (36).

(36) a. Bill [meaning Bill's car] is parked on the next street.
        b. If I were you I would treat myself/me with a little more respect.

Fauconnier's work tends to be less carefully worked out than Jackendoff's, and neither approach reaches the level of explicit comprehensiveness of the formal theories we have been looking at.

4.3. Future prospects. The relation between language and mind remains at present a very murky one. Chomsky's mentalistic revolution in linguistics put the study of language, at least theoretically and according to Chomsky, under the umbrella of psychology. However in practice developments in linguistics and findings of psycholinguists have not always fit together comfortably, and I regret to have to report that linguists have sometimes seemed to turn their back on psycholinguists in such cases. Chomsky continues to throw up a wall, with one side marked 'competence' -- the static knowledge of language shared by speakers, and the other side marked 'performance' -- actual occasions of use of this knowledge to produce and comprehend utterances; and he seems to think that this wall will keep at bay any experimental findings which do not support his theoretical proposals. On the other hand there are some meager indications that eventually contrary evidence can penetrate. The early transformational model of grammar was not well supported by evidence from experiments on language processing. While this evidence seemed to be ignored for many years, gradually Chomsky has replaced the transformational model with another, and it is possible that the psycholinguistic evidence played a role in this replacement.

Another issue is the relation of semantics to the rest of the grammar, on the one hand, and to the rest of cognition on the other. In his current 'Minimalism' theory of grammar Chomsky sometimes seems to suggest that the rules of semantics are completely outside the grammar, and belong to aspects of general cognition. On other occasions, though, Chomsky uses examples of word meaning to argue for the highly specialized nature of linguistic competence and for its innateness. (See Chomsky 1995a,b.) This ambivalence is matched by other long standing controversies -- the controversy over whether word meanings are specialized and distinguished from general knowledge about the world (the 'dictionary' view) or whether they are holistic and global, and encompass everything related to extensions (the 'encyclopedia' view), as well as the many disputes about whether certain aspects of sentence meaning (in a broad sense of 'meaning') belong to semantics or pragmatics -- aspects such as presupposition, conversational implicature, and illocutionary force.

I predict that these issues will be resolved within the next fifty years, and that findings from the rest of the new field of cognitive science -- especially the subfields of psycholinguistics and language processing, neurolinguistics, and computational linguistics -- will be helpful in this resolution. I also believe that the evidence will ultimately indicate that the framework for semantic interpretation will share the unique nature that the rest of language seems to have, and that there will be a distinction between the linguistic lexicon and the encyclopedia of world knowledge. I base this projection in part on the fact that the principles of meaning, whether at the word level or at the sentence level, seem to be as illusive and inaccessible to conscious reflection as the principles of syntactic structure or phonological interpretation, and in part on my belief that Grice and Montague were right, that the logical approach to language, in which semantic and syntactic representations mirror each other, is justified for natural as well as formalized languages.[10]
 
 

ENDNOTES

[1] According to Carnap

A theory, a rule, a definition, or the like is to be called formal when no reference is made in it either to the meaning of the symbols (for example, the words) or to the sense of the expressions (e.g. the sentences), but simply and solely to the kinds and order of the symbols from which the expressions are constructed. (Carnap 1937, 1.) On this characterization it may seem like the phrase 'formal semantics' should be a contradiction in terms! At the time Carnap and others believed that relations of meaning among sentences, such as entailment and contradiction, could and should be given an account in purely formal, that is syntactic, terms. However with the development by Tarski and others of rigorous methods of semantic interpretation, on the one hand, and the proof by Gödel of the nonequivalence of syntactic and semantic notions of logical consequence, on the other, 'logic' has come to encompass both syntax and semantics, and 'formal' has come to mean something like 'rigorous and explicit; modeled on methods in logic'. In some ways the term is a counterpart to Chomsky's term 'generative'.  [return]

[2] One of these developments was the discovery by Fillmore of the cyclic principle of transformational rule application (Fillmore 1963), which allowed the abandonment of generalized transformations and the incorporation of recursive rules in the phrase structure component. The other was the development by Katz and Postal of arguments for deep structure triggers of otherwise meaning changing transformations -- most notably the negation and question rules (Katz & Postal 1964).   [return]

[3] The earliest of these papers were published in 1970, but reference notes make clear that the ideas were already beginning to be presented in lectures as early as January and February of 1966 (cf. Montague 1970a, 188).   [return]

[4] The reader may have guessed from this quotation that the phrase 'universal grammar' had quite a different meaning for Montague than it has for Chomsky. While for Chomsky 'universal grammar' denotes the innate human language faculty, for Montague that phrase denoted the mathematical study of syntax and semantics. Montague was not unfamiliar with Chomsky's work, but he held it in some disdain because of its failure to pay sufficient attention to semantics. Cf. e.g. Montague 1973, 247.   [return]

[5] Frege's solution to the problem of referential opacity, as formalized by Montague and others, has not proved to be completely successful. One difficulty is posed by the fact that proper names do not seem to have a sense, the way descriptive expressions like Oedipus's mother do. (Kripke 1972 argued this at length and quite convincingly.) Nevertheless proper names cannot be substituted for each other in referentially opaque contexts, as seen in by the fact that (ia) can be true and (ic) false, despite the truth of (ib):

(i) a. Ruth knows that Mark Twain wrote Tom Sawyer.
    b. Mark Twain was Samuel Clemens.
    c. Ruth knows that Samuel Clemens wrote Tom Sawyer.

There is a huge literature on this topic, which stretches back at least to the middle ages and continues to the present day. See Linsky 1971 for some of the standard 'classical' references on this topic, including Quine 1953 and Quine 1956, and Anderson & Owens 1990 and Künne, Newen & Anduschus 1997 for some recent papers.   [return]

[6] Although generalized quantifiers had been discovered prior to Montague's work (see Barwise & Cooper 1981, 159), he was apparently unaware of this and did not use the term 'generalized quantifier' in his own papers. Also, because of the intensionality in his approach, rather than interpreting NPs as sets of sets he actually interpreted them as properties of properties, but I am ignoring that complication for this presentation.   [return]

[7] Despite the appealing nature of Ladusaw's hypothesis about negative polarity items there are problems with this explanation. See Israel 1996 for a review of much of the literature on polarity, and an alternative hypothesis.   [return]

[8] Geach 1962 was the first to draw the attention of modern philosophers and linguists to the problems presented by such sentences, though he cited a medieval literature on the subject.   [return]

[9] Of course the real logical form for (22) would have the NPs unpacked in familiar quantificational ways, which have been omitted here for clarity's sake.   [return]

[10] I would like to thank Aldo Antonelli, Jianguo Chen, Yen-Hwei Lin, Dick Stanley, and Luding Tong for help in connection with this paper.   [return]
 

REFERENCES

Anderson, C. Anthony & Joseph Owens, eds. 1990. Propositional Attitudes: The Role of Content in Logic, Language, and Mind. Stanford, CA: CSLI.

Bach, Emmon. 1981. On time, tense, and aspect: An essay in English metaphysics. In Peter Cole, ed., Radical Pragmatics, New York: Academic Press, 63-81.

Bach, Emmon. 1986. The algebra of events. Linguistics and Philosophy 9:1, 5-16.

Bach, Emmon, Eloise Jelinek, Angelika Kratzer & Barbara Partee, eds. 1995. Quantification in Natural Languages. Dordrecht: Kluwer.

Bar-Hillel, Yehoshua. 1954. Logical syntax and semantics. Language 30:2, 230-237.

Barwise, Jon & Robin Cooper. 1981. Generalized quantifiers in natural language. Linguistics and Philosophy 4:2, 159-220.

Barwise, Jon & John Perry. 1983. Situations and Attitudes. Cambridge, MA: MIT Press.

Bloomfield, Leonard. 1933. Language. New York: Henry Holt.

Carlson, Greg N. 1977. A unified analysis of the English bare plural. Linguistics and Philosophy 1:413-456.

Carlson, Greg N. 1980. Reference to Kinds in English. New York: Garland Press.

Carlson, Greg N. & Francis Jeffry Pelletier, eds. 1995. The Generic Book. Chicago: University of Chicago Press.

Carnap, Rudolf. 1937. The Logical Syntax of Language. New York: Harcourt Brace.

Chomsky, Noam. 1955. Logical syntax and semantics: Their linguistic relevance, Language 31:1, 36-45.

Chomsky, Noam. 1957. Syntactic Structures. The Hague: Mouton.

Chomsky, Noam. 1995a. Language and nature. Mind 104, 1-61.

Chomsky, Noam. 1995b. The Minimalist Program. Cambridge, MA: MIT Press.

Davidson, Donald. 1967. The logical form of action sentences. In Nicholas Rescher, ed., The Logic of Decision and Action, Pittsburgh, PA: The University of Pittsburgh Press, 81-95.

Davidson, Donald & Gilbert Harman, eds. 1972. Semantics of Natural Language. Dordrecht: D. Reidel.

Dowty, David R., Robert E. Wall & Stanley Peters, eds. 1981. Introduction to Montague Semantics. Dordrecht: D. Reidel.

Fauconnier, Gilles. 1994. Mental Spaces: Aspects of Meaning Construction in Natural Language. Cambridge: Cambridge University Press.

Fauconnier, Gilles. 1997. Mappings in Thought and Language. Cambridge: Cambridge University Press.

Fillmore, Charles J. 1963. The position of embedding transformations in a grammar. Word 19, 208-231.

Frege, Gottlob. 1892. Über Sinn und Bedeutung. Zeitschrift für Philosophie und Philosophische Kritik 100, 25-50. Reprinted as 'On sense and reference' in Peter Geach & Max Black, eds., Translations from the Philosophical Writings of Gottlob Frege, 1970, Oxford: Basil Blackwell, 56-78.

Gärdenfors, Peter, ed. 1987. Generalized Quantifiers: Linguistic and Logical Approaches. Dordrecht: D. Reidel.

Geach, Peter Thomas. 1962. Reference and Generality. Ithaca, NY: Cornell University Press.

Groenendijk, Jeroen & Martin Stokhof. 1991. Dynamic predicate logic. Linguistics and Philosophy 14:1, 39-100.

Groenendijk, Jeroen, Martin Stokhof & Frank Veltman. 1996. Coreference and modality. In Shalom Lappin, ed., 179-214.

Heim, Irene. 1982. The Semantics of Definite and Indefinite Noun Phrases. Amherst, MA: University of Massachusetts doctoral dissertation.

Heim, Irene. 1983. File change semantics and the familiarity theory of definiteness', in Rainer Bäuerle, Christoph Schwarze, & Arnim von Stechow, eds., Meaning, Use, and Interpretation of Language. Berlin: de Gruyter, 164-189.

Higginbotham, James. 1985. On semantics. Linguistic Inquiry 16, 547-593.

Israel, Michael. 1996. Polarity sensitivity as lexical semantics. Linguistics and Philosophy 19:6, 619-666.

Jackendoff, Ray. 1990. Semantic Structures. Cambridge, MA: MIT Press.

Jackendoff, Ray. 1997. The Architecture of the Language Faculty. Cambridge, MA: MIT Press.

Kalish, Donald. 1967. Semantics. In Paul Edwards, ed., The Encyclopedia of Philosophy, volume 7, 348-358.

Kamp, Hans. 1984. A theory of truth and semantic representation. In Jeroen Groenendijk, Theo M.V. Janssen, & Martin Stokhof, eds., Truth, Interpretation and Information. Dordrecht: Foris Publications, 1-42.

Katz, Jerrold J. & Paul M. Postal. 1964. An Integrated Theory of Linguistic Descriptions. Cambridge, MA: MIT Press.

Kratzer, Angelika. 1996. Stage-level and individual-level predicates. In Carlson & Pelletier, eds., 125-175.

Kripke, Saul A. 1972. Naming and necessity. In Davidson & Harman, 253-355. Reissued separately in 1980 by Harvard University Press.

Künne, Wolfgang, Albert Newen & Martin Anduschus. 1997. Direct Reference, Indexicality, and Propositional Attitudes. Stanford, CA: CSLI.

Ladusaw, William A. 1983. Logical form and conditions on grammaticality. Linguistics and Philosophy 6:3, 373-392.

Lappin, Shalom, ed. 1996. The Handbook of Contemporary Semantic Theory. Oxford: Blackwell.

Larson, Richard & Gabriel Segal. 1995. Knowledge of Meaning: An Introduction to Semantic Theory. Cambridge, MA: MIT Press.

Lees, Robert B. 1957. 'Review of Syntactic structures, by Noam Chomsky', Language 33:3, 375-408.

Lewis, David. 1972. General semantics. In Davidson & Harman, eds., 169-218. Reprinted in Partee 1976, 1-50.

Lewis, David. 1975. Adverbs of quantification. In Edward L. Keenan, ed., Formal Semantics of Natural Language, Cambridge: Cambridge University Press, 3-15.

Linsky, Leonard, ed. 1971. Reference and Modality. Oxford: Oxford University Press.

Montague, Richard. 1970a. English as a formal language. in Bruno Visentini et al., Linguaggi nella Società e nella Tecnica, Milan: Edizioni di Comunità, 189-224. Reprinted in Montague 1974, 188-221.

Montague, Richard. 1970b. Universal grammar. Theoria 36, 373-398. Reprinted in Montague 1974, 222-246.

Montague, Richard. 1973. The proper treatment of quantification in ordinary English. in J. Hintikka, J. Moravcsik, & P. Suppes, eds., Approaches to Natural Language: Proceedings of the 1970 Stanford Workshop on Grammar and Semantics, Dordrecht, D. Reidel, 221-242. Reprinted in Montague 1974, 247-270.

Montague, Richard. 1974. Formal Philosophy: Selected Papers of Richard Montague. Edited and with an introduction by Richmond H. Thomason. New Haven: Yale University Press.

Parsons, Terence. 1990. Events in the semantics of English. Cambridge, MA: MIT Press.

Partee, Barbara. 1975. Montague Grammar and Transformational Grammar. Linguistic Inquiry 6:2, 203-300.

Partee, Barbara, ed. 1976. Montague Grammar. New York: Academic Press.

Partee, Barbara. 1986. Noun phrase interpretation and type-shifting principles. In Jeroen Groenendijk, Dick de Jongh, & Martin Stokhof, eds., Studies in Discourse Representation Theory and the Theory of Generalized Quantifiers, Dordrecht: Foris Publications, 115-144.

Quine, W.V. 1953. Reference and modality. In From a Logical Point of View, Cambridge MA: Harvard University Press, 139-159.

Quine, W.V. 1956. Quantifiers and propositional attitudes. Journal of Philosophy 53. Reprinted in W.V. Quine, The Ways of Paradox, 1966, New York: Random House, 183-194.

Quine, W.V. 1960. Word and Object. Cambridge, MA: MIT Press.

Quine, W. V. 1986. Autobiography of W.V. Quine. in Lewis Edwin Hahn & Paul Arthur Schilpp, eds., The Philosophy of W.V. Quine, La Salle, IL: Open Court, 3-48.

Stalnaker, Robert C. 1974. Pragmatic Presupposition. In Milton K. Munitz & Peter K. Unger, eds., Semantics and Philosophy, New York: New York University Press, 197-213.

Stalnaker, Robert C. 1978. Assertion. In Peter Cole, ed., Syntax and Semantics, Volume 9: Pragmatics, New York: Academic Press, 315-322.

Tarski, Alfred. 1944. The semantic conception of truth. Philosophy and Phenomenological Research 4, 341-375.

Vendler, Zeno. 1967. Linguistics in Philosophy. Ithaca, NY: Cornell University Press.

Verkuyl, Henk J. 1993. A Theory of Aspectuality. Cambridge: Cambridge University Press.

Zwarts, Joost & Henk Verkuyl. 1994. An algebra of conceptual structure: An investigation into Jackendoff's conceptual semantics. Linguistics and Philosophy 17, 1-28.