Survey
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Meanings as Instructions for how to Build Concepts Paul M. Pietroski University of Maryland Dept. of Linguistics, Dept. of Philosophy http://www.terpconnect.umd.edu/~pietro In our last episode… Humans acquire words, concepts, and grammars What are words, concepts, and grammars? How are they related? How are they related to whatever makes humans distinctive? Did a relatively small change in our ancestors lead to both the "linguistic metamorphosis” that human infants undergo, and significant cognitive differences between us and other primates? Maybe… we’re cognitively special because we’re linguistically special, and we’re linguistically special because we acquire words (After all, kids are really good at acquiring words.) Two Pictures of Lexicalization Further lexical information Concept of adicity n Concept of adicity n (initial concept) Perceptible Signal Word: adicity n Concept of adicity n further lexical information Word: adicity k Concept of adicity k Perceptible Signal Puzzles for the idea that Words simply Label Concepts • Apparent mismatches between how words combine (grammatical form) and how concepts combine (logical form) KICK(x1, x2) The baby kicked RIDE(x1, x2) Can you give me a ride? BEWTEEN(x1, x2, x3) I am between him and her BIGGER(x1, x2) That is bigger than that FATHER(…?...) Fathers father MORTAL(…?...) Socrates is mortal A mortal wound is fatal Lexicalization as Monadic-Concept-Abstraction KICK(x1, x2) KICK(event) Concept of adicity n (before) Concept of adicity n Word: adicity -1 Concept of adicity -1 Perceptible Signal Articulation and Perception of Signals Language Acquisition Device in its Initial State Experience and Growth PHONs Language Acquisition Device in a Mature State (an I-Language): GRAMMAR LEXICON initial concepts SEMs introduced concepts other acquired------> concepts initial concepts Language Acquisition Device in a Mature State (an I-Language): GRAMMAR LEXICON what kinds of concepts do SEMs interface with? SEMs introduced concepts other acquired------> concepts initial concepts Idea (to be explained and defended) • In acquiring words, we use available concepts to introduce new ones 'ride' + RIDE(x1, x2) ==> RIDE(_) + 'ride' + RIDE(x1, x2) • Words are then used to fetch the introduced concepts when you hear the word ‘ride’…..fetch the concept RIDE(_) • The new concepts can be systematically conjoined 'ride fast' 'ride horses’ RIDE(_) & FAST(_) RIDE(_) & [THEME(_, _) & HORSES(_)] 'ride horses fast’ FAST(_) RIDE(_) & [THEME(_, _) & HORSES(_)] & ‘ride fast horses’ HORSES(_)] RIDE(_) & [THEME(_, _) & FAST(_) & Yeah, yeah. But… • how could infants use (largely innate) nonmonadic concepts to introduce monadic concepts? • is there evidence that they do so • what about polysemy • what kind of quantifier is that red thing in RIDE(_) & [THEME(_, _) & HORSES(_)] • and what kind of conjunction does that blue ampersand indicate A Possible Mind KICK(x1, x2) KICK(x1, x2) AGENT(_, x1) PATIENT(_, x2) KICK(_, x1, x2) ≡df a prelexical concept for some _, KICK(_, x1, x2) generic “action” concept ≡df AGENT(_, x1) & KICK(_) & PATIENT(_, x2) CAESAR , PF:‘Caesar’ Called(CAESAR, PF:‘Caesar’) Called(_, PF:‘Caesar’) mental labels for a person and a sound a thought about what the person is called ≡df CAESARED(_) KICK(_) introduced via: KICK(_, x1, x2) AGENT(_, x1) PATIENT(_, x2) & KICK(_, x1, x2) introduced via: KICK(_, x1, x2) for some _ A Possible Mind ‘kick’ fetches KICK(_) ‘Caesar’ fetches CAESARED(_) ‘that Caesar’ directs construction of the complex concept CONTEXTUALLY-INDICATED(_) & CAESARED(_) ‘kick that Caesar’ directs construction of the complex concept KICK(_) & [PATIENT(_, _) & CONT-INDICATED(_) & CAESARED(_)] binds the variable of each monadic concept in its scope [PATIENT(_, _) & CONTEXTUALLY-INDICATED(_) & CAESARED(_)] |___________________________________________|___________ ____| PATIENT(_, _) links its second variable to the local /monadic concept [PATIENT(_, _) & CONTEXTUALLY-INDICATED(_) & A Relevant Empirical Consideration Not even English provides good evidence for lexical nouns that simply label singular (saturating) concepts like CAESAR, TYLER, or BURGE Every Tyler I saw at the party was a philosopher Every philosopher I saw was a Tyler There were three Tylers at the party That Tyler stayed late, and so did this one Philosophers have wheels, and Tylers have stripes The Tylers are coming to dinner At noon, we saw Tyler Burge At noon, we saw Professor Burge At noon, we saw Professor Tyler Burge A Relevant Empirical Consideration Not even English provides good evidence for lexical verbs that simply label polyadic (unsaturated) concepts like KICK(x1, x2) or EAT(x1, x2) The baby kicked The ball was kicked I kicked the dog a bone I get no kick from Champagne, but I get a kick out of you I ate very well last night. We dined at a nice restaurant. The fish was selected, cooked, and then eaten. Compare: I fueled the car. I fueled up. Two Roles for Words on this View (1) In lexicalization… acquiring a (spoken) word is a process of pairing a sound with a concept—the concept lexicalized—storing that sound/concept pair in memory, and then using that concept to introduce a concept that can be combined with others via certain (limited) composition operations sound-of-‘kick’/KICK(x1, x2) sound-of-‘kick’/KICK(x1, x2)/KICK(_) at least for “open class” lexical items (nouns, verbs, adjectives/adverbs) the introduced concepts are monadic and conjoinable with others (2) in subsequent comprehension… a word is an instruction to fetch an introduced concept from the relevant address in memory Caveat: Polysemy 1st approximation, ‘book’ fetches BOOK(_) 2nd approximation, ‘book’ fetches one of -abstractBOOK(_), +abstractBOOK(_) A Possible Course of Lexicalization sound-of-‘book’/-abstractBOOK (adicity of initial concept not obvious) sound-of-‘book’/-abstractBOOK/-abstractBOOK(_) sound-of-‘book’/-abstractBOOK/-abstractBOOK(_) | +abstractBOOK/+abstractBOOK(_) Caveat: Polysemy 1st approximation, ‘book’ fetches BOOK(_) 2nd approximation, ‘book’ fetches one of -abstractBOOK(_), +abstractBOOK(_) But we also have to think about… ‘coloring book’ ‘blank book’ ‘book a cruise’ 3rd approximation, ‘book’ fetches one of ‘book a criminal’ … -abstractBOOK1(_), +abstractBOOK1(_) - abstractBOOK1(_), +abstractBOOK1(_) … Polysemy via Austin/Chomsky SEM(‘hexagonal’) = fetch@‘hexagonal’ SEM(‘republic’) = fetch@‘republic’ SEM(‘hexagonal republic’) = CONJOIN[fetch@‘hexagonal’, fetch@‘republic’] HEXAGONAL(_) & REPUBLIC(_) Her country is hexagonal/mountainous/nearby Her country is a republic/politically stable/wealthy two or more Introduced-CONCEPTS may reside at the ‘country’ bin A Slightly More Interesting Example two or more I(ntroduced)-CONCEPTS may reside at ‘country’ fetch@‘country’ TERRA-COUNTRY(_) POLIS-COUNTRY(_) CONJOIN[fetch@‘country’, fetch@‘hexagonal’] TERRA-COUNTRY(_) & HEXAGONAL(_) POLIS-COUNTRY(_) & HEXAGONAL(_) CONJOIN[fetch@‘country’, fetch@‘republic’] TERRA-COUNTRY(_) & REPUBLIC(_) POLIS-COUNTRY(_) & REPUBLIC(_) Caveat: Subcategorization Not saying that a verb meaning is merely an instruction to fetch a (tensefriendly) monadic concept of things that can have participants Distinguish: Semantic Composition Adicity Number (SCAN) (instructions to fetch) singular concepts +1 singular <e> (instructions to fetch) monadic concepts -1 monadic <e, t> (instructions to fetch) dyadic concepts -2 dyadic <e,<e, t>> … Property of Smallest Sentential Entourage (POSSE) zero (indexable) terms, one term, two terms, … Caveats POSSE facts may reflect, among other things (e.g. statistical experience), the adicities of concepts lexicalized, the verb ‘put’ may have a (lexically represented) POSSE of three in part because the concept lexicalized is PUT(x, y, z) though note: speakers of English still say ‘I put the cup ON THE table’, not ‘I put the cup the table’. So there is no reason to conclude that ‘put’ simply labels PUT(x, y, z) Two Kinds of Facts to Accommodate Flexibilities Brutus kicked Caesar Caesar was kicked The baby kicked I get a kick out of you Brutus kicked Caesar the ball Inflexibilities Brutus put the ball on the table *Brutus put the ball *Brutus put on the table Two Pictures of Lexicalization further “flexibility” facts (as for ‘kick’) Concept of adicity n Concept of adicity n (before) Concept of adicity n Word: adicity (SCAN) n further “posse” facts (as for ‘put’) Concept of adicity -1 Perceptible Signal Perceptible Signal Word: adicity -1 “Negative” Facts to Accommodate Striking absence of certain (open-class) lexical meanings that would be permitted if I-Languages permit nonmonadic semantic types “Negative” Facts to Accommodate Brutus sald a car Caesar a dollar sald SOLD(x, w, z, y) [sald [a car]] SOLD(x, w, z, a car) [[sald [a car]] Caesar] x sold y to z (in exchange) for w SOLD(x, w, Caesar, a car) [[[sald [a car]] Caesar]] a dollar] SOLD(x, $, Caesar, a car) _________________________________________________ Brutus tweens Caesar Antony tweens BETWEEN(x, z, y) [tweens Caesar] BETWEEN(x, z, Caesar) [[tweens Caesar] Antony] BETWEEN(x, Antony, Caesar) “Negative” Facts to Accommodate Alexander jimmed the lock a knife JIMMIED(x, z, y) jimmed [jimmed [the lock] JIMMIED(x, z, the lock) [[jimmed [the lock] [a knife]] JIMMIED(x, a knife, the lock) _________________________________________________ Brutus froms Rome froms COMES-FROM(x, y) [froms Rome] COMES-FROM(x, Rome) “Negative” Facts to Accommodate Brutus talls Caesar IS-TALLER-THAN(x, y) talls [talls Caesar] IS-TALLER-THAN(x, Caesar) _________________________________________________ *Julius Caesar Julius JULIUS Caesar CAESAR Emprical Point… There is little to no evidence of any lexical items ever fetching supradyadic concepts Brutus gave Caesar the ball Brutus kicked Caesar the ball Brutus gave/kicked the ball to Caesar Various (e.g., Larson-style) analyses of ditransitive constructions, without ditransitive verbs But… If the basic mode of semantic composition is conjunction of monadic concepts, then we can start to explain the absence of lexical meanings like… SOLD(x, w, z, y) BETWEEN(x, z, y) JIMMIED(x, z, y) COMES-FROM(x, y) IS-TALLER-THAN(x, y) TYLER Language Acquisition Device in a Mature State (an I-Language): GRAMMAR LEXICON what kinds of concepts do SEMs interface with? SEMs introduced concepts other acquired------> concepts initial concepts Idea • In acquiring words, we use available concepts to introduce new ones 'ride' + RIDE(x1, x2) ==> RIDE(_) + 'ride' + RIDE(x1, x2) • Words are then used to fetch the introduced concepts when you hear the word ‘ride’…..fetch the concept RIDE(_) • The new concepts can be systematically conjoined 'ride fast' 'ride horses’ RIDE(_) & FAST(_) RIDE(_) & [THEME(_, _) & HORSES(_)] 'ride horses fast’ FAST(_) RIDE(_) & [THEME(_, _) & HORSES(_)] & ‘ride fast horses’ HORSES(_)] RIDE(_) & [THEME(_, _) & FAST(_) & Meanings as Instructions for how to build (Conjunctive) Concepts The meaning (SEM) of [rideV fastA]V is the following instruction: CONJOIN[execute:SEM(‘ride’), execute:‘SEM(‘fast’)] CONJOIN[fetch@‘ride’, fetch@‘fast’] Executing this instruction yields a concept like RIDE(_) & FAST(_) But the meaning (SEM) of [rideV horsesN]V is NOT the following instruction: CONJOIN[execute:SEM(‘ride’), execute:SEM(‘horses’)] CONJOIN[fetch@‘ride’, fetch@‘horses’] Executing this instruction would yield a concept like RIDE(_) & HORSES(_) Meanings as Instructions for how to build (Conjunctive) Concepts The meaning (SEM) of [rideV fastA]V is the following instruction: CONJOIN[fetch@‘ride’, fetch@‘fast’] Executing this instruction yields a concept like RIDE(_) & FAST(_) The meaning (SEM) of [rideV horsesN]V is the following instruction: CONJOIN[fetch@‘ride’, DirectObject:SEM(‘horses’)] CONJOIN[fetch@‘ride’, Thematize-execute:‘SEM(‘horses’)] Executing this instruction would yield a concept like RIDE(_) & [THEME(_, _) & HORSES(_)] RIDE(_) & [THEME(_, _) & HORSE(_) & PLURAL(_)] Meanings as Instructions for how to build (Conjunctive) Concepts The meaning of [[rideV horsesN]V fastA]V is the following instruction: CONJOIN[execute:SEM([rideV horsesN]V), execute:SEM(fastA)] Executing this instruction yields a concept like RIDE(_) & [THEME(_, _) & HORSES(_)] & FAST(_) The meaning of [[rideV [fastA horsesN]N]V is the following instruction: CONJOIN[fetch@‘ride’, DirectObject:SEM([fastA horsesN]N)] Executing this instruction yields a concept like RIDE(_) & [THEME(_, _) & FAST(_) & HORSES(_)] Meanings as Instructions for how to build (Conjunctive) Concepts On this view, meanings are neither extensions nor concepts. Familiar difficulties for the idea that lexical meanings are concepts polysemy 1 meaning, 1 cluster of concepts (in 1 mind) intersubjectivity 1 meaning, 2 concepts (in 2 minds) jabber(wocky) 1 meaning, 0 concepts (in 1 mind) But a single instruction to fetch a concept from a certain address can be associated with more (or less) than one concept Meaning constancy at least for purposes of meaning composition