
article
... (y-axis) is called dominance: in this dimension the dominance relationships between the nodes are represented by angled, solid dependency edges. This type of tree representation is most closely associated with the work of Hays (1964). Tree representations in other dependency theories may look quite ...
... (y-axis) is called dominance: in this dimension the dominance relationships between the nodes are represented by angled, solid dependency edges. This type of tree representation is most closely associated with the work of Hays (1964). Tree representations in other dependency theories may look quite ...
Dependency in Linguistic Description
... The Deep-Syntactic Structure of a sentence is a tree whose nodes are labeled with the full lexemes of the sentence—such that there is a one-to-one correspondence between DSyntnodes and full lexemes; the arcs of this tree, called branches, are labeled with names of abstract universal Deep-Syntactic R ...
... The Deep-Syntactic Structure of a sentence is a tree whose nodes are labeled with the full lexemes of the sentence—such that there is a one-to-one correspondence between DSyntnodes and full lexemes; the arcs of this tree, called branches, are labeled with names of abstract universal Deep-Syntactic R ...
Morphology and cross dependencies in the synthesis of
... singular can be pronominalized as gli if the previous occurrence of the token is masculine, as le if feminine (see Table 1). However, if there is a [dir-object] synthesized as the Ppv =: lo , the pronouns gli or le amalgamate with this Ppv and both become glieּ: Diedi il libro a Maria --> Le diedi i ...
... singular can be pronominalized as gli if the previous occurrence of the token is masculine, as le if feminine (see Table 1). However, if there is a [dir-object] synthesized as the Ppv =: lo , the pronouns gli or le amalgamate with this Ppv and both become glieּ: Diedi il libro a Maria --> Le diedi i ...
Performance Grammar: a Declarative Definition
... is a two–stage process. First, an unordered hierarchical structure (‘mobile’) is assembled out of lexical building blocks. The key operation at work here is feature uni cation, which also delimits the positional options of the syntactic constituents. During the second stage, the branches of the mob ...
... is a two–stage process. First, an unordered hierarchical structure (‘mobile’) is assembled out of lexical building blocks. The key operation at work here is feature uni cation, which also delimits the positional options of the syntactic constituents. During the second stage, the branches of the mob ...
Chapter 12
... between words and phrases. For example the verb want can be followed by an infinitive, as in I want to fly to Detroit, or a noun phrase, as in I want a flight to Detroit. But the verb find cannot be followed by an infinitive (*I found to fly to Dallas). These are called facts about the subcategoriza ...
... between words and phrases. For example the verb want can be followed by an infinitive, as in I want to fly to Detroit, or a noun phrase, as in I want a flight to Detroit. But the verb find cannot be followed by an infinitive (*I found to fly to Dallas). These are called facts about the subcategoriza ...
The adaptation of a machine-learned sentence
... French specific information was necessary. The most common context of insertion is with être (“to be”), and a feature specific to that environment was added to the set of extracted features. For determining the syntactic label of a constituent, more information again is needed in French, because of ...
... French specific information was necessary. The most common context of insertion is with être (“to be”), and a feature specific to that environment was added to the set of extracted features. For determining the syntactic label of a constituent, more information again is needed in French, because of ...
The adaptation of a machine-learned sentence realization system to
... French specific information was necessary. The most common context of insertion is with etre ("to be"), and a feature specific to that environment was added to the set of extracted features. For determining the syntactic label of a constituent, more information again is needed in French, because of ...
... French specific information was necessary. The most common context of insertion is with etre ("to be"), and a feature specific to that environment was added to the set of extracted features. For determining the syntactic label of a constituent, more information again is needed in French, because of ...
S(A)
... • ----a branch of linguistics that studies the rules that govern the formation of sentences. • ---a branch of linguistics that studies the rules governing the ways different constituents(成分) are combined to form sentence in a language, or the study of the interrelationships between elements in sente ...
... • ----a branch of linguistics that studies the rules that govern the formation of sentences. • ---a branch of linguistics that studies the rules governing the ways different constituents(成分) are combined to form sentence in a language, or the study of the interrelationships between elements in sente ...
1 Robert Frank, Phrase structure composition and syntactic
... and explains the ungrammaticality as due to the fact that elementary trees for nominals are not able to provide a recursive structure that permits adjoining. Frank concludes this chapter with a detailed analysis of copular sentences, as, for example, The assassination of the king was the cause of th ...
... and explains the ungrammaticality as due to the fact that elementary trees for nominals are not able to provide a recursive structure that permits adjoining. Frank concludes this chapter with a detailed analysis of copular sentences, as, for example, The assassination of the king was the cause of th ...
Syntax
... vs. *She ate an apple and so did I a pear)- only an entire verbal constituent can be replaced with do Certain groups of words form close units: constituents, e.g. Nominal constituent (1) (NP=noun phrase), (2) verbal constituent (VP=verb phrase) ...
... vs. *She ate an apple and so did I a pear)- only an entire verbal constituent can be replaced with do Certain groups of words form close units: constituents, e.g. Nominal constituent (1) (NP=noun phrase), (2) verbal constituent (VP=verb phrase) ...
2. Natural Language Processing (NLP)
... Natural language processing systems take strings of words (sentences) as their input and produce structured representations capturing the meaning of those strings as their output. The nature of this output depends heavily on the task at hand. A natural language understanding system serving as an int ...
... Natural language processing systems take strings of words (sentences) as their input and produce structured representations capturing the meaning of those strings as their output. The nature of this output depends heavily on the task at hand. A natural language understanding system serving as an int ...
C14-1101 - ACL Anthology
... certain heavily understudied and even largely unnoticed linguistic phenomena that deserve scientific study independently of whether their neglect causes serious errors in today’s NLP applications or not. However, on the other hand, taking these phenomena into account is definitely useful for applica ...
... certain heavily understudied and even largely unnoticed linguistic phenomena that deserve scientific study independently of whether their neglect causes serious errors in today’s NLP applications or not. However, on the other hand, taking these phenomena into account is definitely useful for applica ...
Morphology and a More `Morphological`
... exists between form and content. Words clearly have a phonological organization, into features, segments, syllables, and larger prosodic constituents such as the foot, but the central question in this area has always been that of other sorts of complexity within words. Do the forms of words, that is ...
... exists between form and content. Words clearly have a phonological organization, into features, segments, syllables, and larger prosodic constituents such as the foot, but the central question in this area has always been that of other sorts of complexity within words. Do the forms of words, that is ...
Reconstruction the Lexical Domain with a Single Generative
... (17) “Paradigmatic” includes the notion that (a) inflection fills out feature space such that, for example, every noun will have all the case forms it needs to participate fully in the syntax and (b) inflection is typically syncretic such that a single form spreads to fill several cells in paradigm ...
... (17) “Paradigmatic” includes the notion that (a) inflection fills out feature space such that, for example, every noun will have all the case forms it needs to participate fully in the syntax and (b) inflection is typically syncretic such that a single form spreads to fill several cells in paradigm ...
Paraphrasing of Synonyms for a Fine
... Paraphrasing is used in many areas of Natural Language Processing – ontology linking, question answering, summarization, machine translation, etc. Paraphrasing between synonyms seems a relatively simple task, but in practice an automatic paraphrasing of synonyms might produce ungrammatical or unnatu ...
... Paraphrasing is used in many areas of Natural Language Processing – ontology linking, question answering, summarization, machine translation, etc. Paraphrasing between synonyms seems a relatively simple task, but in practice an automatic paraphrasing of synonyms might produce ungrammatical or unnatu ...
PowerPoint
... A constituent is a group of words which function as a unit. If you can replace part of the sentence with another constituent (the smallest constituent being a single word), this tells us that the replaced section of the sentence is a constituent. This isn’t foolproof, but it usually works if you try ...
... A constituent is a group of words which function as a unit. If you can replace part of the sentence with another constituent (the smallest constituent being a single word), this tells us that the replaced section of the sentence is a constituent. This isn’t foolproof, but it usually works if you try ...
Generating Text with Hidden Meaning
... Each chunk can attach to the left, to the right, or via an intermediary, something I call ‘upward attachment’. This specification resembles Combinatorial Categorial Grammar (Steedman, 2000), in that a noun can attach to the left to the verb of which it is the direct object, an adjective can attach t ...
... Each chunk can attach to the left, to the right, or via an intermediary, something I call ‘upward attachment’. This specification resembles Combinatorial Categorial Grammar (Steedman, 2000), in that a noun can attach to the left to the verb of which it is the direct object, an adjective can attach t ...
LECTURE 5 CONTENTS 1. Lexical Functional Grammar (LFG
... Functional info comprises information about the function of the different parts of a phrase as well as a small set of axioms. For instance, a phrasal constituent may function as the subject of the verb and another as its object. At the axiomatic level, no predicate is allowed to ...
... Functional info comprises information about the function of the different parts of a phrase as well as a small set of axioms. For instance, a phrasal constituent may function as the subject of the verb and another as its object. At the axiomatic level, no predicate is allowed to ...
VERB
... syntactic pattern and which does not involve any immediate or intrinsic modification of its surface manifestation” (Harris & Campbell 1995:61; ...
... syntactic pattern and which does not involve any immediate or intrinsic modification of its surface manifestation” (Harris & Campbell 1995:61; ...
Syntactic Analysis
... Auxiliaries are lexical items like "will", "might", "do", "may" which often precede another verb. For example, "John will eat pizza". These auxiliairies often carry information about tense, aspect (indicating whether an action is ongoing or completed) or some modality (indicating the possibility of ...
... Auxiliaries are lexical items like "will", "might", "do", "may" which often precede another verb. For example, "John will eat pizza". These auxiliairies often carry information about tense, aspect (indicating whether an action is ongoing or completed) or some modality (indicating the possibility of ...
Experiments for Dependency Parsing of Greek
... and coordinating conjunctions. Coordinating conjunctions and apposition markers head participating tokens in relevant constructions. Table 1 contains some of the most common dependency relations used in the treebank, while Figure 1 presents a sentence fragment that contains a non-projective arc conn ...
... and coordinating conjunctions. Coordinating conjunctions and apposition markers head participating tokens in relevant constructions. Table 1 contains some of the most common dependency relations used in the treebank, while Figure 1 presents a sentence fragment that contains a non-projective arc conn ...
15_chapter 5
... interpretation. For example, it is not necessary to read the head modifier for a tree that does not show it directly, for head complement relations. 2. The dependency tree contains one node per word. Because the parser’s job is only to connect existing nodes, and not to postulate new ones, the task o ...
... interpretation. For example, it is not necessary to read the head modifier for a tree that does not show it directly, for head complement relations. 2. The dependency tree contains one node per word. Because the parser’s job is only to connect existing nodes, and not to postulate new ones, the task o ...
Introducing probabilistic information in Constraint Grammar
... while 'become' is ambiguous not in terms of PoS, but between two different verbal readings (infinitive and participle), as are 'published' and 'convinced' (participle vs. past tense) A Constraint Grammar rule handles such ambiguity through explicit contextual constraints, defining, as it were, what ...
... while 'become' is ambiguous not in terms of PoS, but between two different verbal readings (infinitive and participle), as are 'published' and 'convinced' (participle vs. past tense) A Constraint Grammar rule handles such ambiguity through explicit contextual constraints, defining, as it were, what ...
Unifying Semantic Relations Across Syntactic Levels
... should be applied to all expressions. We also want to use this representation to analyze the change of semantic relations when the utterances are changed by deletion, as discussed in Section 2. If a verb is deleted, we want to see how the case relations change, and what they correspond to in the new ...
... should be applied to all expressions. We also want to use this representation to analyze the change of semantic relations when the utterances are changed by deletion, as discussed in Section 2. If a verb is deleted, we want to see how the case relations change, and what they correspond to in the new ...
Dependency grammar
.jpg?width=300)
Dependency grammar (DG) is a class of modern syntactic theories that are all based on the dependency relation (as opposed to the constituency relation) and that can be traced back primarily to the work of Lucien Tesnière. Dependency is the notion that linguistic units, e.g. words, are connected to each other by directed links. The (finite) verb is taken to be the structural center of clause structure. All other syntactic units (words) are either directly or indirectly connected to the verb in terms of the directed links, which are called dependencies. DGs are distinct from phrase structure grammars (constituency grammars), since DGs lack phrasal nodes - although they acknowledge phrases. Structure is determined by the relation between a word (a head) and its dependents. Dependency structures are flatter than constituency structures in part because they lack a finite verb phrase constituent, and they are thus well suited for the analysis of languages with free word order, such as Czech, Turkish, and Warlpiri.