* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Download Relational Networks
Neurolinguistics wikipedia , lookup
Synaptic gating wikipedia , lookup
Neuropsychopharmacology wikipedia , lookup
Neural engineering wikipedia , lookup
Development of the nervous system wikipedia , lookup
Central pattern generator wikipedia , lookup
Metastability in the brain wikipedia , lookup
Artificial neural network wikipedia , lookup
Neurophilosophy wikipedia , lookup
Nervous system network models wikipedia , lookup
Neuroeconomics wikipedia , lookup
Optogenetics wikipedia , lookup
Node of Ranvier wikipedia , lookup
Embodied language processing wikipedia , lookup
Neuroesthetics wikipedia , lookup
Convolutional neural network wikipedia , lookup
Channelrhodopsin wikipedia , lookup
Catastrophic interference wikipedia , lookup
Hierarchical temporal memory wikipedia , lookup
National Taiwan University Linguistic Structure as a Relational Network Sydney Lamb Rice University [email protected] 9 November 2010 Topics        Aims of Neurocognitive Linguistics The origins of relational networks Relational networks as purely relational Narrow relational network notation Narrow relational networks and neural networks Levels of precision in description Appreciating variability in language Topics        Aims of Neurocognitive Linguistics The origins of relational networks Relational networks as purely relational Narrow relational network notation Narrow relational networks and neural networks Levels of precision in description Appreciating variability in language Aims of Neurocognitive Linguistics (“NCL”)  NCL aims to understand the linguistic system of a language user • As a dynamic system • It operates • Speaking, comprehending, learning, etc. • It changes as it operates • It has a locus • The brain NCL seeks to learn .. • How information is represented in the linguistic system • How the system operates in speaking and understanding • How the linguistic system is connected to other knowledge • How the system is learned • How the system is implemented in the brain The linguistic system of a language user: Two viewing platforms  Cognitive level: the cognitive system of the language user without considering its physical basis • The cognitive (linguistic) system • Field of study: “cognitive linguistics”  Neurocognitive level: the physical basis • Neurological structures • Field of study: “neurocognitive linguistics” “Cognitive Linguistics”  First occurrence of the term in print: • “[The] branch of linguistic inquiry which aims at characterizing the speaker’s internal information system that makes it possible for him to speak his language and to understand sentences received from others.” (Lamb 1971) Operational Plausibility  To understand how language operates, we need to have the linguistic information represented in such a way that it can be used for speaking and understanding  (A “competence model” that is not competence to perform is unrealistic) Operational Plausibility  To understand how language operates, we need to have the information represented in such a way that it can be directly used for speaking and understanding  Competence as competence to perform  The information in a person’s mind is “knowing how” – not “knowing that”  Information in operational form • Able to operate without manipulation from some added “performance” system Topics        Aims of Neurocognitive Linguistics The origins of relational networks Relational networks as purely relational Narrow relational network notation Narrow relational networks and neural networks Levels of precision in description Appreciating variability in language Relational network notation  Thinking in cognitive linguistics was facilitated by relational network notation  Developed under the influence of the notation used by M.A.K. Halliday for systemic networks Precursors  In the 1960s the linguistic system was viewed (by Hockett and Gleason and me and others) as containing items (of unspecified nature) together with their interrelationships • Cf. Hockett’s “Linguistic units and their relations” (Language, 1966)  Early primitive notations showed units with connecting lines to related units The next step: Nodes  The next step was to introduce nodes to go along with such connecting lines  Allowed the formation of networks – systems consisting of nodes and their interconnecting lines  Halliday’s notation used different nodes for paradigmatic (‘or’) and syntagmatic (‘and’) relationships • Just what I was looking for The downward or DIFFICULT hard diffricult The downward and a b The ordered AND  We need to distinguish simultaneous from sequential  For sequential, the ‘ordered AND’  Its two (or more) lines connect to different points at the bottom of the triangle (in the case of the ‘downward and’) • to represent sequential activation  leading to sequential occurrence of items Downward (ordered) AND Vt Nom Upward and Downward  Expression (phonetic or graphic) is at the bottom  Therefore, downward is toward expression meaning network  Upward is toward meaning (or other function) – more abstract expression Neurological interpretation of up/down  At the bottom are the interfaces to the world outside the brain: • Sense organs on the input side • Muscles on the output side  ‘Up’ is more abstract Topics        Aims of Neurocognitive Linguistics The origins of relational networks Relational networks as purely relational Narrow relational network notation Narrow relational networks and neural networks Levels of precision in description Appreciating variability in language Morpheme as item and its phonemic representation boy Symbols? Objects? b-o-y Relationship of boy to its phonemes boy As a morpheme, it is just one unit b o y Three phonemes, in sequence The nature of this “morphemic unit” BOY Noun boy b o y The object we are considering The morpheme as purely relational BOY Noun We can remove the symbol with no loss of information. Therefore, it is a connection, not an object boy b o y Another way of looking at it BOY Noun boy b o y Another way of looking at it BOY Noun b o y A closer look at the segments boy (Bob) b Phonological features o (toy) y The phonological segments also are just locations in the network – not objects Relationships of boy BOY Noun boy b o y Just a label – not part of the structure Objection I  If there are no symbols, how does the system distinguish this morpheme from others?  Answer: Other morphemes necessarily have different connections  Another node with the same connections would be another (redundant) representation of the same morpheme Objection II  If there are no symbols, how does the system know which morpheme it is?  Answer: If there were symbols, what would read them? Miniature eyes inside the brain? Relations all the way  Perhaps all of linguistic structure is relational  It’s not relationships among linguistic items; it is relations to other relations to other relations, all the way to the top – at one end – and to the bottom – at the other  In that case the linguistic system is a network of interconnected nodes Objects in the mind? When the relationships are fully identified, the objects as such disappear, since they have no existence apart from those relationships Quotation The postulation of objects as something different from the terms of relationships is a superfluous axiom and consequently a metaphysical hypothesis from which linguistic science will have to be freed. Louis Hjelmslev (1943/61) Syntax is also purely relational: Example: The Actor-Goal Construcion Semantic function Syntactic function CLAUSE DO-SMTHG Material process (type 2) Variable expression Vt Nom Syntax: Linked constructions TOPIC-COMMENT CL DO--SMTHG Nom Material process (type 2) Vt Nom Add another type of process THING-DESCR CL BE-SMTHG Vt be Adj Nom Loc DO-TO-SMTHG More of the English Clause CL Subj Pred FINITE to <V>-ing Predicator BE-SMTHG DO-TO-SMTHG Conc Past Mod Vi be Vt The downward ordered OR  For the ‘or’ relation, we don’t have sequence since only one of the two (or more) lines is activated  But an ordering feature for this node is useful to indicate precedence • So we have precedence ordering.  One line for the marked condition • If conditions allow for its activation to be realized, it will be chosen in preference to the other line  The other line is the default The downward ordered or a marked choice b unmarked choice (a.k.a. default ) The unmarked choice is the line that goes right through. The marked choice is off to the side – either side The downward ordered or a unmarked choice (a.k.a. default ) b marked choice The unmarked choice is the one that goes right through. The marked choice is off to the side – either side Optionality Sometimes the unmarked choice is nothing b unmarked choice marked choice In other words, the marked choice is an optional constituent Conclusion: Relationships all the way to.. What is at the bottom?  Introductory view: it is phonetics  In the system of the speaker, we have relational network structure all the way down to the points at which muscles of the speech-producing mechanism are activated • At that interface we leave the purely relational system and send activation to a different kind of physical system  For the hearer, the bottom is the cochlea, which receives activation from the sound waves of the speech hitting the ear What is at the top?  Is there a place up there somewhere that constitutes an interface between a purely relational system and some different kind of structure? • This question wasn’t actually asked at first • It was clear that as long as we are in language we are in a purely relational system, and that is what mattered  Somehow at the top there must be meaning What are meanings? For example, DOG C In the Mind The World Outside DOG Perceptual properties of dogs All those dogs out there and their properties How High is Up?      Downward is toward expression Upward is toward meaning/function Does it keep going up forever? No — as it keeps going it arches over, through perception Conceptual structure is at the top The great cognitive arch The “Top” Relational networks: Cognitive systems that operate  Language users are able to use their languages.  Such operation takes the form of activation of lines and nodes  The nodes can be defined on the basis of how they treat incoming activation Nodes are defined in terms of activation: The downward ordered AND k Downward activation from k goes to a and later to b Upward activation from a and later from b goes to k a b Nodes are defined in terms of activation Downward unordered OR k p q a b The OR condition is not Achieved locally – at the node itself – it is just a node, has no intelligence. Usually there will be activation coming down from either p or q but not from both Nodes are defined in terms of activation: The OR k Upward activation from either a or b goes to k Downward activation from k goes to a and [sic] b a b Nodes are defined in terms of activation Downward unordered OR k p q a b The OR condition is not achieved locally – at the node itself – it is just a node, has no intelligence. Usually there will be activation coming down from either p or q but not from both The Ordered AND: Upward Activation Activation moving upward from below The Ordered AND: Downward Activation Activation coming downward from above Upward activation through the OR The or operates as either-or for activation going from the plural side to the singular side. For activation from plural side to singular side it acts locally as both-and, but in the context of other nodes the end result is usually either-or Upward activation through the OR BILL1 BILL2 Usually the context allows only one interpretation, as in I’ll send you a bill for it bill Upward activation through the or BILL1 BILL2 But if the context allows both to get through, we have a pun: A duck goes into a pub and orders a drink and says, “Put it on my bill“. bill Shadow Meanings: Zhong Guo CHINA MIDDLE KINGDOM zhong guo The ordered OR: How does it work? Ordered This line taken if possible default Node-internal structure (not shown in abstract notation) is required to control this operation Topics        Aims of Neurocognitive Linguistics The origins of relational networks Relational networks as purely relational Narrow relational network notation Narrow relational networks and neural networks Levels of precision in description Appreciating variability in language Toward Greater Precision • The nodes evidently have internal structures • Otherwise, how to account for their behavior? • We can analyze them, figure out what internal structure would make them behave as they do The Ordered AND: How does it know? Activation coming downward from above How does the AND node “know” how long to wait before sending activation down the second line? How does it know?  How does the AND node “know” how long to wait before sending activation down the second line?  It must have internal structure to govern this function  We use the narrow notation to model the internal structure Internal Structure – Narrow Network Notation As each line is bidirectional, it can be analyzed into a pair of one-way lines Likewise, the simple nodes can be analyzed as pairs of one-way nodes Abstract and narrow notation  Abstract notation – also known as compact notation  A diagram in abstract notation is like a map drawn to a large scale  Narrow notation shows greater detail and greater precision  Narrow notation ought to be closer to the actual neural structures  www.ruf.rice.edu/~lngbrain/shipman Narrow relational network notation  Developed later  Used for representing network structures in greater detail • internal structures of the lines and nodes of the abstract notation  The original notation can be called the ‘abstract’ notation or the ‘compact’ notation Narrow and abstract network notation Narrow notation     Closer to neurological structure Nodes represent cortical columns Links represent neural fibers (or bundles of fibers) Uni-directional eat apple eat apple Abstract notation     Nodes show type of relationship (OR, AND) Easier for representing linguistic relationships Bidirectional Not as close to neurological structure eat apple eat apple More on the two network notations  The lines and nodes of the abstract notation represent abbreviations – hence the designation ‘abstract’  Compare the representation of a divided highway on a highway map • In a more compact notation it is shown as a single line • In a narrow notation it is shown as two parallel lines of opposite direction Two different network notations ab Abstract notation  Bidirectional a b a b ab Upward b a Narrow notation f Downward b Downward Nodes: Internal Structure AND 2 OR 1 Upward Nodes: Internal Structure AND 2 OR 1 Downward AND, upward direction The ‘Wait’ Element 2 AND vs. OR In one direction their internal structures are the same In the other, it is a difference in threshold – hi or lo threshold for high or low degree of activation required to cross Thresholds in Narrow Notation OR AND 1 2 3 4 The Beauty of the Threshold 1 – You no longer need a basic distinction AND vs. OR 2 – You can have intermediate degrees, between AND and OR 3 – The AND/OR distinction was a simplification anyway — doesn’t always work! The ‘Wait’ Element Downward AND, downward direction Keeps the activation alive A B Activation continues to B after A has been activated Structure of the ‘Wait’ Element 1 2 www.ruf.rice.edu/~lngbrain/neel Node Types in Narrow Notation Junction Branching Blocking T Two Types of Connection Excitatory Type 1 Inhibitory Type 2 Types of inhibitory connection  Type 1 – connect to a node  Type 2 – Connects to a line • Used for blocking default realization • For example, from the node for second there is a blocking connection to the line leading to two Type 2 – Connects to a line TWO ORDINAL 2 -th two second Additional details of structure can be shown in narrow notation  Varying degrees of connection strength  Variation in threshold strength  Contrast Topics        Aims of Neurocognitive Linguistics The origins of relational networks Relational networks as purely relational Narrow relational network notation Narrow relational networks and neural networks Levels of precision in description Appreciating variability in language The node of narrow RN notation vis-à-vis neural structures  It is very unlikely that a node is represented by a neuron • Far more likely: a bundle of neurons  At this point we turn to neuroscience  Vernon Mountcastle, Perceptual Neuroscience (1998) • Cortical columns The node of narrow RN notation vis-à-vis neural structures  The cortical column  A column consists of 70-100 neurons stacked on top of one another  All neurons within a column act together • When a column is activated, all of its neurons are activated The node as a cortical column  The properties of the cortical column are approximately those described by Vernon Mountcastle “[T]he effective unit of operation…is not the single neuron and its axon, but bundles or groups of cells and their axons with similar functional properties and anatomical connections.” Vernon Mountcastle, Perceptual Neuroscience (1998), p. 192 Three views of the gray matter Different stains show different features Nissl stain shows cell bodies of pyramidal neurons The Cerebral Cortex  Grey matter • Columns of neurons White matter • Inter-column connections Layers of the Cortex From top to bottom, about 3 mm The Cerebral Cortex  Grey matter • Columns of neurons White matter • Inter-column connections The White Matter  Provides long-distance connections between cortical columns  Consists of axons of pyramidal neurons  The cell bodies of those neurons are in the gray matter  Each such axon is surrounded by a myelin sheath, which.. • Provides insulation • Enhances conduction of nerve impulses  The white matter is white because that is the color of myelin Dimensionality of the cortex  Two dimensions: The array of nodes  The third dimension: • The length (depth) of each column (through the six cortical layers) • The cortico-cortical connections (white matter) Topological essence of cortical structure  Two dimensions for the array of the columns  Viewed this way the cortex is an array – a twodimensional structure – of interconnected columns The (Mini)Column  Width is about (or just larger than) the diameter of a single pyramidal cell • About 30–50 m in diameter  Extends thru the six cortical layers • • Three to six mm in length The entire thickness of the cortex is accounted for by the columns  Roughly cylindrical in shape  If expanded by a factor of 100, the dimensions would correspond to a tube with diameter of 1/8 inch and length of one foot Cortical column structure  Minicolumn 30-50 microns diameter  Recurrent axon collaterals of pyramidal neurons activate other neurons in same column  Inhibitory neurons can inhibit neurons of neighboring columns • Function: contrast  Excitatory connections can activate neighboring columns • In this case we get a bundle of contiguous columns acting as a unit Narrow RN notation viewed as a set of hypotheses  Question: Are relational networks related in any way to neural networks?  A way to find out  Narrow RN notation can be viewed as a set of hypotheses about brain structure and function • Each property of narrow RN notation can be tested for neurological plausibility Some properties of narrow RN notation  Lines have direction (they are one-way)  But they tend to come in pairs of opposite direction (“upward” and “downward”)  Connections are either excitatory or inhibitory  Nerve fibers carry activation in just one direction  Cortico-cortical connections are generally reciprocal  Connections are either excitatory or inhibitory (from different types of neurons, with two different neurotransmitters) More properties as hypotheses  Nodes have differing thresholds of activation  Inhibitory connections are of two kinds Type 1  Neurons have different thresholds of activation  Inhibitory connections are of two kinds • (Type 2: “axo-axonal”) Type 2  Additional properties – (too technical for this presentation)  All are verified Topics        Aims of Neurocognitive Linguistics The origins of relational networks Relational networks as purely relational Narrow relational network notation Narrow relational networks and neural networks Levels of precision in description Appreciating variability in language Levels of precision in network notation: How related?  They operate at different levels of precision  Compare chemistry and physics • Chemistry for molecules • Physics for atoms  Both are valuable for their purposes Levels of precision  (E.g.) Systemic networks (Halliday)  Abstract relational network notation  Narrow relational network notation Three levels of precision Systemic Networks Relational Networks a b a b Abstract 2 2 Narrow (downward) Different levels of investigation: Living Beings      Systems Biology Cellular Biology Molecular Biology Chemistry Physics Levels of Precision  Advantages of description at a level of greater precision: • Greater precision • Shows relationships to other areas  Disadvantages of description at a level of greater precision: • More difficult to accomplish  Therefore, can’t cover as much ground • More difficult for consumer to grasp  Too many trees, not enough forest Levels of precision       Systemic networks (Halliday) Abstract relational network notation Narrow relational network notation Cortical columns and neural fibers Neurons, axons, dendrites, neurotransmitters Intraneural structures • Pre-/post-synaptic terminals • Microtubules • Ion channels • Etc. Levels of precision         Informal functional descriptions Semi-formal functional descriptions Systemic networks Abstract relational network notation Narrow relational network notation Cortical columns and neural fibers Neurons, axons, dendrites Intraneural structures and processes Topics        Aims of Neurocognitive Linguistics The origins of relational networks Relational networks as purely relational Narrow relational network notation Narrow relational networks and neural networks Levels of precision in description Appreciating variability in language Precision vis-à-vis variability  Description at a level of greater precision encourages observation of variability  At the level of the forest, we are aware of the trees, but we tend to overlook the differences among them  At the level of the trees we clearly see the differences among them  But describing the forest at the level of detail used in describing trees would be very cumbersome  At the level of the trees we tend to overlook the differences among the leaves  At the level of the leaves we tend to overlook the differences among their component cells Linguistic examples  At the cognitive level we clearly see that every person’s linguistic system is different from that of everyone else  We also see variation within the single person’s system from day to day  At the level of narrow notation we can treat • Variation in connection strengths • Variation in threshold strength • Variation in levels of activation  We are thus able to explain • prototypicality phenomena • learning • etc. Radial categories and Prototypicality  Different connections have different strengths (weights)  More important properties have greater strengths  Example: CUP, • Important (but not necessary!) properties:  Short (as compared with a glass)  Ceramic  Having a handle  Cups with these properties are more prototypical The properties of a category have different weights The cardinal node for CUP CUP T MADE OF GLASS SHORT CERAMIC The properties are represented by nodes which are connected to lowerlevel nodes HAS HANDLE Nodes have activation thresholds  The node will be activated by any of many different combinations of properties  The key word is enough – it takes enough activation from enough properties to satisfy the threshold  The node will be activated to different degrees by different combinations of properties • When strongly activated, it transmits stronger activation to its downstream nodes. Prototypical exemplars provide stronger and more rapid activation Activation threshold (can be satisfied to varying degrees) CUP T MADE OF GLASS SHORT CERAMIC Stronger connections carry more activation HAS HANDLE Explaining Prototypicality  Cardinal category nodes get more activation from the prototypical exemplars • More heavily weighted property nodes  E.g., FLYING is strongly connected to BIRD • Property nodes more strongly activated  Peripheral items (e.g. EMU) provide only weak activation, weakly satisfying the threshold (emus can’t fly)  Borderline items may or may not produce enough activation to satisfy threshold Activation of different sets of properties produces greater or lesser satisfaction of the activation threshold of the cardinal node CUP Inhibitory connection MADE OF GLASS SHORT CERAMIC HAS HANDLE More important properties have stronger connections, indicated by thickness of lines Explaining prototypicality: Summary     Variation in strength of connections Many connecting properties of varying strength Varying degrees of activation Prototypical members receive stronger activation from more associated properties  BIRD is strongly connected to the property FLYING • Emus and ostriches don’t fly • But they have some properties connected with BIRD • Sparrows and robins do fly  And as commonly occurring birds they have been experienced often, leading to entrenchment – stronger connections Variation over time in connection strength  Connections get stronger with use • Every time the linguistic system is used, it changes  Can be indicated roughly by • Thickness of connecting lines in diagrams or by • Little numbers written next to lines Variation in threshold strength  Thresholds are not fixed • They vary as a result of use – learning  Nor are they integral  What we really have are threshold functions, such that • A weak amount of incoming activation produces no response • A larger degree of activation results in weak outgoing activation • A still higher degree of activation yields strong outgoing activation • S-shaped (“sigmoid”) function Variation in threshold strength  Thresholds are not fixed • They vary as a result of use – learning  Nor are they integral  What we really have are threshold functions, such that • A weak amount of incoming activation produces no response • A larger degree of activation results in weak outgoing activation • A still higher degree of activation yields strong outgoing activation • S-shaped (“sigmoid”) function N.B. All of these properties are found in neural structures Outgoing activation Threshold function --------------- Incoming activation ------------------- Topics        Aims of Neurocognitive Linguistics The origins of relational networks Relational networks as purely relational Narrow relational network notation Narrow relational networks and neural networks Levels of precision in description Appreciating variability in language Thank you for your attentIon! References Hockett, Charles F., 1961. Linguistic units and their relations” (Language, 1966) Lamb, Sydney, 1971. The crooked path of progress in cognitive linguistics. Georgetown Roundtable. Lamb, Sydney M., 1999. Pathways of the Brain: The Neurocognitive Basis of Language. John Benjamins Lamb, Sydney M., 2004a. Language as a network of relationships, in Jonathan Webster (ed.) Language and Reality (Selected Writings of Sydney Lamb). London: Continuum Lamb, Sydney M., 2004b. Learning syntax: a neurocognitive approach, in Jonathan Webster (ed.) Language and Reality (Selected Writings of Sydney Lamb). London: Continuum Mountcastle, Vernon W. 1998. Perceptual Neuroscience: The Cerebral Cortex. Cambridge: Harvard University Press. For further information.. www.rice.edu/langbrain [email protected]
 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
									 
                                             
                                             
                                             
                                             
                                             
                                             
                                             
                                             
                                             
                                            