So you like linguistics and you’ve stumbled across LanguidSlog? Welcome to the party. Here’s a little overview of what you’ve missed so far. But don’t worry, you can get up to speed quickly and click through to any of the posts. Keep following and get commenting – no matter what your interest or level of knowledge in linguistics is.
LS1 introduces the blog and the blogger. Linguistics is disappointing. The reason could be false assumptions about mental processing. LanguidSlog will avoid that pitfall and deliver a truly explanatory theory – in 57 weeks. It’s 57 years (and counting) since Chomsky’s Syntactic Structures.
LS2 asks why syntacticians place so much emphasis on sentence structure and what the connection is between sentence structure and meaning.
LS3 says that, if structure participates in sentence processing, it must somehow exist in the mental architecture. What must be in the structure for John kissed Lucy is defined. The idea of a junction is introduced; this is similar to a ‘dependency’ in some other grammars.
LS4 discusses the content of a sentence structure held in the mental architecture. The need to make copies of previously-stored concepts makes the idea of structure rather implausible.
LS5 therefore looks more closely at ‘mental architecture’. Theoretical linguists all seem to assume that the human brain is like a stored-program computer. That cannot be so – unless the brain had addressable storage. Thus the explanatory adequacy of any existing theory is doubtful.
LS6 questions one of the sacred cows of mainstream generative grammar – the idea of ‘movement’. Structure cannot participate in sentence processing. We must work out how phonological input is transformed directly into meaning comprehended by the hearer.
LS7 starts to develop the alternative to existing phrase-structure and dependency grammars. The first idea is that the lexicon holds three-concept propositions defining words and rules. Crucially these accommodate the many-to-many relationship between phonological strings and meanings.
LS8 shows how the meaning of an entire sentence is given by the rules for the two-word junctions it contains. The rules deliver meaning to cognition. These propositions are linked only by simultaneity, not by ‘structure’. In many situations, meaning cannot be delivered immediately: a proposition may need to be completed by combining it with a proposition from a junction later in the sentence.
LS9 is an excursion into the world of IT. It claims that the three-concept proposition can be used to store any data. There is a strong inference that this simple model would have been evolved as the means for higher mammals to store knowledge. A postscript proposes that LanguidSlog’s method be called NG or Network Grammar.
LS10 illustrates the three-concept model. There can be hierarchical relationships between concepts – as in taxonomy. But more generally relationships form a network with progression from one concept fanning out through more and more, simpler and simpler concepts.
LS11 asserts that a sentence is processed in one left-to-right pass. The sequence of junctions is illustrated using Nero is giving Olivia to Poppaea. What happens with ungrammatical input is also discussed.
LS12 details what happens with the rules invoked by the junctions in Nero is giving Olivia to Poppaea. Important points are how at Nero is… the possibility that the sentence continues …mad is covered; and ditto, Nero is giving Olivia… continues …a puppy.
LS13 introduces a tabular format to show the junction-by-junction processing of sentences, including the points at which meaning is delivered to cognition.
And now to LS14…