This unscheduled post contains my responses to several points in a comment on LS7 by HK. I’ve also included responses to points about anaphor binding and agreement in an earlier comment on LS6 by HK.
Answering some of the points needed a wider range of formatting options than I’ve been able to get for comments from the WordPress software. (If anyone knows better, please let me know.) Therefore this is a separate post.
I’ve separated HK’s points and put them in italic, and answered each one in regular.
Thanks, HK. We need more stuff like yours.
Consider for example, the c-command condition on reflexive binding. It is satisfied in:
HimSELF John admires most.
This is easily accounted for if there is a trace of movement (or similar). Without it, I’m not sure what you’d have to say about facts like these.
The junction John__admires uses PJ / CJ / MJ for John and PA / CA / MA for admires. The rule CJ / R1 / CA then delivers the proposition ADMIRE / EXPERIENCER / JOHN.
The junctions admires__himself and himself__admires both use PA / CA / MA for admires and PH / CH / MH for himself; but while the canonical sequence uses the rule CA / R2 / CH, the fronted sequence uses rules CH / R3 / CA and CH / R4 / CA.
Rules CA / R2 / CH and CH / R3 / CA both create ADMIRE / SOURCE / (null) which, needing a concept to be complete, grabs JOHN (already linked to ADMIRE) to deliver ADMIRE / SOURCE / JOHN.
Rule CH / R4 / CA delivers ADMIRE / TOPICALISED / SOURCE.
Some of this goes beyond what is discussed in LS7 and 8. But it’s straightforward and requires no separate, ghost-in-the-machine process.
Other examples show that the trace of this kind of movement is active for agreement:
MARY John says is/*are coming to the party.
Possible junctions are prioritised. Processing is one-pass, left-to-right. For the nth word in a sentence, Pn / Cn / Mn, the sequence in which possibilities are tried is in principle:
But not all of these will exist as a rule in the ‘lexicon’. And those that are in the lexicon may not be available because a word can occur in a sentence only once as dependent. Subject to that, the backwards scan stops when a valid rule for Cn is found. (No ghost-in-the-machine is required if a gradient of decaying activation of preceding words gives the right sequence.)
In the given sentence, Mary and John might turn out to be coordinated but no junction is recognised without and. John__says prevents Mary__says and John__is; but it allows Mary__is. Mary__are is not in the lexicon. Etc.
So, the lexicon contains rules that define licit joins of categories. It would help my understanding greatly if these rules were spelled out a bit. Give some examples. What kind of Rs are there?
C / R / C rules are a bit abstract and so I make no attempt to spell them out. However there will be plenty of examples of the M / R / M propositions that result from incoming phonological words: see stuff in small caps. I don’t venture much beyond predicate-argument relations. Using labels from theta theory gives us some common ground and is not too misleading.
However I’m undecided on how much semantics is brought to the delivered proposition by the relation and how much by the predicate. I tend to think of the relation being syntactic – see my definition of R in LS7. Trivially that is to link back to the diagrams in earlier pieces. But my instinct is to have the smallest number of relations for word__word junctions – to allow those Rs to be plausibly innate.
Rs are concepts and could be anything. A wider range of Rs might occur in propositions derived from prosody, gesture etc. These might be learned rather than innate.
And if Cs are just categories rather than words, then how can the rules capture basic subcategorization patterns? A key fact about language is that some relations are encoded between items that are not adjacent (e.g. John gave Mary a book, or LUCY John kissed).
This follows naturally from words and rules as LS7. For example:
Anything that has C2 can attach as AGENT to GIVE, LEND, SELL etc; any C3 as THEME; any C4 as GOAL.
Verbs give, lend and sell are listed as syntactically identical in Levin (1993). But arguably give is different because it allows an inanimate AGENT:
(i) Bouillabaisse gave John salmonella
(ii) * Bouillabaisse sold John salmonella
This distinction can be treated as syntactic by assuming different Cs for the AGENT rules of GIVE and of LEND and SELL. Attributing the unacceptability of (ii) vaguely to ‘semantic processing’ would be less satisfying.
So, while the C for BOUILLABAISSE is not allowed by the rule for SELL/AGENT, it is for SELL/THEME:
(iii) Bouillabaisse sold in Marseille…
Therefore the reduced relative form of (iii) can never give a ‘garden path’. Interestingly my own analysis of the British National Corpus showed that the reduced form is predominant for relative clauses, while garden paths are vanishingly rare.
(iv) Nero gave Olivia to Poppaea
(v) Nero restored Olivia to Poppaea
(vi) * Nero refused Olivia to Poppaea
(vii) Nero gave Poppaea Olivia
(viii) * Nero restored Poppaea Olivia
(ix) Nero refused Poppaea Olivia
Verb restore accommodates one object as theme. A second object cannot attach anywhere and the incomplete proposition it leaves signals ‘ungrammaticality’. There are rules allowing restored__to and to__Poppaea which together deliver RESTORE / GOAL / POPPAEA.
Verb refuse is ditransitive like give except that the first object is always GOAL, not temporarily shared between GOAL and THEME (as shown in LS8). Also there is no rule allowing refused__to.
Furthermore, (a counterpart) items may be adjacent in a string and not in an appropriate selection relation, where in your proposal perhaps they ought to be? (For example, the verb and adjective in “John kissed beautiful girls”).
This is covered above under Agreement. Nowhere does LanguidSlog say that paired words must be adjacent. LS8 discusses a sentence with give, showing non-adjacent pairings.
The picture looks nice, but the proposal advanced here remains very unclear. If I’ve understood it correctly, R is a semantic relation (say ‘patient’), not a syntactic relation.
P, M, C and R concepts and P / C / M and C / R / C propositions are logically fundamental to language. (Indeed concepts and propositions must be fundamental to all mental processes and to the storing of ‘knowledge’, innate and acquired.) They are implemented directly in the ‘bits and bytes’ of the neurophysiological hardware. Therefore it’s not really appropriate to characterise R as ‘syntactic’ or as ‘semantic’.
But it would not be too misleading to say ‘an R is syntactic in the C / R / C rule and semantic in the M / R / M proposition delivered to cognition’.
For an explanation of my tactics, see Rules above.
‘No junction can include another junction’ implies that the proposal will fail to take account of constituency.
The proposal accounts for empirical data that are accounted for elsewhere by constituency. LS2 to 6 (plus my initial response to HK’s comment on LS6) show why those other accounts describe but fail to explain.
Of course the amount of empirical data addressed using the proposal to date is a tiny fraction of what has been addressed elsewhere. LS7 is only to discourage instant dismissal of LS2 to 6 because ‘there is no alternative’. And the whole of LanguidSlog will not answer every mystery of language revealed by the many thousands of man-years that have been expended on generative grammar.
Dependency grammars typically have the generative capacity of context-free phrase structure grammars. Grammars that fit your proposed computational arrangement probably have no more generative capacity than a regular grammar (which would be far too weak for natural language). But I may not have understood the proposal very well.
The generative capacity of the proposal is constrained by the M / P / C / R / C / P / M sub-assembly and by the backwards scan (see Agreement above). There is no algebra to express that and no plan to formulate one. But my strong impression is that the limits of the proposal and of acceptable English sentences are closely aligned, although long-distance junctions have not been tackled yet.
I don’t understand ‘Grammars that fit…regular grammar’ and wonder if this repeats the misunderstanding about Adjacency (above)? Ironically, parsing a regular language would need a stored-program computer.
Your ‘every possible sentence of an idiolect is pre-stored’ flies in that face of the well-known observation that we can understand sentences we have never heard before, as long as we know the words that are contained in it.
‘Pre-stored’ doesn’t mean the sentence has already been heard or voiced. It means that the mental network has a great many paths through it – infinitely many because of recursion. For once I’m not disputing the orthodoxy, but simply making a point about real-time computation.