Home » Articles » LS46. Ambiguity (1)

LS46. Ambiguity (1)

 A correspondent has challenged NG for being apparently deterministic (see LS42), delivering a single meaning for a sentence, with one left-to-right pass and no memory buffer.  The correspondent provides some examples first seen in LS42 that are transiently ambiguous:

(150) Visiting relatives are a nuisance

(151) Visiting relatives is a nuisance

and other examples that are permanently ambiguous:

(169) Visiting relatives can be a nuisance

(170) John greeted the guest with a hoarse voice

Memory buffer

LanguidSlog has never made any claim about this, one way or the other, but in effect NG does have buffering.  One facility is that a word may retain activation after it is initially processed.  An incoming word may form a junction with another word any number of places to its left.

Another facility is where a junction creates incomplete propositions that are completed and delivered only when a later junction is processed.  A simple example is Nero gave Olivia… followed by either …Poppaea or …to Poppaea.

Sentence processing

This is a term favoured by psycholinguists.  It can encompass semantics and pragmatics as well as syntax.  (For now, let’s ignore the upstream processes.  NG simply assumes phonological input can be chopped up and, via some sort of content addressability, deliver a sequence of ‘phonological words’ – the P nodes in the diagrams.)

Syntacticians tend to talk more narrowly about ‘parsing’.  LanguidSlog only uses the term to mean analysing a sentence.  An NG analysis is what it says.  It is definitely not synthesis – no building of trees, labelling of nodes and suchlike.

The following uses phrases like ‘syntax delivers…’ to mean that part of a sentence’s cognitive effect attributable to the processing implied by what students learn in syntax classes.

Syntax does not deliver the whole cognitive effect.  There must be some more processing downstream.  Routinely this processing relates new propositions delivered by syntax to the hearer’s existing cognitive state.  That state reflects the discourse so far, mutual knowledge of speaker and hearer, and general knowledge.  Further propositions are generated forming the gist of the sentence.  It is the gist that is remembered, not the words.

Downstream processing

Downstream processing may be complicated.  Problems it must handle include toddlers and bad non-native speakers (speech that is ungrammatical or fragmentary or uses wrong words), incomplete utterances, ums and ahs, ambiguity, pronoun resolution, fresh metaphors, garden paths.  Most of the problem-solving is unconscious but there is a spectrum of difficulty that extends into the conscious.  Bever’s The horse raced past the barn fell certainly gets the hearer’s attention.

Syntax is entirely automatic and fast.  Pronoun resolution is also automatic as a result of lingering activation on recently used nouns.  But more generally the downstream process must be more or less effortful even though it’s mostly unconscious.  A speaker relies on the hearer’s downstream processing because delivering that part of the cognitive effect in syntax would be disproportionately effortful for the speaker (as well as giving the hearer more syntax to do).  Most speakers develop an instinct for what’s efficient and what’s socially appropriate.

Single meaning

Cognitive effect may be delivered by syntax or by downstream processes.  Are the two strictly separate?  NG assumes there is a grey area where some syntax is supported by some downstream processes.

But is there feedback from downstream processing into syntax?  NG assumes there is not.

Naturally-occurring sentences can usually be disambiguated by their context.  (169) could be preceded by We’re at Mother-in-laws’ or by Mother-in-law is at ours; (170) by One guest had a hoarse voice or by John had laryngitis.

Is it conceivable that the syntactic process is primed by context?  For example in (170), could an earlier guest had a hoarse voice really determine attachment of the PP?  Surely not.

Looking at problematic sentences in isolation proves little.  It’s possible that syntax delivers a single meaning and then, by a near-conscious process, the ambiguity is noticed.  Neither (169) nor (170) gives anything like the jolt of ungrammaticality.

For what it’s worth, I think my own ‘syntax’ would deliver visiting as the subject in (169) and attachment of the PP to greeted in (170).

So yes, LanguidSlog does claim a single parse is delivered, even for (169) or (170).  A key point is that NG delivers a bundle of simple propositions.  It doesn’t build a tree in order to deliver meaning.  If the actual bundle is missing one proposition most of the meaning is still there.  Does a whole PSG tree die if a branch is cut off?

NG could deliver disjunction but it’s preferable not to.  Even with two meanings from (169) or (170), downstream disambiguation would still be required.

Visiting relatives can be a nuisance

There are two possible junctions between Visiting and relatives.  NG creates alternative propositions – one with VISIT as parent, one with RELATIVE – and splits the activation between them.

For (150) the lexicon allows visiting__is but not relatives__is.  The resulting (null) / INST / VISIT causes consolidation on VISIT / QUALIF / RELATIVE.

In this configuration, full activation is brought to VISIT / QUALIF / RELATIVE by relatives and to (null) / INST / VISIT by visitingRELATIVE / QUALIF / VISIT dies.

For (151) the lexicon allows relatives__are but not visiting__are.  The process therefore brings full activation to RELATIVE / QUALIF / VISIT and VISIT / QUALIF / RELATIVE dies.

To discuss (169) we’ll ignore modality and assume the verb is a simple copula, can-be, that is number-general.  NG says that only one reading is delivered but the lexicon allows both visiting__can-be and relatives__can-be.  Which of these takes effect depends on likelihood, determined by previous usage.

Of course visiting is most often a participle and relatives is most often a noun.  Individual words don’t give a useful measure.  A strength of NG is that it can determine the likelihood of junction types.

In this case it’s (gerund)__(noun modifier) versus (verbal adjective)__(noun).  If the former wins, visiting__relatives is more strongly activated initially and then takes all at can-be.  If the latter wins, visiting__relatives takes all.

Finally the (be form)__nuisance junction creates NUISANCE / INST / (null) and so NUISANCE replaces the null in the diagrams (see LS12 for the nulls-in-complementary-positions principle).

John greeted the guest with a hoarse voice 

One possibility for (170) is that disambiguation is done entirely downstream.  NG does not need to identify a junction between the PP and anything else.  The problem with this idea is the implication that adjuncts can attach randomly, which empirically is not the case.

A better idea is that NG attaches the adjunct to the first word to its left that is not semantically incongruous.  For example, for John greeted the new day with a hoarse voice, the junction is not day__voice but greeted__voice.

It’s interesting that ambiguity is mostly unnoticed.  Consider:

(171) I can find a million people who believe in evolution before June

The sentence is three-way ambiguous because June can attach to evolution or believe or find.  But belief-change rarely and evolutionary-change never occur in a timescale of months.

Of course, NG easily excludes incongruity.  However, to allow unambiguous sentences like (172), it must use likelihood rather than categorical exclusion.

(172) Believe before June but be sceptical afterwards

Next time…

…more on ambiguity.

Mr Nice-Guy

Comments