Brain Science Podcast discussion

12 views
Introduce Yourself > Question on language evolution

Comments Showing 1-7 of 7 (7 new)    post a comment »
dateUp arrow    newest »

message 1: by [deleted user] (new)

I started listening to these podcasts about a week ago and am chewing through them in no particular order, so apologies if this has been covered already. In the 'Evolution of Language' podcast, Ginger suggests that it looks like language is an 'emergent' ability. I recently read 'The Language Instinct' (addmittedly the only reading I've done on the subject) and Pinker argues very convincingly that language is an evolved trait.

My questions, specifically, are:

Aren't we physically adapted to be capable of speech? ie. what use is the human voicebox without language? And are we not the only mammal that can't close off the airway while eating, with the payoff off speech?

And,

Do babies go through the babbling phase if they do not hear spoken language?


message 2: by Diane (new)

Diane | 9 comments Apparently babies who are born deaf will still babble. I heard that recently in one of these lectures. http://www.youtube.com/playlist?p=848...

(Sorry, don't remember which one. They are long and their content is not indexed. They are each rich/overflowing in great information however, so I hope this link helps. Language is Topic 23 on the list.)

Diane


message 3: by [deleted user] (new)

Thanks Diane I'll check those out. That would seem to support the idea that language is something we're adapted to have.


message 4: by Virginia (new)

Virginia MD (gingercampbell) | 321 comments Mod
Paulw wrote: "I started listening to these podcasts about a week ago and am chewing through them in no particular order, so apologies if this has been covered already. In the 'Evolution of Language' podcast, Gin..."

I also read Pinker's book first. His views in The Language Instinct are more along the lines of Chomsky than the version presented in The First Word, but it is certainly true that there are evolutionary changes in our airways that are essential to spoken language. I think you would enjoy reading The First Word: The Search for the Origins of Language by Christine Kenneally.

When BSP 30 first aired, I got quite a bit of negative feedback about my negative views of Chomsky's model. Since then I have had a chance to talk to quite a few scientists and linquists and it is my impression that while his work with grammar is highly respected, his conclusions about the brain have not held up to empirical research. I stand by my criticism of his lack of attention to the growing evidence that language is acquired, even though the capability to acquire language certainly did evolve.

No doubt the debate over the relative roles of evolution and learning (brain plasticity) will continue, as well as the debates about whether primates can learn language.


message 5: by [deleted user] (new)

Hi Ginger. Thanks for the response. I'll add The First Word to my list. I'm currently reading 'How the Mind Works', again by Pinker - in which he seems to be banging the same drum, that the mind is modular in design - but I've been hugely sidetracked by your great podcasts. I'll be making a donation in the next few days.

Paul.


message 6: by Virginia (new)

Virginia MD (gingercampbell) | 321 comments Mod
Paulw wrote: "Hi Ginger. Thanks for the response. I'll add The First Word to my list. I'm currently reading 'How the Mind Works', again by Pinker - in which he seems to be banging the same drum, that the mind is..."

Pinker is a very good writer. My favorite is
The Blank Slate: The Modern Denial of Human Nature, but I have to admit I didn't make it through his last book, The Stuff of Thought: Language as a Window into Human Nature.


message 7: by John (last edited Jul 14, 2011 05:07AM) (new)

John Brown | 52 comments I have just got the bugs out of my software, so I can move my gaze a bit closer to the horizon again.

To me, Chomsky’s ideas concentrate much too narrowly on grammar, whilst its interactions with real-world knowledge in the form of ontologies are probably far more important. A lot of real-world knowledge can be represented simply in the form of statistically frequent collocations, such as “barn door”, “horse race”, “race past”, versus infrequent ones like “race (a) horse”.

We seem to store simple associations between words, as much as their class or causal relationships, and when I introspect I seem to have a little cartoon associated with each one. Lots of Artificial Intelligence workers, typified by Charniak with his 1996 book “Statistical Language Learning” abandoned the idea of grammatical parsing, and replaced it with statistical analysis of language. You can write computer programs that mimic a lot of human language behaviour using just these techniques.
Research in Psycholinguistics has proved the invalidity of many of Chomsky’s ideas, particularly that of a transformational grammar. Psycholinguistics is more interested in what “garden path sentences” tell us about human language processing.

1) “The horse raced past the barn fell.”
2) “The horse raced past the barn door.”
3) “The hippo raced past the barn door.”
4) “The horse raced past the bedroom door.”

In understanding (1), “horse race” is far more frequent than “race (a) horse”, but “barn fell” is terribly infrequent. In (2), our expectations based on frequency do not let us down. To understand (3), we need a class-based hierarchy, which we seem to consult with lightning-fast speed. In (4) I have to look very closely at a couple of my internal cartoons to see that they are inconsistent.

Psycholinguistics argues for rules of thumb in parsing, like the rule of minimal attachment, but I find the statistical view based on collocations, like those mentioned earlier, more convincing. But notice that in deciphering (1) I am relying upon a transformational idea, restricted locally to a short phrase, as in
“race (a) horse” -> “horse raced”. (Chomsky’s passivisation rule, from memory)

But doesn’t it take a long time to go through this process? As always with neural systems, there seem to be back-ups and alternative approaches all the way through. To draw an analogy, an ontology with prototypes and statistical associations, makes language understanding a form of “embedded cognition”.

John Sowa’s book on Knowledge Representation in 2000 was an early influence on me, and this probably represents the classical logical view of ontologies. His web page at
http://www.jfsowa.com/
is still live but does not look to have been updated much lately.

More psychological ideas involving classification by analogy were put forward by Fahlman in his doctoral thesis at MIT in 1970. Lakoff has written extensively in this area in “Women, Fire and Dangerous Things” which is very readable, and on how apparently abstract mathematical concepts are really based on analogy with the human body and its movement, in “Where Mathematics comes from …”. If you have done first year college maths, physics or engineering, this is an absolutely fascinating read.

I enjoyed Gregory Murphy’s “The Big Book of Concepts”, with its base idea of the Typical Prototype of a Class, and I think that book talks to (3) above.

Nowadays I spend hours with Princeton’s Wordnet, which you can use over the Web or download and install, from
http://wordnet.princeton.edu/wordnet/...

But the view of the world that this ontology contains is surprisingly rather limited. They have the following hypernym (class) links:

cow -> cattle -> bovine -> bovid
cow -> placental mammal -> mammal -> vertebrate
milk -> dairy product -> foodstuff
dairy farm -> farm -> workplace

But nowhere is there any indication that we keep cows mainly for their milk, which is something that every 2 year old kid learns.

Jackendoff’s “Foundations of Language” seems to me to mark the start of linguists addressing the concepts mentioned above: they have all sorts of implications, for example about when morphological transformation rules (see Pinker, where he discusses idioms rather than collocations) are applied in analysing a sentence.

I want to write computer programs to mimic these processes, and I am beginning to get some success.


back to top