Monday, August 28, 2023

8a. Pinker, S. & Bloom, P. (1990). Natural language and natural selection

Pinker, S. & Bloom, P. (1990). Natural language and natural selectionBehavioral and Brain Sciences13(4): 707-784. 

Many people have argued that the evolution of the human language faculty cannot be explained by Darwinian natural selection. Chomsky and Gould have suggested that language may have evolved as the by‐product of selection for other abilities or as a consequence of as‐yet unknown laws of growth and form. Others have argued that a biological specialization for grammar is incompatible with every tenet of Darwinian theory ‐‐ that it shows no genetic variation, could not exist in any intermediate forms, confers no selective advantage, and would require more evolutionary time and genomic space than is available. We examine these arguments and show that they depend on inaccurate assumptions about biology or language or both. Evolutionary theory offers clear criteria for when a trait should be attributed to natural selection: complex design for some function, and the absence of alternative processes capable of explaining such complexity. Human language meets this criterion: grammar is a complex mechanism tailored to the transmission of propositional structures through a serial interface. Autonomous and arbitrary grammatical phenomena have been offered as counterexamples to the position that language is an adaptation, but this reasoning is unsound: communication protocols depend on arbitrary conventions that are adaptive as long as they are shared. Consequently, language acquisition in the child should systematically differ from language evolution in the species and attempts to analogize them are misleading. Reviewing other arguments and data, we conclude that there is every reason to believe that a specialization for grammar evolved by a conventional neo‐Darwinian process.

Tomasello, M., & Call, J. (2018). Thirty years of great ape gestures. Animal Cognition, 1-9.

Graham, Kirsty E; Catherine Hobaiter, James Ounsley, Takeshi Furuichi, Richard W. Byrne (2018) Bonobo and chimpanzee gestures overlap extensively in meaning. PLoS Biology





134 comments:

  1. I've got some general questions regarding the midterm:

    1. Since we will write our midterm in an essay format, can we use first person pronouns?

    2. If we want to use specific examples, do we have to cite the articles?

    ReplyDelete
    Replies
    1. Hi Tina,

      It was a terrific idea for you to post this in the skywritings! I’ll copy my reply there too.

      1/2: Write in any style you like, Tina, and definitely no need for citations! I know which readings I’ve assigned, and what was said in the lectures and replies!

      3: What I don’t know is how well you have understood it, and can put it together. Remember to make it kid-sibly enough so I can tell whether you really understand the words you are using, especially the ones I bold-faced.

      4. And don’t mirror my words back word-for-word: prove you understand them by explaining in a way kid-sib could understand.

      Don’t give your essay a title. Just head it with the questions:

      Describe the similarities and the differences between cognition and computation.
      How does Turing propose solving the “Easy Problem” of cognitive science?
      Why does cognition have a Symbol Grounding Problem but computation does not?
      How could Neural Nets help a T3 robot learn categories and ground Words?

      Your comments on whether and how using ChatGPT helped should be separate; it’s optional and not part of the answer to the midterm.

      If you’ve been following, this should be easy, fun, and satisfying.

      K-S

      Delete
  2. Pinker & Bloom argue for the causal mechanism of natural selection on a universal grammar in humans. Chomsky, Gould and Piattelli-Palmarini say that language is a byproduct of another natural selection mechanism rather than selected for specifically. Gould suggests that the human brain is an all-purpose general computer and language is just one of its many possible functions. I don’t see how this equates to a byproduct of natural selection. If the brain can develop many possible functions, likely many more than we can conceive existed, wouldn’t language just be one of the selective advantageous functions that it kept? Whereas the other functions fizzled out due to their weakness in reproduction and survival. I think this lines up with Haldane's finding’s “…that tiny selective advantages are sufficient for evolutionary change.” If the original form of language was a 0.1% selective advantage that could have been enough to snowball the effect of evolved language.

    ReplyDelete
    Replies
    1. Kaitlin, the first question to ask yourself is: What is language: phonology? vocabulary? grammar?

      Pinker & Bloom (P&B) [Bloom was supposed to talk at McGill last Friday, on another topic, but his talk was postponed] argue that there is no problem about the evolution of language. And for phonology and vocabulary and Ordinary Grammar (OG), they are perfectly right. Vocabulary is learned, not evolved (lazy evolution) and phonology is partly learned and partly "prepared" by Baldwinian evolution (what's that?) And OG is also learned, not evolved.

      The only problem with the evolution of language is "Universal Grammar" (UG), and P&B hardly mention it -- and certainly not as a problem for evolution.

      What is UG? how does it differ from OG? and why it is a problem for evolution? See 8b and the replies that will be unfolding in the next few days.

      Try those questions on ChatGPT, but you won't get the full answer...

      Delete
    2. tephen & Miriam, yes, that is exactly why explaining the evolution of UG (not the evolution of all the other aspects of language that P&B did discuss) is indeed a problem.

      Have a look at "Nine Easy Pieces on Universal Grammar (UG) with GPT-4".

      Delete
    3. I read the "Nine Easy Pieces on Universal Grammar with GPT-4" and I found it helped clarify to me why UG must be innate. Because there is no possibility for sampling non-members and members to learn how to categorize(POS) and because children naturally do not make UG errors despite this, it seems to be innate.

      Reading the article also made me think about other evolved capacities we have that are innate, for example facial recognition(innate), and how this innate capacity is essential for other(potentially learned abilites) like emotional recognition (which sparks a bit more debate on whether or not it is innate). It is interesting that while children don't tend to make UG errors, there is large variance in how late into childhood(or adulthood) they make OG errors and I wonder if any research into this would shine any light on UG. I imagine it wouldn't, as many biopsychosocial factors would play a role, not to mention it is learned in four different ways(imitation, instruction, feedback and statistical learning), all which would vary slightly person to person, making it hard to study. I am at a bit of a loss at how it could be studied.

      Delete
    4. Baldwinian evolution is when an animal learns to do something that is advantageous and then the skill evolves after. Baldwinian evolution (those most motivated to learn through instruction benefited from this advantage and that disposition became genetically incorporated). One example of Baldwinian evolution is imprinting by baby ducks. Baby ducks learned to follow the first thing they see move (they had the disposition to learn) and those that followed the correct thing (their mother) and not the incorrect thing (a predator) survived. This allowed for genetic reproduction of these ducks that learned the skill.

      UG is the innate capacity to learn language. OG is learned language. UG cannot be learned while OG has to be learned. UG is universal to all humans, OG is specific for the natural language learned from the environment.

      Delete
  3. I find that the following quote succinctly explains Pinker and Bloom's stance in this week's paper: "Putting a dome on top of four arches gives you a spandrel, but it does not give you a mosaic depicting an evangelist and a man pouring water out of a pitcher. That would really be a miracle. To get the actual mosaic you need a designer. The designer corresponds to natural selection". I'm not sure which "side" I want to take; theories I learned in my linguistics classes (such as X-bar theory) show that there are regularities across languages and that they all boil down to some basic tendencies. We haven't accounted for all languages in the world, but we have the foundational building blocks. However, I do not believe that it is that simple (I have no proof for this statement; it's simply an unfounded belief). To be honest, I prefer to focus on understanding language and it's role and impact on us today than to puzzle over how it came to be as we can only speculate about the "origins" of language (although I agree that the latter would help accomplish the former).

    ReplyDelete
    Replies
    1. Aashiha, I agree that the spandrel metaphor was lame, even though it came from a well-informed and imaginative thinker (Stephen Jay Gould). Judging from P&B's (influential) paper, you would have thought it was about "language". But it was actually about UG, which P&B hardly discuss: what is UG?

      Yes, language evolved a long time ago, and could not leave fossils, so reconstructing that is a challenge: Did it start vocally or as gesture? how and why did propositionally start? Tough questions, but not intractable. UG alone is a far bigger challenge. It's not the "Hard Problem," but it's a tough (and interesting) one. Why?

      Delete
    2. Hi! from my understanding, the main issue about UG is that it suddenly emerge in the history of human evolution. Also, such a complex system could have evolved when there are no known intermediate forms of "language" between the communication systems of other animals and human language - some animals uses their voice to communicate, like cat and dolphins - with its infinite generativity and specific complex rules.

      Also, I am considering whether we could treat UG as type of cognition, just as how we define categorization. UG should be something partially inherent and also need life-long acquisition.

      Delete
    3. Honestly, I'm also confused by the author's use of the word “spandrel” throughout the piece and really having a hard time understanding what the author is trying to say with “spandrel”, so what exactly is the metaphor for? For example in this sentence: “Likewise, one need not consider the possibility that some organ that arose as an adaptation to some other task, or a spandrel defined by other body parts…,” What does spandrel defined by other body parts mean in this context?

      Delete
    4. (I deleted my previous message because it contained too many weasel words probably)
      Jinyu, my understanding is that in evolutionary biology, 'spandrel' denotes a trait that emerges as an incidental **byproduct**, not through direct adaptive selection. Take the human chin: some biologists suggest it's a structural result of skull development, not a feature directly favored by evolution. The concept of a spandrel becomes less convincing for traits that are complex and highly functional, like the eye, because their intricacies and utilities typically indicate a history of direct selection pressures.

      Regarding Universal Grammar (UG), it might be simpler than its perceived complexity. Perhaps, like how algorithms apply fundamental principles to learn a wide range of tasks—from language generation to image creation—UG could be learned through cognitive patterns our ancestors already had. The intriguing idea that language abilities were selected as a marker of overall cognitive capacity through sexual selection aligns with this view. It suggests that the capacity for language, rather than being a complex adaptation on its own, could be an emergent property of the cognitive competencies that were advantageous in social and reproductive contexts through the EEA.

      Delete
    5. As mentioned, the difficult part with evidence of an innate language evolution is that language itself does not leave fossils. However, our ability to produce language can leave fossils in our physical body structure, like our jaw and throat bone structures that may have evolved as we learned to pronounce different phonemes and rely less on gestural communication. The difference between this evolved, innate UG and learned OG is interesting when looking at the particular types of language use that each allows. For example, with solely UG children can convey meaning and basic needs, but with learned ordinary grammar, we can use our language in unique ways such as writing poetry and essays; here OG brings an aesthetic aspect to communication, it can be used to change the flow and sound of sentences to evoke an emotional response, or certain cases communicate more effectively with emphasis for persuasion. Why this happens may be related to the Hard Problem of how and why we feel: language is something we have learned to use to make us feel or respond in particular ways and infer what others feel.

      (I am just noticing my skywriting did not upload or got removed possibly because of my internet connection so I am reposting it)

      Delete
    6. Evelyn, yes, grammatical capacity, whether learned or innate, is part of cognitive capacity, hence it has to be reverse-engineered and explained by cogsci.

      Jinju, as discussed in class, a spandrel is a byproduct of putting many circles in a big circle. It is a side-effect of geometry. It started with Gould & Lewontin's critique of "adaptationism," -- The spandrels of San Marco and the Panglossian paradigm: a critique of the adaptationist programme" -- which is the tendency to seek an evolutionary explanation for all inborn traits. Some traits may instead be side-effects of other genetic traits. Gould suggested that UG, too, might be a "spandrel" in this sense. The trouble is that with San Marco's spandrels it is obvious what they are a side-effect of, geometry, (and in the case of the supersized female hyena, it's testosterone), but no one has found a main genetic effect of which UG could be a side-effect.

      Thomas, that's right.

      Rebecah, a bit too imaginative, I'm afraid. You can't have a language if you don't have UG. There's no way to have one without the other. Yes, language can evoke and convey affect (feeling), but that has nothing particular to do with OG or UG, and it does not bear on the HP one way or the other.

      Delete
    7. I found this paper quite hard to follow as I am still unsure what Pinker and Bloom’s specific stances are beyond them thinking that the evolution of language can be attributed natural selection. In subsection 5.2.2, they say “while one might justifiably argue that an entire system of grammar must evolve in a gradual continuous sequence, that does not mean that every aspect of every rule must evolve in a gradual continuous sequence.” Is this them drawing a distinction between UG and OG?

      Delete
    8. Jocelyn, P&B go over the easy parts of the evolution of language and skip over the hard part (UG). Why is it hard?

      Delete
    9. The thing that makes UG hard is that it is not explainable by evolution, which is "lazy" - we have to remember that it is not an "optimally designed," beautiful system. Individuals with traits that help them survive and reproduce live on, passing on those traits to their offspring. UG is not something that helps with reproduction, nor survival - as others in the thread have said, "why is it necessary for us to have certain built-in traits instead of just a built-in system that helps us learn these traits (Baldwinian evolution)?"

      Delete
  4. I found the comparison in this paper between language and vision to be helpful in terms of thinking more about the logical arguments behind how language arose. If the evolution of the eye is meant to be an example of exaptation (and the idea that the eye only developed in a series of intermediate steps, each paired with an actually advantageous development), this is all well and good. I can believe the idea that the development of vision was necessarily a byproduct of other evolutionary drives, as having a half-developed eye is no more beneficial than having no eyes at all. Having partially developed language, though, would be advantageous, and being able to convey meaning and instruction to others poorly is better than being unable to do so at all. That said, from an evolutionary perspective there are different parts of language that are differently useful. If the goal is survival and reproduction, it seems more important to be able to convey the content words (e.g, bear) than to ensure proper grammatical structure. This is not to say the latter isn’t important at all, but it certainly seems more crucial for survival that I know that there is a bear somewhere than to know with precise detail what it is doing.

    ReplyDelete
    Replies
    1. Madeleine, very good reflections. Step by step evolution makes sense for modality (gestural, oral) and for phonology. Propositionality might have been one (especially important) step. Some of the steps may have been Baldwinian (which, and what does that mean?). Vocabulary and OG are learned, not evolved (except maybe a Baldwinian motivation for learning them). But UG is another story...

      Delete
    2. The paper describes Baldwinian evolution as ‘The process whereby environmentally-induced responses set up selection pressures for such responses to become innate, triggering conventional Darwinian evolution that superficially mimics a Lamarckian sequence’ Or in kid-sibly words - acquired behaviour/culturally invented traits, which is important to adapt to the environment, can affect our genetic makeup through natural selection. This suggests that learnt behaviour can eventually become innate.

      Delete
    3. Andrae, a bit more kid-sibly would be: if learning something is especially beneficial, individuals with a genetic tendency to learn it more quickly, easily or readily will have an adaptive advantage and their genes will spread more, eventually to the whole population,.

      Delete
    4. This reminds me of the part of the paper where they talk about the unavoidable tradeoffs of utility within language. Although I agree, could we argue that in the case that Madeleine mentioned, maybe as our living circumstances evolved to go beyond the survival needs of hunters and gathers, the need to convey content words did not supersede that of proper grammatical structure (at least not to the degree that it used to)? An example of this could be the ability to convey socially-relevant abstract information became increasingly advantageous, whereas linguistic demands previously were pertained to more simplistic demands like pointing out the presence of a predator (e.g., bear).

      Delete
  5. In the reading, the discussion about evolutionary theory was intriguing to me, especially the part where it compared gradualism vs punctuated equilibrium. It highlighted 2 sides of the evolutionary theory. It mentioned the theory of punctuated equilibrium, which posits that evolutionary change occurs in bursts corresponding to speciation events, which opposes to the gradual change over generations idea, proposed by Darwin. It also talks about the importance of tiny changes when it comes to understanding complex adaptations, also suggesting that sudden large changes have a minor role in evolution.

    ReplyDelete
    Replies
    1. Selin, yes, but all-or-none puzzles, like UG (and maybe the HP?) appear too. And gradualism is relative too, sometimes faster, sometimes slower. (That of course depends on the length of the species' generational cycle; elephants vs. fruitflies).

      Delete
    2. I was also interested in the discussion between punctuated equilibrium and gradualism. Pinker and Bloom assert that punctuated equilibrium is incorrect and that gradualism is the proper approach, and assert that grammar evolved incrementally with each increment of grammar being increasingly useful compared to the past. I wonder about the implications of adopting either approach. If we assert that grammar evolved in a sudden burst under a specifically strong environmental context, would that provide more information on the purpose of the grammar and help us in reverse engineering it more than if we look at it from a gradualistic perspective?

      Delete
    3. Omar, no, neither spandrels nor "punctuated equilibria" explain the origin or adaptive of UG, so Pinker is again ignoring or oblivious the puzzle the innateness UG poses for evolutionary explanation.

      Delete
  6. I find the argument that language did not develop as a means of mental representation surprisingly compelling against the idea of language being purely adaptationist. The complex structures and endless possible ambiguities that language can make show that language couldn’t purely develop from representation of thoughts. The idea that phonological rules and pragmatic devices like illocutionary force must have a listener on the other end further disprove the mental language origin.

    ReplyDelete
    Replies
    1. Megan, but have you forgotten that both "mental" and "representation" weasel-words? Explain. And what does that have to do with adaptationism? (What's that?) And what does ambiguities have to do with it?

      Both thinking (cognition) and communicating are things that organisms can do, but the communicating is observable whereas the thinking is not. And both the capacity to think and DO evolved, because of adaptive advantages they conferred.

      Delete

    2. Adaptationism views human characteristics and facilities as evolving in response to an environment with a particular purpose. An example is given from a Boston Globe article which argued that the reason human mothers have two breasts is likely because it is not uncommon for mothers to give birth to twins, and having two breasts would allow her to feed them simultaneously. Contrary to an adaptationist viewpoint, this reading points out that a variety of mechanisms have been at play throughout our evolution - genetic drift, laws of growth and form - which may explain human features. Our body structure of bilateral symmetry which features two hands, eyes, legs, etc appears as a more likely explanation of why female humans have two breasts than the Boston Globe’s (what about triplets?).

      Megan, I was confused about how the argument that language did not develop as a means of mental representation was compelling against adaptationism. As I understood it, in the reading it was argued that the way we use language and the structure our grammar tends to take on suggests that it is unlikely “mentalese” was what language was ‘designed’ for. Rather, language allows us to come up with a shared symbolic system so that we can communicate about things that are not physically present with other people. Having the word ‘tree’ allows me to discuss trees even when a physical tree is not present to point to. If anything, this seems to be adaptively advantageous as it allows communication between human beings to share knowledge and ideas.

      Delete
    3. Zoe, good points. But the evolutionary advantages of language are a lot greater than just being able to talk about out-of-sight objects! For example, being able to convey categories's features by verbal instruction rather than everyone having to learn them the hard way for themselves,

      There is a subtle point related to "mentalese" however, and it has to do with "thinkable thoughts". It's discussed in other replies in these threads: what is it?

      Delete
    4. It is positied that the evolution and emergence of UG is necessary for expressable thought and that any thoughts that are UG-noncompliant are rather not thoughts in the same way (since ‘thought’ is a rather broad term we would say they are non-thinkable thoughts). This is not empirically provable and is just a ‘thought’ experiment if you will but it provides an interesting possiblity for the nature of thought itself and we may have to reframe how we concieve of it to get to the bottom of how it evolved and if it is inextricably tied to UG and language. This strays away from cognitive science and more into philosophy.

      Delete
  7. This paper presents arguments supporting the causal mechanism of natural selection in explaining the evolution of language, and lists some basic building blocks of our grammar with their semantic/pragmatic functions. One point that does not seem to be emphasized though is the fact that some of our language abilities are innate (as in universal grammar), while other abilities are learned. This presents a significant question in the evolution of language: what is the function of having these innate rules built in, instead of just learning them all with the help of an innate learning structure?

    ReplyDelete
    Replies
    1. I agree with Jessica. Indeed, the paper does not study universal grammar. When tackling the question of what is language, this seems non-optimal. When asking what makes a human, as they have language, different from another being that does not have language it is crucial. Still, the text gives a good summary of the evolution of language, influenced by natural selection.

      Delete
    2. Jessica & Garance, yes, that's the question to ask.

      Delete
    3. In response to Jessica's comment, a possible reason why some innate mechanisms are already built in (and this is purely speculative) could be cognitive economy? We know that the brain rewires to find more efficient pathways (this is learning), in the interest of being able to "save" on other adaptive things to learn about. It would make sense that having a built in "language-learning module" would save our brains a lot of time and energy when learning to communicate with others. What do you guys think?

      Delete
  8. Melika Yadmelat October 31, 2023 at 12:37 PM posted:
    "This reading explores how human language has evolved by natural selection. Indeed, the authors conclude that language presents signs of complex design for the communication of propositional structures, and thus must be the source of the process of natural selection. Unlike OG that has to be learned, UG is innate and within all children at birth. This makes me wonder, if we were to reverse-engineer UG, we would gain an explanation regarding its mechanism and function but I do not think that we would know why UG was required to produce propositional structures. What would be needed to figure this out?"

    ReplyDelete
  9. Melika, yes, that's the right question to ask about UG.

    ReplyDelete
  10. The authors of this text seem to be arguing for the idea that language evolved through natural selection just like some of our other biological features because language was advantageous for humans. They also introduce the concept of universal grammar (UG). While reading I tried to understand the difference between the concept of universal grammar (UG) and ordinary grammar (OG). The difference between OG and UG is that OG can be learned. This is supported by the fact that children make mistakes when they speak as they are leaning and, when they are corrected with feedback, they learn from it. Yet, UG is universal, this is not something that is learned but rather something that we are born with. According to the reading, children follow UG rules without making mistakes and without ever having received feedback. But what is not explained in the reading is how we got to have this innate UG. So, the main question that I would like to ask is how and why did UG evolve? And also, what is the evolutionary function of UG? (How are these innate grammar rules giving us an evolutionary advantage?).

    ReplyDelete
  11. The authors explain the suboptimal communicative power of some grammatical structures by saying that at a certain point, the pressure to conform to the grammar of others becomes greater than the pressure to maximize objective communicative power, so grammar calcifies in a particular form. This is because one is more easily understood by others when they use the same inefficient grammars as the people around them than when they use more developed grammar. But we must have arrived at these genetically codified grammatical structures incrementally, through evolutionary means. The authors say that Universal Grammar evolved under the pressure to communicate your own thoughts and understand those of others ever more effectively, in what they describe as a linguistic arms race. In this arms race, it would have been more beneficial to introduce a new grammatical structure that broadens your communicative powers, even at the risk of not being well understood, than to just conform to the grammar of everyone else. My question is, at what point in the evolution of Universal Grammar does the primary adaptive concern switch from being the pressure to communicate to the pressure to conform, and why?

    ReplyDelete
    Replies
    1. Aya, I think this is just P&B, speculating. What they say about conformity and convention in grammar applies much more to OG than to UG. Do they even make the distinction between OG and UG?

      Delete
  12. Pinker & Bloom give thorough counterarguments to many of the arguments against Darwinian natural selection as the main causal mechanism behind the development of natural language capacity in humans. One point they brought up is the complexity (weasel-word?) and fragility of our grammatical systems, wherein a slight change to a grammatical principle can lead to “dramatic effects on language as a whole”. This fact has been used by some as an argument towards the unlikelyhood of universal grammar evolving gradually through natural selection, and the authors here respond to this argument by claiming that UG could have been achieved by the gradual improvement of incomplete, though still functional, grammars. They claim essentially that formally incomplete grammars, in an interplay with non-linguistic cognitive systems, could still have been used to generate and comprehend sentences, and that the UG that was eventually arrived at was evolutionarily selected as the most advantageous form of innate grammar. In so positing, they attempt to do away with the notion that UG is an “all-or-nothing” phenomenon, showing that its existence is perhaps less surprising than some would think.

    ReplyDelete
    Replies
    1. Adam, there's no way to wave away the special perplexities of the origin and adaptive benefits of UG just by saying it happened -- somehow -- gradually! Consider just the (simple) subject/predicate proposition. If that's already there once a language (and its speakers) can express propositions at all, what is the gradual "complexification" to UG, and how and when did our genomes get there?

      P&B are right that there is no evolutionary challenge in the evolution of language, but they have nothing whatsoever to say about the evolution of UG. (What is language?)

      Delete
    2. Language can be seen as a system of communication using symbols (spoken words, written characters, gestures) with rules for combining these symbols to convey meaning. According to P&B, that the ability to use language could have provided significant adaptive advantages, such as improved ability to share information, coordinate actions, or engage in social bonding and group cohesion.

      However, explaining the evolution of language in general does not directly address the specific evolution of UG - this may leads to the discussion about OG. UG involves not just the ability to use language but a specific set of innate grammatical structures and rules. The idea that evolution is "lazy" aligns with the principle that natural selection often favors solutions that are 'good enough' rather than 'optimal.' This is a key point in understanding why the evolution of something as intricate as UG poses a challenge for evolutionary theory.

      Moreover, the complexity of human language goes beyond mere communication; it involves the ability to express abstract concepts, hypothetical scenarios, emotions, and so on.

      Delete
  13. Two points made by Pinker and Bloom to counter arguments against the natural evolution of language particularly struck me:
    - The idea that linguistic innovations do not necessarily begin with genetic change, according to the Baldwin effect, which posits that behaviors or traits acquired through individual learning and adaptation can eventually become genetically assimilated in a population over time. This is referred to as a phenotype-first type of evolution (according to Wikipedia).
    - The idea of evolutionary acceleration, initiated by the cognitive arms race between the members of a group, explains how language could have evolved “so quickly”. Language abilities would have quickly evolved driven by a competitive process between individuals. Pinker mentions the case of the cognitive arms race between a cheater and the other members or the conflicts for reproductive success in which language abilities can make the difference (in convincing others, for example).

    ReplyDelete
    Replies
    1. I'm also inspired by these arguments, but I'm confused. Aren't these both supports FOR a natural Darwinian evolution of language (and not a counter argument)?

      Delete
    2. Joann, think of Baldwinian evolution like this: Something is learned. It proves to be very useful That means it becomes adaptive to learn it. Then natural selection favors genes that make that learning faster or easier, and that increases the motivation to do the learning (e.g., imprinting, learning to walk, learning to talk -- and of course learning capacity itself). How is this related to the laziness of evolution?

      An organism's genotype is all its genes. Its phenotype is the outcome of the expression of the organism's genes in bodily and behavioral traits and their interaction with the organism's environment.

      Csenge, yes, P&B's points here are for, not against, the explainability of the origin of language by evolution through natural selection (both Darwinian & "Baldwinian" [which is of course also Darwinian]). (What P&B ignore or underplay is the problem of explaining the evolution of UG itself, which is far from "easy," like the other components of language, innate and learned. Why?)

      Delete
    3. I think the subsection 5.2.1 Nonshared Innovations encapsulates Baldwinian evolution in the context of UG quite well: “When some individuals are making important distinctions that can be decoded with cognitive effort, it could set up pressure for the evolution of neural mechanisms that would make this decoding process increasingly automatic, unconscious, and undistracted by irrelevant aspects of world knowledge. These are some of the hallmarks of an innate grammatical “module” (Fodor, 1983).

      However, it does not address how UG came to be and its implications on the evolution of language. If UG is innate, does that make it immune to natural selection? If not, what is the distinction between aspects of language that belong to UG versus those that pertain to OG?

      Delete

  14. I like the passage 5.3.2. It shows the link between language, technology, and social interactions in early human societies, highlighting how language has been essential for cooperative survival and the sharing of vital knowledge. It emphasizes the significance of accumulated generational knowledge and the universal human trait of teaching, emphasizing language's role in both individual and collective learning. It also mentions how advanced communication tools and recursive syntax allowed for the effective transmission of information crucial for tasks like navigation, resource identification, and risk evaluation.

    ReplyDelete
    Replies
    1. Julide, yes, language's revolutionary power comes from the capacity to acquire and transmit categories indirectly, through propositions. (So, everybody, please forget the simplistic notion, still found in ChatGPT3.5, that it's all just the capacity to think or talk about things that are out of sight.)

      Delete
    2. I also enjoyed this section of the reading! It made me think about the recent advancements in open AI. A few days ago it was announced that there would now be “GPTs” which are essentially tailored versions of chatGPT that can be customized to do all sorts of things. However, what is unique is that it doesn't require any coding knowledge at all, as it allows a person to program it by simply using language. This means that almost anyone can create their own customizable chatGPT. I thought this was interesting because we are now seeing advancements where typical people are given the opportunity to create something that they most likely wouldn't have been equipped to do before. I think this really demonstrates the power of language and how its use is changing and evolving over time and I am curious what everyone thinks about this advancement.

      Delete
    3. This is a good point! This passage helped clarify for me more about the importance of prepositions. While it is quite intuitive to understand the importance of content words, this section does a nice job of conveying the importance of the linking words as well. I can imagine a progression of language where, once a community establishes their content words, and a form of rudimentary communication using language, that community would become much more successful, safe, and coordinated with the addition of prepositions to capture ideas that would be challenging to convey otherwise, but nevertheless important for survival.

      Delete
  15. Pinker & Bloom argue that children first develop a simple, logical, expressive system of communication. For example, this can be done through the use of short phrases such as “want food”, following innate Universal Grammar rules. This method of communication is simple and straightforward, and sufficient to have one’s basic needs met (ie: express hunger to be fed). However, as children develop, they slowly abide by an “adult code” which is more subtle and arbitrary. Children acquire a larger vocabulary and learn to use “Ordinary Grammar” rules learned overtime by interacting with the environment. They gradually stop communicating using simple straightforward terms. Instead of “want food”, children may use “I’m hungry” or “My tummy is rumbling”. Despite the use of longer phrases and more complex vocabulary, Pinker & Bloom argue that it is possible to communicate concisely and precisely, so long as the listener is familiar with the vocabulary being used.

    ReplyDelete
    Replies
    1. Hi Anais, I think that's spot on! This argument of Pinker & Bloom could be referred to the "Show to Tell" proposition. Evolutionarily it would be beneficial for young kids to first ground knowledge about the environment through showing or pointing to references. From then they can utilise Universal Grammar to interact and communicate their needs. However, without vocabularies there would be very limited messages to convey, hence the primary expression evolved into telling. These adapations make sense from an evolutionary perspective, except that the emergence of functions / abilities still could not be explained, including how UG came about. But to me, that's an exploratory question equivalent to how and why we have cognition in first place.

      Delete
    2. Anaïs, if it's not clear why UG was needed for adult conversation, it's even less clear why it was needed for baby-talk, which just expresses the needs and requests that other species express through nonverbal behavior. Nor is it clear why OG is not enough (along with the propositions and predicates of logic).

      Kristie, not so fast: What was that transition from show to tell? How is it related to grounding? And how did UG get into into the picture?

      Delete
    3. The paper by Pinker and Bloom suggests that the complexity and nuances of adult language, including Universal Grammar (UG), are not superfluous but evolved as advantageous adaptations. UG's role in simple 'baby-talk' is foundational, giving a structured base from which language complexity develops. The transition from 'show' to 'tell' reflects an evolutionary advantage in moving from non-verbal to more complex verbal communication, leveraging UG's framework. UG, therefore, is integral to both basic and sophisticated communication, evolving not just for advanced discourse but as a core mechanism facilitating language development and adaptability.

      Delete
  16. The assertion from the authors that the mechanisms of language have been evolved in order to support “the communication of propositional structures” is extremely interesting. More specifically, they explore the reasons that this explanation of language capabilities is not a “just-so” story, which I think is useful in further understanding the concepts. Similarities are drawn between language mechanisms and the discovery of bat sonar and how because of the so-called complexity of the ability we are able to determine that the specific function was designed for that purpose. While I don’t necessarily disagree, it does seem like a leap in logic to assume that just because something is complex (in whatever capacity it is being used to describe something) that in turn the system must have been designed for that function. Additionally, while this supports the idea of UG where some language structure is innate, it doesn’t comment on the fact that there must be some learned language mechanisms that allow for OG.

    ReplyDelete
    Replies
    1. Jenny,
      What is a proposition?
      How is propositionality like bat sonar?
      "Complexity" is a weasel-word.
      What do you mean by "design"?
      What is evolution?
      What is UG?
      Do P&B explain how UG evolved, or even why it's not as easy to explain as other components of language (including OG)?

      Delete
  17. What I found particularly interesting about this paper is how advances in AI have brought doubts about some of the things they state as facts. Pinker and Bloom make the point that language is too complex to be learned from examples and this is why children must have some innate constraints. This argument holds up less well in the days of LLMs like Chat gpt. If language could be learned by examples like this technology seems to do then why do we have these innate constraints? Chomsky has very effectively shown that we don't learn language simply by example but why have UG if it is indeed not actually needed to learn language.

    ReplyDelete
    Replies
    1. Marie, does GPT really learn language?

      Or does GPT just learn to manipulate words from its Big Gulp and from our user queries in a way that makes sense to us users?

      Is that what kids learning language are doing?

      What GPT can do is surprising and interesting and not fully explained -- but has it really learned language, rather than ways of manipulating (our) words?

      Does GPT understand or mean what it says any more than Searle does in the Chinese Room?

      Delete
    2. No, as we have discussed extensively in this course gpt does not ground language. I don't think children at all learn language like what gpt is doing but many of the objections raised in the paper do not surround grounding at all. It seems like at least part of the objection is about syntax rather than semantics. That the structure of language is too complex to be learned by example alone. This is the argument I was responding to. I completely agree that gpt has not learned language and that we shouldn't draw conclusions about how humans learn language from it.

      Delete
    3. Marie I agree with you ! Chat gpt does absolutely not learn language and does not even have UG. It can’t symbol ground because as we saw that is only available if someone or something has sensorimotor capacitors but since chatgpt is T2, it does not. It just manipulate words from its big gulp and that would not be possible for kids to do or for anyone at all. I’m guessing that to learn language, you would need to fully understand the meanings of the words and as we saw with Searle’s CRA that is not possible.

      Delete
    4. Marie, I was looking for someone to bring this up!! I was thinking about the distinction between T2 and T3 robots and how we could ever possibly reach T5 in the context of UG. So we can say that P&G propose that language is an adaptation shaped by evolution, with specific linguistic abilities being encoded in the human genome. I believe that chatGPT operates based on an OG model alone, but i do see how that perspective can be flawed, since i guess when we begin to code the way that Chat GPT manipulates language obviously comes from our own skillset of language acquisition and manipulation which is shaped by both UG and OG but I digress. I want to explain GPTs flaw based on prof Harnad's question, "does GPT really learn language?" by answering it with another, how do kids learn language? So there's two important aspects in word learning in children: their own assumptions about language, and social context (so learning from caregivers and peers). I wont focus much on social context because thats primarily more related to OG. Kids have many assumptions when it comes to learning language, and they're not taught, they're innate language tools that they instinctually refer to as result of UG. As language development progresses, we tend to abandon those assumptions and rely more on social context. I'm going to go on a tangent but think simply about learning a second language, the sensitive period for language acquisition lasts from birth to before puberty due to the maturational changes in the brain whereby language brain areas become less plastic. This is a crucial period in which an individual can acquire a first language if exposed to adequate linguistic stimuli, to the point where full native competence is possible. After this period, languages are learned with great difficulty and native-like competence is rare. I think part of WHY this is the case is as adults, is our inability to directly access UG, based on some of the research I've read, but there's actually a lot of differing opinions and not a lot of consensus has been reached on that topic yet, so I don't want to mislead you in any way, I encourage you to explore that concept and form your own opinion! Anyways, back to UG and ChatGpt, I disagree with the position that UG is not required to learn language. The question should be more about, what are the mechanisms by which we acquire UG through natural selection, which I don't believe is explored in this reading, rather than questioning wether it actually aids in language acquisition. I think if we can 'reverse engineer' that, maybe we can teach chatgpt to learn words without fully solely on human input, perhaps skipping T3 altogether and moving on to become T4!!

      Delete
  18. In this paper Pinker and Bloom argue that language evolved through natural selection, which they support by demonstrating that language “shows signs of complex design” and that the only explanation for this design is the process of natural selection. An aspect of the reading that I found most interesting was their discussion of language diversity, and language design (Section 3.3). The reading highlights that though there seems to be a vast amount of language diversity, these variations are merely surface level. Pinker & Bloom argue that language diversity, and their corresponding variations “correspond to differences in the extent to which the same specific set of mental devices is put to use, but not to the differences in the kinds of devices that are put to use.” Using this argument they elaborate on how the variation across languages may be due to constraints unique to a particular environment, which are learned by some underlying and universal learning mechanisms. An aspect of the reading which I struggled with a bit was the discussion on why language design is not a just-so story—could anyone explain the language design as a just-so story argument? Thank you!

    ReplyDelete
    Replies
    1. Shona, what is "complex" and what is "design"?

      And, while we're at it, language seems to have lots of components: phonology (learned, with some Baldwinian "preparation"), vocabulary (learned, with Baldwinian motivation and speed), OG (learned, with some Baldwinian preparation), and UG (unlearnable, hence innate, hence evolved, but how and why?)

      Language variation is in phonology, vocabulary, and OG. But why would the capacity to learn language take such a complex (sic) form as UG?

      The only difference between a "scientific" (i.e., empirical) hypothesis and a Just-So Story is the amount of observable evidence supporting it. (That's another one of the faces of underdetermination. (What's that? What's approximation? What's uncertainty? What's information?)

      Delete
    2. Adrienne, good points.

      Language certainly evolved for communication rather than "self-expression". All of its features can be explained by evolution or learning -- except UG: Why?

      Delete
    3. When P&B talk about "complexity" they are referring to "any system composed of many interacting parts where the details of the parts' structure and arrangement suggest design to fulfill some function." They use the example of the eye to demonstrate that the structure and interaction of the many parts of the eyeball suggest that it evolved through selective pressures to achieve a function, as natural selection is the only physical process capable of producing low probability arrangements of matter.

      UG cannot be explained by learning because we innately possess some linguistic knowledge from birth, allowing us to generate more sentences than we would be able to if we were just learning from the stimuli around us. UG further cannot be explained evolution (cannot be explained easily, it MUST be evolved) because there is no clear explanation as to how UG would have evolved gradually (what selection pressures would entail that UG evolved in the way that it did?) and why the evolution of UG would have conferred reproductive advantage.

      Delete
  19. In cognitive science classes, we learn about language in terms of modular and interconnected structures in the brain that are involved in specific aspects of language processing, such as understanding words (wernicke’s) or controlling the motor movements of the face to speak (broca’s). These brain structures likely evolved by means of natural selection (much research shows that our phylogenetic counterparts, like macaques and chimps, show similar cortical areas based on their neuroanatomical organization). But where I become confused is when we use of ‘grammar’ and ‘language’ as interchangeable concepts. Did the brain structures evolve thru natural selection or the grammatical rules?

    ReplyDelete
    Replies
    1. Further, are the modules described by Chomsky meant to represent the brain structures? Even so, why are these modules described as having an innate “grammar” system, rather than addressing the real hard problem that we don’t understand HOW they process information so that we may communicate with language.

      Delete
    2. Hi Kristi,

      From what I understand, the article "Natural Language and Natural Selection" argues that the complex structure of the language faculty, including grammar, is a design imposed on neural circuitry as a response to evolutionary pressures. Meaning that brain structures involved in specific aspects of language processing likely evolved by means of natural selection, but this does not mean that grammatical rules themselves evolved through natural selection. Rather, I think that the capacity for grammar is a product of the evolution of the language faculty as a whole. I could have misunderstood so don't hesitate to respond!

      Delete
    3. Kristi, "modules" is an empty weasel-word. Yes, neuroanatomy evolved, and it is correlated with function, across species, but almost none of that has yet been reverse-engineered. Correlation and homology is not explanation. Vocabulary and OG are memes, so they do not evolve genetically. Phonology is partly evolved and partly learned. But UG, which is unlearnable (why?) would have to have evolved. The problem is: how? and why? Read the other replies for why these evolutionary questions about UG are hard.

      Delete
    4. Lili, you're missing the distinction between OG and UG (both are "grammar").

      Delete
  20. I am trying to understand common ground between Chomsky and Gould. From what I understand their theories are quite different : Chomsky's theory emphasizes the role of innate structures in language acquisition, while Gould's theory emphasizes the role of cultural and environmental factors. Is what they agree on just that language did not evolve through natural selection? Or is there more to it?

    ReplyDelete
    Replies
    1. Emma, Gould is just suggesting that not all genetic traits have a direct evolutionary/adaptive explanation; sometimes they are side-effects of adaptations that evolved earlier, providing earlier adaptive benefits. That is what the "spandrel" metaphor was about (see other replies). But it does not work for UG: why not?

      Delete
  21. In section 4.2. Constraints on Possible Forms, authors Pinker and Bloom refute Chomsky’s argument for considering “physical law” alternatives to natural selection. I understand, and agree with, most of their arguments against language being a necessary physical consequence of human brain size or intricate neural connectivity (weasel word, I know). However, I do not understand why the following sentence is necessarily inconsistent with Chomsky’s suggestions:
    “Neural Network modelling efforts have suggested that complex computational abilities require either extrinsically imposed design or numerous richly structured inputs during learning or both”
    While I understand what the above phrase entails, I’m failing to see the link between what neural nets have shown and refuting Chomsky’s argument. Can anyone explain?

    ReplyDelete
    Replies
    1. Paniz, you are right. That sentence does not refute Chomsky's evidence that UG is unlearnable, hence innate. But "physical law" is not an explanation of the origin of UG either.

      Delete
  22. I found Pinker and Bloom’s discussion on the mind as a learning device to be the most interesting, specifically the claim that there is “no psychologically realistic multipurpose learning program that can acquire language as a special case”. I believe this to be fundamentally incorrect, as Landauer and Dumais with their Latent Semantic Analysis model have gone to great lengths to show that language and grammar can be learned from the contextual statistics of language use alone. They propose that no innate knowledge is required for language learning mechanisms and that all human language acquisition is based on a process of induction called abductive reasoning. The LSA model, which is a neural network that relies on dimensionality optimization, co-occurrence patterns, and similarity structures, performs incredibly well on TOEFL and can capture the language learning rate of children to a comparable degree. Most interestingly is the fact that LSA is the precursor to, and basis of, the mechanisms underlying ChatGPT, meaning that the incredibly impressive language abilities of ChatGPT can and do arise from a multipurpose learning device that learns from experience alone. Pinker and Bloom argue that a system capable of learning language without an innate language acquisition device or a universal grammar “does not now exist in artificial intelligence and it is unlikely to exist in biological intelligence”, but evidently it does in the form of ChatGPT. I’ve always found Chomsky’s UG to be this sort of intangible and elusive and hand-wavy mechanism by which language is acquired, especially given that no one (including Pinker and Bloom in this paper) has provided a concrete evolutionary explanation for how it arose (I am a firm believer that a biological feature cannot magically arise unless it is naturally selected for). It would make much more intuitive sense if language were learned from experience alone, as every other domain of human knowledge is learned. It seems far less likely that humans are born with the capacity of language than it is that we learn it through experience, which is shown to be possible through the mechanisms underlying LSA and ChatGPT, which is very much contrarian to Pinker and Bloom’s claim that I started this skywriting with.

    ReplyDelete
    Replies
    1. Stevan, well said! I wonder how the field is revisiting its beliefs based on the advent of AI that can do many things humans can with general-purpose algorithms that merely attempt to reduce uncertainty (form world models that generate accurate predictions).

      Moreover, a repeated attempt to separate human learning from chatgpt-like systems' learning is the claim that the latter require a "Big Gulp" to generate grammatical text. However, could this big gulp be connected to the "big gulp" in human infancy, where babies don't speak in the ~first year of life, despite receiving many billions of bits (some approximate it at 10^14 bits of sense-data) of sensory data (visual, auditory, etc.)? It seems plausible that this kind of low-quality, highly regular data is required by general purpose learning algorithms to improve the ability to notice patterns quickly and be sample efficient, and is the very thing that allows children to learn languages after a few example, once they have received this big gulp. Though this cannot be tested, a human child that would not have access to this immense amount of low-quality sensory data probably would lose the ability to learn language at all.

      Delete
    2. Stevan V., you are quite right that Latent Semantic Analysis (LSA) is an early precursor of the statistical part of LLMs like ChatGPT. But now ask yourself whether ChatGPT (like us) has actually learned language. See replies to Valentina and Marie and Paniz. And remember the CRA.

      Yes, life would be simpler if UG had been either unnecessary or learnable (no UG-evolution problem), but it's not. It would be even easier if all organisms had been Darwinian Zombots (no HP!), but we're not.

      Delete
    3. Thomas, don't get carried away! Please explain to kid-sib, slowly and clearly, exactly what ChatGPT's "Big Gulp" is (and isn't). Then explain to kid-sib how to cram all that into the head of a 1-year-old child (rather than what actually goes into its head in that first year; and don't forget what millions of years of evolution "preparation" had already crammed that little head with). Then re-read the three replies I suggest to Stevan V above,

      And, last, reflect on the fact that even if evolutionary time were at your disposal, the "universal learning algorithms" you are imagining might have to be approached the slow, old-fashioned way, just as they were by us (and our ancestors, right back to the primal worm).

      (But ["Stevan (H) Says"] that, in reality, these proud and hopeful "universal learning algorithms" we think we finally have now are actually just universal "curve fitters" that you can use if you have unlimited time (and resources) at your disposal. So far they have only been applied to a few toy-scale problems (like ChatGPT's manipulation of the human-supplied "Big Gulp"). Cranking that up to Turing scale or even Darwin scale may begin to show that the time universal curve-fitting is actually up against is exponential...)

      Delete
  23. Reading Pinker and Bloom's paper, I appreciate the depth of discussion. However, I find it hard to disregard the impact of culture--in the sense of cultural technology--as to the impact of pedagogy when they stated, "natural language belongs more to the study of human biology than human culture." Technology allows us to achieve 'adaptations' that would otherwise take generations, and particularly recently, the internet has raised kids' IQs at a faster rate than previous generations. I don't know if I misread a bit, but I think that pedagogy and technology help capitalize on the current potential of a species' language skills.

    But, I can understand the relevance of biology, as it studies the 'how' language is done--mechanisms in the brain to form and physical mechanisms to produce language. Thus, I also think that this paper demonstrates how even more significant symbol grounding is. Based on the paper, language was also advantageous because of being able to label things and then categorize them; this knowledge can then be passed on to increase survival. Moreover, to pass on these skills, people still had to bring children around and interact with the environment to achieve grounding. This behavior reminds me of the T3 question on the midterm.

    ReplyDelete
  24. Sorry to post again, but I had questions or doubts about the evidence Pinker and Bloom raised.

    In 5.3, although I agree that gossip is a human trait, they claim that, in all cultures, there is a focus on the ability to persuade, argue, and frame offers; however, I feel like there's a lack of evidence for this thought. Wouldn't cultures devolve into capitalizing on this ability, so wouldn't there be measurable changes worldwide over time to capitalize on this ability? Like increases in brain size for that? Changes in language or grammar to frame deals?

    I just find it hard to wrap my head around, in a micro-social sense, that everybody is constantly reflecting on making deals, gathering gossip to capitalize on how to frame their offers, and then thinking about how to frame a deal given that humans are still prone to short-term calculations. This ability just seems like a long-term trait that would not be incredibly relevant 150,000 years ago, as even now, humans live in a 'vegetative' state most of the time and seek short-term gains. Maybe I have misread, and these deals are more implicit acceptances to cooperate in a society. Though I can understand a bit of where the authors are coming from, as English is a mercantile-focused language; a lot of our language uses language that focuses on benefits and trade, yet I don't think this grounding is widespread in other languages.

    ReplyDelete
    Replies
    1. I see that you're thinking of this gossip/cheating as a long-term trait, to which I disagree. Take for example "I gave your son my fruit so you should give me your meat". This person could be lying for a short-term gain. The better he is at lying, the less likely he is able to survive in the long-term as his group might not like him so much. Conversely, the better the other person is at detecting this lie, the more likely he is to survive as he will not be cheated out of his (vegan) meat so often. This is the arms race that would have been naturally present in all hunter-gatherer societies, promoting an increase in complexity of language. Does this make sense? (I'm not confident myself, so let me know if I'm wrong)

      Delete
    2. Daniel, the biological capacity for language evolved; what we did with language (education, science, technology, art) was all cultural. These are the benefits of indirect verbal grounding (definition, description, instruction) over direct sensorimotor grounding. Cogsci has to explain the nature of language capacity and language learning capacity, as well as how it evolved biologically.

      The "gossip" speculations are speculations, and speculations about memes, not genes.

      Csenge, Yes, cheating and lying are effective in the short-term, and eventually punished in the ancestral village setting, but not on today's anonymous internet, with lies propagated by likes (real and fake).

      But an interesting point to ponder is: which came first, the truth or the lie? Could language have evolved at all if, at least initially, long enough for it to take hold and spread, the default assumption had not been that propositions are True? ("Stevan Says" that this might even be behind the power of suggestion, rumor, and hypnotic susceptibility.)

      Delete
  25. The parts that engaged me the most are the ones objecting against the idea that arbitrary grammatical phenomena are a good counterexample to the position that language is an evolutionary adaptation. Individual symbols may be arbitrary, grammar rules may seem to be arbitrary too but the way we combine the symbols to standardize communication is adaptive (as long as it is shared) - and that is what matters. “The nature of language makes arbitrariness of grammar itself part of the adaptive solution of effective communication in principle”. Liberman and Mattingly’s parity settings in electronic communication protocols was a convincing illustration of how standardization is far more important than any other adaptive feature processed by one party for communication (“there is every reason to set the computer and the printer to the same parity, because if you don’t, they cannot communicate”). The evidence that we go from arbitrary private language to complex learning mechanisms that acquire a language highly similar in almost every detail to those of other speakers in the community (which we need to be social, collaborate and develop kin relationships) shows the adaptive value of this arbitrariness.

    This echoed a point in 8b that stuck with me “our symbol manipulations are governed by a further constraint, over and above the syntax that determines whether or not they are well-formed: that further constraint is the “shape” of the word’s meanings”. So, when we use natural language (words) to communicate, we are not only bound by the rules of syntax but also by the way we shape and convey the meanings of the words we use. The choice of words and how they are combined plays a crucial role in expressing our thoughts and ideas accurately and effectively.

    ReplyDelete
    Replies
    1. Mamoune, you (and P&B) are not making the OG/UG distinction. OG is learned, and learnable. What is innate is UG. (Why?)

      Language as an adaptation for communicating (category) information from speaker to hearer preceded its further use for private thinking. But learning to categorize preceded language. Content-words can be directly grounded through learned, internal sensorimotor feature detection. They become language with propositions (which need grammatical rules: OG and UG). OG is learned. UG is innate. (Why?)

      Language (i.e., propositions) makes it possible to ground further content words in their referent categories indirectly, by combining already grounded word (feature names) into propositions that define or describe the referents of further content-words.

      To either mean or understand a subject/predicate definition ("X means A + B + C") requires that both the speaker and the hearer already understand the predicate (A + B + C), which means they already know the referent of A, B and C. (Can you use that to explain, kid-sibly, what is meant by the "shape" of each word's referent?) (Words have referents; propositions have meanings.

      Delete
  26. The way I see language evolve is the set of functions that different aspects of language fulfills. This text presented dozens of these categories linguists find within language, and discussed them in terms of Darwinian evolution. Here’s my attempt at a chronological overview (I imagine all these changes being made primarily through Baldwinian evolution): Starting with chimpanzee-like intelligence, (1) hunter gatherers greatly benefited from gesturing and understanding gestures, being able to communicate tasks, specifically regarding hunting and gathering. (2) Somehow we made the big leap from pantomime (purely gestural) to proto-communication (arbitrary), as being able to refer to many things and combine things outside our view helped us survive and be socially cohesive. (3) hunter-gatherer lifestyles promote the genetics of memory and the ability to enforce social contract, which requires if-then statements, promoting linguistic expression (propositions) rather than just semantic distinction; (4) this opened the door to cheating by lying, starting a “cognitive arms race” that promoted cheating and cheating-detecting abilities parallel each other. (5) From here I imagine it getting more and more “complex”, as we eventually discuss politics, economics and cognition.

    ReplyDelete
    Replies
    1. Csenge, The hunter/gatherer scenario is popular, but is that a likely place for pantomime to turn into propositionality? I think the family (where both food and categories are being exchanged) would be more likely. And the radical transition is from pantomime to subject/predicate propositions rather than if/then conditionals! But, yes, once you've got the capacity to make true propositions, that includes the capacity to make false ones. And the idea's obviously caught on...

      Delete
  27. In Natural Language and Natural Selection (1990), Pinker and Bloom argue that language has evolved through natural selection. I believe the way in which they modify Gould's original computer analogy significantly enhances the clarity of their points regarding language acquisition. According to the paper, Gould states that many of the brain’s capabilities arose as a byproduct of its rapid growth in size. He associates this with a computer— it could be used for one specific reason at first, but over time it could also go beyond this and execute a variety of other functions.
    However, Pinker and Bloom refute this analogy in their paper: to perform other tasks, the computer would require new instructions. Instead, they think of natural selection as being the “programmer” (Pinker & Bloom, 1990). In the context of human language learning, this would mean that children are learning naturally from their surroundings, rather than being taught specific rules in a direct manner.

    ReplyDelete
    Replies
    1. Michelle, metaphors like P&B's "natural evolution as programmer" are very useful for reverse-engineering language capacity. Have a look at the other replies in the 8a and 8b threads.

      Delete
  28. What struck me in the Tomasello and Call article is the display of intention through great apes’ nonverbal communication. Even though the referent and its signal are commutable, those apes seem to be willing to psychologically impact the receptor of the signals, which is demonstrated by the fact that those monkeys can display many different signs to make the other “understand” what they are trying to indicate. They will also make sure the receptor of the signal saw it, in the case of attention-getters signals for example. This might be a sign of prosocial behavior, and proves that they are able to separate what they know from what others may have noticed in a specific situation.

    ReplyDelete
    Replies
    1. Adrian, it shows how much (and more) you can communicate nonverbally -- and it deepens the mystery of why, with all that capacity, they did not evolve language.

      Delete
  29. The question about how grammar changes in time made me think of consonant and vowel assimilation in phonology, such that phonemes are reduced to ones easier to pronounce. For instance, there was the Great Vowel Shift in English. Thus, the current way language is evolving might be through making the production of words more efficient.

    ReplyDelete
    Replies
    1. Miriam, language change across time is not genetic; it's just cultural. (Can you see how it tells more about the iconic-to-arbitrary transition in pantomime than about the Darwinian evolution of language?)

      Delete
  30. We are still far from having any evolutionary explanation for how UG evolved, which the authors avoided throughout the reading. I was thinking about the TT and reverse engineering our "doing capacity" while I was reading the paper: Our learned capacity for language posits evolutionary advantages in terms of communicating category learning, which can be linked to the “why” portion of the easy problem. (and partly “how” as well). This particularly captures OG, but I am struggling to understand how UG can help us with either “how” or “why”. If we are trying to reverse engineer our cognitive capacities through a TT, feature detectors for category learning sound plausible for OG, but I can’t wrap myself around how UG can be a part of the reverse engineering process. If something is learned or “learnable”, it intuitively feels like it is suitable for reverse engineering, but I’m struggling to understand how we can concretely reverse engineer something strictly innate, like UG. If we have UG, a TT should have UG as well, and what are specific mechanisms of UG that are not learnable but still can be a part of a TT?

    ReplyDelete
    Replies
    1. Can, the problem is not with implanting UG to design a T3: The problem is how it got implanted in us!

      Delete
  31. From what I understood, the process of gradual "complexification" to Universal Grammar (UG) is not fully understood. It seems that there must have been genetic variation among individuals in their grammatical competence, but that's about it. Each step in the evolution of UG would have been small enough to be produced by a random mutation or recombination, and each intermediate grammar would have been useful to its possessor. The details of grammatical competence that we attribute to selection would have conferred a reproductive advantage on its speakers, leading to its fixation in the ancestral population. The acquisition of UG in our genomes is thought to have occurred over a long period of evolutionary time, but there are nothing to support when exactly our genomes acquired it. So, would this mean that the “evolution” of UG is considered to be a hard problem?

    ReplyDelete
    Replies
    1. Hi Marie-Elise
      I wanted to add to your comment and attempt to answer the last question you posed. The gradual development towards UG remains somewhat elusive, leaving gaps in our understanding, and for our case, I believe it is worsened by the fact that Pinker and Bloom avoid UG in the article, making it hard for us to pinpoint its place in evolution (assuming it has one). As you mentioned there is nothing to support when our genomes would have acquired UG, but I don’t think that qualifies the evolution of UG as being a “hard problem”. Instead, I believe that it just leaves a gap in our narrative underastanding of why we have UG in the first place (qualifying it more as an “easy problem”). The reason is that linguist can purposely create and study UG errors as tangible things (something we can’t do when studying feeling/ the other minds problem). Although this may help them with the how UG works, we lack the why evolutionary psychology is usually so good at giving us. I believe for these reasons it can’t be considered a hard problem.

      Delete
  32. ‘A prominent position outlined by Chomsky, Fodor, Lenneberg and Liberman is that the mind is composed of autonomous computational modules and that the acquisition and representation of language is the product of several such specialized modules’ (pg. 3). Could we say that UG is one or a complex of these modules? These modules, like UG, are innate because they’re in our brains from the beginning. What we need to figure out is how we got UG, and these modules, in the first place.

    ReplyDelete
    Replies
    1. I think the question of how such an UG module evolved to be very interesting as well. I think it's possible that the external world acted as a "sufficiently rich stimulus" or a "externally imposed design" over the course of evolutionary history. The external world clearly contains certain consistent patterns in it. Pinker lays out the benefits of being able to communicate about patterns in the external world, such as being able to teach your kids to avoid eating a type of mushroom that will consistently kill you, will out having to show then directly (this he calls stimulus-free learning). I think, since the external world contains regularities, and since the external world is a "sufficiently rich stimulus", the regularities of the outer world imprint on our neurobiology, leaving behind the cross-cultural regularities of universal grammar over the course of evolutionary time...

      Delete
  33. Pinker tries to argue that language is directly adapted for by selective forces in evolution. I found the criticism that Universal Grammar could not have evolved because grammar displays no variation to be very interesting. Pinker responds that the only grammars that actually exist are personal, and do indeed display variation. Universal Grammar is a generalization and exists only in the way that a perfect anatomical model of an eye in a textbook exists. Eyes, and personal grammar still undergo variation, and varying grammar can still be comprehensible to listeners, like “I caveat that”. Given that personal grammars display variation, and can be comprehensible, Pinker argues that exhibiting more advanced grammatical capabilities could confer tangible evolutionary benefits, whether through achieving positions of power, or more effective pedagogy.

    Pinker also notes in 3.4.1 that the development of complex grammar is fundamentally constrained by a trade off between speaker and hearer - or ease of communication, and ease of comprehension. Given the speaker-hearer trade off, and the evolutionary advantage to more sophisticated grammar that Pinker also argues for, I wonder what the biological limits of grammatical complexity are? Is there a degree of information density conveyed by a hypothetical language that will never occur because the listener cannot develop the cognitive abilities to parse the grammar? At what point do limits on our cognitive capacities limit the complexity of our language?

    ReplyDelete
    Replies
    1. Hi Daniel, I thought you brought up a very thought provoking argument. I think that in regards to the biological constraints of grammatical complexity, the trade off between expressive efficiency and cognitive load is indeed critical. There may be a ceiling to complexity in any given language system that is determined by the listener’s cognitive processing capacity. If a specific language were to evolve beyond this cognitive threshold however, it would in essence risk communicative failure and in turn negate its actual purpose. So even though our brains are very adaptable, the evolutionary advantage of language lies in its balance between complexity and comprehensibility. The balance between these two at the end of the day ensures that information is conveyed effectively and within the listener and speaker's cognitive limitations. This would in my opinion, then prevent any emergence of a language that is too complex to serve its primary function of communication.

      Delete
  34. In class we discussed, if I remembered correctly, that we don’t make UG mistakes in our mother tongue, but it is still possible to make UG mistakes in other languages. Therefore I am wondering, have tests been conducted on ChatGPT, to see whether GPT makes UG mistakes in languages other than English? I know GPT misinterprets words that have multiple meanings in another language. For example, crane stands for both the bird and the tool, and when asking to be translated into Chinese, GPT misunderstands the context and translates it into the wrong definition.

    ReplyDelete
    Replies
    1. Which, if GPT also makes UG mistakes in language other than English, then it will be interesting since, did we implemented UG into ChatGPT?

      Delete
    2. Hi Tina, that’s a really interesting thought. I have looked into this and actually found an article published by Professor Harnad, where he interacts with ChatGPT about UG. From my understanding, chatGPT does not have any innate knowledge of language or any built-in universal grammar. Any errors in the system’s grammar is due to errors during training and the data used in training. However, Professor Harnad adds that the reason chatGPT does not make UG errors is due to the fact that ChatGPT was initially trained in English. He also adds that the database comes from people who do have innate UG. Here is the link to the full interaction as I find it gives a detailed answer to your question. https://generic.wordpress.soton.ac.uk/skywritings/2023/07/30/minimal-grounding-sets-universal-grammar-and-chatgpt/

      Delete
  35. Children are “stuck with having to learn the particular kind of language the species eventually converged upon and the particular variety the community has chosen.” In the case of hitted and cutted, could we reasonably say that collective agreement in what is right and wrong wins over what, in this case, have greater expressive power? This reminds me of previous discussions of how language could have evolved from a gestural form, in that maybe a previous symbol (in this case, gesture) for a certain world (e.g., triangle) could have bear closer resemblance to its referent, but has since been evolved to be grounded in the arbitrary symbol (in this case, word) that we have collectively agreed upon.

    ReplyDelete
    Replies
    1. Hi Jocelyn! I think you’re correct in your assertion about “hitted” and “cutted” - that our collective agreement of right and wrong grammar beats out the increased explanatory power of these words. The authors say on page 32, that the “requirement of conformity to the adult code, as subtle and arbitrary as it is, wins over other desiderata” (Pinker and Bloom). I also found this section very interesting, as it seems odd that we abandon more useful terms in favor of prescriptive grammar rules. The authors suggest that this is linked to the evolution of a language acquisition device that is built on the heard language around children, rather than basing it on usefulness. In this understanding, kids’ gradual conformity to adult language is a result of this innate device. I don’t quite understand how this device could’ve developed from gestural language, but I think you raise a very interesting idea!

      Delete
  36. This comment has been removed by the author.

    ReplyDelete
  37. The 5.3.3 section on grammatical complexity and social interactions mentions the enduring role of language in human culture: the important notion that early humans, like Homo Habilis (Isaac 1983), depended on language for cooperative survival. The ability to resolve problems by communicating and convey socially relevant abstract information becomes paramount in such contexts, emphasizing the evolutionary importance of linguistic skills. It challenges the view that language is a mere cultural construct, suggesting it has deep roots in human evolution. This distinction further sets humans apart from other species, as our linguistic abilities enable us to engage in discussions, arguments, and effective communication through propositions that express the information we want to convey.

    ReplyDelete
  38. I also found UG confusing at first, but Professor Harnad's example in one of the skywriting replies clarified the concept; "John was easy to please Mary" violates UG, whereas "It was easy for John to please Mary" does not. Children do not make UG mistakes but may make ordinary grammar errors (like saying "breaked" instead of "broke"). UG is not just an innate capacity to learn grammar but also the ability to follow specific grammatical rules that appear universal across all languages. These rules are deduced in syntax studies, but they are not taught to children through active instruction, or even through exposure to sentences that deviate from these rules (which comes back to the poverty of stimulus argument). The interaction between ChatGPT and Professor Harnad further elucidates some of these UG rules, you can find it at this link: https://generic.wordpress.soton.ac.uk/skywritings/2023/11/04/nine-easy-pieces-on-universal-grammar-ug-with-gpt-4/.
    Additionally, I think some examples from the reading, such as "Who did John see Mary with?" but not "Who did John see Mary and?" illustrate the difference between what follows UG rules and what doesn’t.

    ReplyDelete
  39. This paper does not distinguish between OG and UG. It mentions a theory that language has such a complex design that, like the heart, it must be designed by natural selection; alternatively, language is a side effect of evolution of other things such as brain size. From the perspective of languages and OG, either or neither may be true (likely neither as language is learned). I want to focus on UG. To me it seems plausible that UG evolved through natural selection. UG allows infants to learn any OG and language more quickly. The sooner an infant can learn language, the sooner it can learn categories. This is essential for survival as categories allow one to scavenge for food using instruction and supervised learning, which are faster and more beneficial from a Baldwinian standpoint (than unsupervised learning which would be used without language).

    ReplyDelete
  40. The lack of obvious evolutionary advantage of UG is indeed a problem, although this reading, and the second one for this week, gave me a thought on a possible advantage.
    Both Pinker and Harnad talk about the massive advantages that second-hand learning gives, allowing for circumventing of dangerous trial and error, think of the "thief" group in mushroom island. I wonder if UG being, well Universal, couldn't have facilitated the acquisition of knowledge by "standardizing the file format", through having a common structure to thought early hominids could better understand other's propositions. If in theory groups existed that had proto-language capacity, but individual differences in "mental propositions", they might be able to understand each other (just as we can sort of understand UG violations), but would surely be out-competed by a UG group that can parse other's propositions with no ambuiguity.

    ReplyDelete
  41. This sentence in the article impressed me deeply: “A major problem among even the more responsible attempts to speculate about the origins of language has been that they ignore the wealth of specific knowledge about the structure of grammar discovered during the past 30 years. As a result language competence has been equated with cognitive development, leading to confusions between the evolution of language and the evolution of thought, or has been expediently equated with activities that leave tangible remnants, such as tool manufacture, art, and conquest.” (50)

    This article surprised me. It explores and discusses the relationship between language changes and the evolution of our human cognitive abilities.

    Initially, I did not completely think about the relationship between the evolution and growth history of language and the evolutionary history of our human thinking.

    The development of human thinking depends on problem-solving, as well as our language learning, language exploration, thinking, and the thinking process that gives a certain logic, and the generation of memory.

    Although language is covered in the development of human cognitive capacity, these two developments cannot be equated. Because language has many factors, such as grammar, it is all through reading, listening, speaking and writing. These factors make up the development history of language.

    This development of cognitive capacity helps us learn language. However, the two of them still have independent development histories.

    Therefore, language learning is only a tiny part of the development of cognitive capacity. Moreover, there are many uncertain factors in the composition of language. Then this will end up in history learning.

    ReplyDelete
  42. The article Thirty years of great ape gestures’ discusses great ape communication which is a unique system. It is intentional and sensitive to the attentional state of the recipient. Great ape gestures are different from other mammal displays and human gestures. They are used flexibly for various functions, including play, and are learned through interactions with humans.The study of great ape communication is important for understanding the origins and evolution of human linguistic communication as it is likely that human linguistic communication evolved from intentionally and flexibly used gestural communication. Therefore, studying great ape communication provides insights into the cognitive processes underlying communication and the evolutionary path that led to human language.

    ReplyDelete
  43. Considering Pinker and Bloom's theory that language is a result of evolutionary processes, it becomes apparent that the acquisition of language is interwoven with the evolution of cognitive abilities yet follows its unique developmental path. Language is a very important aspect of our cognitive growth, but it represents only a segment of the spectrum of our cognitive capabilities. The connectedness of language with cultural and historical narratives seems to add complexity to its evolution, possibly suggesting that language's development is as much a cultural phenomenon as it is a biological one. Given this intricate relationship, how might the variability and uncertainties of language reflect the adaptive strategies of our species in response to environmental and social changes?

    ReplyDelete
  44. The article "Natural Language and Natural Selection" by Steven Pinker and Paul Bloom explores the evolution of human language and the arguments surrounding its biological specialization. The authors argue that language is a product of natural selection and that it is a specialized cognitive ability unique to humans. They also discuss the role of syntax and grammar in language, and how it is encoded in the brain. The article provides a comprehensive overview of the debate surrounding the biological basis of language and the various theories proposed to explain its evolution. I am wondering whether the author’s claims could be supported by the kind of artificial life simulation mentioned in reading 8.a? Have any of those been attempted?

    ReplyDelete
  45. I realized that I didn't write anything for this week, and since everyone has commented on OG, UG, the incomprehension surrounding the spandrel example, etc., I wanted to add some links that could be made between music and language, as Darwin describes music as this undocumented protolanguage that has preceded all other forms of language.
    Similarly to language (as seen in this text), there is also a debate surrounding the innate/acquired aspects of music. Some people have argued that musical genes exist for the absolute pitch and in child prodigies. Others claim that experience rather than genetics shape these two aspects. An interesting study has recently shown that all babies could be born with Absolute Pitch, that is then kept or not throughout development. I found this relevant to compare with language, since the Absolute Pitch seems to be a kind of Universal Grammar (innate form of language) that is then shaped and modified by exposures to everyday life events and to music (a form of OG).
    Therefore, music can be similar to language in the sense that babies are born with UG that is then shaped by life and by OG (if my understanding of UG and OG are good).

    ReplyDelete
    Replies
    1. Hi Juliette, I think you make a super interesting connection between music and language. Your comparison between Absolute Pitch and Universal Grammar is fascinating and I would say that I tend to agree with your overall points regarding how both Absolute Pitch and UG seem to have innate forms that require other factors to help shape and maintain its form. Although it is yet to be determined whether there exist specific genes that involve either Absolute Pitch or Universal Grammar, your point regarding exposure to music and OG as the precursors to activate AP or UG seem close to the concept of epigenetics. Epigenetics refer to how your behaviors and environment can actually alter the expression of your genes. It does not change the content of your DNA itself but can alter how certain genes are activated or not activated. So for UG, it could be that OG triggers the “gene” that contains UG to activate and thus allows for our language development. I am obviously unsure of all this but I thought that it was an interesting connection to make.

      Delete
  46. This paper focused on the different theories behind the role of evolutionary adaptation in language. Gould and Chomsky believe language to be a spandrel, or resulting from the “just-so” adaptation of byproducts alongside adaptive evolutionary changes. Language is in-fact a complex system that follows a specific syntactic structure concerning grounded symbols and the relationships between them. To understand language requires the necessary brain mechanisms and physical hardware for hearing, understanding, and producing language, as well as the sensori-motor modalities to ground them. In my opinion, Chomsky can claim it is simply a system of complex organs but those organs were still a product of natural selection with specific adaptive benefits that evolved gradually. The paper also distinguishes two theories behind the manner of evolutionary changes. First, punctuated equilibrium, which is described as a “burst” of changes followed by periods of relative stasis (no significant changes). Alternatively, the author advocates for gradualism, characterized by sequences of small effects that contribute to complex evolutionary changes.

    ReplyDelete
  47. The article and skywritings and replies above supporting certain points of the article leads me to gather that innate universal grammar serves as a mechanism facilitating the acquisition of rule-based language. The learning of language is unique as it seems to be partially innate, but cannot be learned in a fully supervised or unsupervised environment. Universal grammar must already be known (innate) since there are no errors made and cannot be learned through teachings, but other components of language must be taught through positive and negative reinforcement and observing methods. This article did not dive deep into how OG and UG evolved, but rather focused on them working together in language. A question that did arise from this is if there are currently experiments or conservations surrounding the idea of certain genes or mechanisms in the brain that are associated with UG since it is argued to be innate.

    ReplyDelete
  48. Chomsky considered that language usage transcends mere utility in communication, instead providing a platform for individuals to express thoughts. I find myself contemplating the validity of this assumption. From Darwin's natural selection, the trajectory of human evolution has been largely shaped by the imperative to reproduce and survive—a decidedly utilitarian objective. It appears then that all human behavior, including linguistic development, could be traced back to this fundamental purpose.

    From my perspective, language likely emerged as a direct response to this utilitarian goal. Consider an ancient scenario wherein individuals encounter danger. While some might directly experience the peril(through sensor motor experience), without the means of language, they cannot convey this sensory-motor categorization to others. Language serves as the bridge, allowing them to categorize the danger and communicate it, thereby facilitating avoidance of the threat and enhancing the group's survival chances, which then develop to much more categorization, including more abstract terms, such as expressing feelings.
    Consequently, while Chomsky's view evaluates language as a higher intellectual function—essentially, a gift allowing for the sharing of thoughts and knowledge—I propose that this capacity may have originated from a primal need for species propagation. The question then becomes: Could our advanced linguistic capabilities have evolved from these original utilitarian roots, gradually developing the richness and complexity of categorization we observe today?

    ReplyDelete
    Replies
    1. Hi Jiajun, your perspective seems to be the widely accepted/assumed origin of language. However, I don't see how your thought experiment can be an argument for this, because there are non-linguistic forms of communication of danger in many species (e.g., vervet calls), so why didn't they eventually learn language as well? In the Omasello and Call article, they even showed that other great apes seem to communicate to satisfy individualistic needs rather than cooperative needs like we, as humans, do, suggesting a fundemental difference between communication in our species and our closest relatives. Furthermore, we ourselves seem to still retain similar more primitive, often reflexive, forms of communication, and we can see that they are behaviourally and neurologically distinct from language.

      Delete
  49. The reading starts by arguing for a consideration of evolution as an interaction of selective and non-selective mechanisms, then applies it to the evolution of language, overall arguing for the rise of language as an adaptation. It emphasizes the difference between language acquisition and language evolution and states that while language evolved from a need to communicate (advantageous), language acquisition in children corresponds to adherence to pre-existing structures. In the article, although we might think that this preexisting structure refers to UG, the example given by the author for language acquisition in children seems to refer to OG: “breaked”, “comed”. I wonder what the role of UG vs OG is according to this account, and how they emerged.

    ReplyDelete
  50. Thirty years of great ape gestures is looking into the evolution of gesture as a communicative means. Deriving from a common ancestor, studying gesture in great apes is thought to help us understand the evolution of language in human which also evolved from gesture. The main difference assessed between great ape and human communication is that human communication is social, cooperative and aiming at sharing knowledge and information rather than individualist imperatives related to immediate goals. So, while great ape and human gesture are fundamentally different in their purpose, they both have two components: intention-movement and attention-getter (in early human evolution: pantomime and pointing). Therefore, not only is it interesting to study great ape gesture for its own sake, but it could also enlighten us in the study of linguistic evolution in humans.

    ReplyDelete
  51. In order for great apes to get from intention-movements and attention-getters, to transform to human linguistic communication, the environment would have to reflect that of a cooperative lifestyle. So, there must be shared intentionality between members in terms of skills and motivations. I was just wondering under what circumstances then since we shared a common ancestor, would the emergence of communicative motives have occurred (development of linguistic communication)? Shared intentionality, social structure, and the use of tools and manipulation would have played crucial roles in that–but I wonder what type of shift in climate or available resources (ecological pressure) would have prompted the need for enhanced communication (as no other animal on this terrain has language).

    ReplyDelete
  52. I'm trying to think of other examples which might map onto UG and OG in order to explain them. Universal grammar seems to be the thing which humans all have which helps us learn og and which children have and are able to use despite POS. Upon my search I have landed on maybe math as an example, although we do learn math. UG might be compared to the things which feel most basic about math, that it is made of numbers and symbols and that the symbols are things that alter the symbols (set aside physics symbols like lambda ) and OG is learning like specific emotions and other strategies of how to prove or solve things

    ReplyDelete
  53. p&b talk about how language is useful for expressing and sharing category learning, through language we can communicate to others our knowledge about things in the world such that they can do the right sort of things with them merely from us telling them than them having to experientially learn this theirselves. Agreeing to call something which kills you deadly is trivial, it could just as well be called cat. This is just a matter of vocabulary learning which falls under OG, however, P&B do not touch on UG which by its definition cannot really be something which came about through evolution regardless of complexity UG prods more at the non trivial questions of what is language

    ReplyDelete
  54. This paper suggests that the evolution of natural language is due to natural selection and describes grammar as a complex mechanism that children are able to grasp without formal instruction. They argue that language is more related to human biology rather than culture, which I find interesting because I always thought that languages evolve due to and according to our need to express more complicated things over time and to be able to communicate effectively.

    ReplyDelete
  55. I’m a fan of this reading—enjoyed it a bunch. Pinker and Bloom argue that language is a product of Darwinian natural selection. They assert that language’s complexity, especially grammatical structure, evolved for communication, challenging views that it’s a byproduct of other developments. This links language intricacies directly to human evolutionary processes. I’m curious about how UG evolved, but I suppose that to be the big question. Likewise, why might some processes be hardwired and others not?

    ReplyDelete
  56. The optional reading is about gestural communication in great apes. I find it interesting that what differentiates human and ape communication is the purpose - ours is to cooperate ‘in the context of joint goals and ideas’ while theirs is just to fulfill individualistic goals, and that what is inhibiting their showing from evolving into telling is the lack of motivation to do so.

    ReplyDelete
  57. The text explains that grammatical categories align closely with fundamental distinctions in the real world, such as things, events, states, and qualities. Grammar is a category from language, and there are many small categories for rules of applications when used for sentences and phrases. It highlights how grammatical rules define key lexical categories like nouns, verbs, adjectives, and prepositions, which function as symbols. These rules also establish distribution patterns, like verbs taking direct objects but nouns not; the nouns usually start at the beginning of the sentences. The overall idea is that lexical categories serve as the foundational elements of grammar, showing cognitive distinctions in how we perceive and categorize the world.

    ReplyDelete
  58. It has always been insane to me how humans are the only creatures who have complex grammatical languages. I was very excited for this reading. I always found the memes online about animals being able to speak but they just don’t want to pay taxes hilarious and extremely unlikely but a part of me has asked “what if”. After this reading my five cents on the topic is that I think language is a product of evolution through natural selection. Without being able to communicate, ask questions, and understand the complex information, it is very hard to survive on your own. I think that the paper makes some very sound arguments. In an attempt to find non-verbal abilities in humans that correspond to grammar, I have thought of facial expressions, pointing at things, the strength of sounds (loudness and quietness), and eye movement to be possible predecessors of grammatical structure as they are also mentioned in the paper. The correlation between though complexity and language complexity should intuitively be a causal relationship. Since language is external and it is displayed through social interactions, complex thought must come first. Now however, that dynamic has reversed because from the moment we are born the more we can speak a language, the more thoughts we can form so it could very well be the opposite as well. It is certainly very interesting.

    ReplyDelete
  59. My understanding of acquisition of a language requires grounding, and this could be simply indirect grounding, which could be achieved by conducing computation, or manipulation of symbols. ChatGPT is computing words, or symbols, by indirect grounding only, since it does not have sensorimotor systems for accessing the sensory input for direct grounding process.
    Human not only could use languages but understanding the language. And “understanding” requires direct grounding. Human starts learning languages through direct grounding, connecting words with its referents in reality, and then later indirect grounding is possibly conducting once more and more categories are built.
    But how can we be sure that we understand the languages? There is a feeling that we feel we understand the language, we understand the meaning corresponding to the symbols. This is OMP and maybe could not be explained for a robot even with sensorimotor systems that is weak equivalent to humans’.

    ReplyDelete
  60. Pinker says that a series of steps are needed for universal grammar to evolve in the Darwinian way, that language must have start from nothing at all to something. Universal grammar theory suggests that there exists a grammatical structure that must be innate and shared among all humans, but little evidence is found to support the existence of UG, and evolutionary theory doesn’t really provide an explanation for it. When language was at its very ancient form as it starts to evolve from drawings on the wall, is grammar already existing? If so, then the drawings must also have grammars. If not, then how does grammar suddenly exist when language is created? I wonder if it is possible that there is no universal grammar from the start, but only different OGs, and the UG we have today is a result from selective adaptation of the most fitted OGs, but I guess this is just a Just-so story.

    ReplyDelete
  61. Pinker and Bloom claim that human language is overqualified for the kinds of things ancestral humans or chimpanzees would need to use it for (p.43). I would argue that this is not a very strong point. While human communication might indeed be more complex, it is not clear to me why other ways of communicating used by other animals are also not overqualified. They might not be as complex as human communication, but they might still be more complex than needed for the tasks they have to perform. Therefore, I do not think that the fact that human communication is more complex than other animals' is sufficient grounds to suggest that it is a byproduct of some other adaptation.

    ReplyDelete

PSYC 538 Syllabus

Categorization, Communication and Consciousness 2023 Time : 8:30 am to 11:30 am Place :  Arts W-120  Instructor : Stevan Harnad Office : Zoo...