Note: Click on any word on this page (and click it again) to experience the Online Learning Support Net (OLSN).

Re: “AtoZ” PBS-Nova Episode and Children of the Code

RECOMMENDATION: Watch it: PBS Nova: At to Z: The First Alphabet (released 9/23/20)

This beautifully well done @PBS @Nova documentary, A to Z: The First Alphabet, outlines how writing developed from pictographs to rebuses to hieroglyphs to the alphabet.  It parallels and improves on aspects of our video the “Alphabet’s Big Bang” (the second segment of “A Brief History of the Code – Part 1: So Let It Be Written”, which is the first of three chapters (21 video segments) that explore the “code” in the “Children of the Code” video series).

Hopefully the next episode, A to Z: How Writing Changed the World, (airing 9-30) will examine how this c-o-d-e, the greatest invention in the history of history, became the technology that both made possible and constitutes our civilization, religion, science, politic, law, money, education, marriage and virtually everything else you think of.  That said, and as important as that part of the story is, the Children of the Code project had a different purpose for outlining the history of the invention of writing.

Just recently (9-9-20), Forbes published an article called “Low Literacy Levels Among U.S. Adults Could Be Costing The Economy $2.2 Trillion A Year“. The piece estimates that 130 million adults have reading skills below the 6th grade level. If that number surprises you, it shouldn’t. According to national data (NAEP), most of U.S. children are less than grade level proficient readers from the day they enter school to the day they leave school.  It’s been that way for decades (thus the 130 million adults). These children and adults are living lives harmed by their lack of reading ability. Harmed, by the mind-shaming / self-disesteeming effects of their chronic struggles with learning to read.  Harmed, by the extra difficulty that accompanies learning everything else in school and life that depends on reading. They are “Children of the Code” in another, darker, and more important sense.

These two seemingly distinct stories are really aspects of one story. How the alphabet came to be used to write and read English is the backstory for understanding the difficulties, pain, and shame experienced by the tens of millions of children and adults who struggle with reading. In short, understanding the code’s history is critical to understanding why so many people struggle learning to read it, which is inturn critical to understanding the collective costs and societal effects of low-literacy on the economy, the healthcare system, democracy, and the intelligence of our population.  What connects these facets, is that for the vast majority of the tens of millions of children who struggle with reading, the cause of their struggle is the state of this “c-o-d-e” (1).

T-h-i-s code is not the code of Moses or Plato. Back in those days, as Plato said (Republic), “In learning to read… we were satisfied when we knew the letters of the alphabet”. The code was phonetic and because it was, reading was simply blending the saying of a letter with blending the saying of the next letter. What happened to the code since then, how the Roman alphabet became used to represent the sounds of the English language (The First Millennium Bug) is the story of how the code came to be in the unnaturally and complexly ambiguous state that it is in today.

Explore these and other critical components of the story of the “Children of the Code” on our website, Facebook page, LinkedIn page or Youtube Channel.

@DOXdocs @doxproductionsltd

1 – Some might argue that Dyslexia is independent of the code.  I am not saying there isn’t a neurobiologically innate form of Dyslexia that makes learning to read difficult. Rather, that over 60% of all U.S. kids are below proficient yet only about 6% have innate dyslexia (see Shaywitz Dyslexia Numbers). And, that even for those 6%, the irregularities in the code make reading significantly more difficult:

Dyslexic children of primary school age learning consistent orthographies  seem to have less serious difficulties than their English-speaking counterparts. English children with dyslexia, generally have persistent deficits in word reading  accuracy, and even more severe deficits in nonword reading, with error rates often ranging from 50% to 70% (see Rack et al., 1992). Their counterparts in languages such as French (Sprenger-Charolles, Siegel, Bechennec, & Serniclaes, 2003), German (Landerl, Wimmer, & Frith, 1997), Dutch (deJong & van der Leij, in press), and  Greek (Porpodas, 1999) typically attain much higher scores, with error rates in the order of 6% (Dutch) to 25% (French). Importantly, even these relatively low error rates are typically significantly higher than those of age-matched controls, and sometimes also higher than reading ability matched controls. Hence, children with dyslexia who learn relatively transparent orthographies do experience problems with  word and nonword reading accuracy, but these appear to be less severe than those of their English-speaking peers. From: The Nature and Causes of Dyslexia in Different Languages


, , , , , , ,

No comments yet.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.