Everyone we interviewed agreed: a significant component of the challenge of learning to read (English) is recognizing unfamiliar words fast enough to keep comprehension primed and flowing.
What most challenges the brain and causes the processing delays that “stutter” the flow of reading, is the time it takes to recognize unfamiliar words. More specifically, the time it takes to work through the confusing relationship between the letters and sounds within unfamiliar words.
This aversion to scientifically examining the role of code confusion in reading difficulties took many forms. Some of the most powerful leaders in reading science and policy saw the code through the mental models of an already literate adult (rather than a pre-literate child’s mind):
In a moment that perhaps most exemplifies the paradigm inertia, we asked Dr. Reid Lyon (referred to at the time as a the Czar of Reading in America): “so what we are saying, in effect, is that the majority of our children, to some degree, are having their lives all but fated by how well they learn to interface with an archaic technology”. Watch his response:
In effect Dr. Lyon was saying: we can’t change the code so there is no reason to think about the role of code confusion in reading difficulties (except in terms of training teachers). The code and its confusions are an immutable fixture in thinking about reading science and policy. Therefore the problem is not the code but the the lousy job teachers are doing teaching it.
We mean no personal criticism here. Dr. Lyon is someone we greatly admire for his championing of children. His response illustrates how the learning of even the best of scientists can still be disabled by their paradigms. As one of the nation’s most influential scientist / policy makers, the way he conceptualized the challenge of learning to read directly influenced government policy and academic research and indirectly influenced virtually every school in the country.
Perhaps the most stunning example of the paradigm inertia that pervades reading science is revealed in this exchange with Dr. Keith Stanovich, one of the most influential cognitive scientists in the field of reading.
Again, we mean no personal criticism here. Dr. Stanovich is a brilliant and caring scientist who contributed vastly to the cognitive science of reading. Yet, here we have one of the most highly respected scientists in the field of reading saying in effect: We’ve been badly burned by “reading wars” and how the irregularity in the code was used to justify whole language. We don’t want to go anywhere near those confusions and risk opening up those wars again. Here a scientist is saying: We don’t want to be scientific and learn deeper into the underlying issues that are causing so many of our children to have life-maligning difficulty learning to read because it might endanger our phonics train agenda and reopen those painful wars.
Having lived through the reading wars and feeling passionately that they were fighting for the future of generations of children, its easy to be sympathetic to these leaders. Yet, the moment we set aside an instructional agenda, as we had with the Children of the Code Project, and instead focus on understanding the challenge, the reason so many kids (nearly 2/3 today in 2019) do not read at the proficiency level, it becomes clear that it is not the teachers’ fault. It’s the code.
Watch a few seconds of the above video again. It’s set to start at the right place. Stop watching when Dr. Merzenich stops talking (“It is a technological artifact”). Dr. Michael Merzenich, is a National Academy of Sciences neuroscientist known for his pioneering work in neuroplasticity and for its first big application, the cochlear implant. A thought leader in the neuroscience of sound processing, after helping people hear he turned his attention to helping people read. His focus was studying the underlying auditory processing required for reading to work – to sustain the co-entrainment of the many mental sub processes necessary for reading. He recognized that timing is everything and that any slow down of the auditory processing that the reading process conscripts and “plays” will cause stutters if not complete breakdowns in the continuity of reading. My point was that the temporally precarious effects of fuzzy auditory discrimination (sound pattern recognition) and processing effects he is talking about applies to the temporally precarious effects of fuzzy code discrimination (disambiguating letter-sound-spelling) and processing. Further, that the auditory issue exists in the organic world of naturally varying sounds, the kinds of ambiguity in the code are completely artificial. He agrees.
Because changing the code – changing the alphabet or spelling – has such intolerable consequences, our conceptions of ‘teaching reading’ have been constrained to accepting the confusion as immutable and, consequently, to paradigms of reading teaching organized around training the brains of readers to deal with it. Phonics and whole language methods are both attempts to compensate for (work around), rather than directly address, the confusing correspondence between letters and sounds. (see also Alphaphon analogy)
See also: Mental Models
See also: JCPS – Bellarmine
Stay tuned… Part 4 of the series will demonstrate why some of the most powerful people and organizations involved in fields related to reading are actively resistant to this way of thinking because it threatens their reputations and income streams (researchers, gurus, institutions, publishers, non-profits: literacy, dyslexia, ld…)