QUICK TOURS

Abstract:

Article:

Analogy:

Story:

a brief essential statement for the impatient or rigorous

a  3 1/2 page outline of our reasoning and proposal

why we overlook the role of technology

how the idea developed

 

 

ABSTRACT: 

 

The greater the number of ambiguous letter-sounds (and letter-sound combinations) coexistent in a word, the greater the number of iterations of ambiguity reduction required before the word can be virtually-heard or spoken. The greater the number of ambiguity reducing iterations (disambiguations) involved the longer the reader’s attention must stretch to process them.  The longer the span of attention required, the greater the vulnerability to miscues in decoding causing drop outs from the decoding-stream-flow-rate necessary to sustain the flow of reading.  The single most significant underlying cause for this, ambiguity-overwhelm > stutter > drop out, is the archaic “technology”  - the 'code' we read with. More On...

 

A 1000-year-old lack of leadership in managing the relationship between the Latin alphabet and the English spoken language has resulted in a deeply entrenched, convoluted and highly ambiguous ‘code’.  Every attempt to change the alphabet or reform spelling - to render their relationship more simply phonetic - has failed. Phonics and phonemic awareness pedagogies are both attempts to compensate for, not directly address, the ambiguities created by the idiomatic correspondence of these two systems (the code).

 

With modern font technology it is relatively easy to add another dimension of functionality to the concept of a character or letter. Specifically, it is possible to print (paper or screen) letters with shape, size, intensity and spacing variations that, while retaining unambiguous letter recognition features, convey additional information or cues about how the letter sounds in the particular word in which it is encountered.  For a detailed description of the Cues click here.

We are proposing that a small number of alphabet-general letterface variations, acting as phonetic cues, can dramatically reduce the disambiguation-overhead involved in learning to read. Our intent is to catalyze if we can, and, drive if we must, the development of a new learning to read system based on developing this concept and subsequently integrating it with the best of what remains relevant from phonemic awareness, phonics and whole language pedagogies and practices.

 


ARTICLE


A New Way of Thinking About Learning to Read
By David Boulton

 

69% of 4th graders read below the proficiency level
60% remain below it in the 12th grade
        National Assessment of Educational Progress 1998 Reading Report Card  

 

42 million adult Americans can't read; 50 million can recognize so few printed words they are limited to a 4th or 5th grade reading level. According to Literacy Volunteers of America, 237 billion dollars a year in unrealized earnings is forfeited by persons who lack basic reading skills.
       
The National Right to Read Foundation    

                                                                                                                       

There is no natural, biological-evolutionary precedent for reading. Spoken language - yes - the ability to discriminate among sounds and associate distinct sounds with distinct meanings has been evolving for millions of years. But, nothing about our natural evolutionary development has prepared us to read – to focus our eyes into small static spaces and translate and assemble strings of visual symbols into virtually heard sequences that simulate the sounds of spoken words. Human beings invented reading (and writing), and it should be added, those that did were far less familiar with how our brains work and children develop than we are today.

 

Reading is a technology skill that requires the use of two archaic technologies or systems (the 3,000+ year old alphabet and the 1,000+ year old system of English spelling) that were developed by adults for adults and were never designed (or since in any way optimized) for use by young developing minds. Moreover, they were never designed to work together; like the proverbial square peg/round hole, we have been ‘force fitting’ them for over a thousand years. The fact is, that most people who struggle to read are suffering from a kind of interface incompatibility with our reading technologies that is the fault of the technologies, not them!

 

HISTORICAL ROOTS

 

Ancient Greek and Latin were almost completely phonetically written…
          Teaching Reading - a History

 

Just as in learning to read, I said, we were satisfied when we knew the letters of the alphabet…
           Plato, the Republic 
       
                                             

The major cause of today’s reading problems began taking root nearly a thousand years ago as the Latin alphabet and the English language collided. The Latin alphabet was nearly phonetic; it had one letter for each sound spoken in the Latin language. But in trying to represent the English, the Latin alphabet came up short by over a dozen letters.  There were simply more sounds spoken in English then there were letters to represent them in the Latin alphabet. 

With religion, politics and academia so entrenched in the written Latin any thought of changing it was virtually inconceivable.  Consequently, instead of adding letters to the alphabet, a series of rules developed whereby some letters, (but not all) would no longer have just one sound but could have other sounds depending on which of the other letters (in what sequence) they preceded or followed (most but not all of the time). Sound pretty convoluted? The consequence of this ‘hack’ has haunted us ever since: the phonetic principle was broken and the relationship between written letters and spoken sounds became complex and confusing. 

 

What happened significantly altered the course of human history and its effects are still felt today by over 700 million people. The result of making up for the shortage of letters was an ambiguous alphabetic ‘code’ that strained the process of learning to read - a process that had for over fourteen hundred years been based on the phonetic simplicity of one letter for one sound. No longer as quickly self-evident, reading now involved the need to determine which of a letter’s possible sounds it was supposed to actually sound like in the particular word in which it was appearing. The stress involved in such decoding has remained deep in the ‘overhead’ of our reading process ever since.

To make matters worse further complications followed as the words, spellings and accents of Greek Philosophers, French clerks, Danish typesetters and others were added to the system.  Now, in addition to idiomatic codes for missing letters, spellings became incoherent as the various spelling conventions of non-English influences were imposed on the language. With the combination of Luther’s reformation, Guttenberg’s printing press and the King James translation of the Bible, the cement began to harden.  All these diverse and complicating stresses heaped upon a writing system that was already inadequate resulted in a seriously flawed and dysfunctional system. 

 

TODAY

The underlying cause of our reading difficulties is that we have rigidly held to an inherited, technologically archaic, symbol system (the Alphabet) that was developed in an entirely different 'age of the world', for adults not children, and that was never designed to represent the 44+ sounds of the English spoken language.

As a result, the number and duration of the mental processing iterations necessary to resolve the ambiguity of letter sound correspondences all too frequently exceeds the attention span of beginning readers. The consequences are ‘reading stutters’ and 'drop outs’ in reading flow.  The core problem is AMBIGUITY-OVERWHELM and it is an artifact of the “technology” involved. For some reason - its 'sacredness' or simply its institutional inertia - we have been unable to update the technology to reflect what we know about human neurological processing and to make it friendly to the self-esteem and developing mental processes of our young people. 

 

The first casualty is self-esteem: they soon grow ashamed… about half of youths with a history of substance abuse have reading problems.
        National Institute of Child Health and Human Development

 

35% of children with reading disabilities drop out of school, a rate twice that of their classmates. 
50% of juvenile delinquents manifest some kind of learning disability, primarily in the area of reading.
        National Center for Learning Disabilities  

There aren’t many parallels to this. Under what other circumstances do people spend years trying to learn something that continually makes them feel bad about themselves as they do? Most children and adults have very limited patience for repeatedly trying to do something that results in self-esteem-lowering feelings. Yet, we must compel people to learn to read. They can't function in our modern world if they can't. However, the way things stand the technology is causing real and significant damage to people's lives (and costing us billions of dollars).  

 

Up to this point, absent a new alphabet or a way of spelling phonetically with the one we have, our only course of action was to facilitate the development of explicit skills and attention span increases such that developing readers might be better able to process the ambiguities we can't spare them from experiencing. This has been the role of explicit phonemic awareness exercises and explicit, systematic phonics both of which are attempts to compensate for, not directly address, the ambiguity created by the archaic alphabet and spelling system.

 

But what if we could, without changing the alphabet or the way English is spelled, present our letters (on paper or screen) with cues embedded or accompanying them that could significantly reduce the letter-sound ambiguity involved in reading?

 

This kind of thinking was impossible until very recently, until computers and modern font technology. Though the moveable type of the printing press was a breakthrough innovation in its day, it restricted us to thinking about printing through a paradigm that was based on what was and was not possible in a mechanism that used real physical objects to print letters with. Whereas moveable type made it relatively easy to set up any number of alternative typefaces, once within a typeface it was impractical to offer letterfaces or optional variations on the way each letter might appear.

 

However, today, with modern font technology, it is possible and relatively easy to add another dimension to the idea of a character or letter. Specifically, it is possible to print (paper or screen) letters with shape, size, intensity and spacing variations, that while retaining unambiguous letter recognition features, allows the presentation of the letter to convey additional information or cues about how it sounds in the particular word in which it is encountered.  

The P-CUES Concept: Mind your Ps and Qs - Phonic Cues - P-Cues

 

The intention is to prompt the reader with unambiguous CUES that reduce the number and complexity of the instances of ambiguity encountered during the immediate decoding-stream-flow of the reading process.  

For a detailed description of the Cues click here.

 

The Software and Font Technology Involved

 

Conceptually, the technology involved is relatively straightforward. The first component is the "carrier" or shell that extends the font family to have the added capacity to store the alternate presentations for each character in a font. The second component is the "P-Cue presentation dictionary" which, like a spell checker in a standard word processor, scans the words in documents and looks them up in its database. When a word match is found, the P-Cue dictionary reads the character presentation variations (P-Cues) for the letters in the word and substitutes the P-Cued letters into the publication to match.

 

IN CLOSING

 

The examples I have put forth are placeholders.  There is significant work ahead to map the territory of letter-sound ambiguities and to determine which metaphors and variations of letter presentation will best serve different types of developing readers. With that said, I believe it is possible to develop a system of variations that will cue developing readers in ways that reduce the 'overhead' involved in reading by many times the 'overhead' involved in processing the cues.  Based on my preliminary and informal experiments with children, I am confident that, once fully developed as an overall system, this approach will dramatically simplify and speed up the process of learning to read. 

 

What I am proposing bridges the phonic and whole language ideologies.  Instead of having to create ‘dumbed down’ reading materials or having to design reading materials around the awkward pedagogical requirements of cryptic decoding, the P-Cue model reduces the ambiguity involved in decoding and allows developing readers to access more meaningful materials faster (extending the ceiling on ‘decodable text’ to a more meaningful and enjoyable level). Finally, it does this without changing the alphabet or English spelling.

 

This is not meant as an alternative to learning other rules of decoding, as it won't eliminate all the ambiguities. Rather, what I am proposing will provide developing readers the means to quickly filter out a significant portion of what would otherwise be ambiguities leaving them with a less dissipated attention span to apply whatever rules remain appropriate (arguably new rules based on an integrated approach to using this technology with phonemic awareness and phonic instructional pedagogies).

 

I call this 'Training Wheels for Literacy" because this system is not intended to replace our colossal inventory of written materials, but rather to provide developing readers with an 'on- ramp' and the ‘training wheels’ that enable them to develop better phonemic awareness, phonic skills and greater attention span by making it easier for them to keep themselves from 'falling' out of reading.  By enabling them to extend their reading flow, they will learn to associate the P-Cues with the phonemic distinctions available in written word structures and ultimately take the 'wheels off' - stretching into the next step of becoming an empowered reader. 

 

In summary, this learning to read barrier; it's pain, shame and life-disabling consequences...our arguments about methodologies and the money we spend on efforts intended to compensate for it, stem not from some deficit or lack of natural capacities in our brains, but rather, from the change resistant technology of our 3000 year old alphabet and its poor interaction with the (nearly as change resistant) 1000 year old technology of English spelling. For the sake of the children, in the spirit of plain good science, lets acknowledge the fact and do something about it.

 

Learning to read is a process of acquiring an “inner-interface” between our biologically native all-at-onceness processing and our enculturated mind’s one-at-a-time thought processes. Indeed, reading plays a significant role in creating the later. Taking up this challenge could create a breakthrough in literacy, reduce damage to self-esteem, reduce the waste of billions of dollars and, perhaps, beyond all of that, change the ecology and efficiency of the "inner interface" that regulates our learning, and, who we are.


AN ANALOGY?

Imagine that a fictional product called AlphaPhon is the world’s leading English language GUI (Graphemic User Interface).  AlphaPhon is the entry-level product of a company (also fictitious) named USASoft. All is not well with USASoft. Market research has revealed that 92 million older AlphaPhon customers, due to their poor use of the product, are suffering major financial losses.

42 million adult Americans can't read; 50 million can recognize so few printed words they are limited to a 4th or 5th grade reading level. According to Literacy Volunteers of America, 237 billion dollars a year in unrealized earnings is forfeited by persons who lack basic reading skills.
               
The National Right to Read Foundation   

Perhaps even more alarming, user-test reports indicate that 60% of the company’s new customers are less than proficient with AlphaPhon even after 12 to 13 years of day in and day out attempts to learn it: USASoft is in serious risk of losing its future customer base.  

69% of 4th graders read below the proficiency level
60% remain below it in the 12th grade
                  National Assessment of Educational Progress 1998 Reading Report Card  

 

The first casualty is self esteem: they soon grow ashamed about half of youths with a history of substance abuse have reading problems.
               
National Institute of Child Health and Human Development

 

35% of children with reading disabilities drop out of school, a rate twice that of their classmates.  50% of juvenile delinquents manifest some kind of learning disability, primarily in the area of reading.
               
National Center for Learning Disabilities  

Disturbed by these reports, USASoft undertakes a massive research campaign to discover why their customers are having such difficulty learning to use AlphaPhon.  Billions of dollars and thousands of research studies later a scientific consensus emerges: the customers who have difficulty learning AlphaPhon exhibit a common ‘core deficit’ in something the researchers call ‘alphaphonemic awareness’

 

Phonemic Awareness: It’s the hottest topic in education.
              
National Adult Literacy and Learning Disabilities Center

 

Converging evidence from all research centers show that deficits in phonemic awareness reflect the core deficit in reading disabilities.
               National Institute of Child Health and Human Development

Moreover, they also lack ‘alphaphonic’ code knowledge and skills.

 

Letter knowledge, which provides the basis for forming connections between the letters in spellings and the sounds in pronunciations, has been identified as a strong predictor of reading success
              National Center to Improve the Tools of Educators

 

Moreover, if the letter-sound code (phonics) is not taught, all reliable studies concur that poor readers and nonreaders will not become fluent readers
             
National Adult Literacy and Learning Disabilities Center
 

Based on this new understanding, USASoft issues orders to all of its distributors to initiate a nationwide training program designed to train the minds of its users in the alphaphonemic awareness and alphaphonic skills required to use AlphaPhon.

 

zyxwvutsdrqponmlkjihgfedcba

 

If it worked, the analogy started to sound absurd as USASoft began to act like AlphaPhon’s problems were exclusively in the minds of its users. As if it were inconceivable that anything could be wrong with AlphaPhon, or, that if something was wrong, that AlphaPhon could be in any way changed or improved. What’s disturbing, of course, is this is exactly how we have come to think about our reading problems and the role our reading technologies play in creating them.  How could USASoft be so blind and negligent about the usability implications of such a human-engineered human-interface product?  How could we?

 

STORY

 

In 1989-90, my son, Daaron who had, since his birth, been my learning guide, began to want to read. Daaron grew up in an environment of  rich dialogue and quickly developed a remarkable proficiency with language. By the time he was three he could engage in complex conversations about his own thought process and he and I were able to travel together throughout our thoughts and feelings.

 

Daaron’s kindergarten teacher described him has having ‘the vocabulary of a 20 year old’. He was masterfully, verbally, dexterous. From the time he was a month old I read to him. I never read him baby books; I read him whatever I was reading. He picked up the alphabet sounds when he was 2 and, because he loved playing games, he had a good sight vocabulary by the time he was 3.  He was never pressured to read before Kindergarten. It was his starting school that precipitated our journey into learning to read and it was then that, for the first time since I learned to read myself, I was drawn into the problems of reading.   

 

His game playing had taught him to trust the game he was playing – that if only he learned what he needed to learn he could win. This had enabled him to stay with his frustrations because he knew it was possible to learn through them. Daaron became  an exceptional learner because he had developed a refined sense of his own meaning needs  and he trusted them to direct him, like a compass, through any learning challenges he encountered. 

 

When he began to read he couldn’t believe it. He was incredibly frustrated by the lack of coherence in the process and the seemingly arbitrary ways in which reading was so different than the ABCs had prepared him for.  He had no problem with the alphabet letter sounds; he had no problem with understanding the meaning of words. It was the ambiguity in-between that broke the flow in his mind. With his ability to articulate his confusion and our combined ability to track the flow of meaning in his mind, we bumped right into the sound-letter correspondence problem. Why was it so confusing? So many rules and exceptions, he asked me ‘why dad?”  “Why does it work this way?  It just doesn’t make sense!”  “How could it be such a mess?” He was sure there must be something simple that once he understood it would enable him to read.  

 

At the time I couldn’t answer the question very well. I explained to him that his struggle wasn’t his fault. The problem was in the mess, not in him. It really annoyed us that something so fundamental and basic as reading could be so unsystematic, illogical and so inconsiderate of the way our mind's functioned most naturally. I explained that there was a very complex set of rules that made the mess make sense to adults that studied it but that there wasn’t any explanation I could give him that would make sense of it  to him.  Though I couldn't help him understand it, paying careful attention to what he was going through made it obvious how I could help him.

 

I began using a pencil in much the same way an orchestra conductor uses his wand - moving it up and down, in circles and left and right. We developed a cueing system.  When I sensed his flow stumble, I would move the pencil tip just above the letters he was reading and tap the letter when it was supposed to sound like its letter name. I would move the pencil tip in a circle when it's one of the letter's other sounds and down when it was silent. I would move it left when it's blended with the letters before it and right when it's blended with the letters after it. He quickly understood these simple signals, and, as my cueing reduced the ambiguity he experienced, we both felt an ease and acceleration in the flow of his reading.  It worked so well that soon Daaron was off and reading. I never did explain the complex rules and exceptions I just reduced the ambiguity overhead to the process and he learned his way through.  

 

As my work was about learning and my concern was how our insidious curricula did damage to our core capacities for learning, this reading issue really troubled me. Realizing that the key to real change lie in the direction of providing paper and screen based disambiguation cues to the developing readers, I wrote the first article: Training Wheels For Literacy in 1991.

 

For many reasons I never actively pursued or pushed the project – Daaron was no longer struggling and my work went back to its earlier focus. But in 1999 my daughter Deanna, began to learn to read and once again I was drawn into the frustrating reality of a child's struggle with this process.  Deanna's strengths are different than Daaron's. Like him, she is emotionally and somatically smart and conceptually dexterous. However, unlike him, she had a slight but noticeable auditory-memory-processing problem. In reading this manifested itself as an inability to remember a word she had just decoded. She had to re-decode the same word she had just decoded only a sentence or two previously.  Once again the spotlight was on decoding and, I think more accurately stated, overcoming the ambiguity. I began to use the ‘wand’ again and also to write the words she was struggling with on the whiteboard, varying the way the letters looked, to act like cues.  

 

This time, because the process lasted so much longer, I experienced  a side-affect of the process that I hadn’t encountered with Daaron. Even though I helped her understand that the problem wasn’t her fault – that the messy code made it hard on her, she couldn’t help but ask “why can’t I read as well as everyone else”? She began to doubt her own mind – her struggle with reading made her feel bad about herself, feel shame about her own mind. To avoid these negative feelings she began to comfort herself by saying ‘I am just not a good reader’. This time as I felt such compassion for her struggle, I vowed to give the Training Wheels concept a full exploration....and that led to the work of this site.

 

 

 

©Copyright 2001 - 2003: Training Wheels for Literacy & Implicity