A Novel Idea: The History of the Science of Reading

2: Noah Webster Loves Phonics

Iowa Reading Research Center Season 1 Episode 2

Hear ye, hear ye! In this episode, we travel all the way back to the 17th and 18th centuries to examine the Enlightenment-era roots of reading instruction as we know it. We will introduce several major players in the early days of literacy education, and examine the ways in which their actions have influenced today’s conversations surrounding literacy. Plus, hear from Natalie Wexler (Twitter: @natwexler), author of The Knowledge Gap, and Kate Will of the Iowa Reading Research Center.


Learn More:

In addition to the philosophical and pedagogical changes discussed in this episode, the concept of literacy education was also a subject of political discourse during these years. These resources provide more information on this angle of American literacy learning not covered in the episode and are excellent starting points to learn more.


Confronting Anti-Black Racism: Education, Harvard Library

library.harvard.edu/confronting-anti-black-racism/education 

How literacy became a powerful weapon in the fight to end slavery, Colette Colman, HISTORY

www.history.com/news/nat-turner-rebellion-literacy-slavery


Episode transcript and sources

irrc.education.uiowa.edu/transcript-and-sources-novel-idea-episode-2

A Novel Idea website:

irrc.education.uiowa.edu/resources/novel-idea-podcast

The 17th and 18th centuries marked the beginning of a period called the “Age of Enlightenment.” Coming out of the Renaissance era, people were really focused on things like philosophy, the arts, and social reform. Also, more people in Europe were learning to read than ever before. Some scholars estimate that literacy rates among free White men increased by as much as 30% during this period. This was also a time in which schools began to gain a little more structure and consistency, thanks to efforts by both religious and secular advocates of early education. Of course, most children still weren’t getting a full formal education, but things were slowly improving.

To an extent, we can actually trace the roots of the contemporary science of reading movement all the way back here, to the 17th century, when education in the Western world first started to gain a semblance of formal structure. At this time, most children who learned how to read would have done so via something called the “alphabet method.” First, students would learn the alphabet and the sounds of each letter. Next, they would learn short nonsense words and letter combinations, such as “ab,” “ip,” or “glo.” These combinations would start with two letters, then grow to three letters, four letters, and so on. In a way, this method of teaching would have looked something like a rudimentary form of phonics instruction, with an emphasis on teaching individual letter sounds and blending. 

From the Iowa Reading Research Center, I’m Meg Mechelke, and this is A Novel Idea: The History of the Science of Reading Movement. Today we’re going to take an in-depth look at Enlightenment-era educational reform and its lasting impacts on contemporary conversations around the science of reading.

With the type of explicit phonics instruction supported by science of reading advocates, students are directly taught the rules of language. For example, students are taught to associate specific sounds with specific letters or groups of letters — “b” makes the /b/ sound, “d” makes the /d/ sound, and so on. They are also taught phonemic awareness, or the ability to identify and manipulate the different sounds, or phonemes, present in spoken words. Today, many literacy researchers believe that effective reading instruction must include explicit instruction in multiple concepts, including phonics and phonemic awareness. For example, consider the following excerpt from a 2016 post on the Iowa Reading Research Center blog by Dr. Deborah K. Reed, director of the Tennessee Reading Research Center:

“The goal of reading is to make meaning from print, but a reader first has to know what the printed symbols represent before meaning can be assigned to them. By directly teaching phonics skills, it becomes possible for students to figure out the printed words they need to make meaning.” (Reed, 2016)

The idea that letters or combinations of letters can each be associated with a spoken sound, and the idea that these sound-letter combinations must be understood in order for reading to occur, is actually a very old idea. 

John Hart (voice actor portrayal): The teacher ought first to knowe the names of the figures portrayed following for demonstration and to name the Carecters or letters written ouer the sayde portraytures with the first sound or breath, or breath and sound togither, of the names of every of them. Wherefore to make them certaine howe to sounde them, the accustomed name of eche thing is written thereunder, as they are called in the Court, and Lŏdon speaces, where the generall flower of all English countrie speaches, are chosen and used. (Hart, 1570)


That’s an excerpt from a book written by English educator and grammarian John Hart in 1570. Hart is often credited for introducing what is now known as the “alphabetic principle,” or the idea that specific letters correspond to specific sounds. Hart follows this excerpt by sorting the letters of the alphabet into their respective “types” including vowels, voiced consonants, and unvoiced consonants. He then includes a series of charts depicting common letter pairings and describing their pronunciations.

John Hart (voice actor portrayal): Vowels doe come togither and shew in speech their severall soundes, often two togither, and seldom three. When two come togither, we [use] for it the Greeke word dipthong, which signifieth a double sounde: whereof the first is commonly short and the latter long. (Hart, 1570)


Hart’s method of teaching reading caught on quickly. In the 17th century, it was often supported by the use of devices called “hornbooks.” These were made by attaching a single sheet of paper to a small handheld board. The paper would include both the alphabet and a short bit of text—usually a prayer or bible verse. The reason they were called hornbooks is because the paper was often protected by a thin, translucent layer of animal horn. This early protective laminate was allegedly made by soaking the horns in water for a few weeks, and then peeling the thin layer away, boiling it, and pressing it. Many early hornbooks included a 24 letter alphabet instead of the 26 letters you and I are accustomed to. [The letters] “i” and “j” and “u” and “v” were sometimes used interchangeably during the early 18th century! Kids would get hornbooks as gifts, and then parents or teachers could use them as a tool for teaching young readers the alphabet and other basic phonetic skills. Hornbooks varied widely in quality. The board could be made of a range of materials, from cheap scrap wood and leather to luxuries like silver and polished bone. There is even record of edible hornbooks being made from things like gingerbread and given out on special occasions. For example, check out this poem written by English poet Matthew Prior:

Voice (reads): To Master John, the English MaidA Horn Book gives of Ginger-bread:And that the Child may learn the better,As he can name, he eats the Letter;Proceeding thus with Vast Delight,He spells and gnaws from Left to Right.” (Prior, 1718)


In this poem, the child, Master John, is practicing naming the letters of the alphabet, which is an important pre-phonics skill that is still taught to children today.

In the 1750s, hornbooks started to go out of fashion. Soon, they were replaced with objects called “battledores.” Battledores were made of stiff sheets of paper that could be folded into pocket-sized booklets that kids could carry around with them. These booklets usually included an illustrated alphabet and a short Bible story that kids could practice reading.

At the same time, spelling books, or “primers,” were emerging as another method of phonics instruction. These books included the alphabet, lists of vowels and consonants, and pages of charts that laid out common English letter pairings. These charts would have looked somewhat similar to sound walls and phoneme charts that are now found in elementary classrooms across the country. Children could use them to learn things like “p-h” makes the /f/ sound or “s-h” makes the /sh/ sound, and to gain exposure to some basic consonant-vowel pairings.

Learning to break words into smaller parts is a vital component of learning to read. Every word is made up of both graphemes and phonemes. Graphemes are the letters or letter pairs that make up a word, and phonemes are the corresponding sounds. For example, the word “spoon” has four graphemes: “s,” “p,” “oo,” and “n.” It also has four phonemes: /s/ /p/ /oo/ and /n/. Spelling books helped 17th and 18th century readers learn to understand phoneme-grapheme pairings.

In 1783, the century’s most popular primer, The American Spelling Book, was published by a famously patriotic American lexicographer. Can you guess who it was? I’ll give you a hint. This scholar would later go on to create an even more iconic work: An American Dictionary of the English Language. That’s right! Noah Webster himself was an outspoken advocate for phonics instruction. Listen to this excerpt from the 1824 reprinting of The American Spelling Book:

Noah Webster (voice actor portrayal): Articulate sounds are those which are formed by the human voice, in pronouncing letters, syllables and words, and constitute the spoken language, which is addressed to the ear. Letters are the marks of sound, and the first elements of written language, which is presented to the eye. (Webster, 1824) 


Noah Webster believed that language should be broken down into its component parts and taught in a clear sequence. He thought students must learn to master each individual component of language before moving on to the next. This means that students would first learn to name the letters of the alphabet and then to associate those letters with sounds. Next, students could learn to read syllables, simple words, sentences, and so on. In his biography of Webster, Dr. Joseph Ellis, a Pulitzer-prize winning American historian, states:

“The center of [Webster’s speller], and the core of its appeal, was the elegantly simple and orderly presentation of words and the rules specifying how they were to be spelled and pronounced. As a teacher, Webster had learned from experience that his pupils acquired knowledge most readily when he broke a complex problem into its component parts and then made certain that each pupil had mastered one part before proceeding.” (Ellis, 2002)

Webster’s idea of breaking large concepts into smaller parts and teaching them in a logical sequence is still common practice today. In fact, it’s actually one of the core concepts of something we call “structured literacy.”

The International Dyslexia Association defines structured literacy as an approach to teaching characterized by “systematic, explicit instruction that integrates listening, speaking, reading, and writing.” Structured literacy also includes the direct teaching of language features such as phonology, orthography, syntax, morphology, and semantics. Basically, this means that you take all the separate parts that make up language – letters, sounds, grammar, etc – and then you make sure to teach them each explicitly and in a logical order. The implementation of structured literacy can have significant benefits for all students, and the International Dyslexia Association describes it as vital for the success of readers with dyslexia and other reading disabilities. Many educators refer to this approach to teaching as “the science of reading put into practice,” as the concepts of structured literacy have a strong foundation in reading research. Consider this excerpt from a 2019 post on the Iowa Reading Research Center blog by Assistant Director for Education and Outreach Nina Lorimor-Easley and the aforementioned Deborah Reed:

“Structured Literacy instruction is built around a scope and sequence. This may vary somewhat among curricula, but the scope and sequence always dictate the order in which educational concepts and content are taught. Adhering to the instructional sequence encourages skill mastery, minimizes confusion and incorrect attempts, and gradually builds the complexity of students’ knowledge and skills.” (Lorimor-Easley & Reed, 2019)


This corresponds directly with the kind of literacy instruction that Webster was in favor of. For example, the contemporary concept of “scope and sequence” is almost identical to Webster’s insistence that students be taught language in an explicit, step by step fashion.

By the end of the 19th century, The American Spelling Book had sold around 100 million copies. It influenced several generations of young American readers. The “blue-backed speller,” as it was commonly called, was found in homes and classrooms across the country. Even American icon Benjamin Franklin allegedly taught his granddaughter to read using “Old Blue Back.” But why was this speller so popular? Why was what we now know as phonics so dominant as an early form of literacy instruction? Well, it seems to me as though this question has two answers.

The first has to do with the fact that unlike speech, reading and writing are not natural human abilities. Neuroscientific evidence has repeatedly shown that the majority of people need explicit instruction to become proficient readers and writers, while most people are able to pick up oral language just by listening to others around them. This idea is also supported by historical record. By most estimates, the earliest form of human writing seems to have appeared in Mesopotamia around 5,500 years ago, which is relatively recent when you consider the entire span of human development. And that language was pretty simple; it was really just a few pictorial signs that represented words with literal images. In contrast, oral language seems to have developed much earlier, coming about as a natural process of human evolution rather than invention. We also have several well-documented historical and contemporary examples of spoken languages with no written equivalent. By looking at this historical and neuroscientific evidence, it becomes clear that reading is truly not a natural process. This is one reason why systematic and explicit instruction in the rules of written language is so important.

Secondly, when it comes to recognizing the value of explicit language instruction, it is helpful to understand more about the characteristics of the English language itself.

Kate Will: My name is Kate Will, and I am the literacy research and program coordinator at Iowa Reading Research Center.


Will is a colleague of mine here at the IRRC. She has a master’s degree in linguistics.

Kate Will: One way that we can group languages is by looking at the relationship that the spoken language has to its writing system. And so three general categories of writing systems are alphabetic, syllabic, and logographic.


According to Will, in logographic languages, such as Chinese, written characters represent whole units of meaning, such as words or morphemes. In syllabic languages, such as Cherokee, written characters represent syllables. In alphabetic languages, languages like English, Spanish, and Italian, written characters usually represent individual sounds, or phonemes. While very few languages fit perfectly into one category or another, these language types are useful in understanding how our brains make meaning out of written words.

Alphabetic languages like English can be broken down into their individual graphemes and sounded out. For this reason, proficient users of these languages must understand the sounds, or phonemes, associated with each individual letter or letter pair, also called graphemes. That’s why it's so important for early readers to develop a thorough understanding of phonics. It’s what allows us to see the letter “k” and know that it makes the /k/ sound. However, not all alphabetic languages are created equally. 

Kate Will: Within alphabetic writing systems, there is a continuum from shallow, or transparent, orthography, to deep, or opaque, orthography.


The word “orthography” refers to the conventional spelling system of a written language. 

Kate Will: A language like Spanish has a relatively shallow orthography, which means there is close to a one to one relationship between the graphemes, or letters, and the sounds they represent, which are called phonemes.


For example, vowels in Spanish have relatively consistent pronunciations. The letter “a” almost always makes the /ah/ sound, the letter “e” almost always makes the /eh/ sound, and so on.

According to Will, English is on the other side of the continuum, with a relatively deep orthography.

Kate Will: Some graphemes can represent many different sounds. So, the letter A could sound like uh in about, aa in cat, ah in father, and ay in plate. And so this makes it hard to read. Similarly, the same sound can map onto two or more different graphemes. So, the /duh/ sound in metal and medal, or med-al and met-al, is the same unless I intentionally try to distinguish them like I did right there, but they’re represented with different graphemes. So, this overlap makes spelling difficult.


If English has such a complex and deep orthography, can explicit phonics instruction really help kids learn to read and write? Will says that it can.

Kate Will: I think that explicit phonics instruction is necessary in English specifically because it has a deep orthography. So, the less transparent that the grapheme to phoneme relationship is in a language, the more explicit instruction children will need in order to uncover those patterns.


For example, consider the words “jam,” j-a-m, and “jar,” j-a-r. Notice how the letter “a” makes two different sounds? 

Voice (sounds out): /j/ /a/ /m/Voice (sounds out): /j/ /ar/


This may seem confusing, and even random, but there’s actually a simple phonics rule that students can be taught to help them understand this pronunciation difference. The /ar/ sound in “jar” is something we call an “r-controlled syllable,” which is sometimes taught to students using the catchphrase “Bossy R.” Most students who have received explicit phonics and syllable type instruction will be familiar with this concept.

With explicit instruction in syllable types and spelling rules, English’s deep orthography is made more transparent and accessible, giving students the tools they need to understand how and why the language works and therefore allowing them to decode words more fluently and accurately.

At this point you may be wondering, why is English so complex? Why can’t we have an orthographically shallow language like Spanish or Italian? Well, as far as I can tell, it’s a long story, involving the Norman Conquest, the difficulties of spelling Old English words with Latin characters, and something called “the Great Vowel Shift.” But that’s for another podcast.

Unfortunately, clever spelling rules like “bossy-r” didn’t really come into vogue until the last few decades. Early American readers probably didn’t have access to these kinds of mnemonic tools to help them navigate the complex orthography of the English language. What they did have, though, were hornbooks, battledores, and good old blue back!

However, when it came to literacy instruction, even during the Enlightenment Era, not everyone agreed that phonics was the way to go.

In 1762, the influential Genevan philosopher Jean Jacques Rousseau published a book called Emile which laid out his preferred method for educating children. The premise of Rousseau's argument was that education should focus on nurturing children’s innate interests, not on forcing them to sit through long, tedious lessons. 

Natalie Wexler: Rousseau has, I'd say indirectly at this point, had a significant influence on the way, not just American educators think about education, but also in other mainly English-speaking countries.


This is Natalie Wexler, an American education writer whose book, The Knowledge Gap, analyzes the current literacy achievement gap in the context of its many historical roots. 

Natalie Wexler: Emile was this fictional young French aristocrat that Rosseau fictionally was the tutor for, and he just used very sort of indirect methods of education, basically letting Emile direct his own education to a large extent, and letting him learn things naturally through inquiry, and not even teaching him how to read, really, unless he asked to be taught. So, this was fiction, but it was taken as a model, really, for how education should proceed. 


Rousseau believed that it was much more important for children to enjoy learning than to be given direct, step-by-step instruction. A lot of Rousseau's ideas can be traced back to the earlier writings of a Czech philosopher and educator called John Amos Comenius. Comenius’s writings are important to note, as they would go on to influence whole language advocates such as Ken and Yetta Goodman. We will return to the Goodmans in a later episode, when we talk more in depth about whole language, a literacy teaching philosophy that does not typically include explicit instruction in foundational skills like phonics and is generally viewed as antithetical to the science of reading.

By the end of the century, Comenius and Rousseau's ideas on teaching were beginning to echo across the European continent. In 1791, a German teacher named Friedrich Gedike published an educational primer that was pretty explicit in its advocacy against phonics instruction. Gedike suggested that teaching children to break words into sounds was both unnatural and unhelpful, as he believed that these sounds would be completely meaningless to a child. Instead, Gedike advocated for an early form of what we now call the “whole-word” approach to reading. According to collector and Germanist David Paisey, Gedike’s primer was never particularly well-circulated. However, other scholars had similar ideas to Gedike’s, and soon the whole-word approach began to spread across the European continent.

Advocates of the whole-word approach believed that the best way for children to learn to read was for them to see whole words in context. The idea was that the whole sound of the word should be associated with the word’s whole visual appearance. Because of this, sight words were given a vastly overstated emphasis. For example, a student reading the sentence “Jane pets the cat,” would not be taught to break the word “cat” or any of the words in this sentence into their respective phonemes and graphemes. Instead, students would be asked to memorize each whole word as one symbol that they could recognize instantly. In whole word learning, students are rarely or never explicitly taught the sounds and letters that make up words. Rather, they are expected to develop an understanding of these rules and patterns through memorization and recitation. This was considered more natural and less restrictive than directly teaching phonetic rules. 

Of course, students were not generally able to learn every English word through memorization alone. That would be an impossible task. As a result, many students began to rely on context clues, such as accompanying pictures, to figure out what a text said.

By the 1800s, literacy rates were on the rise across the ocean in the newly formed United States. In the early 19th century, an American named Horace Mann visited Prussia, which is part of modern-day Germany. On his trip, Mann was thoroughly impressed by the state of the German education system and the Germans’ use of whole-word language learning. When he returned to the United States, Mann brought these ideas with him, and began to argue that American children should also be taught to read using the whole-word approach, rather than with phonics. 

Natalie Wexler: If you go back to the mid 18th century, Horace Mann, who was a pioneering education reformer, he also has some, I can't quote this off the top of my head, but he has some passage that's quoted about how these “skeleton-like shapes of letters will haunt these poor children.”


This again from education writer Natalie Wexler.

Natalie Wexler: So I think there was also this idea that “ohh, it's just incredibly boring and harmful to actually sit there and teach kids, this is the sound that the letter ‘a’ makes, and here is the letter ‘a.’” You know, that was just seen as a bad thing to do.


Due to the influence of people like Mann and his successors, classrooms across the United States began to transition to a style of literacy instruction in which meaning-making was held as the most important part of early reading. All that matters to proponents of this type of pedagogy is that students can figure out the meaning of a text, regardless of how they come to that conclusion. This means that a student who can understand the basic plot of a picture book is considered a proficient reader, even if they cannot actually read or pronounce the majority of the words in the text. 

Most science of reading advocates do believe that meaning-making is the ultimate goal of reading. However, they also believe that the only way students can reliably and accurately derive meaning from a text is through explicit instruction in phonics, phonemic awareness, vocabulary, building, background knowledge, and other fundamental literacy skills. In order to understand what a text says, students must first have the decoding skills to read the words it includes. And according to the National Institute of Health, the vast majority of readers require explicit support and instruction to develop these skills.

The problem is, when meaning is emphasized over the ability to read individual words, instruction in foundational literacy skills often becomes secondary or even nonexistent. For example, Francis Parker was one of Horace Mann’s most well-known successors. He began teaching at the age of 16, left to enlist as a lieutenant colonel in the Civil War, and then returned to teaching. As an educator, Parker required his students to memorize over 200 sight words before exposing them to any explanation of the sounds that individual letters make. Thus, instead of sounding words out or breaking them into smaller parts, his students had to rely on visual memory and context clues to derive meaning. Parker went on to become the superintendent of a school district in Massachusetts, where he upended the traditional curriculum by eliminating the use of spellers, readers, and grammar textbooks entirely. Instead of learning the alphabet, children jumped right into learning whole words and sentences.

Many early whole-word advocates had backgrounds in philosophical and psychological movements that they used to justify their educational beliefs. 

For example, Gestaltism is a school of psychological thought that formally originated in the early 20th century. In its most basic form, Gestaltism can be summarized with the phrase “the whole is more than the sum of its parts.” Gestaltists tend to believe that human beings understand the things around them as unified wholes, rather than component parts. And the Gestaltist perspective isn’t completely inaccurate. In fact, we see a lot of Gestalt principles in advertising today. Do a web search for “gestaltism and logos'' to see some examples. However, one thing this principle does not apply to is written language. Though early Gestaltists quickly latched onto the idea that children read all words as unified symbols rather than breaking them down, current research suggests that this is largely untrue.

Another influential education reformer inspired by the work of Horace Mann was John Dewey. An academic philosopher, teacher and professor, Dewey used his own pragmatist philosophy to advocate for the whole-word approach to language. Pragmatists tend to understand all activities as inseparable from their function. In terms of literacy, this would mean that reading has no innate or absolute significance if it is not used to do something else, such as gaining meaning. For this reason, many pragmatists were outspoken advocates of meaning-first approaches to literacy instruction. This outlook has had a significant impact on the course of reading instruction, and it prompts a lot of questions that are still relevant to today’s debates around the topic. Is it reading if a student can figure out how to decode and pronounce nonsense words, even though they can’t construct a meaning from them? Alternatively, is it reading if a student can guess at the content of a sentence well enough to figure out what it says, even if they don’t know all of the individual words? To the latter question, most science of reading advocates would say no. However, as we will see in future episodes, there are others who would disagree. 

Additionally, to many educators and philosophers, the whole-word approach to literacy instruction represented a sort of progressivist freedom. Many prominent whole-word advocates identified as humanists, as do a fair number of outspoken phonics detractors today. To 19th-century humanists, whole-word reading instruction meant an emphasis on the development of the whole child, with a focus on skills such as independence and self-discovery, and an emphasis on the importance of “classic” literature. No longer would students be bound to the old-fashioned rigors of routine phonics instruction. Instead, students would be encouraged to learn on their own and grow at their own pace. This is not inherently a bad idea, and it would be unfair to assume poor intentions on the behalf of all early whole-word advocates. In fact, many of these people were educational reformers who instituted changes that would make school as a whole safer and more accessible for students across the country. And philosophically speaking, they were on the right track. It is beneficial for students to develop the ability to make their own discoveries and to enjoy learning for learning’s sake. The problem is, as contemporary research has shown, without the support of explicit and systematic instruction in foundational literacy skills, not all students will be equipped with the tools they need to make these discoveries. Despite the utopian philosophy of the whole-word movement, it just lacks a basis in empirical evidence, and has ultimately been disproven as a reliable method of reading instruction.

Nonetheless, as the years progressed, the divide between explicit phonics instruction and other approaches to reading instruction – primarily the meaning-first and whole-word approaches – began to grow more contentious. Throughout the century, curriculum creators actually attempted to straddle this gap, playing to both sides in order to sell their products. For example, in 1836, the first McGuffey Readers were published. This wildly popular series of instructional books, inspired by Webster’s work in the 1780s, included short stories, as well as sight words and some phonics instruction. 

The McGuffey Eclectic Readers were incredibly popular throughout the 19th century. The California Museum of Teaching and Learning estimates that the series sold over 120 million copies between 1836 and 1960. The readers have even made their way into popular culture. For example, they have appeared in various episodes of the TV series Little House on the Prairie, which also had this to say about the privilege of literacy learning

Laura (voice actor portrayal):  Oh Pa, do I have to go to school?Pa (voice actor portrayal): It isn’t everybody that gets a chance to learn to read and write and cipher. Be thankful you’ve got the chance, Laura. (Wilder, 1953)


Early editions of the McGuffey readers featured a note in the introduction defining both phonics and whole-word instruction and claiming that the reader could be used with either approach. However, despite their efforts to appeal to both phonics and whole-word advocates, the publishers of the McGuffey readers failed to remain entirely neutral. Listen to this excerpt from McGuffey’s 1885 teacher manual:

Voice (reads): While McGuffey's Readers are prepared to meet the demands of each of these recognized methods, they are especially adapted to the Phonic Method and to the Combined Word and Phonic Method, which are the two methods most extensively used by successful teachers of primary reading. (McGuffey, 1885)


Here, the McGuffey’s publishers take a clearly pro-phonics stance, which is interesting, as the mid-1800s marked the beginning of a steady decline in this type of instruction in American classrooms. By the turn of the century, the whole-word approach dominated classrooms across the country.

In 1917, American psychologist Edward Thorndike published an influential article in the Journal of Educational Psychology arguing strongly in favor of meaning-first and whole-word instruction. In this piece, titled “Reading as Reasoning: A Study of Mistakes in Paragraph Reading,” Thorndike claims that the correct way to read is to see a word and instinctively attach that word to its proper meaning. So basically, intelligent readers would see a word like “teacher” and just instinctively know what that meant, without any need for phonetic intervention. In fact, nowhere in Thorndike’s article does he write about phonics, decoding, or any other foundational literacy skills. From his perspective, all that matters is that students have the ability to draw conclusions from the text, regardless of how they get there.

Consider the following paragraph from Thorndike’s 1917 article:

“In correct reading (1) each word produces a correct meaning, (2) each such element of meaning is given a correct weight in comparison with the others, and (3) the resulting ideas are examined and validated to make sure that they satisfy the mental set or adjustment or purpose for whose sake the reading was done.” (Thorndike, 1917)


Many of Thorndike’s beliefs about meaning-first literacy persist in reading education today. Even now, there are some who believe that phonics is simplistic, unnecessary, and boring for children. They say children should be encouraged instead to use reasoning skills to figure out reading on their own. However, science of reading advocates see a major flaw in this kind of pedagogy. There are hundreds of thousands of words in the English language; it would be impossible for students to memorize even a fraction of these words as whole symbols. Because meaning-first and whole-word instruction often skip explicit phonics instruction, students must develop the ability to recognize patterns and break down words on their own if they hope to be successful in reading more complex texts. While this may work for some students, many others will be left behind.

In fact, Thorndike’s reading research has received sharp contemporary criticism. In a 1971 critique of the article, literacy researcher Dr. Wayne Otto points out that Thorndike’s study has been impossible to replicate and that the concluding data is riddled with flaws. Nonetheless, Thorndike’s work has had a lasting influence on the discourse surrounding literacy education in America. 

Throughout the start of the next century, psychologists, educators, and even children’s book authors began to introduce instructional ideas that would change the course of literacy education for the foreseeable future. Learn more on our next episode of A Novel Idea.

A Novel Idea is a podcast from The Iowa Reading Research Center at the University of Iowa. It’s written, produced, and mixed by me, Meg Mechelke. Editing by Sean Thompson, and expert review by Nina-Lorimor Easley and Lindsay Seydel, with additional review provided by Grace Cacini, Natalie Schloss, and Olivia Tonelli. Fact checking by Maya Wald. Additional voiceover work from Kathleen Guerrero and Colin Payan.

For further credits, including audio and music attribution, please see the link in the show notes. 

Visit us online at irrc.education.uiowa.edu to find more episodes and additional literacy resources for educators and families. Again, that’s irrc.education.uiowa.edu. You can also follow us on Twitter at @IAReading. 

If you want to help spread the word about A Novel Idea, subscribe, rate, and leave us a review wherever you get your podcasts. Institutional support for this podcast comes from the University of Iowa College of Education and the Iowa Department of Education

People on this episode