Feedback on Phonic Check

Moderators: Debbie Hepplewhite, maizie, Lesley Drake, Susan Godsland

User avatar
Debbie Hepplewhite
Administrator
Posts: 3654
Joined: Mon Aug 29, 2005 4:13 pm
Location: Berkshire
Contact:

Re: Feedback on Phonic Check

Post by Debbie Hepplewhite » Fri Oct 18, 2013 9:14 am

It depends on how you define the word 'reading'.

My experience of children depending on the pictures and context to help them to 'get through the book' is that they invariably guessed the wrong word and they certainly wouldn't study the actual printed word - rather they would look at the picture and guess the first thing that sprang to their minds.

I use the words 'get through the book' deliberately, as that is what would happen without a good ability to decode the words - a process of trying to get through book after book.

During this usually painful process, some children might be able to get through the books more intelligently than others.

Book experience and superior oral vocabularies would help in this endeavour.

Systematic synthetic phonics revolutionises the capacity of really very young children not only to read books in a way which is focused on the printed words themselves (and lifting the words off the page underpins comprehension) - but also enables children to write very competently from an early age.

Children with a good experience of the systematic synthetic phonics teaching principles bear no resemblance to the multi-cueing and emergent readers and writers of days gone by (and currently in mixed methods settings).

Yes, I have seen numerous children read with apparent fluency and expression from a mixed methods experience - but with a marked tendency to guess whole swathes of the little in-between words (which actually are often the extremely simple decodable words), with a marked tendency to neglect whether a word was negative or plural, with a marked tendency NOT to check for true accuracy as they would make the text make sense to them from their capacity to read with apparent 'fluency and expression'.

Further, such highly articulate children would score well on reading comprehension exercises and more or less give an accurate description of the book - thereby to all intents and purposes their meaning-making had not been impeded (although in reality they were describing the gist whilst endangering their accuracy).

But, such precocious children often displayed significant mismatches between their reading comprehension levels and their capacity to write and record their ideas. Spelling would be so laboured and haphazard that you might not think it was the same children.

Then, you could say but what happens when such children are taught with a rigorous systematic synthetic phonics approach and ALSO the mixed methods multi-cueing?

I suggest that the profile of such children are the very children that many teachers are currently up in arms about - they did not score so well reading the non-words of the Year One Phonics Screening Check?

They are the children used to taking an intelligent stab at words in their reading material. They do not necessarily pay attention to the minutiae of the code in the written words.

They do not necessarily study the printed word well enough. They don't want to impede their apparent 'flow' for reading their reading books.

But should not such children be able to readily see and read such words as 'varp' and 'blimp' as any other reader?

I suggest that the so-called 'better readers' as described by teachers who made more errors on the non-words than peers who were not apparently such good readers, are possibly, if not probably, the profile of children who are in mixed methods classrooms to this day.

Yes, they may well have benefited from promotion and practices of systematic synthetic phonics compared to the past lack of systematic synthetic phonics teaching - but are they benefiting from multi-cueing reading strategies used alongside - or being impeded by the tendency to take intelligent stabs at the printed words?

Intelligent stabs at words can only work partially if children have the words concerned in their oral vocabularies, and cannot work in the long term for more advanced texts and when pictures and obvious themes disappear.

chew8
Posts: 4171
Joined: Sun Nov 02, 2003 6:26 pm

Re: Feedback on Phonic Check

Post by chew8 » Fri Oct 18, 2013 9:24 am

Toots wrote:So, for poor decoders, the slowing effect of having to segment and decode each word laboriously means they cannot grasp the meaning well enough to use context. While segmenting and blending is their only route to meaning it does not serve the purpose well in this situation. They have to reach a stage of fluency, or read books that are within their 'stage of fluency'. Then context can be used because the meaning within the context is accessed.
Just a quick response as I'm about to go out: I think Stanovich would agree that once reading is reasonably fluent, context can be used to refine comprehension, but that at this point decoding is likely to be good enough to kick in for word-identification before context has a chance. I have a vague memory that he (or someone else) has made this point somewhere, but I might have to re-read hundreds of pages to find it. Does anyone else remember this? The Stroop experiments may be relevant.

Even my grandson, who is not yet 4, now seems to have an irrepressible urge to decode letter-strings that he sees. He was sitting in our car the other day and saying something like /mgum/, /mgum/, /mgum/ - it turned out that he was looking at the back of the tax-disc holder, which has the initials MGM (the dealer's initials) on it

Jenny C.

kenm
Posts: 1495
Joined: Sat Dec 17, 2005 5:19 pm
Location: Berkshire

Re: Feedback on Phonic Check

Post by kenm » Fri Oct 18, 2013 10:21 am

Beginning students of general relativity or quantum physics sometimes ask why these subjects are so unintuitive. The usual answer from anyone who combines knowledge of evolution with that of epistemology is that our brain has evolved over many millions of years to deal efficiently with objects and events with a length scale ranging from about a millimetre to a few tens of kilometres and a time scale from about a tenth of a second to at most a few thousand years. Quantum physics is needed when the events under investigation take place in times smaller than 17 orders of magnitude shorter (less than one millionth of a millionth of a millionth of a second).
Toots wrote:I think I mainly differ with you about the fluent reading of text, which has to depend on an automatic decoding of words. Decoding in this sense is not segmenting and blending, it involves looking at the word and knowing it, and sometimes (see Rayner's eye movement studies) it involves not looking at the word and knowing it. To go back to the Ehri article, she clearly sees this instant decoding as fully-developed reading skill. Being able to segment and blend each word is not fully-developed reading skill, but is an effective route to learning words.
If you think that any of the processes of visual perception are instantaneous, you are bound to get a wrong answer. I translate the word "automatic" into "mysterious" and "instant" into "so quick that I give up trying". The overall impression I derive from this paragraph is "Toots is applying her macro thinking to a micro process, so no wonder she doesn't make sense and can't understand my questions."
How dismissive you are of the expertise of teachers. You must have swallowed some line on that one.
I have not met many primary teachers. Most of them have been conscientious and reasonably intelligent, but variable in width and depth of knowledge. Many of the minority who show understanding of the reading process have acquired it through much study and thought, in order to overcome an initial training that set them off in the wrong direction. The announcements by their unions so gleefully quoted by the press are usually obviously idiotic or, at best, based on misunderstandings. My impression of teachers is influenced by the thought that these statements come from representatives of the profession.
Last edited by kenm on Fri Oct 18, 2013 2:00 pm, edited 1 time in total.
"... the innovator has as enemies all those who have done well under the old regime, and only lukewarm allies among those who may do well under the new." Niccolo Macchiavelli, "The Prince", Chapter 6

Toots

Re: Feedback on Phonic Check

Post by Toots » Fri Oct 18, 2013 11:53 am

Ken, are you saying that skilled readers use segmenting and blending to read, but they use it so speedily that they themselves don't notice? I don't think the evidence suggests this, as there doesn't seem to be a left to right movement of the eyes along text, grapheme by grapheme, in reading. This is my understanding of the Rayner eye movement studies I have seen, but perhaps you know something else.

What evidence would you offer that experienced readers segment and blend words they already know? It certainly is not my personal experience. I would also point out that if you ask an adult to analyse the words they are reading into the component GPCs they do not always find it easy, especially with the so-called 'advanced code', although they can do it with thought. Does that mean that they decode GPCs that they do not know in order to read? Seems unlikely, but wouldn't that have to be the case if each word was read by very fast segmenting and blending?

Just to speculate about what evidence of your thesis (if I've understood it correctly) would look like. I wonder if the brains of readers reading pseudo words in pseudo sentences would process the material in the same way and at the same speed as if the words and sentences were real? I wonder if reading such material would be slower or the same speed as reading real text? A higher speed of reading real text would suggest that there is something in a word being familiar that speeds up the process, and that can only mean that the reader recognises the complete structure of the word at a different level from the individual grapheme (unless they are using context, which the Stanovich studies show is not the case with skilled readers).

As you say, it is counter-intuitive to imagine that unreal text is as quickly read as real text. I tend to think that that is because we can imagine the two situations and know that they would be dealt with differently by the reader. But perhaps you have evidence of something else?

Are you also saying that I am not understanding that the reader does not know words instantaneously? Well, yes, of course there must be a time needed for processing, as we read in time and do not read a whole text by glancing at it! Perhaps I have misled you by using the word 'instant', where it might have been better to say 'effortless'. As for knowing what print 'says' automatically or otherwise the same rules must apply whether we look at knowing GPCs or knowing words. In order to decode words using GPCs we have to 'know' the GPCs; in order to decode whole words we have to 'know' the words. If we can learn and therefore recognise and respond with sound to GPCs we can also learn words, can't we?
Last edited by Toots on Fri Oct 18, 2013 12:37 pm, edited 5 times in total.

Derrie Clark
Posts: 1174
Joined: Sun May 01, 2005 8:24 am
Location: Kent

Re: Feedback on Phonic Check

Post by Derrie Clark » Fri Oct 18, 2013 12:07 pm

I have not met many primary teachers. Most of them have been conscientious and reasonably intelligent, but variable in width and depth of knowledge. Many of the minority who show understanding of the reading process have acquired it through much study and thought, in order to overcome an initial training that set them off in the wrong direction. The announcements by their unions so gleefully quoted by the press are usually obviously idiotic or, at best, based on misunderstandings. My impression of teachers is influenced by the thought that these statements come from representatives of the profession.
Beautiful Ken. Such an excellent explanation. It is really frustrating when things are turned round to make it appear that the criticism is of hard working (and naturally defensive) teachers trying to jump through hoops to fit in with all the various agendas and (often contradictory) directions.

Toots

Re: Feedback on Phonic Check

Post by Toots » Fri Oct 18, 2013 12:56 pm

chew8 wrote:
Toots wrote:So, for poor decoders, the slowing effect of having to segment and decode each word laboriously means they cannot grasp the meaning well enough to use context. While segmenting and blending is their only route to meaning it does not serve the purpose well in this situation. They have to reach a stage of fluency, or read books that are within their 'stage of fluency'. Then context can be used because the meaning within the context is accessed.
Just a quick response as I'm about to go out: I think Stanovich would agree that once reading is reasonably fluent, context can be used to refine comprehension, but that at this point decoding is likely to be good enough to kick in for word-identification before context has a chance. I have a vague memory that he (or someone else) has made this point somewhere, but I might have to re-read hundreds of pages to find it. Does anyone else remember this? The Stroop experiments may be relevant.

Jenny C.
The fluency of a novice reader would depend upon the difficulty of the material. I think Stanovich says that where the difficulty is at the right level for the child's decoding skills the child is able to use context. Presumably the word guessed through context would be one word, within a manageable text, which the child cannot tackle. This implies that there is a stage between the two you seem to refer to Jenny (reasonably fluent and using context only for comprehension; unable to decode efficiently using phonics and unable to use context because cannot read context). That stage would be: increasing in fluency and efficiency of decoding and able to use context for more difficult words. The stage at which the context would be used by the reader to support decoding would depend, also, on the difficulty of the material, so that context might not be used at all on one day because not needed, used to good effect on another, and not used at all on a third day because the child cannot decode the context. And somewhere within this mix (Debbie) would be occasions on which context is used to bad effect, needing input from the teacher, who would also no doubt be able to give support where phonic strategies were being inexpertly applied.

kenm
Posts: 1495
Joined: Sat Dec 17, 2005 5:19 pm
Location: Berkshire

Re: Feedback on Phonic Check

Post by kenm » Fri Oct 18, 2013 3:10 pm

Toots wrote:Ken, are you saying that skilled readers use segmenting and blending to read, but they use it so speedily that they themselves don't notice? I don't think the evidence suggests this, as there doesn't seem to be a left to right movement of the eyes along text, grapheme by grapheme, in reading. This is my understanding of the Rayner eye movement studies I have seen, but perhaps you know something else.

The lack of phoneme-by-phoneme eye movement is misleading. On page five of this thread I wrote:
According to a document I read several years ago, the eye has parallel processing, and during the reading process sends to the visual cortex the letters captured by the fovea (the central area of the retina with closely packed cones and no rods) in a single saccade. In fluent readers, the next phase is to decode the captured group grapheme by grapheme, so as to determine how it sounds; more than one saccade will be needed for long words and, (presumably*) this process may need to be repeated if the word is a homograph, possibly some time later if the context only becomes clear later in the text.

* i.e. this is my deduction; the document did not have anything about this.
I should have made clear that the sequential decoding is entirely within the brain.
What evidence would you offer that experienced readers segment and blend words they already know? It certainly is not my personal experience.
I don't have any evidence, being an amateur in this field. Earlier in this thread, I described two conceivable mechanisms for word recognition. The first was equivalent to segmenting and blending. The second was a putative "word recognition" mechanism that I found rather dubious, but I would be prepared to accept a more plausible version of it.
I would also point out that if you ask an adult to analyse the words they are reading into the component GPCs they do not always find it easy, especially with the so-called 'advanced code', although they can do it with thought. Does that mean that they decode GPCs that they do not know in order to read? Seems unlikely, but wouldn't that have to be the case if each word was read by very fast segmenting and blending?
No, it just means that they don't know what a grapheme, a phoneme, or a GPC is, as was my case before I joined this forum. It is also concceivable that reasonably fluent reading can be achieved with a set of different correspondences, e.g. of letter groups some of which are longer than one grapheme. Thinking that "box" has only three phonemes is unlikely to impair ones reading, although it shows a flaw in the understanding desirable in someone who teaches it.
Just to speculate about what evidence of your thesis....
What I believe about reading mechanisms is not a thesis. Most of my comments on this forum are efforts to clarify contributions that I find difficult to understand or which seem to present an idea that is obviously wrong.
As you say, it is counter-intuitive to imagine that unreal text is as quickly read as real text.
I don't recall saying anything of the sort. I don't even know what you mean by the phrase (new to me) "unreal text".
Are you also saying that I am not understanding that the reader does not know words instantaneously?
That was the impression conveyed to me by your post, since you gave no explanation for the phenomenon.
Well, yes, of course there must be a time needed for processing, as we read in time and do not read a whole text by glancing at it! Perhaps I have misled you by using the word 'instant', where it might have been better to say 'effortless'. As for knowing what print 'says' automatically or otherwise the same rules must apply whether we look at knowing GPCs or knowing words. In order to decode words using GPCs we have to 'know' the GPCs; in order to decode whole words we have to 'know' the words. If we can learn and therefore recognise and respond with sound to GPCs we can also learn words, can't we?
What I have yet to understand is what people mean by "sight word" and what are the mechanisms that would underlie their concepts (plural, because I am unconvinced that they all mean the same thing). Readers of ideographic writing usually recognise from 2000 to 5000 patterns. Many years ago we heard of the problems of a very intelligent person who learnt some 10000 English words as patterns, but had to study (post graduation) maths, her second choice, because this was an inadequate vocabulary for philosophy. How long can a list of sight words be?

There is also the consideration, familiar to anyone who makes some progress along the road to mastery of a musical instrument, or anyone who remembers their first efforts at riding a bicycle or driving a car, that thought processes or action sequences much repeated become subconscious. Fluent readers working entirely with GPCs might not know it.
Last edited by kenm on Sat Oct 19, 2013 8:30 am, edited 1 time in total.
"... the innovator has as enemies all those who have done well under the old regime, and only lukewarm allies among those who may do well under the new." Niccolo Macchiavelli, "The Prince", Chapter 6

chew8
Posts: 4171
Joined: Sun Nov 02, 2003 6:26 pm

Re: Feedback on Phonic Check

Post by chew8 » Fri Oct 18, 2013 5:35 pm

Kenm: by ‘sight’-words, researchers who favour code-based learning mean written words which are recognised very quickly and without any overt decoding, but with all orthographic details accurately noted, having been worked through more consciously on previous encounters. In something I’ve already quoted, Share and Stanovich say that ‘Ironically, the outcome of this process of “lexicalization” is a skilled reader whose knowledge of the relationships between print and sound has evolved to a degree that makes it indistinguishable from a purely whole-word mechanism that maintains no spelling-sound correspondence rules at the level of individual letters and digraphs’. I think these researchers (and Ehri) would agree, however, that such readers will be far more accurate than those who rely more on a genuine ‘whole-word mechanism’ (e.g. the products of look-and-say). The 'whole word' readers may look 'indistinguishable' when they read words accurately, but if they often don't read words accurately, then they are distinguishable.
Toots wrote:The fluency of a novice reader would depend upon the difficulty of the material. I think Stanovich says that where the difficulty is at the right level for the child's decoding skills the child is able to use context. Presumably the word guessed through context would be one word, within a manageable text, which the child cannot tackle. This implies that there is a stage between the two you seem to refer to Jenny (reasonably fluent and using context only for comprehension; unable to decode efficiently using phonics and unable to use context because cannot read context). That stage would be: increasing in fluency and efficiency of decoding and able to use context for more difficult words. The stage at which the context would be used by the reader to support decoding would depend, also, on the difficulty of the material, so that context might not be used at all on one day because not needed, used to good effect on another, and not used at all on a third day because the child cannot decode the context. And somewhere within this mix (Debbie) would be occasions on which context is used to bad effect, needing input from the teacher, who would also no doubt be able to give support where phonic strategies were being inexpertly applied.
With children (as distinct from adults), I would also want to take into account the age-appropriateness of the text - with weak readers, one can find text 'where the difficulty is at the right level for the child's decoding skills' but the difficulty is far below what it should be in view of the child's age.

I’m thinking in terms of the real children with whom I work voluntarily. After years of doing this (13 years with 90-odd Y3 children per year, and 3+ years with 30-odd children in each of Years R to 2), I think I have a reasonable feel for the range of difficulty which is acceptable at each stage. If I take ‘good’ readers to be those who are able to read texts within this range, I would say that they fit your description of ‘increasing in fluency and efficiency of decoding and able to use context for more difficult words’, Toots. In my experience, however, the first strategy resorted to by children like this is usually to try and decode the word they are stuck on. For example, a Y1 child that I worked with yesterday was reading a ‘real’ book (i.e. not part of a reading scheme) and getting along well by reading some words apparently ‘at sight’ and others by very quick sounding out and blending. He came to ‘friends’ and started sounding it out as /f/-/r/-/igh/, but clearly felt that something was wrong when he was about to add sounds for the last three letters and, after a moment’s hesitation, came out with ‘friends’ – he may well have been using context. Then he came to ‘rowed’, in a sentence about a boat – he sounded it out as rhyming with ‘cloud’, but immediately self-corrected. If children try sounding out and blending first, then obviously they are not using context as a first resort and any subsequent use of context is likely to be influenced by whatever they have been able to do by way of decoding.

The children who don’t try decoding first and whose eyes stray away from the problem word are the ones who are poor readers in the sense of being able to cope only with texts which are nowhere near age-appropriate. I often notice them looking at the pictures for help – I can’t say I’ve noticed them looking carefully at the surrounding text, but I’ll try and look out for signs of this.

Jenny C.

User avatar
Debbie Hepplewhite
Administrator
Posts: 3654
Joined: Mon Aug 29, 2005 4:13 pm
Location: Berkshire
Contact:

Re: Feedback on Phonic Check

Post by Debbie Hepplewhite » Fri Oct 18, 2013 7:34 pm

http://www.youtube.com/watch?v=4t_NNXO5xTU

OK - do find time to watch this.

English is the second language of Fernando - he is six years old. He lives in Argentina and his mother tongue is Spanish.

Decide for yourself how helpful the decoding through alphabetic code knowledge and application of the blending skill is for Fernando.

Toots

Re: Feedback on Phonic Check

Post by Toots » Fri Oct 18, 2013 7:34 pm

Ken, I would be interested to read the document you refer to. I wonder how it is proved that once the appearance of a word is captured the reader then decodes it 'phoneme by phoneme'. One imagines if the whole word is captured the whole word is decoded, although of course a reader will be aware of the letters in order - otherwise the word would be a different one. Additionally, if the reader decodes phoneme by phoneme how is it that irregular spellings are processed unproblematically by experienced readers. There is not a large incidence of misreadings of words such as 'was', 'tough' or 'cow' by good readers despite the fact that all three words have alternative pronunciations. If we are decoding, but unconsciously, when we read, as you suggest towards the end of your post, do we also unconsciously decide which version of pronunciation we need when there are various possibilities? How does that work? Do we also unconsciously know what the whole word is, then know the right alternative, then re-build the whole word grapheme by grapheme before arriving at the correct version. Hmm - a bit of a reductio ad absurdum.

I don't think you quite answer my point about experienced readers being unable to identify all the GPCs in an irregular word immediately when asked. I envisaged that this would be after an explanation of what a GPC was and perhaps an example given. It is the sort of exercise that happens in phonics training. If experienced readers read words by decoding grapheme by grapheme surely they would be able to identify the GPCs in any word they could read in a similar way that they can say the names of the letters in order. Yet with words such as 'should' there can definitely be some hesitation in analysis as to which bit is /oo/ and which /d/. Would you concede that perhaps readers process larger chunks than the GPC? That seems self-evident to me. And for fully developed readers that would be the full word in most cases.

I used the phrase 'real text' to refer back to my idea that were readers to be presented with an extract written using familiar (real) words and phrases and then with a similar passage using pseudo words and phrases they might read the 'real text' more quickly and easily. Why would that be if they were segmenting and blending the words in both? What do you think? Or perhaps you disagree that the real text would be read more quickly. I would be interested to know.

volunteer
Posts: 755
Joined: Wed Nov 16, 2011 12:46 pm

Re: Feedback on Phonic Check

Post by volunteer » Fri Oct 18, 2013 10:02 pm

The reading of the nonsense text would be slower so long as the real words were ones already known to the child. If the real words were not already known to the child both passages would be read at the same speed.

User avatar
maizie
Administrator
Posts: 3121
Joined: Sun Mar 14, 2004 10:38 pm
Location: N.E England

Re: Feedback on Phonic Check

Post by maizie » Fri Oct 18, 2013 10:42 pm

Think you made your point, volunteer; 8 duplicate posts ;-)

I've removed some them with my mod's fingers :grin:

(Also note that I am thinking of locking this thread as it has wandered very far off topic. Anyone who wants to continue the discussion please do feel free to start a new thread )

Toots

Re: Feedback on Phonic Check

Post by Toots » Sat Oct 19, 2013 10:09 am

volunteer wrote:The reading of the nonsense text would be slower so long as the real words were ones already known to the child. If the real words were not already known to the child both passages would be read at the same speed.
Really I was thinking of adult skilled readers, volunteer; I was trying to think of a possible way of showing whether or not they read by decoding grapheme by grapheme. But, of course, the uneven reading skill of a child is an indication that some words become known and read effortlessly, no matter their irregularity ('the', for instance). The only problem is that the child's hesitation over nonwords could be due to not knowing the GPCs rather than not knowing the words.

Maizie, why lock a thread that has had so much interest and traffic? Yes, it's gone off track, as conversations tend to, following the interests of the participants. What do others think? Is there a way of duplicating it with a new title? Call it, 'Debate between Toots and the RRF' (joke).

Locked

Who is online

Users browsing this forum: No registered users and 15 guests