Sounds~Write:Data from the 1st 4 years of whole class usage

Moderators: Debbie Hepplewhite, maizie, Lesley Drake, Susan Godsland

User avatar
Susan Godsland
Administrator
Posts: 4973
Joined: Thu Oct 30, 2003 11:10 pm
Location: Exeter UK
Contact:

Sounds~Write:Data from the 1st 4 years of whole class usage

Post by Susan Godsland » Tue Nov 20, 2007 12:49 pm

Sounds~Write: Data from the first four years of whole class usage
The first Sounds~Write training courses were trialed in March 2003. From the outset we were determined to investigate the outcomes of teaching with the programme. Success for any literacy tuition programme can only be measured in terms of the numbers of pupils taught by it who actually end up literate, by which we include being able to both read and write. We have therefore started the process of tracking pupils taught by the Sounds~Write approach during their first four years of primary schooling, i.e., from the start of Reception until the end of Year Three. We now have data on a total of over 2500 pupils drawn from classes in thirty-six state primary schools geographically located in three different parts of the United Kingdom.
http://sounds-write.co.uk/sw_research_report_07.pdf

g.carter
Posts: 1859
Joined: Wed Nov 05, 2003 7:41 pm

Post by g.carter » Tue Nov 20, 2007 3:04 pm

Congratulations - that is impressive and all the data gathering immensely hard work.

chew8
Posts: 4161
Joined: Sun Nov 02, 2003 6:26 pm

Post by chew8 » Wed Nov 21, 2007 6:11 pm

Spelling has always been a passion of mine, especially since I started teaching students aged 16+ in England in 1978 and realised how weak their spelling was compared with that of students I had taught abroad. So I think that what Sounds~Write is doing on the spelling front is great.

I have some queries/reservations, however, about certain points in the S~W report:

1. p. i, second para: '...we think it obvious that if pupils can spell words they can also read them'. I once thought similarly, but have recently realised that it may not be entirely true. The NLS has until now put far more emphasis on segmenting-for-spelling than on blending-for-reading, and this seems to be reflected in the performance of four weak Y3 children with whom I've been working this term. When I tested them at the outset, they were all better at applying code knowledge in spelling than in reading. I met someone at the RRF conference who was finding the same sort of thing. Maybe this doesn't happen with S~W teaching, but I think that some caution may be necessary about suggesting, as a general rule, that 'if pupils can spell words they can also read them'.

2. p. 1, last para: I agree that comprehension tests 'are only of use when pupils are actually literate', but surely they should be literate by Y2 or Y3 if taught in a code-based way? I can see reasons against using comprehension tests in the first 2 or 3 years, but after that I think we need to be able to show that good decoding does improve comprehension. Good decoding should allow children to start absorbing word-meanings through their reading, so it's important to get them as quickly as possible to the point where they can read anything that interests them and to encorage them to read widely. This means that code-based programmes have to enable children to cope comfortably with the sorts of words often regarded as 'sight' words - we have to beat the 'sight'-word brigade at their own game.

Still re. comprehension: the Clackmannanshire study has been criticised because the children in Primary 7 (Year 6) were not nearly as far above national norms in comprehension as they were in word-reading (3.5 months vs. 3.5 years). The critics have assumed (without proof) that children from similar backgrounds who have been taught to focus more on meaning than on decoding would do better. Johnston and Watson, however, have now looked at the comprehension of NLS-taught children from comparable backgrounds and have found them to be about 9 months behind the Clack. children - evidence that a code-based start enables children to beat the reading-for-meaning brigade at their own game.

p. 2, second para: What is said about the word 'said' is actually a bit too pedantic for my taste. Given that the 'ai' spelling of the /e/ sound occurs in only three words at most ('said', 'again', 'against'), I would want to deal with it in a way that sets the 'ai' apart, in terms of applicability, from the 'ai' in 'rain', 'paid', 'train' (and, indeed, 'again' and 'against' in some people's pronunciation). The fact is that there is a sense in which children do just have to memorise 'said' (and 'again' and 'against', if they pronounce them /agen/ and /agenst/).

p. 2, last para: Children don't have to remember a word-specific spelling for 'duck' in anything like the same way as they have to remember 'said' - the 'ck' pattern ('rule'??) is far more widely applicable. For spelling purposes, I would want children to know that the /k/ sound after any of the five conventional 'short' vowel sounds is almost invariably spelt 'ck' in single-syllable words and in the middle of many two-syllable words (back, peck, pick, sock, duck, packet, reckon, chicken, rocket, lucky etc.).

p. 3: I don't see how the Young test can have been standardised on children taught by the NLS if the version used was published only in 1998. The NLS wasn't implemented until autumn 1998.

p. 9, last para: I'd be interested in knowing where the figure of GCSE exams requiring a reading age of 10.5 years comes from. And is 10.5 a decoding age or a comprehension age? A study done in the early 1990s found that the reading age needed for GCSEs varied from subject to subject, but I seem to remember that the lowest at that time was something like 12 years, with some subjects needing a reading age of 14+ - though of course things may have changed.

It's not possible to make straight comparisons between the Sounds~Write and Clackmannanshire results because of the different tests used. If the correlation were perfect (Young spelling with Schonell spelling, Burt word-reading with BAS word-reading), it would look as though S~W might have the edge in spelling but Clack. might have the edge in reading - but in the circumstances, no really valid comparisons can be made. Presumably, by the way, 'spelling' in the headings in the last two columns in the tables on p. 15 is a typo. for 'reading'.

Jenny C.

User avatar
Debbie Hepplewhite
Administrator
Posts: 3653
Joined: Mon Aug 29, 2005 4:13 pm
Location: Berkshire
Contact:

Post by Debbie Hepplewhite » Thu Nov 22, 2007 12:03 am

I agree with Jenny that no assumptions can be made that pupils who are good at spelling are necessarily as 'good' at reading.

This became more than apparent to me when studying programmes and their outcomes. Reading up on the Codebreakers programme, for example, first drew my attention to this issue. Even the authors admitted at the end of their literature that it became clear the pupils were better at spelling than reading and that teachers needed to place more emphasis on blending for reading. Pupils, after all, usually reflect the type of teaching emphasis they receive.

There are programmes which have endeavoured to follow the ** format which place so much emphasis on the notion of the written code being a 'sound to print' code that they tend to teach 'reading' through a spelling route. There may not be enough emphasis on the fact that reading starts with print and the reader translates the print to sound -notwithstanding that the written code is created from sound to print.

Thinking about the fact that pupils reflect their experience of the teaching, those pupils who are confused and in a mess with their reading and spelling skills arguably reflect that the teaching is confused and over-complicated or non-existent in terms of the content and skills that need to be taught effectively. [This comment relates to mixed methods and whole language teaching.]

I don't want all my comments above to detract from the level of effort of those associated with the Sounds~Write team - well done to them for the enormously difficult job of collating results across a wide number of settings.
:grin:

Judy
Posts: 1184
Joined: Tue Oct 18, 2005 9:57 pm

Post by Judy » Thu Nov 22, 2007 1:47 am

Jenny C wrote
this seems to be reflected in the performance of four weak Y3 children with whom I've been working this term. When I tested them at the outset, they were all better at applying code knowledge in spelling than in reading. I met someone at the RRF conference who was finding the same sort of thing.
I have only worked with a relatively small number of children but what I have noticed is that they all find spelling easier than reading at first, probably because there is no blending involved. I have recently also noticed that writing a word, saying the sounds out loud, is one way they seem to be able to 'hear' the word (ie for reading!) when all else seems to fail. So at that point, in a sense, the spelling seems to be supporting the reading.

Then a switch takes place, soon after they begin working on the Advanced Code, I think, when the reading leaps ahead and the spelling becomes much more challenging because there are choices of spelling alternatives to be made.

My pupils are all 'strugglers' and I wonder whether the pattern is the same in a class where ability is more mixed?

chew8
Posts: 4161
Joined: Sun Nov 02, 2003 6:26 pm

Post by chew8 » Thu Nov 22, 2007 9:11 am

That's interesting, Judy. How old are your strugglers, and have they previously been taught by methods that stress segmenting more than blending?

For my own part, I am pretty sure that what I am seeing among the weak Y3 children I'm working with is the result of the teaching they've had in the three previous years of school. The NLS has stressed segmenting-for-spelling much more than blending-for-reading, and its avowed policy re. blending has been for teachers to tell the children what the target word is, then get them to segment it, assign letters to the sounds, and reblend it - in other words, there has been little or no stress on blending to work out what the target word is. Many of the abler children have probably cracked the blending side of things for themselves - the strugglers haven't, but would (I think) have had a much better chance of doing so if blending had been more explicitly taught.

Jenny C.

Judy
Posts: 1184
Joined: Tue Oct 18, 2005 9:57 pm

Post by Judy » Thu Nov 22, 2007 10:45 am

The children I am referring to have all come to me at the end of Y3 or the beginning of Y4. Some have 'done Reading Recovery'. They know the sounds for most of the alphabet letters but that seems to be all; segmenting and blending are completely unknown to them. I have heard that the reception teacher 'does not believe in phonics'. They all either take a while to get the hang of blending or have difficulty recognising the word they have blended when it crops up again a few minutes later. But once they have overcome these difficulties, their reading makes good progress, though I have to stress that I can never be completely sure how much of their progress is due to my teaching as they spend far longer in school than they do with me (1 hour a week).

So far, none of them has found segmenting at all difficult; it's coping with choosing the correct spelling alternatives that presents a hurdle and I have decided to spend more time on spelling when we reach that point. (normally I switch back and forth between spelling and reading in roughly ten minute slots).

This is only a very small sample of children but it seems likely that a similar pattern would emerge in any group. There always seem to be far more queries on message boards about blending than segmenting so maybe that is the most challenging aspect of learning to read, while choosing the correct spelling alternatives, which comes later, is the hardest part of spelling.

User avatar
David Philpot
Posts: 80
Joined: Sun Feb 26, 2006 8:22 am
Location: Wigan
Contact:

Post by David Philpot » Sat Nov 24, 2007 5:09 pm

I have just read the various responses on this thread with some interest. The biggest problem that some respondents seem to have is with the issue of our use of spelling tests as a much better indicator of literacy development than reading and/or comprehension tests. From Jenny there is the response
I have some queries/reservations, however, about certain points in the S~W report:

1. p. i, second para: '...we think it obvious that if pupils can spell words they can also read them.' I once thought similarly, but have recently realised that it may not be entirely true. The NLS has until now put far more emphasis on segmenting-for-spelling than on blending-for-reading, and this seems to be reflected in the performance of four weak Y3 children with whom I've been working this term. When I tested them at the outset, they were all better at applying code knowledge in spelling than in reading. I met someone at the RRF conference who was finding the same sort of thing. Maybe this doesn't happen with S~W teaching, but I think that some caution may be necessary about suggesting, as a general rule, that 'if pupils can spell words they can also read them'.
and from Debbie
I agree with Jenny that no assumptions can be made that pupils who are good at spelling are necessarily as ‘good’ at reading.
This merits a reply: beginning with the actual quote from the report, page 3, para.1.
Importantly, we think it self evident that pupils should not be able to spell accurately words on a spelling test that they could not actually read if they met them in text: i.e., if they achieve a spelling age of X on a well constructed spelling test, their reading age must also be at a similar level X, or higher. (Though this does not necessarily apply if reading age is measured on currently available reading tests that have been standardised on populations taught by mixed methods involving varying proportions of whole language and traditional phonic ideas.)
We actually think, and our data seems to support our thinking, that literacy is a unitary concept and needs to be taught as such. However, the two major aspects of it, (i) reading and (ii) writing/spelling, are not identical and involve different emphases on underlying knowledge, skills and understanding. At its simplest level, word recognition can be triggered by the sight of the word. So, for example, a young pupil who has recently learned that the sound ‘ae’ (as in maid, play, etc.) can be represented by the spellings < ai > < ay > < a > and < a-e > may encounter a written word previously unseen, such as David for example, and correctly sound it out. However, if a few days later you asked this pupil to write the word David as part of a dictation or spelling test you might get Dayvid, Daivid, Daveid or David. This is because this pupil knows these four spellings of ‘ae’ but has not encountered the word David in text enough times to establish the memory of which spelling of ‘ae’ is the ‘correct one for this word’. Thus it is easy to see why the literacy development process, beyond the initial transparent one letter-one sound stage, usually results in each pupil being able to accurately read more words than s/he can accurately spell. (To some extent this then remains true for life as no-one is likely to encounter every word in the dictionary enough times to remember all their spellings accurately).
Notice that the important quote from the report is, that pupils should not be able to spell accurately words on a spelling test that they could not actually read if they met them in text.’ (A reading test, of course, does come into the category of text.)

We now need to think about what reading and spelling tests are. Why do we have them? I would guess that just over a hundred years ago, with the establishment of compulsory schooling for all in the UK, teachers had become aware that many pupils who appeared to be making satisfactory progress with reading were having trouble with spelling – and that this led to the idea that there was a degree of independence between these two aspects of literacy. The development of independent measures of these two aspects of literacy was therefore probably seen as a ‘good/useful’ thing. When establishing tests, researchers just trial lots of words to see which appear the most useful for discrimination purposes. So, for example, on the Burt Word Recognition Test the 17th word is boys and a score of 17 gives you a reading age of 6.0 years. So the researchers, from their initial samples, would have been looking for a word at this ‘age-level’ that half the pupils of age 6.0 could read (either by decoding or by sight recognition) and half couldn’t. The same types of procedures are followed for constructing spelling tests. Using various statistical jugglings, these tests are put together to parallel or mimic the normal distribution curve (Bell curve). They have to produce different results! [If their results were the same, then anyone with a reading age of X would also have a spelling age of X and the two tests would effectively be measuring the same thing and therefore be the same test – so of course test constructors ensure that their new tests show positive correlations with other similar tests purporting to be measuring the same things, but also ensure that their new tests produce results that are a bit different. Obviously if they weren’t different it would be hard to persuade people to give up their old tests and buy these new ones.] The end result is that the majority (i.e., over 90%) of pupils will always achieve different age equivalent or standard scores if given both a reading and a spelling test. Faced with the fact that, as a result of test construction, we cannot test pupils without finding nearly half whose spelling scores are higher than their reading scores, with the reverse being true for the other half. What should we make of the results? John has a spelling age of 7.6 and a Reading Age of 7.0 whilst Mary has a Spelling Age of 7.0 and a Reading Age of 7.6. Is it John or Mary that is making the best progress in literacy development? Well I don’t know and I’m not convinced that there is any meaningful information embedded somewhere in these figures that could tell me.

It is particularly interesting to examine the words that get put into reading and spelling tests. When I inspect them I usually find that reading tests contain a surfeit of high frequency words, often where the spelling is one that the children wouldn’t be expected to have encountered at the level these words are placed. To use the Burt again as an example (because it is freely downloadable and if I give examples from other tests most readers won’t have immediate access to them), the first two words on the test are to and is with the letter o representing the sound ‘moon’ and the letter s representing the sound ‘z’. If you can ‘read’ only these two words, your reading age will be below 5.3 and I doubt there are many teachers teaching those two phoneme-grapheme correspondences to rising fives. However, when I inspect spelling tests I find that, although not perfect, the spellings involved follow much more rational patterns and show a reasonable degree of correlation with the teaching progressions that are being followed in many classrooms. Given that at Sounds~Write we are interested in teaching all pupils to be literate, meaning that we think that all pupils should be able to write down their thoughts accurately in diaries, letters, essays, take notes in class, etc, we think that spelling tests, which are uncontaminated by the visual memory issues that bedevil reading tests, are the most important and useful measurement we can make. For those who have not looked at our report, it is interesting to see in Appendix A how the Bell-shaped curve for the distribution of reading scores disappears between Y1 and Y2, being replaced by a more rectangular-looking curve for pupils taught by the linguistic phonic approach. This is just what we expected given the nature of reading tests that take no account of the difference between accurate decoding and sight memorisation of words. This change in the nature of the distribution does not occur with the spelling test results.

What was actually written in the report was:
'...we think it obvious that if pupils can spell words they can also read them.'
There seems to be some dispute as to whether this is true. Easy enough to resolve, of course, and providing a simple opportunity for anyone interested to do a piece of action research. All it requires is to give any pupils you are teaching a spelling test, recording for each one all the words spelled correctly. Then a day or two later present them with those words and see if they can read them. The reverse can also be tried, starting with reading and then going on to spelling. I would be really interested in hearing about all the various words that pupils can spell accurately but can’t read.

Dave Philpot

g.carter
Posts: 1859
Joined: Wed Nov 05, 2003 7:41 pm

Post by g.carter » Sat Nov 24, 2007 6:12 pm

Most interesting. Thanks for taking the time to explain in such detail, Dave - it will take a couple more readings to assimilate properly - but worth putting in the effort!

Derrie Clark
Posts: 1174
Joined: Sun May 01, 2005 8:24 am
Location: Kent

Post by Derrie Clark » Sat Nov 24, 2007 7:52 pm

It is difficult not to be impressed by this data. It confirms all the anecdotal evidence from teachers on the training who notice the differences with their pupils in all Key Stages after just a couple of weeks of being on the training.

I do have a concern with this continuing drive / focus on reading. KS2/3 teachers, for example, are typically on the S-W training not because their pupils can't read but because they can't spell and do not feel confident about writing.

It is also the case that while pupils can be brought up relatively quickly to an age appropriate level with reading on the programme they need continuing access to it to develop their spelling. The kind of lengthy explanations and rules typically needed for a graphemic approach just overload memory with processing.

chew8
Posts: 4161
Joined: Sun Nov 02, 2003 6:26 pm

Post by chew8 » Sun Nov 25, 2007 3:50 pm

Dave writes 'Notice that the important quote from the report is, ‘that pupils should not be able to spell accurately words on a spelling test that they could not actually read if they met them in text.’'

I avoided quoting this because I was not sure that I fully understood it in relation to the point made earlier in the report which I did quote: '...we think it obvious that if pupils can spell words they can also read them' (second paragraph of the Executive Summary). The bit Dave quotes seems to deal with the situation as it should ideally be, whereas the bit I quoted seems to deal with the situation as it actually is. I agree that the situation should ideally be that if children can spell words they should also be able to read them, but my recent experience suggests that this is not true of some children at present. I therefore query whether it's as 'obvious' as the Executive Summary suggests that 'if pupils can spell words they can also read them'.

At the start of this term, I did some one-to-one testing of the reading and spelling (in that order, and in the same 15-20-minute session) of five weak Year 3 children. They were unable to read some three-letter words (e.g. 'egg', 'bun', 'wet', 'rub') but they all spelt all the following words correctly: 'net', 'can', 'fun', 'top', 'rag', 'let', 'cap'. In addition, one spelt 'then' correctly and another three spelt 'then', 'may' and 'tree' correctly - so they were able to apply some digraph knowledge, as well as single-letter-sound knowledge, in spelling. One child who misread 'tree' as 'there' spelt it correctly a few minutes later, segmenting it audibly as he did so.

In fact there is research going back two decades or so (e.g. by Bryant and his Oxford Group) documenting cases of children who could spell words but not read those same words. I felt that this was unsurprising during the era when teaching was overwhelmingly of a whole-word/whole-language type. I could understand why children would try to take in words as wholes in reading and would therefore misread many of them, but why they would try to use whatever letter-sound knowledge they had in spelling and might therefore correctly spell words that they would misread.

Since the introduction of the NLS in 1998, however, the situation has not been quite so straightforward - there has been more phonics in the mix, but it has been much more in the context of segmenting-for-spelling than of blending-for-reading. This seems to have resulted in a situation where the weakest children (though perhaps not all the others) still show more ability to apply grapheme-phoneme knowledge in spelling than in reading.

Sounds~Write itself is probably not producing this kind of pattern in children, but I think that those involved with it may need to be aware that it's a kind of problem that can occur (and has occurred with the NLS) when people assume too easily that children will automatically transfer spelling skills to their reading. In the current climate, I would prefer it if the supporters of code-based teaching were not saying that 'if children can spell words, they can also read them'. Run-of-the-mill teachers who don't go on specialised courses need to understand that they need to teach both blending-for-reading and segmenting-for-spelling - they should not leave children to deduce one from the other.

Jenny C.
Last edited by chew8 on Sun Nov 25, 2007 5:31 pm, edited 1 time in total.

User avatar
maizie
Administrator
Posts: 3121
Joined: Sun Mar 14, 2004 10:38 pm
Location: N.E England

Post by maizie » Sun Nov 25, 2007 5:07 pm

I tested 33 Y7s at in September. Bearing in mind everything that Dave says about tests, I can report that 21 had a Spelling Age (Schonell) higher than their Word Reading Age (Burt). In some it was a matter of 2 or 3 months difference, but 13 had a difference of 5 months or more.

These are all mixed methods taught children. The most common reading 'fault' that they exhibit is inaccuracy; not looking at words properly when they are reading.

May be the Sounds~Write contention holds more true for children who have been taught blending and segmenting equally well.

chew8
Posts: 4161
Joined: Sun Nov 02, 2003 6:26 pm

Post by chew8 » Sun Nov 25, 2007 5:50 pm

Apologies for referring to Derrie rather than Dave in my response. I drafted it in 'Word', having copied and pasted the opening quotation, and didn't check carefully enough who had written what I was responding to. I have now corrected what I wrote.

Jenny C.

Judy
Posts: 1184
Joined: Tue Oct 18, 2005 9:57 pm

Post by Judy » Sun Nov 25, 2007 6:08 pm

Might it not depend on which tests are used?

I use Burt for reading and Young's Parallel for spelling and it is only in the very early stages, when some children are still struggling with blending, that the Spelling Age is ever higher than the Reading Age. Admittedly, I only have a very small number of pupils and they are all 'strugglers'.

Dave, if I get time, I will try what you suggest about testing using the same words, but there will have to be a week in between the tests as I only see the children once a week.

Judy
Posts: 1184
Joined: Tue Oct 18, 2005 9:57 pm

Post by Judy » Sun Nov 25, 2007 7:33 pm

I can report that 21 had a Spelling Age (Schonell) higher than their Word Reading Age (Burt).
Maizie - when I made the changeover from using the Schonell Spelling Test to the Young's, I tested my pupils using both tests, with about a week's interval between the two. All but one of the 9 children scored a higher Spelling Age on the Schonell test than on the Young's.

The difference was 3 or 4 months in each case, the one who did better on the Young's (by far the best speller of them all) being 3 months higher on that test than on the Schonell.

All had higher scores on the Burt Word reading test than either of the Spelling tests, apart from the two who were just emerging from difficulties with blending, but who had learnt enough sight words at school to read some of the words on the test and sound out a few more.

Post Reply

Who is online

Users browsing this forum: No registered users and 6 guests