reading tests

Moderators: Debbie Hepplewhite, maizie, Lesley Drake, Susan Godsland

Judy
Posts: 1184
Joined: Tue Oct 18, 2005 9:57 pm

Post by Judy » Tue Nov 14, 2006 4:35 pm

I agree with you, Frances, because it is frustrating that it will take so long for some of the children I teach (once a week) to show any progress on a test of that sort.

For instance, it has taken several lessons of very hard work for one little 7 yr old girl to learn to blend at all, so we haven't had time to make much progress on learning more correspondences. I can foresee that if I were to test her again after several months she would most likely still only be able to read the same few words she has learnt as sight words at school. Fortunately. in this case, her mother can understand that she is making progress without the need for a test to prove it.

But this is a thorny subject so maybe why nobody else has replied as yet! :D

chew8
Posts: 4187
Joined: Sun Nov 02, 2003 6:26 pm

Post by chew8 » Tue Nov 14, 2006 9:00 pm

Burt is too old to count as a good modern standardised test.

Jenny.

willow
Posts: 72
Joined: Mon Dec 12, 2005 8:29 pm

Post by willow » Wed Nov 15, 2006 8:41 pm

But surely they must have some use if used consistently over time with the same children to show 'progress'?

Or am I just kidding myself they are good because I was so proud that dd had a 'reading age' of 8? :D

chew8
Posts: 4187
Joined: Sun Nov 02, 2003 6:26 pm

Post by chew8 » Wed Nov 15, 2006 10:54 pm

I'm sorry, but I have to say yet again that well-normed standardised tests do serve a useful purpose.

Very early on (in the first 3-4 months, say) there is little point in administering anything except programme-specific assessments, besause the parts of the alphabetic code that will have been taught will vary from programme to programme. After that, however, children who have been taught good synthetic phonics already know so much of the alphabetic code that they can manage even non-programme-specific tests - they may not know all the 'sight-words' that other children have been taught, but they still get much higher overall scores because of their good decoding.

I have just dug out my copy of the Burt test, for which I have no norms more recent than 1972. The first 20 words are: to, is, he, at, up, for, an, of, his, or, sun, went, just, big, my, that, girl, day, pot, one. Children who had been taught good s.p., even without learning any 'tricky' words, would be able to read all or most of the words in bold after 3-4 months of instruction. That would give them a reading age, on the 1972 norms, of many months above chronological age, very much as is found in modern s.p. schools.

My problem is not with the idea that there should be programme-specific assessments - as I say, I agree that these are a good thing at first. My problem is with the implication that this situation should last a long time. I think that the recognition of the early need for programme-specific tests should be accompanied by a recognition of the stage at which this should no longer be the case. If children have been properly taught, this stage comes within the first few months. By the end of two years in school they should be able to read almost anything. In the UK context, at least, I would be worried if children were still needing programme-specific tests for longer than those first few months.

Jenny.
Last edited by chew8 on Thu Nov 16, 2006 7:37 am, edited 1 time in total.

Judy
Posts: 1184
Joined: Tue Oct 18, 2005 9:57 pm

Post by Judy » Thu Nov 16, 2006 12:46 am

Dick - do you have any ideas on how one is to know when a pupil has reached 'Certified Reader Status', given that each individual child will have its own level of understanding of spoken communication?

User avatar
Debbie Hepplewhite
Administrator
Posts: 3663
Joined: Mon Aug 29, 2005 4:13 pm
Location: Berkshire
Contact:

Post by Debbie Hepplewhite » Thu Nov 16, 2006 10:31 am

One of the main issues in my opinion is still the lack of crystal clear information.

Even the Rose Report is not crystal clear enough.

If Jim Rose had felt in a comfortable enough position to be more plain spoken about which schools he visited, which programmes they were using, how the effect of these really compared to the mixed methods type schools - then I think there would have been a clearer message still.

Also, the Rose Report is very clear about the evidence to support the 'simple view of reading' in appendix 1 but because of the Torgerson, Brookes, Hall analysis commissioned by the DfES, Jim appeared to make what I consider a woolly reference to the conclusions of research. What if he had felt free, or thought to, refer to Professor Diane McGuinness's response to the Torgerson analysis? Would this have given a clearer picture?

Then, the RRF newsletters and numerous international websites refer to the flaws of whole language and mixed methods so that we on the RRF message board have a rich background and understanding re the length of time this debate has raged and the arguments for and against - ordinary teachers are not generally aware of this background information.

What I am trying to say, is that if ordinary teachers were FULLY aware of the history of reading instruction and the breadth of issues and the results of good phonics teaching over time compared to mixed methods and whole language, then they could make those PROPERLY informed choices. Consider teachers like 'Kat' who has recently described her experiences. What if she had not come across the RRF?

The issue, therefore, is surely that teachers are not fully informed, and I doubt very much whether information coming from most advisers will ever help to inform teachers fully - at least not for a long time.

Meanwhile, only some people in the government are truly wise to synthetic phonics and the history of reading instruction, and it is very clear from the continued and simultaneous promotion of programmes such as Reading Recovery and Catch Up by some government departments, that we have a long way to go until even the government gives a clear message to its own members to enable them to share a common understanding! :?

User avatar
Debbie Hepplewhite
Administrator
Posts: 3663
Joined: Mon Aug 29, 2005 4:13 pm
Location: Berkshire
Contact:

Post by Debbie Hepplewhite » Thu Nov 16, 2006 10:39 am

I have no idea why I wrote the message above but I think my mind wandered from the notion that if all schools used a standardised test commonly, such as the Burt word reading test, then the schools could compare the results they were achieving to others.

The reason that the Burt word reading test is provided on the websites I edit is because:

1) It IS standardised therefore can provide measures

2) Schools using it can compare with the results at St Michael's at Stoke Gifford which is one of our synthetic phonics exemplar schools which uses this test

3) We can provide a standardised test free of charge, quick to print off at any time, and easy to administer and mark

4) Schools can readily use this test at the beginning of some intervention as a benchline and after some intervention for some measure of progress

5) Schools can claim that their good results are not because they have used a purpose-designed 'regular words' test to fit in with their phonics teaching approach, but are using a traditional standardised test of generally common words

Using such tests ADDS TO CLEAR INFORMATION AND COMPARISONS which teachers can draw upon - that was my original thinking before I started to pontificate about clarity of information in the posting above!!!! :oops:

User avatar
maizie
Administrator
Posts: 3121
Joined: Sun Mar 14, 2004 10:38 pm
Location: N.E England

Post by maizie » Thu Nov 16, 2006 4:01 pm

When they are interested in reading and you pull out results on a spelling test, is it any wonder that they think SP is cock-eyed.
Burt is a word reading test, Dick.

chew8
Posts: 4187
Joined: Sun Nov 02, 2003 6:26 pm

Post by chew8 » Thu Nov 16, 2006 4:05 pm

Dick wrote, about the Burt test, 'schools that are teaching "sight words" will do very well on the test'.

Actually, no, Dick. Schools teaching 'sight words' actually achieve fairly average scores on this type of test - unsurprisingly, since the tests were almost certainly normed on a population taught this way. It is the schools teaching synthetic phonics which do very well, and this is a very useful bit of ammunition for us.

I agree with Debbie that it is important that 'Schools can claim that their good results are not because they have used a purpose-designed "regular words" test to fit in with their phonics teaching approach, but are using a traditional standardised test of generally common words'. You can accuse us of 'buying right into the ignorance perpetrated by the mass ignorance re SP, the substance and structure of the Alphabetic Code and what is involved in acquiring reading expertise', but I think you are wrong.

Jenny.
Last edited by chew8 on Thu Nov 16, 2006 5:24 pm, edited 1 time in total.

Judy
Posts: 1184
Joined: Tue Oct 18, 2005 9:57 pm

Post by Judy » Thu Nov 16, 2006 4:15 pm

Maybe it would show more confidence in Synthetic Phonics if we were to challenge advocates of mixed methods to use a test which revealed the children's knowledge of the Alphabet Code, their ability to blend etc.

It has been my experience so far that children can do well on a standardised test and yet come seriously unstuck when tested on the skills and knowledge involved in SP.

chew8
Posts: 4187
Joined: Sun Nov 02, 2003 6:26 pm

Post by chew8 » Thu Nov 16, 2006 8:39 pm

Where on earth, Dick, do you get the idea that 'even when SP proponents "have the goods" to answer the question, we are unwilling/unable to do so'?

Of course we can produce the goods to answer the question. Synthetic phonics children outperform the rest whatever measure is used. I myself know, from listening to hundreds of 7-8-year-olds read over the past 6 years, that those who are reading far above age-norms on standardised single-word reading tests read well by any standards - e.g if they read a passage from a Harry Potter book aloud it's obvious that they can decode all the words, and the expression that they put into their reading shows their understanding. Parents, too, are not stupid about this sort of thing. In fact it's probably true to say that most parents never find out standardised test scores for their children because very few schools use them - or if they do, they don't publicise the results.

Where such tests are used is mainly in research projects such as the Clackmannanshire one, and those results are genuinely meaningful by any standards. The researchers would not have been taken seriously if they had not been able to show that from an early stage until the 7-year follow-up, the children had not only learnt what they had been taught but could also outperform similar children on non-programme-specific tests.

Jenny.

chew8
Posts: 4187
Joined: Sun Nov 02, 2003 6:26 pm

Post by chew8 » Thu Nov 16, 2006 9:22 pm

How does one show the politicians etc. enough children reading for them to be sure that they are seeing the whole picture and not just a sample selected to impress? Whole language teachers, as well as phonics teachers, could produce a select few who would impress, but that would leave a lot of the picture hidden. Statistics may not be perfect, but at least all children are included in them.

Why do you mention 'the adults reading to the kids' Dick? Even the whole-language people don't rely on that to demonstrate children's achievement.

Jenny.

CuriousMum
Posts: 17
Joined: Wed Apr 12, 2006 10:23 am

Post by CuriousMum » Thu Nov 16, 2006 10:34 pm

Well, intrigued by this thread (though somewhat confused as to what a word-reading test would demonstrate), I gave the Burt test to my synthetically-taught 5y3m old. It gave him a reading age of 12. This is because his phonic knowledge allows him to read words like 'phlegmatic', 'domineer', 'microscopical' and 'terminology' without having the faintest idea what they mean!

I'm not sure many 'whole word' taught children would manage this at the same age.

I think the test is a bit out of date though - 'luncheon', in particular, which my son decoded very sensibly as 'lunch - ee- on', is hardly ever used today.

User avatar
Debbie Hepplewhite
Administrator
Posts: 3663
Joined: Mon Aug 29, 2005 4:13 pm
Location: Berkshire
Contact:

Post by Debbie Hepplewhite » Fri Nov 17, 2006 12:56 am

Dear Dick,

I am afraid to say that no matter what the topic of conversation, your only recourse is to say 'ah but...'.

What the RRF provides, or what I provide through my www.syntheticphonics.com website, is based on no finances, no authority other than a 'natural' authority and with limited time.

There is a kind of science when readers take the same test - of whatever nature - and the results are compared.

There are a number of tests and these can be used in a number of ways, and the information gleaned from these tests can be used in a number of ways.

The point is that people can choose whether to use those we provide or not and they can decide for themselves whether they are helpful or not.

People can devise their own tests for their own purposes, and programmes may provide tests appropriate to the programme.

:lol:

Judy
Posts: 1184
Joined: Tue Oct 18, 2005 9:57 pm

Post by Judy » Fri Nov 17, 2006 1:44 am

I apologise if this is going slightly off topic, but it has occurred to me while following this thread that the heart of the issue might be in how the different teaching methods affect those children at the tail end, who struggle to learn to read.

Mixed methods can produce plenty of children with impressive reading ages on stnadardised tests, as can SP. But has anyone produced any figures for comparison, showing what percentage of children still need extra help by, say, Y3?

Post Reply

Who is online

Users browsing this forum: Bing [Bot] and 53 guests