Page 1 of 1

Prof. D.McG's analysis of the Project X CODE research report

Posted: Thu Oct 18, 2012 6:29 pm
by Susan Godsland
The following report appeared a few days ago:
An independently analysed research trial of OUP's Project X CODE by Ros Fisher. ... report.pdf

I sent it to Prof. Diane McGuinness for a thorough analysis and here is what she had to say:

''Well, it's hard to know where to begin. The so-called "research results" from this project are a sham. In a proper study, one needs to eliminate all sources of error (variance or variability) by restricting subjects to a uniform group, in a specific (contained) environment. With such a small sample of children per school (4 to 6 children), using 13 different schools, taught by multiple teachers with no particular training for this task, any one of these uncontrolled variables will void the study. When your study contains multiple uncontrolled variables, you have chaos. No one really knows what is causing changes in reading skill, assuming these skills are actually changed. We don't know, for example, whether there is overlap between the Letters and Sounds instruction and the stories the children are asked to read. We have no idea from reading this report, exactly how the children are taught. Do they learn to read isolated words, or in sentences, or guessing by looking at the pictures, or all of the above? Nor do we know whether the words on the pretest are different to the words on the post-test on the Hodder reading assessment test. To make matters worse, the Hodder test (PERA) is based on Letters and Sounds, which means the test is measuring some or all of what many children have already been taught.

To do this kind of study correctly you have to eliminate (rule out) all possible contaminating variables. One needs a large sample of children in the same age range, who are taught by the same teacher/trainer throughout, one who is properly trained to teach reading. This simply isn't happening. And there are other problems. There appears to be no strict time lines in this study (i.e. exact number of hours taught per day, number of days in the study).

The goal in research of this type is to isolate the "independent variable" (the only variable which is free to vary - i.e. teaching a method of reading). All other variables must be controlled (made constant across all conditions). The data per se (the dependent variable) is the reading test score. When these variables are locked down and in place, we can argue that a particular method improves learning a particular skill. Reading test scores should have mainly one cause - the method employed by a knowledgeable teacher. Instead, we have contamination from multiple teachers, teaching in different schools, in different locations ( geographic or the hallway), with no standard approach, and no control over the exact hours, days, weeks, this teaching occurs.

As the early part of this instruction is tightly linked to Letters and Sounds, there is a strong possibility that the children have already been taught lessons using Letters and Sounds. Hence they would be familiar with the corpus of words in the PERA assessment test. We know nothing about this connection. This would make it easier to "teach to the test" if the test bears a strong relationship to what was taught in the classroom (Letters and Sounds). So there is serious issue of confounding''.

Re: Prof. D.McG's analysis of the Project X CODE research report

Posted: Wed Oct 24, 2012 3:52 pm
by Susan Godsland
Several RRFers have seen the materials.

OUP's Project X CODE intervention programme for Y2-4 Wave 2 or ''lighter touch’ Wave 3 support''.

A synthetic phonics expert said that in Project X CODE, 'the phonics is much too simple and slowly taught and loads of time is wasted on comprehension for children whose language comprehension is fine'.

CODE is NOT included in the DfE's synthetic phonics match-funding catalogue, perhaps because it doesn't meet the criteria for a systematic synthetic phonics programme.

In a publicity flyer for the programme OUP said, ''Project X CODE was trialled with 61 children in Years 1 to 4 in 11 schools in 8 local authorities in 2012. After only 2½ months of support, they had made an average of 8 months of progress on a standardised reading test'' The test used was the PERA test, standardised for 'Reception,Years 1 and 2'.
The actual report says that some children in the trial, ''fell outside the age range within which standardised scores could be calculated''. The figures given in the report show that nearly half (29) of the children in the trial were in Years 3 and 4.

The researcher, who analysed the results and wrote the report, dismissed the trial's lack of a randomised control design because she says an RCT would not, ''give sufficient detail of local conditions or give information about how the intervention worked'' (p5) Hence the reason why the majority of the report consists of non-scientific qualitative data (description, views and opinions).

The researcher admits, ''It is well established that interventions may have a short term affect because they are new and all those involved have much invested in the programme.Their very newness provides a motivation to teachers and pupils''(p4). Her candid assessment of CODE's possible limitations may be to forestall criticism when the claimed meagre gains 'wash out' rather rapidly.

Re: Prof. D.McG's analysis of the Project X CODE research report

Posted: Wed Oct 24, 2012 6:09 pm
by geraldinecarter
Thanks, Susan, very succinctly put. And Oxford is spending 500,000 pounds for these wretched books and on training up volunteers for their large number of failing 7-year-olds. Schools are already awash with Reading Recovery or RR clones - the penny never drops apparently.

If Clackmannanshire 11-year-olds achieved comprehension results 3 months ahead of their chronological age on average in the second most deprived county in the country, I wonder if Oxford with a far more privileged population(and a few areas of deprivation) achieves stellar comprehension results in SATs even with so many children remaining barely literate ???

There was much humming and haaing and wringing of hands and meetings to decide what was to be done. And, in spite of Oxford having some superb SP programmes the Oxford group of eminent educationalists making up the literacy element of 'Oxford Inspires' have chosen Phonics Code X.

Nothing changes - just buckets more money spent and educationalists proud to continue blindly with their mixed- methods strategies.

Re: Prof. D.McG's analysis of the Project X CODE research report

Posted: Thu Mar 14, 2013 3:28 pm
by Susan Godsland
I've just found a booklet on the web which highlights the 'research' behind ProjectXCode ... ET_WEB.pdf

The booklet also promotes an arithmetic intervention, '1st class number' ... lassnumber

I've been reading on the SENCo forum that people attending their training are told not to use squared paper in arithmetic 'as it confuses children'.

EP Philip MacMillan took a quick look at the introductory web page for 1st class number
He commented:
It states that pupils get 3 x 30 minutes of first class number per week for 12-15 weeks whilst continuing with their normal math lessons. This means they are getting 90 minutes more access to maths lessons/ teaching than their ?normal? counterparts. 90 minutes out of a school working week is a substantial chunk of time. The web page goes on to say that the treatment group went on to increase math scores by 10 months in a term. No mention of the test used, no mention of how the normal group did in the same time period, no spread of scores, so, how do we evaluate the results quoted? Is it the method or the extra time that produces the increases in maths performance? If it is the method what parts of the method account for the increase above and beyond the additional time. What would happen to the ?normal? group if they too had an extra 90 minutes per week of tuition? This all harks back to Reading Recovery research. Any information on the above will be gratefully received. The problem with most educational research is that it lacks theoretical, experimental and statistical rigour. It tends to go for correlation rather than causation and deal in qualitative outcomes. Qualitative has its place, I use such methods myself, but to test theories you need number, i.e. quantitative approaches that stand up to scrutiny. No wonder the powers that be (HMG) cannot get it right.

Re: Prof. D.McG's analysis of the Project X CODE research report

Posted: Fri Mar 15, 2013 10:07 am
by Susan Godsland
Prof. Wheldall describes the booklet as ''an advertising brochure not a report. Cut the pretty pics and give us detail!''

How is a busy SENCo meant to choose an effective intervention when this is the kind of pap they get sent?

I'm looking forward to viewing Greg Brooks 'What works...' 4th edition when it appears online soon.

Re: Prof. D.McG's analysis of the Project X CODE research report

Posted: Fri Mar 15, 2013 10:43 am
by Derrie Clark
Unfortunately Susan, Schools, Politicians and Policy Makers don't have time to read or get to a good understanding of the detail.

Re: Prof. D.McG's analysis of the Project X CODE research report

Posted: Sun Mar 17, 2013 11:41 pm
by Debbie Hepplewhite
In any case, many teachers might not understand serious research studies properly even if they did read them.
Whether or not this is the case, surely it does not abrogate universities of their duty to provide more transparent information than we see in the glossy report of Edgehill University.

Teachers may never take research seriously, nor get better at understanding 'serious research studies', nor appreciate the need to look analytically at research - if it is not made readily available to them as a matter of norm.