Memorandum submitted by William Wallace
A report on the Questionnaire to Local Authorities prepared on behalf of AHEd 'Action for Home Education'
William Wallace, BSc, MSc, MPhil, FSS, AFIMA
has served as Local Government Statistician, Research Fellow and Senior University Lecturer in Statistics
1. The lack of consistent definitions used by Local Authorities in recording client transactions would militate against making sensible comparisons. Much more rigorous preparation of the groundwork for data collection would be needed for sound comparability than has been demonstated with this attempt. It has been clearly demonstrated by the poor response to the questionnaire by local authorities that they are not confident in presenting their data for purposes other than for daily client management for which it was intended.
2. There has been some suggestion in the Review report and subsequent discussion that the risk of child abuse is higher within the home education sector. No evidence of this has been shown.
3. The report has shown an inappropriate use of one measure instead of the target measure. That is, using 'known to Social Services' rather than recorded abuse (often termed an error of operationalisation).
4. The data collection approach has used a non-probability sample. No standard errors or confidence intervals can be computed. Only some qualitative value may be possibly obtained. This is error in sample selection.
5. The methodology shown to produce an estimate of the proportion of EHE children known to Social Services , through FOI disclosure, does not stand up as plausible or acceptable statistically.
6. It has come to light that G Badman has been granted extra time to collect more data from local authorities. The data collection methodology for this shows very serious flaws and would render results worthless.
A. The report on Review of Elective Home Education in England presents : 'First, on the basis of local authority evidence and case studies presented, even acknowledging the variation between authorities, the number of children known to children's social care in some local authorities is disproportionately high relative to the size of their home educating population.' No valid supporting evidence has been shown for this statement. In fact, through FOI disclosures, it would appear that the opposite is the case.
B. We have been shown a calculation presented by the DCSF which is an attempt to produce an estimate of the proportion of EHE children known to Social Services. The method is not statistically valid. The use of a sample median to gross up to a national value requires that all LA's have the same number of EHE children, which they do not. It is not easily possible to estimate the statistical error introduced by doing this but suffice it to say that the standard error of such an estimate would be so large that it would not be worth using the statistic.
C. Also quoted is 477 registered EHE children known to Social Services (SS) in the 25 LA's that responded out of the 90 asked to respond as part of the review. We have no way of knowing how representative the 25 are of the 150 LA's and this needs to be checked before any statistics can be quoted. Are we comparing like with like? If we were confident that these 25 reasonably reflected the total 150 then we might take the 477 and divide by the total number of EHE children in the 25 LA's. This would give us an estimate of the proportion of EHE children known to SS per LA. This is what they could have done but did not.
D. There are at least two main flaws to be noted:
1. The inappropriate use of one measure instead of the target measure. Using 'known to SS' rather than recorded abuse (often termed an error of operationalisation).
2. Using a non-probability sample. No standard errors or confidence intervals can be computed. Only some qualitative value may be possibly obtained. This is error in sample selection.
E. It has come to light that G Badman has been granted extra time to collect more data from local authorities. An example of a requested piece of information is shown below (taken from the published proposed local authority questionnaire). Notwithstanding the earlier references to inadequacies in data definitions at local authority level, it can be seen that an education might be deemed inadequate if it has not yet been assessed or there might be some disagreement rendering a case as seeming to be one of non-cooperation. Such blatant errors in data collection cannot be tolerated. Allowing such distortion would bring into disrepute an attempt to clarify suitable education. The lack of consistent definition across local authorities that would result is quite frankly unbelievable.
F. I can say without any hesitation that the information on methodology casts grave doubt on any use of the results from the Badman Review, or its follow up, to sensibly inform government decisions with respect to elective home education.
ATTACHMENT CASE STUDY
A Critical Assessment of the Additional Data Collection from LA's by G. Badman for the Select Committee.
William Wallace, BSc, MSc, MPhil, FSS, AFIMA
Has served as Local Government Statistician, Research Fellow and Senior University Lecturer in Statistics
1. The sample of local authorities used by Mr Badman in his additional data collection and analysis is not representative of all local authorities and needs some post-hoc stratification adjustment before comparison of statistics can be made. This renders the child protection plan (cpp) comparisons between elective home educated (EHE) children and all children unreliable. Special care is needed since the proportions being compared are particularly small. As is noted in paragraph 3 below, some authorities have been using cpp's as a way of forcing EHE children back to school. With such small numbers involved with EHE and certainly cpp's, any distortion in the representativeness of the sample is likely to result in statistical error.
2. The response categories for the question on inadequate education do not all relate to inadequate education and suggest some negative bias against elective home education.
3. On the point of Statutory Attendance Orders (SAO's) no comparison is made with national figures so little can be said on this issue. It is known that some local authorities are using cpp's as a device to force children back to school rather than using the SAO. This is seen as a misuse of proper procedures and may present cases for maladministration. This aspect is discussed further in my Case Study, which is attached to this critical assessment.
4. The use of Connexions service administration (CCIS) data for NEET comparison is flawed since it has resulted in a serious data loss. It is generally not safe to report findings when the response rate is so low.
5. The hastily constructed runaways question needs further attention.
6. A case study is attached below. This study presents a documented case of a local authority that has inappropriately used a cpp as a device to attempt to force EHE children back to school. There was no welfare issue and the only aspect for the issue of a cpp through emotional abuse was that the children were electively home educated. The issue of a SAO was repeatedly threatened but not carried out, no doubt because one would have not have succeeded since a suitable education was being delivered. The cpp plan was subsequently removed after the case was thrown out at Court. This case will have cost the tax payer hundreds of thousands of pounds.
Representativeness of the new sample of LA's
G. Badman maintains in section 5 that the responding 74 LA's (49% of all English LA's) present a representative sample of all LA's in the country. We are not advised upon what criteria this is based. We will now show that this claim is far from the case.
The 74 (49%) responding LA's reported 11,700 elective home educated children of statutory school age. These LA's account for 4,303,700 school children (mid-year population 2008 from ONS) according to Mr Badman. The mid-2008 population estimates from ONS for the number of children of statutory school age for all English local authorities is 7,876,700. If we gross up the school age population from the sample using the fraction .49 we have a national total estimate of of 8,783,000, which is too large. This suggests that the sample is population 'top heavy' and contains more large LA's than it should to be representative.
The responding local authorities noted there to be 10,025 child protection plans (cpp's) for children of statutory school age and this yields a national total of 20,461 child protection plans when the grossing up fraction of .49 is used. This figure needs to be compared with the 15,713 which is taken from DCSF SFR Local Authority Tables, Referrals and Assessments of Children in Need, Year ending March 2008. Again showing an estimate much larger than the actual.
On these two counts the representativeness of Mr Badman's sample is questioned since there is such a large difference between the actual 15,713 cpp's and those predicted from the sample and also the estimated school age populations differ by almost one million children.
Mr Badman's sample is population 'top heavy' and suggests that there are some large local authorities in the sample. The 'behaviour' of large local authorities is likely to be different from the smaller local authorities for obvious reasons. This can distort comparisons in a number of ways; that is why representative samples are preferred. If this is not the case then it is necessary to perform some post-hoc stratification adjustment. Since I am not in possession of the detailed dataset I have not been able to perform any further analysis and adjustment which is needed.
The question asked of local authorities is given below. It may be observed that only the first category identifiably relates to an inadequate education since none is being given. All the other response categories could present disputed and / or disagreed or not completed classifications of provided education. Local authorities have been interpreting government guidance on EHE in a variety of ways which may be at variance with the aims or wishes of parents. Even Mr Badman has used terms such as monitoring and assessment which are not appropriate when viewed alongside government guidance documentation and the Education Act in relation to EHE. The questions as asked appear to pre-empt any sensible dialogue concerning what is 'full time' and 'suitable'. There has been much disagreement between parents and local authority personnel on the autonomous approach to learning. This has in some cases led to attempts to force children back into school because local authority personnel were not familiar with this learning approach or were sceptical of its value.
The question as asked of LA's:
Please can you provide information about the number of electively home educated children of compulsory school age not receiving a suitable education:
Number not receiving any education
Number receiving some education but not a full time education
Number receiving a full time but not 'suitable' education
Number not cooperating with monitoring so no assessment can be made
Known to home educate but no assessment yet
School Attendance Orders
Firstly no comparison is made with national figures so little can be said on this issue. It is known that some local authorities are using cpp's as a device to force children back to school rather than using the statutory attendance order. This is seen as a misuse of proper procedures and may present cases for maladministration. This aspect is discussed further in my Case Study, which is attached.
NEETS and Runaways
With the NEET results it needs to be asked what has become of the children from the non-responding LA's. We are not informed about the results of this data loss. While 47 LA's responded to this question they only account for 1,220 elective home educated children. Are these children from only one year cohort? The Connexions service administration data (CCIS) relates only to those young people whose status is known to Connexions so is unlikely to be appropriate for home educated children. It is generally not safe to report findings when the response rate is so low. With some LA's showing 100 percent or close to 100 percent of EHE children who were NEET the figures don't look very plausible and again point to variable definitions and poor quality database management by LA's. I have already alluded to the fact that LA's maintain transactional databases suitable only for case management and these may not be fit for the purpose of comparative statistical reporting. What is certain is that it is not justified to compare the 22% NEET for EHE children with the national figure of 5.2%. We have no way of knowing what relationship this small number of EHE children had with the Connexions service but we are sure that it needs exploring before any attempt at comparison can be made.
The definition of runaways (missing children) is not given and so is obscure. We are not sure whether LA's also found this to be the case. It seems very likely that we have another situation of variable and unsure definitions. The often used expression 'garbage in - garbage out' is appropriate here. To say that further scrutiny is required begs the question of why these results have been published. Further research is needed on this question to ensure stability and comparability. There is a serious loss of credibility of this additional piece of work just on these two aspects alone.
 Not published.