Memorandum submitted by Tania Berlow, Jacquie Cox, and Dr Ben Anderson (CS 22)
This submission concerns Schedule 1 of the Children's, Schools, and Families Bill which refers to proposed legislation to register, monitor, and support Elective Home Education (EHE).
We have been advised that since the 'Bullet Point Critique' and Dr Ben Anderson's 'Critique of the Children's, Schools, and Families Impact Assessment' have been published to the internet, they cannot be submitted to the Scrutiny Committee
Here are the links to the two previously published documents. The online documents contain all the necessary references.
Bullet Point Critique
Children Schools Families Bill Impact Assessment
The linked documents specifically show why the Impact Assessment figures regarding numbers of children not receiving a suitable education and also percentages of NEET Young People are incorrect. The recently released revised Impact Assessment is still incorrectly using a figure of 20% of EHE children who 'may' not be receiving a suitable education. Dr. Anderson's critique of the original Impact Assessment shows how using the correct 7.1% (1.8% + 5.3% rate), rather than the erroneous 20% figure, results in a negative cost/benefit scenario when looking at the possible savings to society by having a child who can attain '5 GCSE's'. Note the below scenario does not include costs of issuing SAO's, costs of PRU for each child returned to school, costs for training nor SEN provision nor exams and other possible 'benefits' for home educators.
Numbers of children who 'may' not be receiving a suitable education.
In his letter to Barry Sheerman, Chairman of the Select Committee dated October 2009, the Review author states that 1.8% of registered EHE children were considered as not receiving any education - a figure that comes from the supplemental questionnaire data submitted by 74 LA's.
In this letter the Review author quotes a further 5.3% not receiving a suitable or a full time education. The raw data returns show that the rate for education considered 'not suitable' was fewer than 2%, whereas 'not full time' was 3.35%.
The guidelines on Elective Home Education specifically state that 'full time' school hours do not apply to EHE. However, the initial questionnaire sent to Local Authorities stated 'full time' to be 20 hours per week. Therefore, included in the 3.35% 'not full time' data are answers from Local Authorities who are redefining guidelines and law. Many home educators cannot and do not segment their child's learning experiences into timetabled units.
These 3 categories, 'no education', 'not full time', and 'not suitable, when added together give a 7.1% figure where the LA's state some concern about educational provision. In his letter to the Select Committee, the review author goes on to detail a further 5.8% 'not co-operating' with monitoring and 9.3% 'not yet assessed'.
The original Impact Assessment (page 85) states that 8% are receiving no education and has added figures from all four categories above (italics), and has rounded them down stating that there is a total of '20% not receiving a suitable education.' These figures are inaccurate and yet have been used as a basis for estimating the financial costs and benefits of the proposals in the CSF Bill. In the revised Impact Assessment the DCSF continue to refer to the erroneous 20% figure although the correct percent indicated by the responding LAs of 1.8% is finally used.
The parents 'not co-operating' are not necessarily families who have refused to have any contact with the LA, (it is considered unwise to do so) but include those who submit written plans and philosophies. Many LAs, in keeping with current law and guidelines, would not consider such families to be non co-operative. However, many LA's do consider families who do not wish a home visit to be non co-operative.
The 9.3% who were 'not yet assessed' are also included in the DCSF Impact Assessment as may 'not receiving a suitable education'. The supplemental data was requested at the beginning of a new academic year, and most LA's had numerous children they had not yet processed. It is simply illogical to include the un-assessed children in the data for those apparently 'not receiving a suitable education'.
Weeks previous to the supplemental questionnaire being sent out to LA's, home educators themselves submitted 152 Freedom of Information requests asking LA's to detail the educational concern rates they had and the reasons why there were concerns. In total a 6% rate for 142 of the responding LA's was gathered and this included a few LA's who 'redefine' guidelines.
is evident from Home Educators own Freedom of Information requests and the
supplemental questionnaire, that LAs are not using the powers they already
have when they have educational concerns i.e. by issuing School Attendance
Orders. Only Approximately 8% of 'concerns' have SAO's issued; but the rate
varies substantially across LAs with over three-quarters of those with concerns
issuing no SAO's , even where there are relatively high
The NEET statistics.
During Oral Evidence to the Scrutiny Committee yesterday, Chloe Watson commented that she is considered in the NEET statistics. This particular section in the 'Bullet Point Critique' (see link above) will explain why Chloe Watson was 'almost' correct. Sadly, as far as the statistics used in the Review are concerned, her 'achievements' and those of many home educated young people are completely excluded from the calculations when assessing NEET figures.
If 10 out of 15 young people (who were equally well adjusted and qualified and productive like Chloe) did not continue any contact with the LA or Connexions, and 5 young people did have contact with the LA and Connexions, LAs would only commented on the 5 they knew about . If only one of the five were indeed NEET, that would appear in the Review statistics as 20% - when in reality it could be only one of the 15 total leavers. This is obviously a much smaller percentage (6 .7%) and more in line with the regional averages for NEET.
The Review author uses the Statistical First Release data to compare with the answers given by the Local Authorities who use the regional Connexions data (CCIS) for Electively Home Educated leavers. The SFR rate is lower than the CCIS rate. The CCIS rate spikes during the summer period as young people leave school and await placements that start in October. The data from the supplemental questionnaire which informed the CSF Bill were collected in September and therefore will be higher.
The figure quoted by the Review author of 5.2% is the SFR data for 16 year olds only, and does not take into account the seasonal spike. Therefore any EHE young people who were no longer of compulsory education age and who did not care to inform Connexions of their plans were counted by some of the responding LAs as NEET.
Many EHE young people do go to college to take GCSE exams and would also be counted as NEET while they were awaiting placements. A proportion of EHE young people would simply continue learning at home. They too will have been counted as NEET by some of the responding LAs.
In the supplemental returns only 36/74 Local Authorities answered this question. No regional comparisons were made for these 36 LA's to see if they were comparable to the CCIS national averages.
of the responding 74 LAs did not know about many of the Electively Home
Educated leavers as it is not compulsory for EHE families to register with
Connexions. Those who chose not to register with Connexions or who were simply
continuing with home education were not counted into the total number of
leavers. The LA's therefore only
commented on those they knew about. For example,
To highlight the lack of professionalism that forms the backbone of the reasoning behind the CSF bill, one LA commented that 23% of its 4 leavers were NEET. This equates to 92% of one whole leaver. Percentages are covered in year 5 by the National Curriculum (which Home Educators do not have to follow).
Supplemental data about the number of Child Protection Plans (CPP's) (CPR3 Part B Child protection register) on 31st March 2009 revealed 51 care plans in 11,700 EHE 5 to 17 year old children (0.4%).
In the Review authors' letter to Barry Sheerman, this figure was compared to a national figure taken from the Statistical First Release 2009 of 5-17 year olds.
The Statistical First Release for 31st March 2009 shows that the national figure for all children age 0 - 18 (i.e. including children aged 17) is 0.31% which is 34,100 CPP's in 11 million children.
Nationally for children aged 5-17 there were 18,590 CPP's amongst 7 201200 children which is 0.26 % .
Only 20 of the 74 responding Local Authorities had any EHE children with child protection plans.
1 LA had 8 CPP's 2 LA's had 3 CPP's
1 LA had 6 CPP's 8 LA's had 2 CPP's
1 LA had 5 CPP's 6 LA's had 1 CPP's
1 LA had 4 CPP's
0.4% rate of EHE children with CPP's does not take into account that one large
family could account for all the CPP's in some of the figures mentioned. When
dealing with small sample sizes the figures become skewed - for example, a
small LA such as
According to the Laming report, for the entire year 2008, across 11 million children nationally, there were 29,000 CPP's in place (0.26%) and a further 37,000 Care Orders (0.34%)
On 27 March 2009, home educators submitted Freedom of Information requests to all 152 LA's to obtain data on substantiated abuse or neglect cases within the EHE community. This would include cases of children taken into care. Analysis of the returns shows that the total rate for 129 Local Authorities who provided data was 0.31%.
When both Home Education witnesses said in oral evidence that they cannot see and have not seen ,any evidence which they can rely on that has come from the DCSF -it is because through our work they can look at the analysis and actually click on a link which takes them directly to each Freedom of Information response from each Local Authority. We know what 64 of the 74 Local Authorities actually said. We have 20 of the 25 responses from the LAs used for the actual Review statistics , before the Review author had to gather more evidence through special permission from the Star Chamber.
This more specific evidence which cannot be submitted today to the Scrutiny Committee, as it has been published on the web far and wide. This information was published in order to reveal to those interested in what the real situation is and what misinterpretations have been used. If the Home Educating community had any faith in the accuracy of the statistics used in the Review,we would agree that there was cause for concern.