Memorandum submitted by Michael Crawshaw (CS 21)

 

New Impact Assessment - Same Old Mistakes

The author is a financial analyst and management consultant. He was formerly a Head of Research for Citigroup. He and his wife Carolyn have home educated for over ten years. They have five children.

 

The DCSF's final impact assessment of Home Education contains essentially the same errors and omissions as the first report. Diana Johnson and the author were made aware of these mistakes in December 2009 but did not correct them. We trust that this is due to incompetence rather than deliberate misrepresentation.

 

There are at least six major flaws in the analysis (there are also several smaller ones which we leave aside). The first is subtle, but fundamental, so please persevere!

 

A summary of the mistakes

1. Double counting of lifetime earnings benefits.

2. Overstatement of their own contentious figures for the percent of home educated children apparently receiving an 'unsuitable education'.

3. Use of subjective opinion from a self selecting sample of LAs in place of proper research.

4. No account taken of independent academic research suggesting that home educated children, especially those of lower socio-economic groups, actually achieve better outcomes than their school-attending peers.

5. No account taken of the additional costs to the education budget of the inevitable increase in the school population.

6. No provision made or costs incorporated for genuine support to home educators (exam fees, tuition, labs, sporting facilities etc).

 

1. They have double counted the lifetime earnings benefits by confusing the analysis of individual children with analysis of population. They have assumed that monitoring improves qualifications for the 20% of the home educated 'population' that they consider currently receives an unsuitable education[i]. The impact assessment assumes the new monitoring and/or school attendance improves their qualifications and accordingly grants them a lifetime improvement in their salaries[ii].

 

But the assessment then also adds in benefits over the subsequent nine years to an additional population. But there is no additional population. Individuals come in and individuals go out but the 'population' should remain relatively static. It is possible to analyse benefits by looking at population or individuals. But to do both, as has been built into the impact analysis, is double counting.

 

To clarify ... they could have done either of the following:

A: An analysis of the home education 'population'. Let's say it is 20,000. Take the 20% of the population who are said to be receiving an unsuitable education and fix the problem. Improve outcomes then for 4000 of the population and sum their lifetime benefits from higher salaries. Compare this against the costs of monitoring the 20,000 population over the school-age lifetime of eleven years. This is close to the calculation they have made in the first part of the benefit analysis called the 'one-off effect'. On its own that methodology would have been reasonable.

B: An analysis of 'individual children'. This would have to incorporate an investment period of a number of years in which the new measures were in place but had not yet produced any benefits. After this investment period you could then look at the individual home educating graduates every year of around 2000 and assume 20% of these were going out into the world with better qualifications than they would have otherwise. Calculate the lifetime benefits for this graduate pool of 400 each year. Perform the calculation for a number of years into the future and sum the gains[iii]. This would then be compared to the costs of running the new regime over the same period (including the investment period where no benefits had yet accrued). This is something similar to the calculation performed in the second part of their benefit analysis entitled 'those benefiting in the following nine years'. That methodology on its own would have been reasonable.

 

But they can't include both[1]. To do so is double counting. The mistake was pointed out to the DCSF. They could not see that there was a problem.

 

 

 

2. In addition, the error is compounded by the misinterpretation of the stated percentage of children believed to be receiving an unsuitable education. They have interpreted incorrectly. They say that LAs have told them that 20% of home educated children may be receiving an unsuitable education. They have then turned this into an assumption that 20% are receiving an unsuitable education.

But in fact Graham Badman's report (see table below) clearly shows the LAs judged only 5.3% to be receiving an unsuitable education. The other 15% needed to arrive at the 20% figure are simply children where the LA was not in a position to give a view because they were not in contact with the child. It includes 5.8% who are not cooperating with assessments and 9.3% who have not yet been assessed. All we know about this 15% is that they have not yet been evaluated yet the impact assessment assumes all of them are not receiving a suitable education. They may be NOT KNOWN to be receiving a suitable education. But the author can not treat them as KNOWN to be NOT receiving a suitable education. So they have overstated the figures.

Table 1. Copied from Graham Badman's report

Extent of Education Received

Number of children

% of total

Not receiving any education

210

1.8%

Receiving education but not full-time or suitable education

609

5.3%

Not co-operating with monitoring- no assessment made

656

5.8%

Not yet assessed

1063

9.3%

 

3. They persist in using subjective opinion in place of fact. We continue to challenge the validity of using LA opinions that 5.3% of home educated children do not receive a full time or suitable education and that 1.8% do not receive any education. By Graham Badman's own admission many LA officers do not understand the different nature of home education and so will be inaccurately assessing the provision. Moreover nobody has yet provided a definition of what constitutes a 'suitable education'. Until that is done any opinions on the subject are irrelevant. These figures should be inadmissible.

4. They refuse to incorporate into the impact assessment the body of independent academic research suggesting that home educated children, especially those of lower socio-economic groups, actually achieve better outcomes than their school-attending peers do. So there are no improved outcomes to be achieved, no lifetime benefits to be calculated, and no problem to fix.

Diana Johnson explained (in a letter to Graham Stuart in December 2009) that she would not accept the findings of this research because "the sample base did not consist of all home educated children and has of necessity been drawn from those who are motivated to participate." But she had no such reservations about accepting the opinions (not research) of a self-selecting sample of just 74 Local Authorities.

5. The impact assessment does not include the additional costs to the education budget of the inevitable increase in the school population as local authorities use their new powers to order some home educated children into school and to prevent some children leaving to become home educated and as a number of home educating parents (who feel demoralised and disempowered by the transfer of responsibility for education from parent to state) give up on Home Education and send their child to school.

Diana Johnson wrote (in the same letter to Graham Stuart) that the costs were not included because the numbers are estimated to be very small. Yet over time the number could quite easily be 10,000 children (out of a possible home education population of 20-80,000). The cost of educating those children at school would be over 100m pa.

The cost could even run into the hundreds of millions under an 'extinction' scenario where the changes collapse the home education community in England. This scenario is not as outlandish as might first appear. The home educating community is small (one might think of it as an already endangered species). If the population falls further then it becomes more difficult for parents (to share costs of tutoring, educational visits, social events etc; to find sufficient numbers of children to make classes and events viable; to find other home educating parents with whom to exchange lessons and gain emotional support and spread best practice). It also becomes more difficult and less attractive for the child to be home educated as the population falls (less home educated friends, fewer social and sporting activities, fewer group classes etc). Negative feedback loops come into play that could drive the home educated population towards extinction.

6. The impact assessment does not include any costs for provision of genuine support to home educators (exam fees, tuition, science labs, sporting facilities etc). Diana Johnson wrote in the letter to Graham Stuart that she expects that support to cost 20m pa. Yet this has not been included in the impact assessment. If the Home Education population is 20,000 that amounts to 1000 per child per year. The impact assessment uses an upper range population of 80,000 where the costs would then be 80m pa. These clearly significant costs have been omitted from the impact assessment.

Corrections to the cost-benefit analysis in the impact assessment.

 

We have performed two recalculations. In the first we remove the double counting of benefits (we took only the 'population' estimates) and we then brought those figures down to around one quarter the estimate to reflect the fact that there were only 5.3% found to be receiving an unsuitable education and not 20%.

In the second recalculation we have also recalculated the costs to include 20m pa of support costs as estimated by the DCSF for the 20,000 population (thus 80m pa for the 80,000 population). We did not feel it prudent to guess and factor in the additional cost to the state from a rise in the state education population (though over ten years that figure could run into the billions).

As the recalculations show under every scenario, the net present value is negative i.e. a cost to the state of anywhere up to 1bn.

Table 2. DCSF Impact assessment cost benefit summary (m)

EHE population

Total 10 year Benefits

Total 10 year costs

Net Benefit

20,000

332.8

137

195.8

80,000

1333

567

765.6

Table 3. Corrected DCSF Impact assessment version 1 - Author's re-calculations (removing double counting and using the 5.3% for children receiving an unsuitable education rather than 20%)

EHE population

Total 10 year Benefits

Total 10 year costs

Net Benefit (i.e. net cost where negative)

20,000

75.2

137

-62

80,000

300.8

567

-266

Table 4. Corrected DCSF Impact assessment version 2 - Author's re-calculations (removing double counting and using the 5.3% for children receiving an unsuitable education rather than 20% and including extra costs of support.

EHE population

Total 10 year Benefits

Total 10 year costs

Net Benefit (i.e. net cost where negative)

20,000

75.2

337

-262

80,000

300.8

1367

-1066

 

 

January 2010

 



[1] To illustrate for those still struggling to grasp this subtle but fundamental point ... Assume for the moment that all home educated children are receiving an inadequate education and all get 'corrected' under the new system. If the home educated population is 20,000 then we should have 20,000 lifetime earnings improvements to be compared against the costs of monitoring this 20000 population over the school-age lifetime. But under the methodology of the impact assessment in year one alone we would get 20,000 lifetime earnings improvements and then in year two we find another population of 1818 (that's 20,000 divided by eleven school years) with better lifetime earnings. In year three again we find another 1818 and so on for nine years. So after ten years of implementation we would have a population of 36000 with better lifetime earnings. Yet our 'problem' population is only 20,000! Still not sure? Are you still worrying about those new individual children coming into home education each year? You want to include them? Then you need to look purely at individual children over time. So in year one, there are no gains to be measured. Each child has had one visit from an LA inspector. You can't now grant them a lifetime of earnings improvements. You can do this to the graduate pool of 1818 children in, say, five years time and then out for however many years you care to go. After the five year investment period and an eleven year period of improved graduates, you would have improved outcomes for 20,000 individual children. All well and good. But under the methodology of the impact assessment you would also have improved outcomes for 20,000 in year one alone - a total of 40,000. Double counting again. You have to look at one or the other - population or individual children - not both.

 

 



[i] They also double count in an analysis of the 1.8% of the population that they believe are receiving no education but for the sake of simplicity we can leave that aside.

[ii] In fact the calculation only awards benefits to 46.8% which is the National Average Achievement, but that is a technicality which might be confusing, and has no impact on the flawed methodology.

[iii] Actually you would discount the sums back to a present value and add up the gains in the manner in which the assessment has correctly done but for the sake of simplicity we can also leave that aside.