120,000 deaths revisited…

Although intending to move on from this topic, a couple of things seem worth updating.

The first of these is a likely explanation for the discrepancy in the scale of age-standardised values in the Watkins paper, noted in my previous blog. I am grateful to colleagues at Public Health England for suggesting the possibility that the authors have mistakenly used the 1976 European Standard Population (ESP) rather than the current, 2013 revision.  This looks plausible, although the difference is surprisingly large. Here is a useful Scottish paper on the change between those versions of the ESP. The newer version reflects more aged populations, shifting the shape of the population pyramid to reflect the changes of recent decades.

Slide2

It is difficult to be sure what this does to the rest of the calculations in the paper, but it does not resolve any of the other major problems – for example, that it cannot demonstrate causation, or that it uses an essentially arbitrary selection of data points upon which to base its extrapolations.

Secondly, and for fun, it is worth looking at another major change in social policy associated with a change in the trajectory of life expectancy / mortality from an earlier period.

The NHS commenced on 5th July 1948, with (according to Rudolf Klein’s The Politics of the NHS) a clear expectation that the universal healthcare would improve health and render itself progressively cheaper. We are still waiting for this to happen, and its naïveté survives in the Five Year Forward View‘s perspective on prevention. Here is the graph of life expectancy, based upon data in the Human Mortality Database, for the UK from 1941 to 1951:

Slide1It beggars belief that the arrival of the NHS had a negative effect on health, wellbeing and life expectancy. I don’t believe it and neither, I would guess, do you. On the other hand, this illustrates a key message about the relationship between the provision of care and mortality – it has far less effect than most people, including Watkins et al, think.

In their paper, the factors incorporated as potential mediators of an effect are numbers of doctors, nurses, social care staff and so on (see their supplementary file 3, tables S8-S10), so they are clearly suggesting such provision is causal. Their projection to 2020 implies a shortfall against the otherwise projected increase in life expectancy of about 26% – that is, they suggest it will rise by 1.86 years instead of 2.53 years between 2010 and 2020.

Subjective estimates of the contribution of health care to long term falls in mortality are probably swayed by the triumphs – leukaemia in childhood, for example – and regular press coverage of ‘breakthroughs’. An American study in 2014 demonstrated a public perception that 80% of increased life expectancy was attributable to medical care.

Objective estimates put that figure at less than 20%. Life expectancy rose in the UK by about 30 years during the 20th century, with about 6 of those being the cumulative and acute effects of health care. It is possible, of course, that health care has become more effective in recent years though, again, that is a more marginal effect than most imagine.

Which leads us back to the question of “120,000 deaths” and its plausibility. If the shortfall in funding were to cut 26% from the rate of increase, as proposed, then that would be a larger effect on the trajectory of life expectancy than the entirety of health care impact seen over the last century. It is implausible that so relatively marginal (and I use those words with caution as it is not a small amount of money) an effect on funding could produce so major an impact upon mortality.

Big claims need powerful evidence.

As a final note, while writing this I am also following the Twitter feed of the Lancet Public Health Science Conference (#phsci), it seems apposite to flag the conclusions drawn on complexity by my colleague Harry Rutter at the same conference last year, thus:

  • don’t treat complex systems as simple: of course many things appear not to ‘work’; we’re judging them on the wrong criteria;
  • linear cause and effect is (relatively) easy to look for, but that doesn’t mean it’s helpful to do so;
  • Whole systems approach requires a shift from reductionist, linear, single prespective thinking to transdisciplinary, non-linear systems thinking;
  • Don’t forget the lessons of Rose: population level prevention involves shifting the population curve in ways that may be indiscernable at individual level;
  • Conceptualise actions as driving perturbations within systems, not as hammers applied to nails;
  • Move from ‘does it work’ to ‘does it contribute?’

With some small tweaks, these lessons could usefully have been applied in this case.

Why the “120,000 deaths” claim is unsupportable

The press has been full of a new paper in BMJ Open which claims that austerity has been killing people in their thousands. The independent called it a ‘landmark study’ and UCL trumpeted its findings.

Unfortunately, this paper – in common with the earlier paper by Hiam et al. – is fatally flawed (see here for a description of problems with that paper).

Its dishonesty starts in its title – “Effects [my emphasis] of health and social care spending on mortality in England: a time trend analysis”. This explicitly lays claim to causation, which an observational study of this kind cannot do. The title alone should not have been allowed through peer review.

But it gets much worse. Here is fig 1 of the Watkins paper:

Watkins fig 1

The legend reads thus:

Figure 1 Time trend projections of age-standardised death rate (ASDR) per 100 000 individuals. ASDR (left hand y-axis) and the difference in the number of deaths between actual and predicted mortality (right hand y-axis) per year from 2001 to 2014 are shown. The black and blue lines represent actual ASDR for the 2001–2010 and 2011–2014 periods, respectively. The red line represents predicted ASDR using 2001–2010 as an observation base while the 95% CIs are denoted by the beige-coloured area. The grey bars denote the differences between the number of deaths observed and the number predicted for 2011–2014 where positive values correspond to excess deaths and negative values represent lower than expected deaths. Error bars represent 95% CIs. *p<0.05; **p<0.01; ***p<0.001.

Notice firstly that the illustrated data are not 2001-2014 but actually 2002-2014. This may have a small effect in changing the gradient of the projected curve, but I point it out mostly because it is sloppy, which I find worrying in the core of an argument.

Note secondly, and more importantly, that the mortality rate illustrated falls from 650 per 100,000 to about 500 per 100,000* over this period. Below is a graph of directly standardised all-cause, all-age mortality per 100,000 population for England for 1995 to 2014. This is taken from data you can check in this download of official statistics:

Mortality

The rates for 2002 and 2014 here are, respectively, 1,225 and 947 per 100,000 population. I have no idea how these figures can be reconciled with those quoted in the paper other than to guess that the authors have got their calculations wrong. Which, in turn, throws into doubt all of the other figures in the paper.

Moreover, as can be seen in the longer timescale of my graph, the downward trend of mortality over time does not necessarily suggest a unique slowing of decline since 2010. The overall picture is plausibly one of noise around a continuing fall. One might have made as good a case for a pause in mortality in 2001-2003.

2010 itself was below the trend line, as was 2011, while 2013 and 2014 lay above it. In combination, these would largely cancel one another out. The pattern described by Watkins et al arises only by selective use of segments of the overall mortality curve.

Extension to 2015 data would illustrate a significant kick upwards – but as discussed elsewhere, that can be explained as a winter effect (itself similar to winter excesses seen at other times in the last couple of decades).

None of this means that austerity is not a huge problem for health and social care. It is. We can see that in child poverty statistics, use of food banks, stress and mental health problems relating to implementation of Universal Credit and so on. But the claim that mortality data shows an excess of deaths relating to funding cuts is unsustainable.

 

*Note: this figure was corrected from 530 on 21st November 2017.