The first UK city to lower child obesity?

One of the great joys of the internet age (and yes, this says much about the geek in me) is that it is now possible to pick up a news story, read the scientific paper upon which it is based and check out the original data before getting out of bed. Which I did this morning on reading about Leeds, the “first UK city to lower its childhood obesity rate” according to the Guardian and many other news outlets. It is a great story. Something we would love to have been able to claim for Newcastle, but we can’t. Sadly, neither can Leeds.

The study, published in Paediatric Obesity, is based upon data readily available in Public Health England’s Fingertips website – in the section on NCMP and Child Obesity Profile, the figures are those for “Reception: Prevalence of obesity (including severe obesity), 5-years data combined”. The study was based upon 4 data points, equivalent to 4 consecutive 5-year rolling averages. For Leeds these showed a fall from 9.42% to 8.83% over the reported period from 2009/10-2013/14 to 2012/13-2016/17.

This prompted much admiration and praise on Twitter, including a Tweet by the Secretary of State, Matt Hancock, saying “Childhood obesity rates have fallen in Leeds after bringing in parenting classes, study shows – highlighting pivotal role parents play in tackling childhood obesity. Terrific step forward we must build on if we’re to protect health of future generations”.

I twitched a little at this narrative of parental choice and responsibility, particularly coming the day after publicity for a much larger study showing a powerful relationship between breastfeeding and obesity.

But what was missing from the news coverage and responses was the fact that there is now a fifth data point in the Fingertips data – for the period 2013/14-2017/18 – and this shows the prevalence of obesity in Leeds to have risen again to 8.98%. This is because the single-year data for the city in 2017/18 showed a rise to a prevalence figure of 9.5% in the reception age group.

OK, you might say, it is still a fall, and possibly a bit more than the national change over the same period. But let’s look at Newcastle’s figures for the same periods. These show that for the time covered by the published study, obesity in Newcastle, by the same measure, fell from 12.10% to 11.25%. And its figure for 2017/18 was 11.16%.

So, between 2009/10 and 2017/18, obesity by this yardstick fell in Leeds by 4.68%, whereas in Newcastle it fell by 7.81%.

Does this make Newcastle the first UK city to lower its obesity rate? Or does it mean that the Leeds data have been misleadingly over-interpreted?

This matters, because, however worthy the parental programme (HENRY) in Leeds may be, this study is not evidence that it works. But it has fuelled the individual, behavioural choice paradigm of childhood obesity, against which there is a mass of evidence, and that strikes me as irresponsible epidemiology.

 

MUP

In the recent budget, the Chancellor failed to act on the price of alcohol other than cheap ciders, despite a widespread public health consensus that minimum unit pricing would be an effective step. A year ago today, one of our students at Newcastle University died of alcohol poisoning.  A day or so later I received the following note from our press office:


A Daily Mail reporter is writing a feature about bars and pubs who are still selling cheap alcohol offers to young people like students. In light of the 20-year old Newcastle University student, Ed Farmer, who recently died of excessive alcohol consumptionShe knows that most Student Unions have now banned society drinking games and initiations within campus bars. The reporter noticed that pubs like, the Soho Rooms which is part of the Council’s “Raising the Bar” scheme – promoting responsible drinking (holds the silver position). They are currently selling 9-shots of Vodka for £5The reporter wants to know what checks are in place by the local authority for such pubs, who are openly advertising (online) basically saying to people “why not get smashed with us!

  • What line do pubs have to cross for the local authority to say “we won’t renew your licence”?
  • She wants to know what role the council has / our duty to set standards for bars and pubs to adhere to?
  • What preventative measures are put in place by the Council?

For interest, I enclose below the full text of my response:

“We agree completely that vodka, or any other alcoholic drink, being sold at £5 for 9 shots is dangerous and inappropriate. A shot of vodka is equivalent to about 1 unit of alcohol, so a price of £5 for 9 shots is a price per unit of about 56p per unit. This is comfortably higher than the proposed Scottish minimum unit price for alcohol of 50p per unit which has been fought through the courts by the alcohol industry, and reinforces the view that even a rate of 50p per unit would be too low.

“At present the Government has no plans to introduce minimum unit pricing in England despite overwhelming evidence to support its effectiveness as a means of reducing alcohol-related harm – recently supported in a review of evidence by Public Health England.

“The ‘Raising the Bar’ policy is based upon working with businesses in the city to try and promote healthy behaviours and health protection to the greatest extent that we can within the existing law. We consider that it would be irresponsible not to attempt to act in this way, given the restrictions of existing legislation. At present the council is not permitted by law to remove a licence on the basis of the price of alcohol. We are, however, working to  strengthen our collaborative approach to health protection which extends beyond alcohol to cover action on other areas of personal risk, and of sexual health in particular – review of each business’s status will consider its adherence to good practice.

“Newcastle City Council welcomes the Daily Mail’s support for minimum unit pricing and the power for local authorities to enforce this as part of its approach to protecting the health of the public.”

Here is the article as it finally appeared, and which (surprise!) doesn’t follow through on the logic of the Mail’s questions.

I offer my deepest sympathy to the Farmer family – I, too, have had a young, close relative die of alcohol poisoning – we will not forget, and we will not stop arguing for public policy that prioritises wellbeing and health over profit.

120,000 deaths revisited…

Although intending to move on from this topic, a couple of things seem worth updating.

The first of these is a likely explanation for the discrepancy in the scale of age-standardised values in the Watkins paper, noted in my previous blog. I am grateful to colleagues at Public Health England for suggesting the possibility that the authors have mistakenly used the 1976 European Standard Population (ESP) rather than the current, 2013 revision.  This looks plausible, although the difference is surprisingly large. Here is a useful Scottish paper on the change between those versions of the ESP. The newer version reflects more aged populations, shifting the shape of the population pyramid to reflect the changes of recent decades.

Slide2

It is difficult to be sure what this does to the rest of the calculations in the paper, but it does not resolve any of the other major problems – for example, that it cannot demonstrate causation, or that it uses an essentially arbitrary selection of data points upon which to base its extrapolations.

Secondly, and for fun, it is worth looking at another major change in social policy associated with a change in the trajectory of life expectancy / mortality from an earlier period.

The NHS commenced on 5th July 1948, with (according to Rudolf Klein’s The Politics of the NHS) a clear expectation that the universal healthcare would improve health and render itself progressively cheaper. We are still waiting for this to happen, and its naïveté survives in the Five Year Forward View‘s perspective on prevention. Here is the graph of life expectancy, based upon data in the Human Mortality Database, for the UK from 1941 to 1951:

Slide1It beggars belief that the arrival of the NHS had a negative effect on health, wellbeing and life expectancy. I don’t believe it and neither, I would guess, do you. On the other hand, this illustrates a key message about the relationship between the provision of care and mortality – it has far less effect than most people, including Watkins et al, think.

In their paper, the factors incorporated as potential mediators of an effect are numbers of doctors, nurses, social care staff and so on (see their supplementary file 3, tables S8-S10), so they are clearly suggesting such provision is causal. Their projection to 2020 implies a shortfall against the otherwise projected increase in life expectancy of about 26% – that is, they suggest it will rise by 1.86 years instead of 2.53 years between 2010 and 2020.

Subjective estimates of the contribution of health care to long term falls in mortality are probably swayed by the triumphs – leukaemia in childhood, for example – and regular press coverage of ‘breakthroughs’. An American study in 2014 demonstrated a public perception that 80% of increased life expectancy was attributable to medical care.

Objective estimates put that figure at less than 20%. Life expectancy rose in the UK by about 30 years during the 20th century, with about 6 of those being the cumulative and acute effects of health care. It is possible, of course, that health care has become more effective in recent years though, again, that is a more marginal effect than most imagine.

Which leads us back to the question of “120,000 deaths” and its plausibility. If the shortfall in funding were to cut 26% from the rate of increase, as proposed, then that would be a larger effect on the trajectory of life expectancy than the entirety of health care impact seen over the last century. It is implausible that so relatively marginal (and I use those words with caution as it is not a small amount of money) an effect on funding could produce so major an impact upon mortality.

Big claims need powerful evidence.

As a final note, while writing this I am also following the Twitter feed of the Lancet Public Health Science Conference (#phsci), it seems apposite to flag the conclusions drawn on complexity by my colleague Harry Rutter at the same conference last year, thus:

  • don’t treat complex systems as simple: of course many things appear not to ‘work’; we’re judging them on the wrong criteria;
  • linear cause and effect is (relatively) easy to look for, but that doesn’t mean it’s helpful to do so;
  • Whole systems approach requires a shift from reductionist, linear, single prespective thinking to transdisciplinary, non-linear systems thinking;
  • Don’t forget the lessons of Rose: population level prevention involves shifting the population curve in ways that may be indiscernable at individual level;
  • Conceptualise actions as driving perturbations within systems, not as hammers applied to nails;
  • Move from ‘does it work’ to ‘does it contribute?’

With some small tweaks, these lessons could usefully have been applied in this case.

Why the “120,000 deaths” claim is unsupportable

The press has been full of a new paper in BMJ Open which claims that austerity has been killing people in their thousands. The independent called it a ‘landmark study’ and UCL trumpeted its findings.

Unfortunately, this paper – in common with the earlier paper by Hiam et al. – is fatally flawed (see here for a description of problems with that paper).

Its dishonesty starts in its title – “Effects [my emphasis] of health and social care spending on mortality in England: a time trend analysis”. This explicitly lays claim to causation, which an observational study of this kind cannot do. The title alone should not have been allowed through peer review.

But it gets much worse. Here is fig 1 of the Watkins paper:

Watkins fig 1

The legend reads thus:

Figure 1 Time trend projections of age-standardised death rate (ASDR) per 100 000 individuals. ASDR (left hand y-axis) and the difference in the number of deaths between actual and predicted mortality (right hand y-axis) per year from 2001 to 2014 are shown. The black and blue lines represent actual ASDR for the 2001–2010 and 2011–2014 periods, respectively. The red line represents predicted ASDR using 2001–2010 as an observation base while the 95% CIs are denoted by the beige-coloured area. The grey bars denote the differences between the number of deaths observed and the number predicted for 2011–2014 where positive values correspond to excess deaths and negative values represent lower than expected deaths. Error bars represent 95% CIs. *p<0.05; **p<0.01; ***p<0.001.

Notice firstly that the illustrated data are not 2001-2014 but actually 2002-2014. This may have a small effect in changing the gradient of the projected curve, but I point it out mostly because it is sloppy, which I find worrying in the core of an argument.

Note secondly, and more importantly, that the mortality rate illustrated falls from 650 per 100,000 to about 500 per 100,000* over this period. Below is a graph of directly standardised all-cause, all-age mortality per 100,000 population for England for 1995 to 2014. This is taken from data you can check in this download of official statistics:

Mortality

The rates for 2002 and 2014 here are, respectively, 1,225 and 947 per 100,000 population. I have no idea how these figures can be reconciled with those quoted in the paper other than to guess that the authors have got their calculations wrong. Which, in turn, throws into doubt all of the other figures in the paper.

Moreover, as can be seen in the longer timescale of my graph, the downward trend of mortality over time does not necessarily suggest a unique slowing of decline since 2010. The overall picture is plausibly one of noise around a continuing fall. One might have made as good a case for a pause in mortality in 2001-2003.

2010 itself was below the trend line, as was 2011, while 2013 and 2014 lay above it. In combination, these would largely cancel one another out. The pattern described by Watkins et al arises only by selective use of segments of the overall mortality curve.

Extension to 2015 data would illustrate a significant kick upwards – but as discussed elsewhere, that can be explained as a winter effect (itself similar to winter excesses seen at other times in the last couple of decades).

None of this means that austerity is not a huge problem for health and social care. It is. We can see that in child poverty statistics, use of food banks, stress and mental health problems relating to implementation of Universal Credit and so on. But the claim that mortality data shows an excess of deaths relating to funding cuts is unsustainable.

 

*Note: this figure was corrected from 530 on 21st November 2017.

The flawed concept of ‘heart age’

Age is a curious, ill-defined thing even among those who study the subject for a living. In its most pragmatic form, it is ‘chronological’ – a useful tautology, as if age could be marked truly by anything other than the passage of time.

My grandfather, frustratingly to a 4 year old, would reply when asked his age “as old as my eyes and a little older than my teeth”; though typically of his generation in later years he became quite a lot older than his teeth.

The real problem arises when people start talking about ‘biological’ age, as if that were something real and definable, which they then try to define, through telomere length, or frailty indices, collagen cross-linkage and any number of other measures.

Particularly problematic in defining ‘biological age’ is the circular confusion of age-related and age-specific – or if you prefer, age-consequent – disease; that is, the confusion of disease that is more likely to be present because of the cumulative experience of insults and disease which arises because of some theoretical, intrinsic process of ageing. Do you become ill because you are old or old because you are ill? The truth clearly lies somewhere between the two, but the manifest variability of the latter element tempts us into an assumption that it operates exclusively. Particularly in America.

Public Health England are re-promoting their ‘Heart Age’ test, which purports to tell you how old your heart is, relative to your chronological age. It has good intentions. It remains to be seen whether those good intentions have any effect. But I find it profoundly flawed theoretically.

To make real sense, it would be necessary that a heart has a basic trajectory of wearing out. I am not sure how one would measure this, but let’s suppose that it might be by an increasing liklihood of arrhythmia, or of muscle diminution and weakness, or of diminishing patency of the blood vessels. The trouble is that there is no great evidence for this. Instead the increasing probability of a dysfunctional heart in our society is largely a consequence of disease – which itself may be entirely avoidable.

This is not to say that there are not also underlying age-specific processes. Increasing stiffness of blood vessels, for example, could be determined by physico-chemical changes in non-renewable parts of their structure, similar to the racemisation and cross-linkage of proteins in the lens of the eye, which varies to a degree with UV exposure and so on, but explains why we all need reading glasses by about the same age. Importantly, the latter is not a disease. You can’t avoid it. It is a consequence of living for a sufficient period of time with a body of this particular design.

Hearts are different. Most of the heart ‘age’ risk in PHE’s calculations is Coronary Heart Disease (CHD) – which is, to coin a phrase, what it says on the tin: a disease. It isn’t ageing. It may be ubiquitous but it is not universal. There is no ‘natural’ level of CHD for an 80 year old, merely that which is typical of an 80 year old in our society at this particular time. And therein lies the problem with PHE’s concept of ‘Heart Age’. To say I have a Heart Age of 57 implies only that I am typical of men aged 57 with my behaviour and physical characteristics right now – or, more accurately, a little while ago – in the mortality figures. My heart by this measure is only averagely crap.

For fun (admittedly not very much) I changed some things around to try and make my Heart Age younger. This turns out to be pretty difficult to achieve, and even varying it downward by 3 years only increased my predicted event-free survival by a year.

On the other hand, cardiovascular risk is falling at a spectacular rate. If it carried on falling in the next two decades as it has the past two, the actual level of risk would be a fraction of that predicted here. As far as I can see, this isn’t taken into account in the calculator, which is based upon QRISK, which, in turn, is adjusted at intervals for the changing population profile of risk. Indeed, QRISK became necessary because the Framingham equations proved to be not only non-exportable to other populations but to become less accurate with time and inappropriate to other cultures.

All of this renders the whole concept a bit of a gimmick. A different way to tell people to quit smoking, lose some weight and take more exercise. These are excellent messages that we need to continue communicating and facilitating.

But if I told someone of 55 they had a Heart Age of 65, I would feel that I was making up a story to manipulate them, and I don’t think we should do that. That’s why we have science.

 

 

Misinterpreting mortality

On the same day that the story about a levelling of life expectancy (LE) and possible links to austerity hit the press, the Journal of the Royal Society of Medicine published my letter responding to the paper that sparked a flurry of interest in the same topic earlier this year. That paper was entitled “Why has mortality in England and Wales been increasing? An iterative demographic analysis“.

Irritatingly, I have no budget to pay for open-access publication, so although the original paper is freely available, my letter outlining the paper’s errors is not. In addition, as the JRSM does not publish letters as “advance access”, the points I wished to make have been held back for 3 months. This feels pretty unsatisfactory.

One side of the argument is out there and its refutation is not, which is a shame as the paper has some profound flaws – none of which have stopped the authors from quoting the paper as if it were reliable:

McKee copy

In addition to which, the authors’ response to my letter suggests the critique to make little difference to their conclusions. I am not persuaded.

So here is a summary of what is wrong with its analysis.

  1. The paper claims to be an analysis of age-standardised mortality and makes some play of the need for standardisation. But the data it quotes and illustrates are not standardised. Instead, they are crude mortality data both in the text and in figure 1 of the paper. The proportional changes described at the top of the right hand column of text on the 2nd page are of the crude mortality rates. It is not true, as the paper states, that age-standardised rates are higher than at any point since 2008, as they were higher in 2010 – an error also stated in the abstract. Subsequent analysis is manifestly predicated upon this error.
  2. The methods section does not accurately describe the source or handling of the data. The citations imply the use of standard release data, whereas having run into a brick wall on these, it turns out that the authors used a specific run of data obtained from ONS. How they handled data above the age of 90 is unclear.
  3. The methods section also omits to mention that the comparison of years used mid-year to mid-year data rather than calendar year. This renders the findings of the paper non-replicable if the described method is followed. The labelling of those data in figure 2 is also incorrect as a consequence – they should indicate 2013-14 and 2014-15. Moreover, this very markedly amplifies the winter excess effect of 2014-15. Excess winter deaths data for 2015-16 are now available and show a fall of the excess rate back to the average of recent yearsPicture1It is worth noting here also that the excess of winter deaths in 2014-15 is not unprecedented, but also occurred as similar levels in the 1990s.
  4. Taken together, the errors of non-standardised rates and exaggeration of the winter 2014-15 effect in the overall figures renders dubious the claim of a clear change in trajectory of mortality. When the genuinely age-standardised fall to 2014 is considered it looks like this:Trend 1
    It is only when the 2015 data (which include most of the bad winter effect) are added into the trend that it looks to have stalled:Trend 2
    Visually, the up-tick of the graph changes the way we perceive the data quite markedly.
  5. In the last paragraph of page 2 of the paper it is stated that the supposed trend in deaths “cannot be explained by population ageing”. This is a conclusion that could only genuinely be drawn from analysis of the age-standardised figures. But having drawn that conclusion from what are actually crude data, the authors then present the crude excess of deaths by month in figure 3 against a baseline of average deaths in 2006-2014. No explanation is offered for the curious choice of a 9-year baseline. The annual excess of deaths over the crude average for that period of 9 years is 30,515. However, these should be compared with standardised figures. In fact, the excess age-standardised rate in 2015 when compared with 2006-2014 is -32 per 100,000, which would be equivalent to -18,331 deaths if applied to the mid-year population estimate for that year. Either way, this is an essentially arbitrary analysis since there is no logic to use of a 9-year baseline as the comparator. I note also that the paper states in relation to the excess figure of 30,515 that “calculations of excess deaths in this paper vary slightly due to differences in standardisation in various comparisons”. I think the juxtaposition of that number and statement is misleading, since it implies the figure to be standardised when it is not.

Altogether, the errors in the paper are extensive and fundamental to its entire thesis. They seem to me to kill the assertion that there is yet evidence of a real change in mortality trend. What remains is simply the very substantial excess winter deaths figure for 2014-15. The key point is surely to ask whether there is an underlying trend once the noise of winter variation is removed. I do not believe that has been demonstrated in this paper.

Life expectancy…

Has it stopped rising? From what I can gather even Michael Marmot only really said that a) its rate of increase had slowed and b) that it was worth investigating whether austerity might be involved. None of which stopped the story being way more extreme on both counts, fuelled in some cases by people who really ought to know better.

So what to think? The first thing to remember is that Life Expectancy (LE) is a fairly artificial construct. It doesn’t actually describe the likely lifespan of anyone, given that it is constructed from contemporary risks – as if a baby born today were to experience today’s age specific risks throughout life. As a result, LE as a measure can change rapidly because of contemporary pressures.

Secondly, the excess winter deaths of 2014-15 make a substantial difference to the calculations and visual appearance of the data. We will only really know what is happening with that trend when we can see if that was a one-off event or something more sinister.

Thirdly, the appearance of slowed increase in LE may be more the result of an upward deviation from trend between 2009 and 2011 rather than a subsequent downward deviation, thus:

DFCJfuRW0AE-CnK
Without those better, earlier figures the furthest right points would not appear anomalous, notwithstanding the 2014-15 excess winter deaths.

This sort of pattern also invites speculation that there is a ‘frail survivor’ effect here, whereby older individuals, having avoided earlier insults, are particularly vulnerable when subsequently exposed to e.g. a new circulating viral strain.

On Radio 4 this morning, the ‘Thought for the Day’ speaker reflected the suggestion that we could be reaching a limit to extension of life expectancy. To which it is worth pointing out that people have been predicting that for many years and have found it necessary repeatedly to revise their assumptions (e.g. see Oeppen J, Vaupel JW. Broken Limits to Life Expectancy. Science (80- ). 2002;296(5570):1029–1031.)