Britain’s Welfare State: Benevolence Backfires

A common refrain amongst left-wingers is that the U.S. ought to adopt a more comprehensive welfare state, especially in terms of healthcare, akin to those of Western Europe. The implicit assumption underlying this claim is that these nations are providing a better quality of life for their citizens. While this may in fact be the case for some nations in Western Europe, it would be utterly incorrect in the case of the U.K.; the English welfare state is a failure on virtually every front and is particularly deleterious to those whom it ostensibly seeks to help. The social safety net (what Britons refer to as “social protection”) and public healthcare, the U.K.’s two largest government expenditures, are most often cited as exemplary programs for the U.S., despite their waste, graft and unintended consequences for their beneficiaries.

Following the Great Depression and World War II, William Beveridge, Director of the London School of Economics and Master of University College, Oxford, was commissioned by the Labour Party to write a report detailing the existing state of social insurance. Beveridge’s report, officially named Social Insurance and Allied Security, was published on Nov. 12, 1942, and contained recommendations to eliminate the “five giants on the road of reconstruction” (6): want, disease, ignorance, squalor and idleness. The U.K.’s modern welfare state can be traced back to these specified provisions. To remedy the first of these “five giants”, the English welfare state includes means-tested aid designed to provide incapacity benefits to those who are unable to work as they are more likely to experience “want,” i.e. deprivation.

However, determining who legitimately qualifies for incapacity benefits, i.e. who is physically incapable of working, has proven incredibly challenging for Britain. According to a study published in the University of Pennsylvania Press by Karl De Schweinitz and referenced by James Bartholomew in The Welfare State We’re In (2004), the text upon which this essay is largely predicated, “Out of 2.4 million people claiming incapacity and similar benefits in May 2002, by far the biggest categories were 819,000 people apparently suffering; ‘mental and behavioral disorders,’ a category which includes ‘stress’, and 523,000 people afflicted with ‘diseases of the musculoskeletal system and connective tissue,’ which includes backache” (122). One might be inclined to assume good faith on behalf of the claimants, but this becomes increasingly difficult when one compares the percentage of working-age population on incapacity benefits across Europe: 7% in the U.K., 2% in France, 3% in Spain, and 4% in Germany as of 2002 (70). By tying cash benefits to the inability to work due, in large part, to dubious and frankly unverifiable claims of illness, the U.K. welfare state has encouraged one of its chief architect’s five enemies: idleness. By including unverifiable and minor afflictions such as backache in its criteria for receiving incapacity benefits, the U.K. creates perverse incentives to take advantage of its welfare state.
 

That said, this begs the question as to why “idleness”, i.e. not having a job, is even a bad thing in the first place; maybe we would all be better off not working. While assuming that employment is an important metric for evaluating the success of welfare programs may seem anachronistic and moralistic at first, it turns out to be absolutely crucial for the happiness of Englishmen. Professor David Blanchflower of Dartmouth College and professor Andrew Oswald of Warwick University, using data gathered by the Eurobarometer survey from 1975 to 1998, calculated that the depressing effect of being unemployed — controlling for all other depressing circumstances scrutinized in the report — is 1.18, over two times larger than the second greatest effect: being separated from one’s spouse. Considering the meta-goal of the U.K.’s welfare state is to improve the health and happiness of its citizens, it seems concerning that it encourages a condition so strongly correlated to the antithesis of these characteristics: depression. 

Still, there are certainly those who are legitimately unable to provide for themselves due to infirmity caused by old age, serious illness or simply because, despite their best efforts, they cannot find a job. Proponents of the welfare state might suggest that society is only able to address this problem through government, i.e. an organization with the ability to forcibly tax and redistribute wealth to the needy. Victorian England would beg to differ; it had a non-coercive, non-governmental solution: friendly societies, a.k.a. mutual aid organizations, and charities. From 1803 to 1910, membership increased in such voluntary insurance schemes from 704,350 to 6,600,000 (26), from approximately 7.62% to 18.3% of the entire English population. Even more spectacularly, Bartholomew cites reports from The Times in 1885: “the combined incomes of London charities came to more than the revenues of several European governments”; and, in 1895: “the average middle-class household spent 10 percent of its income on charities.” This level of charitable donation is particularly inspiring when one appreciates that modern Britons contribute a paltry 1% nowadays. So, to those who contend that private individuals wouldn’t contribute to welfare services of their own volition, the historical record says, “Bollocks!”

What’s more, Victorian institutions performed their charitable functions in a manner that did not create perverse incentives to remain unemployed and dependent on the state. Since the church and mutual aid organizations would be made bankrupt by people needlessly taking advantage of them, they had strong motives for making sure that people who claimed they were unable to work were actually infirm and for helping those who were unemployed to find work. Gorsky in “self help and mutual aid: friendly societies in 19th century Britain” cites work done by D.G. Green in “Reinventing Civil Society: The Rediscovery of Welfare Without Politics” to explain how mutual aid organizations in Victorian England “were ‘character-building associations’ whose ‘wholesome influence’ inculcated in their members ‘a commitment to fraternity’, ‘good character’, ‘status and self respect’” and even implemented elaborate initiation rituals to establish this conduct. The work done by these scholars show that Victorian mutual aid organizations were deeply committed to mitigating the moral hazard of being needlessly on the dole instead of contributing to the general coffers to provide aid to those who truly needed it.

Furthermore, these friendly societies would ensure that they didn’t distort incentives by providing just enough to subsist, which, recognizing that more than this would lead to corruption, graft and abuse, is precisely what Beveridge himself called for in his report: “It is, first and foremost, a plan of insurance — of giving in return for contributions benefits up to subsistence level.” John Chodes, writing for the Foundation for Economic Education, offers an example of such subsistence-level support provided by the Union Provident Sick Society: “Sick benefits were roughly 25 to 33 percent of weekly wages for a year, and 15 to 20 percent for the remainder of the illness. For members over the age of 20, contributions and benefits were double. The surplus was divided each December, the members receiving shares in proportion to their contributions.” Membership in such austere yet robust institutions was incredible: according to the Shepherds Friendly Society’s historical record, membership in mutual aid organizations piqued at a whopping 14 million members in the 1940’s out of a total population of approximately 47 million – this number is truly considerable when one recognizes that these societies were composed almost exclusively of male heads of households and would provide insurance for them as well as their nuclear families (figure 5). 

Evidently, the subsistence level of charity did not discourage workers from seeking membership. Instead, the modest benefits encouraged members not to “loaf”, as the British say, but to provide a higher-than-subsistence quality of life for themselves, their families and to fulfil their avowed obligations to fellow society members. This assertion is substantiated by data from the U.K.’s own Office of National Statistics which, while not having data on long-term trends of work-force participation dating back to the 19th century, does reveal that two of the three highest employment rates were before the passage of the National Insurance Act of 1946 which all but crowded out friendly societies from the market: “The highest employment rates recorded were in the years 1872, 1943 and 2018, at 76% of the working age population.”

Moving right along to public healthcare, how has England’s National Health System been performing? Well, to evaluate its level of success, let’s first look at what it sought to achieve at its conception. As the Minister of Health of the Labour Party in 1943, Aneurin Bevan published a pamphlet, entitled National Service for Health, calling for an unprecedented degree of government intervention in the healthcare sector. Their top five goals in public control of healthcare were as follows: “Planned as a whole, so that there are no gaps in it… Preventative as well as curative… Complete, covering all kinds of treatment required… Open to all, irrespective of means or social position… Efficient and up to date” (3). These are undeniably laudable goals and, if they were actually achieved by the U.K. government, I too would be a stalwart supporter of the National Health Service. Sadly, using Bevan’s own criteria, the reality of the English healthcare system is disappointing. 

More than six decades after the inception of the NHS, Bartholomew cites Dr. Maurice Slevin of the London NHS Trust who determined that there were a whopping 269,080 managers, administrators and support staff — a higher number than the 266,170 nurses employed by the NHS (113). So, the NHS is certainly “planned as a whole” but what about the “gaps” part? In terms of the NHS being “preventative as well as curative,” it pales in comparison to the U.S. For example, with regards to screening and treating prostate cancer (the second most prevalent male cancer worldwide), 92% of men diagnosed in the U.S. survive for at least five years whereas only 49% do so in Britain (Ibid, 115). Lest one mistakenly believe that this discrepancy only pertains to prostate cancer, the Centers for Disease Control published a report in 2015 detailing how the U.S. has better five-year survival rates than the U.K. for the most common cancers: female breast (88.6% vs. 81.1%), colon (64.7% vs. 53.8%) and lung cancer (18.7% vs. 9.6%).

In terms of its comprehensiveness, professor Karol Sikora, former head of the World Health Organization’s cancer forum, concluded that “10,000 people a year would be saved from death if Britain matched the average performance of medical services in Europe” (Ibid, 118). Perhaps the NHS, despite its poor quality of service relative to both the U.S. and the rest of Europe, is at least “open to all.” Not even this can be said of the NHS; the King’s Fund found in 2002 that three-quarters of senior healthcare managers across England believed age discrimination occurred (129). One might assume that the NHS nevertheless provides more coverage to more Englishmen than existed before it. But, as the saying goes, assumptions make a donkey out of you and me: Bartholomew details that “many – well over half the population – would pay for their general practitioner through membership of a friendly society” in the late Victorian Age (104). The Welfare State We’re In similarly reveals that, in 1936, private charitable hospitals accounted for 60% of all acute care patients and were funded through voluntary contributions by locals, wealthy philanthropists and parishes (101). 

Finally, the NHS is anything but “up to date” as Bartholomew cites Dr. Colin Connolly’s audit on behalf of the World Health Organization: “About a fifth of the equipment used in cancer treatment is obsolete. More than half the anesthesia machines need replacing… And over half the machines used for intensive care are past their use-by dates” (131). Furthermore, the U.K. made its paradigm-shifting contributions to medical knowledge — some of the most important in world history to this day — before the creation of the NHS: discovery of transmission of cholera through the water supply (1854), Dr. Edward Jenner’s smallpox vaccine (1798), development of anesthesia (1800), antiseptics (1865) and antibiotics (1928). In 2019, only one of the top biotechnology/pharmaceutical firms as defined by market capacity was English, AstraZeneca, while the majority, a whopping fifteen, were American (rankings). A study published by the Brookings Institute on the “Pharmaceutical industry profits and research and development” explains that the profit incentive present in the U.S. is strongly correlated with its higher rates of innovation compared to countries whose governments provide universal healthcare and negotiate prescription drug prices such as the U.K. However, while the U.K. does not produce nearly as many pharmaceuticals as the U.S., according to a report published by the Congressional Ways and Means Committee in September of 2019, its average drug price is $111.52 whereas the U.S. pays a whopping $466.15. In short, Americans are subsidizing prescription drug innovation for the rest of the world. 

It is unfathomable that either Beveridge or Bevan had anything but the most patriotic and benevolent intentions: to utilize the legitimacy, authority and coffer of the English government to provide for the common welfare, specifically to make sure that no Englishman or woman would go without the basic needs of life nor life-saving medical treatment. The disappointing reality of the matter is that the British government does not perform these functions well. Moreover, the Labour Party’s own pamphlet calling for the creation of the NHS plainly stated that “many important health services… and great unofficial services such as the voluntary hospitals, the insurance companies, and all the doctors in private practice remain outside [The Ministry of Health’s] control.” The Labour Party only sought to extend the already high quality of healthcare to more citizens in a coordinated manner through their National Health Service. 

To end on an optimistic note, it just so happens that the English needn’t delegate the important tasks of social security and healthcare to their incompetent government; as Victorian England shows, generous citizens and competent healthcare professionals are more than willing and able to provide these services through charities, mutual aid societies, community organizations, friendly societies and other voluntary mechanisms.