Archive for the ‘Tim Berners-Lee’ Tag

You Only Live Twice – Cool Britannia to Cold Brexit: The United Kingdom, 1999-2019. Part One: Economics, Culture & Society.   Leave a comment

Europe-map-without-UK-012

Cold Shoulder or Warm Handshake?

On 29 March 2019, the United Kingdom of Great Britain and Northern Ireland will leave the European Union after forty-six years of membership, since it joined the European Economic Community on 1 January 1973 on the same day and hour as the Republic of Ireland. Yet in 1999, it looked as if the long-standing debate over Britain’s membership had been resolved. The Maastricht Treaty establishing the European Union had been signed by all the member states of the preceding European Community in February 1992 and was succeeded by a further treaty, signed in Amsterdam in 1999. What, then, has happened in the space of twenty years to so fundamentally change the ‘settled’ view of the British Parliament and people, bearing in mind that both Scotland and Northern Ireland voted to remain in the EU, while England and Wales both voted to leave? At the time of writing, the manner of our going has not yet been determined, but the invocation of ‘article fifty’ by the Westminster Parliament and the UK government means that the date has been set. So either we will have to leave without a deal, turning a cold shoulder to our erstwhile friends and allies on the continent, or we will finally ratify the deal agreed between the EU Commission, on behalf of the twenty-seven remaining member states, and leave with a warm handshake and most of our trading and cultural relations intact.

As yet, the possibility of a second referendum – or third, if we take into account the 1975 referendum, called by Harold Wilson (above) which was also a binary leave/ remain decision – seems remote. In any event, it is quite likely that the result would be the same and would kill off any opportunity of the UK returning to EU membership for at least another generation. As Ian Fleming’s James Bond tells us, ‘you only live twice’. That certainly seems to be the mood in Brussels too. I was too young to vote in 1975 by just five days, and another membership referendum would be unlikely to occur in my lifetime. So much has been said about following ‘the will of the people’, or at least 52% of them, that it would be a foolish government, in an age of rampant populism, that chose to revoke article fifty, even if Westminster voted for this. At the same time, and in that same populist age, we know from recent experience that in politics and international relations, nothing is inevitable…

referendum-ballot-box[1]

One of the major factors in the 2016 Referendum Campaign was the country’s public spending priorities, compared with those of the European Union. The ‘Leave’ campaign sent a double-decker bus around England stating that by ending the UK’s payments into the EU, more than 350 million pounds per week could be redirected to the National Health Service (NHS).

A British Icon Revived – The NHS under New Labour:

To understand the power of this statement, it is important to recognise that the NHS is unique in Europe in that it is wholly funded from direct taxation, and not via National Insurance, as in many other European countries. As a service created in 1948 to be ‘free at the point of delivery’, it is seen as a ‘British icon’ and funding has been a central issue in national election campaigns since 2001, when Tony Blair was confronted by an irate voter, Sharon Storer, outside a hospital. In its first election manifesto of 1997, ‘New Labour’ promised to safeguard the basic principles of the NHS, which we founded. The ‘we’ here was the post-war Labour government, whose socialist Health Minister, Aneurin Bevan, had established the service in the teeth of considerable opposition from within both parliament and the medical profession. ‘New Labour’ protested that under the Tories there had been fifty thousand fewer nurses but a rise of no fewer than twenty thousand managers – red tape which Labour would pull away and burn. Though critical of the internal markets the Tories had introduced, Blair promised to keep a split between those who commissioned health services and those who provided them.

001

Under Frank Dobson, Labour’s new Health Secretary, there was little reform of the NHS but there was, year by year, just enough extra money to stave off the winter crises. But then a series of tragic individual cases hit the headlines, and one of them came from a Labour peer and well-known medical scientist and fertility expert, Professor Robert Winston, who was greatly admired by Tony Blair. He launched a furious denunciation of the government over the treatment of his elderly mother. Far from upholding the NHS’s iconic status, Winston said that Britain’s health service was the worst in Europe and was getting worse under the New Labour government, which was being deceitful about the true picture. Labour’s polling on the issue showed that Winston was, in general terms, correct in his assessment in the view of the country as a whole. In January 2000, therefore, Blair announced directly to it that he would bring Britain’s health spending up to the European average within five years. That was a huge promise because it meant spending a third as much again in real terms, and his ‘prudent’ Chancellor of the Exchequer, Gordon Brown, was unhappy that Blair had not spoken enough on television about the need for health service reform to accompany the money, and had also ‘stolen’ his budget announcements. On Budget day itself, Brown announced that until 2004 health spending would rise at above six per cent beyond inflation every year, …

… by far the largest sustained increase in NHS funding in any period in its fifty-year history … half as much again for health care for every family in this country.       

The tilt away from Brown’s sharp spending controls during the first three years of the New Labour government had begun by the first spring of the new millennium, and there was more to come. With a general election looming in 2001, Brown also announced a review of the NHS and its future by a former banker. As soon as the election was over, broad hints about necessary tax rises were dropped. When the Wanless Report was finally published, it confirmed much that the winter crisis of 1999-2000 had exposed. The NHS was not, whatever Britons fondly believed, better than health systems in other developed countries, and it needed a lot more money. ‘Wanless’ also rejected a radical change in funding, such as a switch to insurance-based or semi-private health care. Brown immediately used this as objective proof that taxes had to rise in order to save the NHS. In his next budget of 2002, Brown broke with a political convention that which had reigned since the mid-eighties, that direct taxes would not be raised again. He raised a special one per cent national insurance levy, equivalent to a penny on income tax, to fund the huge reinvestment in Britain’s health.

Public spending shot up with this commitment and, in some ways, it paid off, since by 2006 there were around 300,000 extra NHS staff compared to 1997. That included more than ten thousand extra senior hospital doctors (about a quarter more) and 85,000 more nurses. But there were also nearly forty thousand managers, twice as many as Blair and Brown had ridiculed the Tory government for hiring. An ambitious computer project for the whole NHS became an expensive catastrophe. Meanwhile, the health service budget rose from thirty-seven billion to more than ninety-two billion a year. But the investment produced results, with waiting lists, a source of great public anger from the mid-nineties, falling by 200,000. By 2005, Blair was able to talk of the best waiting list figures since 1988. Hardly anyone was left waiting for an inpatient appointment for more than six months. Death rates from cancer for people under the age of seventy-five fell by 15.7 per cent between 1996 and 2006 and death rates from heart disease fell by just under thirty-six per cent. Meanwhile, the public finance initiative meant that new hospitals were being built around the country. But, unfortunately for New Labour, that was not the whole story of the Health Service under their stewardship. As Andrew Marr has attested,

…’Czars’, quangos, agencies, commissions, access teams and planners hunched over the NHS as Whitehall, having promised to devolve power, now imposed a new round of mind-dazing control.

By the autumn of 2004 hospitals were subject to more than a hundred inspections. War broke out between Brown and the Treasury and the ‘Blairite’ Health Secretary, Alan Milburn, about the basic principles of running the hospitals. Milburn wanted more competition between them, but Brown didn’t see how this was possible when most people had only one major local hospital. Polling suggested that he was making a popular point. Most people simply wanted better hospitals, not more choice. A truce was eventually declared with the establishment of a small number of independent, ‘foundation’ hospitals. By the 2005 general election, Michael Howard’s Conservatives were attacking Labour for wasting money and allowing people’s lives to be put at risk in dirty, badly run hospitals. Just like Labour once had, they were promising to cut bureaucracy and the number of organisations within the NHS. By the summer of 2006, despite the huge injection of funds, the Service was facing a cash crisis. Although the shortfall was not huge as a percentage of the total budget, trusts in some of the most vulnerable parts of the country were on the edge of bankruptcy, from Hartlepool to Cornwall and across to London. Throughout Britain, seven thousand jobs had gone and the Royal College of Nursing, the professional association to which most nurses belonged, was predicting thirteen thousand more would go soon. Many newly and expensively qualified doctors and even specialist consultants could not find work. It seemed that wage costs, expensive new drugs, poor management and the money poured into endless bureaucratic reforms had resulted in a still inadequate service. Bupa, the leading private operator, had been covering some 2.3 million people in 1999. Six years later, the figure was more than eight million. This partly reflected greater affluence, but it was also hardly a resounding vote of confidence in Labour’s management of the NHS.

Public Spending, Declining Regions & Economic Development:

As public spending had begun to flow during the second Blair administration, vast amounts of money had gone in pay rises, new bureaucracies and on bills for outside consultants. Ministries had been unused to spending again, after the initial period of ‘prudence’, and did not always do it well. Brown and his Treasury team resorted to double and triple counting of early spending increases in order to give the impression they were doing more for hospitals, schools and transport than they actually could. As Marr has pointed out, …

… In trying to achieve better policing, more effective planning, healthier school food, prettier town centres and a hundred other hopes, the centre of government ordered and cajoled, hassled and harangued, always high-minded, always speaking for ‘the people’.  

The railways, after yet another disaster, were shaken up again. In very controversial circumstances Railtrack, the once-profitable monopoly company operating the lines, was driven to bankruptcy and a new system of Whitehall control was imposed. At one point, Tony Blair boasted of having five hundred targets for the public sector. Parish councils, small businesses and charities found that they were loaded with directives. Schools and hospitals had many more. Marr has commented, …

The interference was always well-meant but it clogged up the arteries of free decision-taking and frustrated responsible public life. 

002

Throughout the New Labour years, with steady growth and low inflation, most of the country grew richer. Growth since 1997, at 2.8 per cent per year, was above the post-war average, GDP per head was above that of France and Germany and the country had the second lowest jobless figures in the EU. The number of people in work increased by 2.4 million. Incomes grew, in real terms, by about a fifth. Pensions were in trouble, but house price inflation soured, so the owners found their properties more than doubling in value and came to think of themselves as prosperous. By 2006 analysts were assessing the disposable wealth of the British at forty thousand pounds per household. However, the wealth was not spread geographically, averaging sixty-eight thousand in the south-east of England, but a little over thirty thousand in Wales and north-east England (see map above). But even in the historically poorer parts of the UK house prices had risen fast, so much so that government plans to bulldoze worthless northern terraces had to be abandoned when they started to regain value. Cheap mortgages, easy borrowing and high property prices meant that millions of people felt far better off, despite the overall rise in the tax burden. Cheap air travel gave the British opportunities for easy travel both to traditional resorts and also to every part of the European continent. British expatriates were able to buy properties across the French countryside and in southern Spain. Some even began to commute weekly to jobs in London or Manchester from Mediterranean villas, and regional airports boomed as a result.

Sir Tim Berners Lee arriving at the Guildhall to receive the Honorary Freedom of the City of LondonThe internet, also known as the ‘World-Wide Web’, which was ‘invented’ by the British computer scientist Tim Berners-Lee at the end of 1989 (pictured right in 2014), was advancing from the colleges and institutions into everyday life by the mid- ‘noughties’. It first began to attract popular interest in the mid-nineties: Britain’s first internet café and magazine, reviewing a few hundred early websites, were both launched in 1994. The following year saw the beginning of internet shopping as a major pastime, with both ‘eBay’ and ‘Amazon’ arriving, though to begin with they only attracted tiny numbers of people.

But the introduction of new forms of mail-order and ‘click and collect’ shopping quickly attracted significant adherents from different ‘demographics’.  The growth of the internet led to a feeling of optimism, despite warnings that the whole digital world would collapse because of the inability of computers to cope with the last two digits in the year ‘2000’, which were taken seriously at the time. In fact, the ‘dot-com’ bubble was burst by its own excessive expansion, as with any bubble, and following a pause and a lot of ruined dreams, the ‘new economy’ roared on again. By 2000, according to the Office of National Statistics (ONS), around forty per cent of Britons had accessed the internet at some time. Three years later, nearly half of British homes were ‘online’. By 2004, the spread of ‘broadband’ connections had brought a new mass market in ‘downloading’ music and video. By 2006, three-quarters of British children had internet access at home.

001

Simultaneously, the rich of America, Europe and Russia began buying up parts of London, and then other ‘attractive’ parts of the country, including Edinburgh, the Scottish Highlands, Yorkshire and Cornwall. ‘Executive housing’ with pebbled driveways, brick facing and dormer windows, was growing across farmland and by rivers with no thought of flood-plain constraints. Parts of the country far from London, such as the English south-west and Yorkshire, enjoyed a ripple of wealth that pushed their house prices to unheard-of levels. From Leith to Gateshead, Belfast to Cardiff Bay, once-derelict shorefront areas were transformed. The nineteenth-century buildings in the Albert Dock in Liverpool (above) now house a maritime museum, an art gallery, shopping centre and television studio. It has also become a tourist attraction. For all the problems and disappointments, and the longer-term problems with their financing, new schools and public buildings sprang up – new museums, galleries, vast shopping complexes (see below), corporate headquarters in a biomorphic architecture of glass and steel, more imaginative and better-looking than their predecessors from the dreary age of concrete.

002

Supermarket chains exercised huge market power, offering cheap meat and dairy products into almost everyone’s budgets. Factory-made ready-meals were transported and imported by the new global air freight market and refrigerated trucks and lorries moving freely across a Europe shorn of internal barriers. Out-of-season fruit and vegetables, fish from the Pacific, exotic foods of all kinds and freshly cut flowers appeared in superstores everywhere. Hardly anyone was out of reach of a ‘Tesco’, a ‘Morrison’s’, a ‘Sainsbury’s’ or an ‘Asda’. By the mid-noughties, the four supermarket giants owned more than 1,500 superstores throughout the UK. They spread the consumption of goods that in the eighties and nineties had seemed like luxuries. Students had to take out loans in order to go to university but were far more likely to do so than previous generations, as well as to travel more widely on a ‘gap’ year, not just to study or work abroad.

Those ‘Left Behind’ – Poverty, Pensions & Public Order:

Materially, for the majority of people, this was, to use Marr’s term, a ‘golden age’, which perhaps helps to explain both why earlier real anger about earlier pension decisions and stealth taxes did not translate into anti-Labour voting in successive general elections. The irony is that in pleasing ‘Middle Englanders’, the Blair-Brown government lost contact with traditional Labour voters, especially in the North of Britain, who did not benefit from these ‘golden years’ to the same extent. Gordon Brown, from the first, made much of New Labour’s anti-poverty agenda, and especially child poverty. Since the launch of the Child Poverty Action Group, this latter problem had become particularly emotive. Labour policies took a million children out of relative poverty between 1997 and 2004, though the numbers rose again later. Brown’s emphasis was on the working poor and the virtue of work. So his major innovations were the national minimum wage, the ‘New Deal’ for the young unemployed, and the working families’ tax credit, as well as tax credits aimed at children. There was also a minimum income guarantee and a later pension credit, for poorer pensioners.

The minimum wage was first set at three pounds sixty an hour, rising year by year. In 2006 it was 5.35 an hour. Because the figures were low, it did not destroy the two million jobs as the Tories claimed it would. Neither did it produce higher inflation; employment continued to grow while inflation remained low. It even seemed to have cut red tape. By the mid-noughties, the minimum wage covered two million people, the majority of them women. Because it was updated ahead of rises in inflation rates, the wages of the poor also rose faster. It was so successful that even the Tories were forced to embrace it ahead of the 2005 election. The New Deal was funded by a windfall tax on privatised utility companies, and by 2000 Blair said it had helped a quarter of a million young people back into work, and it was being claimed as a major factor in lower rates of unemployment as late as 2005. But the National Audit Office, looking back on its effect in the first parliament, reckoned the number of under twenty-five-year-olds helped into real jobs was as low as 25,000, at a cost per person of eight thousand pounds. A second initiative was targeted at the babies and toddlers of the most deprived families. ‘Sure Start’ was meant to bring mothers together in family centres across Britain – 3,500 were planned for 2010, ten years after the scheme had been launched – and to help them to become more effective parents. However, some of the most deprived families failed to show up. As Andrew Marr wrote, back in 2007:

Poverty is hard to define, easy to smell. In a country like Britain, it is mostly relative. Though there are a few thousand people living rough or who genuinely do not have enough to keep them decently alive, and many more pensioners frightened of how they will pay for heating, the greater number of poor are those left behind the general material improvement in life. This is measured by income compared to the average and by this yardstick in 1997 there were three to four million children living in households of relative poverty, triple the number in 1979. This does not mean they were physically worse off than the children of the late seventies, since the country generally became much richer. But human happiness relates to how we see ourselves relative to those around us, so it was certainly real. 

The Tories, now under new management in the shape of a media-marketing executive and old Etonian, David Cameron, also declared that they believed in this concept of relative poverty. After all, it was on their watch, during the Thatcher and Major governments, that it had tripled, which is why it was only towards the end of the New Labour governments that they could accept the definition of the left-of-centre Guardian columnist, Polly Toynbee. A world of ‘black economy’ work also remained below the minimum wage, in private care homes, where migrant servants were exploited, and in other nooks and crannies. Some 336,000 jobs remained on ‘poverty pay’ rates. Yet ‘redistribution of wealth’, a socialist phrase which had become unfashionable under New Labour lest it should scare away middle Englanders, was stronger in Brown’s Britain than in other major industrialised nations. Despite the growth of the super-rich, many of whom were immigrants anyway, overall equality increased in these years. One factor in this was the return to the means-testing of benefits, particularly for pensioners and through the working families’ tax credit, subsequently divided into a child tax credit and a working tax credit. This was a U-turn by Gordon Brown, who had opposed means-testing when in Opposition. As Chancellor, he concluded that if he was to direct scarce resources at those in real poverty, he had little choice.

Apart from the demoralising effect it had on pensioners, the other drawback to means-testing was that a huge bureaucracy was needed to track people’s earnings and to try to establish exactly what they should be getting in benefits. Billions were overpaid and as people did better and earned more from more stable employment, they then found themselves facing huge demands to hand back the money they had already spent. Thousands of extra civil servants were needed to deal with the subsequent complaints and the scheme became extremely expensive to administer. There were also controversial drives to oblige more disabled people back to work, and the ‘socially excluded’ were confronted by a range of initiatives designed to make them more middle class. Compared with Mrs Thatcher’s Victorian Values and Mr Major’s Back to Basics campaigns, Labour was supposed to be non-judgemental about individual behaviour. But a form of moralism did begin to reassert itself. Parenting classes were sometimes mandated through the courts and for the minority who made life hell for their neighbours on housing estates, Labour introduced the Anti-Social Behaviour Order (‘Asbo’). These were first given out in 1998, granted by magistrates to either the police or the local council. It became a criminal offence to break the curfew or other sanction, which could be highly specific. Asbos could be given out for swearing at others in the street, harassing passers-by, vandalism, making too much noise, graffiti, organising ‘raves’, flyposting, taking drugs, sniffing glue, joyriding, prostitution, hitting people and drinking in public.

001 (2)

Although they served a useful purpose in many cases, there were fears that for the really rough elements in society and their tough children they became a badge of honour. Since breaking an Asbo could result in an automatic prison sentence, people were sent to jail for crimes that had not warranted this before. But as they were refined in use and strengthened, they became more effective and routine. By 2007, seven and a half thousand had been given out in England and Wales alone and Scotland had introduced its own version in 2004. Some civil liberties campaigners saw this development as part of a wider authoritarian and surveillance agenda which also led to the widespread use of CCTV (Closed Circuit Television) cameras by the police and private security guards, especially in town centres (see above). Also in 2007, it was estimated that the British were being observed and recorded by 4.2 million such cameras. That amounted to one camera for every fourteen people, a higher ratio than for any other country in the world, with the possible exception of China. In addition, the number of mobile phones was already equivalent to the number of people in Britain. With global satellite positioning chips (GPS) these could show exactly where their users were and the use of such systems in cars and even out on the moors meant that Britons were losing their age-old prowess for map-reading.

002003

The ‘Seven Seven’ Bombings – The Home-grown ‘Jihadis’:

Despite these increasing means of mass surveillance, Britain’s cities have remained vulnerable to terrorist attacks, more recently by so-called ‘Islamic terrorists’ rather than by the Provisional IRA, who abandoned their bombing campaign in 1998. On 7 July 2005, at rush-hour, four young Muslim men from West Yorkshire and Buckinghamshire, murdered fifty-two people and injured 770 others by blowing themselves up on London Underground trains and on a London bus. The report into this worst such attack in Britain later concluded that they were not part of an al Qaeda cell, though two of them had visited camps in Pakistan, and that the rucksack bombs had been constructed at the cost of a few hundred pounds. Despite the government’s insistence that the war in Iraq had not made Britain more of a target for terrorism, the Home Office investigation asserted that the four had been motivated, in part at least, by ‘British foreign policy’.

They had picked up the information they needed for the attack from the internet. It was a particularly grotesque attack, because of the terrifying and bloody conditions in the underground tunnels and it vividly reminded the country that it was as much a target as the United States or Spain. Indeed, the long-standing and intimate relationship between Great Britain and Pakistan, with constant and heavy air traffic between them, provoked fears that the British would prove uniquely vulnerable. Tony Blair heard of the attack at the most poignant time, just following London’s great success in winning the bid to host the 2012 Olympic Games (see above). The ‘Seven Seven’ bombings are unlikely to have been stopped by CCTV surveillance, of which there was plenty at the tube stations, nor by ID cards (which had recently been under discussion), since the killers were British subjects, nor by financial surveillance, since little money was involved and the materials were paid for in cash. Even better intelligence might have helped, but the Security Services, both ‘MI5’ and ‘MI6’ as they are known, were already in receipt of huge increases in their budgets, as they were in the process of tracking down other murderous cells. In 2005, police arrested suspects in Birmingham, High Wycombe and Walthamstow, in east London, believing there was a plot to blow up as many as ten passenger aircraft over the Atlantic.

After many years of allowing dissident clerics and activists from the Middle East asylum in London, Britain had more than its share of inflammatory and dangerous extremists, who admired al Qaeda and preached violent jihad. Once 11 September 2001 had changed the climate, new laws were introduced to allow the detention without trial of foreigners suspected of being involved in supporting or fomenting terrorism. They could not be deported because human rights legislation forbade sending back anyone to countries where they might face torture. Seventeen were picked up and held at Belmarsh high-security prison. But in December 2004, the House of Lords ruled that these detentions were discriminatory and disproportionate, and therefore illegal. Five weeks later, the Home Secretary Charles Clarke hit back with ‘control orders’ to limit the movement of men he could not prosecute or deport. These orders would also be used against home-grown terror suspects. A month later, in February 2005, sixty Labour MPs rebelled against these powers too, and the government only narrowly survived the vote. In April 2006 a judge ruled that the control orders were an affront to justice because they gave the Home Secretary, a politician, too much power. Two months later, the same judge ruled that curfew orders of eighteen hours per day on six Iraqis were a deprivation of liberty and also illegal. The new Home Secretary, John Reid, lost his appeal and had to loosen the orders.

006

Britain found itself in a struggle between its old laws and liberties and a new, borderless world in which the hallowed principles of ‘habeas corpus’, free speech, a presumption of innocence, asylum, the right of British subjects to travel freely in their own country without identifying papers, and the sanctity of homes in which the law-abiding lived were all coming under increasing jeopardy. The new political powers seemed to government ministers the least that they needed to deal with a threat that might last for another thirty years in order, paradoxically, to secure Britain’s liberties for the long-term beyond that. They were sure that most British people agreed, and that the judiciary, media, civil rights campaigners and elected politicians who protested were an ultra-liberal minority. Tony Blair, John Reid and Jack Straw were emphatic about this, and it was left to liberal Conservatives and the Liberal Democrats to mount the barricades in defence of civil liberties. Andrew Marr conceded at the time that the New Labour ministers were ‘probably right’. With the benefit of hindsight, others will probably agree. As Gordon Brown eyed the premiership, his rhetoric was similarly tough, but as Blair was forced to turn to the ‘war on terror’ and Iraq, he failed to concentrate enough on domestic policy. By 2005, neither of them could be bothered to disguise their mutual enmity, as pictured above. A gap seemed to open up between Blair’s enthusiasm for market ideas in the reform of health and schools, and Brown’s determination to deliver better lives for the working poor. Brown was also keen on bringing private capital into public services, but there was a difference in emphasis which both men played up. Blair claimed that the New Labour government was best when we are at our boldest. But Brown retorted that it was best when we are Labour. 

002 (2)

Tony Blair’s legacy continued to be paraded on the streets of Britain,

here blaming him and George Bush for the rise of ‘Islamic State’ in Iraq.

Asylum Seekers, EU ‘Guest’ Workers & Immigrants:

One result of the long Iraqi conflict, which President Bush finally declared to be over on 1 May 2003, was the arrival of many Iraqi asylum-seekers in Britain; Kurds, as well as Shiites and Sunnis. This attracted little comment at the time because there had been both Iraqi and Iranian refugees in Britain since the 1970s, especially as students and the fresh influx were only a small part of a much larger migration into the country which changed it fundamentally during the Blair years. This was a multi-lingual migration, including many Poles, some Hungarians and other Eastern Europeans whose countries had joined the EU and its single market in 2004. When the EU expanded Britain decided that, unlike France or Germany, it would not try to delay opening the country to migrant workers. The accession treaties gave nationals from these countries the right to freedom of movement and settlement, and with average earnings three times higher in the UK, this was a benefit which the Eastern Europeans were keen to take advantage of. Some member states, however, exercised their right to ‘derogation’ from the treaties, whereby they would only permit migrant workers to be employed if employers were unable to find a local candidate. In terms of European Union legislation, a derogation or that a member state has opted not to enforce a specific provision in a treaty due to internal circumstances (typically a state of emergency), and to delay full implementation of the treaty for five years. The UK decided not to exercise this option.

There were also sizeable inflows of western Europeans, though these were mostly students, who (somewhat controversially) were also counted in the immigration statistics, and young professionals with multi-national companies. At the same time, there was continued immigration from Africa, the Middle East and Afghanistan, as well as from Russia, Australia, South Africa and North America. In 2005, according to the Office for National Statistics, ‘immigrants’ were arriving to live in Britain at the rate of 1,500 a day. Since Tony Blair had been in power, more than 1.3 million had arrived. By the mid-2000s, English was no longer the first language of half the primary school children in London, and the capital had more than 350 different first languages. Five years later, the same could be said of many towns in Kent and other Eastern counties of England.

The poorer of the new migrant groups were almost entirely unrepresented in politics, but radically changed the sights, sounds and scents of urban Britain, and even some of its market towns. The veiled women of the Muslim world or its more traditionalist Arab, Afghan and Pakistani quarters became common sights on the streets, from Kent to Scotland and across to South Wales. Polish tradesmen, fruit-pickers and factory workers were soon followed by shops owned by Poles or stocking Polish and East European delicacies and selling Polish newspapers and magazines. Even road signs appeared in Polish, though in Kent these were mainly put in place along trucking routes used by Polish drivers, where for many years signs had been in French and German, a recognition of the employment changes in the long-distance haulage industry. Even as far north as Cheshire (see below), these were put in place to help monolingual truckers using trunk roads, rather than local Polish residents, most of whom had enough English to understand such signs either upon arrival or shortly afterwards. Although specialist classes in English had to be laid on in schools and community centres, there was little evidence that the impact of multi-lingual migrants had a long-term impact on local children and wider communities. In fact, schools were soon reporting a positive impact in terms of their attitudes toward learning and in improving general educational standards.

001

Problems were posed, however, by the operations of people smugglers and criminal gangs. Chinese villagers were involved in a particular tragedy when nineteen of them were caught while cockle-picking in Morecambe Bay by the notorious tides and drowned. Many more were working for ‘gang-masters’ as virtual, in some cases actual ‘slaves’. Russian voices became common on the London Underground, and among prostitutes on the streets. The British Isles found themselves to be ‘islands in the stream’ of international migration, the chosen ‘sceptred isle’ destinations of millions of newcomers. Unlike Germany, Britain was no longer a dominant manufacturing country but had rather become, by the late twentieth century, a popular place to develop digital and financial products and services. Together with the United States and against the Soviet Union, it was determined to preserve a system of representative democracy and the free market. Within the EU, Britain maintained its earlier determination to resist the Franco-German federalist model, with its ‘social chapter’ involving ever tighter controls over international corporations and ever closer political union. Britain had always gone out into the world. Now, increasingly, the world came to Britain, whether poor immigrants, rich corporations or Chinese manufacturers.

005

Multilingual & Multicultural Britain:

Immigration had always been a constant factor in British life, now it was also a fact of life which Europe and the whole world had to come to terms with. Earlier post-war migrations to Britain had provoked a racialist backlash, riots, the rise of extreme right-wing organisations and a series of new laws aimed at controlling it. New laws had been passed to control both immigration from the Commonwealth and the backlash to it. The later migrations were controversial in different ways. The ‘Windrush’ arrivals from the Caribbean and those from the Indian subcontinent were people who looked different but who spoke the same language and in many ways had had a similar education to that of the ‘native’ British. Many of the later migrants from Eastern Europe looked similar to the white British but shared little by way of a common linguistic and cultural background. However, it’s not entirely true to suggest, as Andrew Marr seems to, that they did not have a shared history. Certainly, through no fault of their own, the Eastern Europeans had been cut off from their western counterparts by their absorption into the Soviet Russian Empire after the Second World War, but in the first half of the century, Poland had helped the British Empire to subdue its greatest rival, Germany, as had most of the peoples of the former Yugoslavia. Even during the Soviet ‘occupation’ of these countries, many of their citizens had found refuge in Britain.

Moreover, by the early 1990s, Britain had already become both a multilingual nation. In 1991, Safder Alladina and Viv Edwards published a book for the Longman Linguistics Library which detailed the Hungarian, Lithuanian, Polish, Ukrainian and Yiddish speech communities of previous generations. Growing up in Birmingham, I certainly heard many Polish, Yiddish, Yugoslav and Greek accents among my neighbours and parents of school friends, at least as often as I heard Welsh, Irish, Caribbean, Indian and Pakistani accents. The Longman book begins with a foreword by Debi Prasanna Pattanayak in which she stated that the Language Census of 1987 had shown that there were 172 different languages spoken by children in the schools of the Inner London Education Authority. In an interesting precursor of the controversy to come, she related how the reaction in many quarters was stunned disbelief, and how one British educationalist had told her that England had become a third world country. She commented:

After believing in the supremacy of English as the universal language, it was difficult to acknowledge that the UK was now one of the greatest immigrant nations of the modern world. It was also hard to see that the current plurality is based on a continuity of heritage. … Britain is on the crossroads. It can take an isolationist stance in relation to its internal cultural environment. It can create a resilient society by trusting its citizens to be British not only in political but in cultural terms. The first road will mean severing dialogue with the many heritages which have made the country fertile. The second road would be working together with cultural harmony for the betterment of the country. Sharing and participation would ensure not only political but cultural democracy. The choice is between mediocrity and creativity.

002

Language and dialect in the British Isles, showing the linguistic diversity in many English cities by 1991 as a result of Commonwealth immigration as well as the survival and revival of many of the older Celtic languages and dialects of English.

Such ‘liberal’, ‘multi-cultural’ views may be unfashionable now, more than a quarter of a century later, but it is perhaps worth stopping to look back on that cultural crossroads, and on whether we are now back at that same crossroads, or have arrived at another one. By the 1990s, the multilingual setting in which new Englishes evolved had become far more diverse than it had been in the 1940s, due to immigration from the Indian subcontinent, the Caribbean, the Far East, and West and East Africa. The largest of the ‘community languages’ was Punjabi, with over half a million speakers, but there were also substantial communities of Gujurati speakers (perhaps a third of a million) and a hundred thousand Bengali speakers. In some areas, such as East London, public signs and notices recognise this (see below). Bengali-speaking children formed the most recent and largest linguistic minority within the ILEA and because the majority of them had been born in Bangladesh, they were inevitably in the greatest need of language support within the schools. A new level of linguistic and cultural diversity was introduced through Commonwealth immigration.

003

007

Birmingham’s booming postwar economy attracted West Indian settlers from Jamaica, Barbados and St Kitts in the 1950s. By 1971, the South Asian and West Indian populations were equal in size and concentrated in the inner city wards of North and Central Birmingham (see the map above).  After the hostility towards New Commonwealth immigrants in some sections of the local White populations in the 1960s and ’70s, they had become more established in cities like Birmingham, where places of worship, ethnic groceries, butchers and, perhaps most significantly, ‘balti’ restaurants, began to proliferate in the 1980s and ’90s. The settlers materially changed the cultural and social life of the city, most of the ‘white’ population believing that these changes were for the better. By 1991, Pakistanis had overtaken West Indians and Indians to become the largest single ethnic minority in Birmingham. The concentration of West Indian and South Asian British people in the inner city areas changed little by the end of the century, though there was an evident flight to the suburbs by Indians. As well as being poorly-paid, the factory work available to South Asian immigrants like the man in a Bradford textile factory below, was unskilled. By the early nineties, the decline of the textile industry over the previous two decades had let to high long-term unemployment in the immigrant communities in the Northern towns, leading to serious social problems.

006

Nor is it entirely true to suggest that, as referred to above, Caribbean arrivals in Britain faced few linguistic obstacles integrating themselves into British life from the late 1940s to the late 1980s. By the end of these forty years, the British West Indian community had developed its own “patois”, which had a special place as a token of identity. One Jamaican schoolgirl living in London in the late eighties explained the social pressures that frowned on Jamaican English in Jamaica, but which made it almost obligatory in London. She wasn’t allowed to speak Jamaican Creole in front of her parents in Jamaica. When she arrived in Britain and went to school, she naturally tried to fit in by speaking the same patois, but some of her British Caribbean classmates told her that, as a “foreigner”, she should not try to be like them, and should speak only English. But she persevered with the patois and lost her British accent after a year and was accepted by her classmates. But for many Caribbean visitors to Britain, the patois of Brixton and Notting Hill was a stylized form that was not truly Jamaican, not least because British West Indians had come from all parts of the Caribbean. When another British West Indian girl, born in Britain, was taken to visit Jamaica, she found herself being teased about her London patois and told to speak English.

003

The predicament that still faced the ‘Black British’ in the late eighties and into the nineties was that, for all the rhetoric, they were still not fully accepted by the established ‘White community’. Racism was still an everyday reality for large numbers of British people. There was plenty of evidence of the ways in which Black people were systematically denied access to employment in all sections of the job market.  The fact that a racist calamity like the murder in London of the black teenager Stephen Lawrence could happen in 1993 was a testimony to how little had changed in British society’s inability to face up to racism since the 1950s. As a result, the British-Caribbean population could still not feel itself to be neither fully British. This was the poignant outcome of what the British Black writer Caryl Phillips has called “The Final Passage”, the title of his novel which is narrated in Standard English with the direct speech by the characters rendered in Creole. Phillips migrated to Britain as a baby with his parents in the 1950s, and sums up his linguistic and cultural experience as follows:

“The paradox of my situation is that where most immigrants have to learn a new language, Caribbean immigrants have to learn a new form of the same language. It induces linguistic shizophrenia – you have an identity that mirrors the larger cultural confusion.”

One of his older characters in The Final Passage characterises “England” as a “college for the West Indian”, and, as Philipps himself put it, that is “symptomatic of the colonial situation; the language is divided as well”.  As the “Windrush Scandal”, involving the deportation of British West Indians from the UK has recently shown, this post-colonial “cultural confusion” still ‘colours’ political and institutional attitudes twenty-five years after the death of Stephen Lawrence, leading to discriminatory judgements by officials. This example shows how difficult it is to arrive at some kind of chronological classification of migrations to Britain into the period of economic expansion of the 1950s and 1960s; the asylum-seekers of the 1970s and 1980s; and the EU expansion and integration in the 1990s and the first decades of the 2000s. This approach assumed stereotypical patterns of settlement for the different groups, whereas the reality was much more diverse. Most South Asians, for example, arrived in Britain in the post-war period but they were joining a migration ‘chain’ which had been established at the beginning of the twentieth century. Similarly, most Eastern European migrants arrived in Britain in several quite distinct waves of population movement. This led the authors of the Longman Linguistics book to organise it into geolinguistic areas, as shown in the figure below:

001

The Poles and Ukrainians of the immediate post-war period, the Hungarians in the 1950s, the Vietnamese refugees in the 1970s and the Tamils in the 1980s, sought asylum in Britain as refugees. In contrast, settlers from India, Pakistan, Bangladesh and the Caribbean, had, in the main come from areas of high unemployment and/or low wages, for economic reasons. It was not possible, even then, to make a simple split between political and economic migrants since, even within the same group, motivations differed through time. The Eastern Europeans who had arrived in Britain since the Second World War had come for a variety of reasons; in many cases, they were joining earlier settlers trying either to escape poverty in the home country or to better their lot. A further important factor in the discussion about the various minority communities in Britain was the pattern of settlement. Some groups were concentrated into a relatively small geographical area which made it possible to develop and maintain strong social networks; others were more dispersed and so found it more difficult to maintain a sense of community. Most Spaniards, Turks and Greeks were found in London, whereas Ukrainians and Poles were scattered throughout the country. In the case of the Poles, the communities outside London were sufficiently large to be able to sustain an active community life; in the case of Ukrainians, however, the small numbers and the dispersed nature of the community made the task of forging a separate linguistic and cultural identity a great deal more difficult.

Groups who had little contact with the home country also faced very real difficulties in retaining their distinct identities. Until 1992, Lithuanians, Latvians, Ukrainians and Estonians were unable to travel freely to their country of origin; neither could they receive visits from family members left behind; until the mid-noughties, there was no possibility of new immigration which would have the effect of revitalizing these communities in Britain. Nonetheless, they showed great resilience in maintaining their ethnic minority, not only through community involvement in the UK but by building links with similar groups in Europe and even in North America. The inevitable consequence of settlement in Britain was a shift from the mother tongue to English. The extent of this shift varied according to individual factors such as the degree of identification with the mother tongue culture; it also depended on group factors such as the size of the community, its degree of self-organisation and the length of time it had been established in Britain. For more recently arrived communities such as the Bangladeshis, the acquisition of English was clearly a more urgent priority than the maintenance of the mother tongue, whereas, for the settled Eastern Europeans, the shift to English was so complete that mother tongue teaching was often a more urgent community priority. There were reports of British-born Ukrainians and Yiddish-speaking Jews who were brought up in predominantly English-speaking homes who were striving to produce an environment in which their children could acquire their ‘heritage’ language.

Blair’s Open Door Policy & EU Freedom of Movement:

During the 1980s and ’90s, under the ‘rubric’ of multiculturalism, a steady stream of immigration into Britain continued, especially from the Indian subcontinent. But an unspoken consensus existed whereby immigration, while always gradually increasing, was controlled. What happened after the Labour Party’s landslide victory in 1997 was a breaking of that consensus, according to Douglas Murray, the author of the recent (2017) book, The Strange Death of Europe. He argues that once in power, Tony Blair’s government oversaw an opening of the borders on a scale unparalleled even in the post-war decades. His government abolished the ‘primary purpose rule’, which had been used as a filter out bogus marriage applications. The borders were opened to anyone deemed essential to the British economy, a definition so broad that it included restaurant workers as ‘skilled labourers’. And as well as opening the door to the rest of the world, they opened the door to the new EU member states after 2004. It was the effects of all of this, and more, that created the picture of the country which was eventually revealed in the 2011 Census, published at the end of 2012.

004

The numbers of non-EU nationals moving to settle in Britain were expected only to increase from 100,000 a year in 1997 to 170,000 in 2004. In fact, the government’s predictions for the number of new arrivals over the five years 1999-2004 were out by almost a million people. It also failed to anticipate that the UK might also be an attractive destination for people with significantly lower average income levels or without a minimum wage. For these reasons, the number of Eastern European migrants living in Britain rose from 170,000 in 2004 to 1.24 million in 2013. Whether the surge in migration went unnoticed or was officially approved, successive governments did not attempt to restrict it until after the 2015 election, by which time it was too late.

(to be continued)

Posted January 15, 2019 by TeamBritanniaHu in Affluence, Africa, Arabs, Assimilation, asylum seekers, Belfast, Birmingham, Black Market, Britain, British history, Britons, Bulgaria, Calais, Caribbean, Celtic, Celts, Child Welfare, Cold War, Colonisation, Commonwealth, Communism, Compromise, Conservative Party, decolonisation, democracy, Demography, Discourse Analysis, Domesticity, Economics, Education, Empire, English Language, Europe, European Economic Community, European Union, Factories, History, Home Counties, Humanism, Humanitarianism, Hungary, Immigration, Imperialism, India, Integration, Iraq, Ireland, Journalism, Labour Party, liberal democracy, liberalism, Linguistics, manufacturing, Margaret Thatcher, Midlands, Migration, Militancy, multiculturalism, multilingualism, Music, Mythology, Narrative, National Health Service (NHS), New Labour, Old English, Population, Poverty, privatization, Racism, Refugees, Respectability, Scotland, Socialist, south Wales, terror, terrorism, Thatcherism, Unemployment, United Kingdom, United Nations, Victorian, Wales, Welsh language, xenophobia, Yugoslavia

Tagged with , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Years of Transition – Britain, Europe & the World: 1992-1997.   Leave a comment

Epilogue to the Eighties & Prologue to the Nineties:

I can recall the real sense of optimism which resulted from the end of the Cold War, formally ending with President Gorbachev’s announcement of the dissolution of the Soviet Union on Christmas Day 1991. Although never an all-out global war, it had resulted in the deaths of up to forty million people throughout the world, involving more than a hundred and fifty smaller ‘proxy’ conflicts. Moreover, we had lived under a continual sense of doom, that it was only a matter of time until our brief, young lives would be snuffed out by a nuclear apocalypse. Now, politicians and journalists in the West talked of a coming ‘peace dividend’ and the end of the surveillance, spy and secret state in both east and west. The only continuing threat to British security came from the Provisional IRA. They hit Downing Street with a triple mortar attack in February 1991, coming close to killing the new Prime Minister, John Major, and his team of ministers and officials directing the Gulf War.

Margaret ThatcherBy the time Margaret Thatcher left office in tears on 28 November 1990, ‘Thatcherism’ was also a spent force, though its influence lingered on until at least the end of the century, and not just among Conservatives. Only a minority even among the ‘party faithful’ had been true believers and the Tory MPs would have voted her out had her cabinet ministers not beaten them to it. As Andrew Marr has written, History is harshest to a leader just as they fall. She had been such a strident presence for so long that many who had first welcomed her as a ‘gust’ of fresh air now felt the need for gentler breezes. Those who wanted a quieter, less confrontational leader found one in John Major.

Yet most people, in the end, had done well under her premiership, not just the ‘yuppies’ but also her lower-middle-class critics who developed their own entrepreneurial sub-cultures rather than depending on traditional sponsorship from arts councils and local authorities. By the early nineties, Britons were on average much wealthier than they had been in the late seventies and enjoyed a wider range of holidays, better food, and a greater variety of television channels and other forms of home entertainment. Nor was everything the Thatcher governments did out of tune with social reality. The sale of council houses which corresponded to the long passion of the British to be kings and queens of their own little castles. Sales of state utilities, on the other hand, presupposed a hunger for stakeholdership that was much less deeply rooted in British habits, and the subsequently mixed fortunes of those stocks did nothing to help change those habits. Most misguided of all was the decision to implement the ‘poll tax’ as a regressive tax. In the end, Thatcher’s 1987-90 government became just the latest in a succession of post-war British governments that had seen their assumptions rebound on them disastrously. This ‘trend’ was to continue under John Major. The upper middle-class ‘Victorian Values’ of the grocer’s daughter from Grantham were replaced by the ‘family values’ of the lower middle-class garden gnome salesman from Brixton, only for him to be overwhelmed by an avalanche of sexual and financial scandals.

The single most important event of the early nineties in Britain, possibly globally too, had nothing to do with politics and diplomacy or warfare and terrorism, at least not in the nineties. Tim Berners-Lee, a British scientist, invented the World Wide Web, or the Internet. His idea was for a worldwide ‘hypertext’, the computer-aided reading of electronic documents to allow people to work together remotely., sharing their knowledge in a ‘web’ of documents. His creation of it would give the internet’s hardware its global voice. He was an Oxford graduate who had made his first computer with a soldering iron, before moving to CERN, the European Physics laboratory, in Switzerland in 1980, the world’s largest scientific research centre. Here he wrote his first programme in 1989 and a year later he proposed his hypertext revolution which arrived in CERN in December 1990. The ‘internet’ was born the following summer. He chose not to patent his creation so that it would be free to everyone.

The Election of 1992 – A Curious Confidence Trick?:

002

John Major called an election for April 1992. Under a pugnacious Chris Patten, now Party chairman, the Tories targeted Labour’s enthusiasm for high taxes. During the campaign itself, Major found himself returning to his roots in Brixton and mounting a ‘soap-box’, from which he addressed raucous crowds through a megaphone. John Simpson, the BBC correspondent, was given the task of covering Major’s own campaign, and on 15 March he travelled to Sawley, in the PM’s constituency of Huntingdon, where Major was due to Meet the People. I have written elsewhere about the details of this, and his soap-box campaign, as reported by Simpson. Although Simpson described it as ‘a wooden construction of some kind’, Andrew Marr claims it was ‘a plastic container’. Either way, it has gone down in political history, together with the megaphone, as the prop that won him the election. The stark visual contrast achieved with the carefully stage-managed Labour campaign struck a chord with the media and he kept up an act that his father would have been proud of, playing the underdog to Neil Kinnock’s government in waiting. Right at the end, at an eve of poll rally in Sheffield, Kinnock’s self-control finally gave way and he began punching the air and crying “y’awl’ right!” as if he were an American presidential candidate. It was totally ‘cringe-worthy’ TV viewing, alienating if not repulsing swathes of the very middle England voters he needed to attract.

On 9 April 1992 Major’s Conservatives won fourteen million votes, more than any party in British political history. It was a great personal victory for the ‘new’ Prime Minister, but one which was also based on people’s fears of higher taxes under a Labour government. It was also one of the biggest victories in percentage terms since 1945, though the vagaries of the electoral system gave the Tories a majority of just twenty-one seats in parliament. Neil Kinnock was even more devastated than he had been in 1987 when he had not been expected to defeat Thatcher. The only organ of the entire British press which had called the election correctly was the Spectator. Its editor, Dominic Lawson, headlined the article which John Simpson wrote for him The Curious Confidence of Mr Major so that the magazine seemed to suggest that the Conservatives might pull off a surprise win. Simpson himself admitted to not having the slightest idea who would win, though it seemed more likely to him that Labour would. Yet he felt that John Major’s own apparent certainty was worth mentioning. When the results started to become clear on that Friday morning, 10 April, the Spectator stood out favourably from the shelves of newsagents, surrounded by even the late, or early editions of newspapers and magazines which had all been predicting a Labour victory.

003 (2)

The only politician possibly more disappointed than Neil Kinnock, who immediately left front-line politics, was Chris Patten, who had been the real magician behind Major’s remarkable victory. He lost his seat to the Liberals in marginal Bath and went off to become the final governor of Hong Kong ahead of the long-agreed handover of Britain’s last colony in 1997. Kinnock, a former long-term opponent of Britain’s membership of the EEC/ EC went off to Brussels to become a European Commissioner. Despite his triumph in the popular vote, never has such a famous victory produced so rotten an outcome for the victors. The smallness of Major’s majority meant that his authority could easily be eaten away in the Commons. As a consequence, he would not go down as a great leader in parliamentary posterity, though he remained popular in the country as a whole for some time, if not with the Thatcherites and Eurosceptic “bastards” in his own party.  Even Margaret Thatcher could not have carried through her revolutionary reforms after the 1979 and 1983 elections with the kind of parliamentary arithmetic which was dealt her successor. In Rugby terms, although the opposition’s three-quarters had been foiled by this artful dodger of a full-back, he had been dealt a ‘hospital pass’ by his own side. For the moment, he had control of the slippery ball, but he was soon to be forced back into series of crushing rucks and mauls among his own twenty-stone forwards.

 John Smith – Labour’s lost leader and his legacy:

002 (2)

After Neil Kinnock gave up the Labour leadership following his second electoral defeat in 1992, he was replaced by John Smith (pictured above), a placid, secure, self-confident Scottish lawyer. As Shadow Chancellor, he had been an effective cross-examiner of Nigel Lawson, John Major and Norman Lamont and had he not died of a heart attack in 1994, three years ahead of the next election, most political pundits agreed that, following the tarnishing of the Major administration in the mid-nineties, he would have become Prime Minister at that election. Had he done so, Britain would have had a traditional social democratic government, much like those of continental Europe. He came from a family of herring fishermen on the West Coast of Scotland, the son of a headmaster. Labour-supporting from early youth, bright and self-assured, he got his real political education at Glasgow University, part of a generation of brilliant student debaters from all parties who would go on to dominate Scottish and UK politics including, in due succession, Donald Dewar, Gordon Brown, Alistair Darling and Douglas Alexander. Back in the early sixties, Glasgow University Labour Club was a hotbed not of radicals, but of Gaitskell-supporting moderates. This was a position that Smith never wavered from, as he rose as one of the brightest stars of the Scottish party, and then through government under Wilson and Callaghan as a junior minister dealing with the oil industry and devolution before entering cabinet as President of the Board of Trade, its youngest member at just forty. In opposition, John Smith managed to steer clear of the worst in-fighting, eventually becoming Kinnock’s shadow chancellor. In Thatcher’s England, however, he was spotted as a tax-raising corporatist of the old school. One xenophobic letter he received brusquely informed him:

You’ll not get my BT shares yet, you bald, owl-looking Scottish bastard. Go back to Scotland and let that other twit Kinnock go back to Wales.

Smith came from an old-fashioned Christian egalitarian background which put him naturally out of sympathy with the hedonistic culture of southern England.  Just before he became Labour leader he told a newspaper he believed above all in education, because…

 … it opens the doors of the imagination, breaks down class barriers and frees people. In our family … money was looked down on and education was revered. I am still slightly contemptuous of money.

Smith was never personally close to Kinnock but was scrupulously loyal to him as his leader, he nevertheless succeeded him by a huge margin in 1992. By then he had already survived a serious cardiac arrest and had taken up hill-walking. Though Smith swiftly advanced the careers of his bright young lieutenants, Tony Blair and Gordon Brown, they soon became disappointed by his view that the Labour party needed simply to be improved, not radically transformed. In particular, he was reluctant to take on the party hierarchy and unions over issues of internal democracy, such as the introduction of a one-member, one-vote system for future leadership elections. He was sure that Labour could regain power with a revival of its traditional spirit. At one point, Tony Blair was so dispirited by Smith’s leadership style that he considered leaving politics altogether and going back to practising law. Smith died of a second heart attack on 12 May 1994. After the initial shock and grief subsided, Labour moved rapidly away from his policy of ‘gradualism’ towards ‘Blairite’ transformation. One part of his legacy still remains, however, shaping modern Britain today. As the minister who had struggled to achieve devolution for Scotland in 1978-9, he remained a passionate supporter of the ‘unfinished business’ of re-establishing the Holyrood Parliament and setting up a Welsh Assembly. With his friend Donal Dewar he had committed Labour so utterly to the idea in Opposition, despite Kinnock’s original strong anti-nationalist stance, that Blair, no great fan of devolution himself, found that he had to implement Smith’s unwelcome bequest to him.

Black Wednesday and the Maastricht Treaty:

The crisis that soon engulfed the Major government back in the early autumn of 1992 was a complicated economic one. From August 1992 to July 1996 I was mainly resident in Hungary, and so, although an economic historian, never really understood the immediate series of events that led to it or the effects that followed. This was still in pre-internet days, so I had little access to English language sources, except via my short-wave radio and intermittent newspapers bought during brief visits to Budapest. I had also spent most of 1990 and half of 1991 in Hungary, so there were also longer-term gaps in my understanding of these matters. I have written about them in earlier articles in this series, dealing with the end of the Thatcher years. Hungary itself was still using an unconvertible currency throughout the nineties, which only became seriously devalued in 1994-96, and when my income from my UK employers also fell in value, as a family we decided to move back to Britain to seek full-time sterling salaries. The first thing that happened was that they lost their fiscal policy in a single day when the pound fell out of the ERM (European Exchange Rate Mechanism). In his memoirs, John Major described the effect of this event in stark terms:

Black Wednesday – 16 September 1992, the day the pound toppled out of the ERM – was a political and economic calamity. It unleashed havoc in the Conservative Party and it changed the political landscape of Britain.

For Major and his government, the point was that as the German central bank had a deserved reputation for anti-inflationary rigour, having to follow or ‘shadow’ the mark meant that Britain had purchased a respected off-the-shelf policy. Sticking to the mighty mark was a useful signal to the rest of the world that this government, following all the inflationary booms of the seventies and eighties, was serious about controlling inflation. On the continent, however, the point of the ERM was entirely different, intended to lead to a strong new single currency that the countries of central Europe would want to join as members of an enlarged EC/EU. So a policy which Margaret Thatcher had earlier agreed to, in order to bring down British inflation, was now a policy she and her followers abhorred since it drew Britain ever closer towards a European superstate in the ‘Delors Plan’. This was a confused and conflicted state of affairs for most of the Tories, never mind British subjects at home and abroad.

The catalyst for sterling’s fall was the fall in the value of the dollar, pulling the pound down with it. Worse still, the money flowed into the Deutschmarks, which duly rose; so the British government raised interest rates to an eye-watering ten per cent, in order to lift the pound. When this failed to work, the next obvious step would have been for the German central bank to cut their interest rates, lowering the value of the mark and keeping the ERM formation intact. This would have helped the Italian lira and other weak currencies as well as the pound. But since Germany had just reunited after the ‘fall of the wall’, the whole cost of bringing the poorer East Germans into line with their richer compatriots in the West led to a real fear of renewed inflation as well as to memories of the Berlin Crisis of 1948-49 and the hyperinflation of the Weimar period. So the Germans, regardless of the pain being experienced by Britain, Italy and the rest, wanted to keep their high-value mark and their high interest rates. Major put considerable and concerted pressure on Chancellor Kohl, warning of the danger of the Maastricht treaty failing completely since the Danes had just rejected it in a referendum and the French were also having a plebiscite. None of this had any effect on Kohl who, like a previous German Chancellor, would not move.

002 (62)

In public, the British government stuck to the line that the pound would stay in the ERM at all costs. It was not simply a European ‘joint-venture’ mechanism but had been part of the anti-inflation policy of both the Lawson and Major chancellorships. Then, the now PM had told the unemployed workers and the repossessed homeowners in Britain that if it isn’t hurting, it isn’t working, so his credibility had been tied to the success of the ERM ever since. It had also been, as Foreign Secretary and now as Prime Minister, his foreign policy of placing Britain ‘at the heart of Europe’. It was his big idea for both economic and diplomatic survival in an increasingly ‘globalised’ environment. Norman Lamont, who as Chancellor was as committed as Major, told ‘the markets’ that Britain would neither leave the mechanism nor deviate from it by devaluing the pound. ERM membership was at the centre of our policy and there should not be one scintilla of doubt that it would continue. Major went even further, telling a Scottish audience that with inflation down to 3.7 per cent and falling, it would be madness to leave the ERM. He added that:

“The soft option, the devaluer’s option, the inflationary option, would be a betrayal of our future.”

However, then the crisis deepened with the lira crashing out of the ERM formation. International money traders, such as the Hungarian-born György Soros, began to turn their attention to the weak pound and carried on selling. They were betting that Major and Lamont would not keep interest rates so high that the pound could remain up there with the mark – an easy, one-way bet. In the real world, British interest rates were already painfully high. On the morning of ‘Black Wednesday’, at 11 a.m., the Bank of England raised them by another two points. This was to be agonising for home-owners and businesses alike, but Lamont said he would take whatever measures were necessary to keep the pound in the mechanism. Panic mounted and the selling continued: a shaken Lamont rushed round to tell Major that the interest rate hike had not worked, but Major and his key ministers decided to stay in the ERM. The Bank of England announced that interest would go up by a further three points, to fifteen per cent. Had it been sustained, this would have caused multiple bankruptcies across the country, but the third rise made no difference either. Eventually, at 4 p.m., Major phoned the Queen to tell her that he was recalling Parliament. At 7.30 p.m., Lamont left the Treasury to announce to the press and media in Whitehall that he was suspending sterling’s membership of the ERM and was reversing the day’s rise in interest rates.

Major considered resigning. It was the most humiliating moment in British politics since the IMF crisis of 1976, sixteen years earlier. But if he had done so Lamont would have had to go as well, leaving the country without its two most senior ministers in the midst of a terrible crisis. Major decided to stay on, though he was forever diminished by what had happened. Lamont also stayed at his post and was delighted as the economy began to respond to lower interest rates, and a slow recovery began. While others suffered further unemployment, repossession and bankruptcy, he was forever spotting the ‘green shoots’ of recovery. In the following months, Lamont created a new unified budget system and took tough decisions to repair the public finances. But as the country wearied of recession, he became an increasingly easy ‘butt’ of media derision. To Lamont’s complete surprise, Major sacked him as Chancellor a little over six months after Black Wednesday. Lamont retaliated in a Commons statement in which he said: We give the impression of being in office, but not in power. Major appointed Kenneth Clarke, one of the great characters of modern Conservatism, to replace him.

In the Commons, the struggle to ratify the Maastricht Treaty hailed as a great success for Major before the election, became a long and bloody one. Major’s small majority was more than wiped out by the number of ‘Maastricht rebels’, egged on by Lady Thatcher and Lord Tebbit. Black Wednesday had emboldened those who saw the ERM and every aspect of European federalism as disastrous for Britain. Major himself wrote in his memoirs that it turned …

… a quarter of a century of unease into a flat rejection of any wider involvement in Europe … emotional rivers burst their banks.

Most of the newspapers which had welcomed Maastricht were now just as vehemently against it. The most powerful Conservative voices in the media were hostile both to the treaty and to Major. His often leaded use of English and lack of ‘panache’ led many of England’s snobbish ‘High Tories’ to brand him shockingly ill-educated and third-rate as a national leader. A constantly shifting group of between forty to sixty Tory MPs regularly worked with the Labour opposition to defeat key parts of the Maastricht bill, so that Major’s day-to-day survival was always in doubt. Whenever, however, he called a vote of confidence and threatened his rebellious MPs with an election, he won. Whenever John Smith’s Labour Party and the Tory rebels could find some common cause, however thin, he was in danger of losing. In the end, Major got his legislation and Britain signed the Maastricht Treaty, but it came at an appalling personal and political cost. Talking in the general direction of an eavesdropping microphone, he spoke of three anti-European ‘bastards’ in his own cabinet, an obvious reference to Michael Portillo, Peter Lilley and John Redwood. The country watched a divided party tearing itself apart and was not impressed.

By the autumn of 1993, Norman Lamont was speaking openly about the possibility that Britain might have to leave the European Union altogether, and there were moves to force a national referendum. The next row was over the voting system to be used when the EU expanded. Forced to choose between a deal which weakened Britain’s hand and stopping the enlargement from happening at all by vetoing it, Foreign Secretary Douglas Hurd went for a compromise. All hell broke loose, as Tory MPs began talking of a leadership challenge to Major. This subsided, but battle broke out again over the European budget and fisheries policy. Eight MPs had their formal membership of the Tory Party withdrawn. By this point, John Smith’s sudden death had brought Tony Blair to the fore as leader of the Opposition. When Major readmitted the Tory rebels, Blair jibed: I lead my party, you follow yours. Unlike Lamont’s remark in the Commons, Blair’s comment struck a chord with the country.

The concluding chapter of the Thatcher Revolution:

While the central story of British politics in the seven years between the fall of Thatcher and the arrival to power of Blair was taken up by Europe, on the ‘home front’ the government tried to maintain the momentum of the Thatcher revolution. After many years of dithering, British Rail was divided up and privatised, as was the remaining coal industry. After the 1992 election, it was decided that over half the remaining coal mining jobs must go, in a closure programme of thirty-one pits to prepare the industry for privatization. This angered many Tory MPs who felt that the strike-breaking effect of the Nottinghamshire-based Union of Democratic Mineworkers in the previous decade deserved a better reward, and it aroused public protest as far afield as Cheltenham. Nevertheless, with power companies moving towards gas and oil, and the industrial muscle of the miners long-since broken, the closures and sales went ahead within the next two years, 1992-4. The economic effect on local communities was devastating, as the 1996 film Brassed Off shows vividly, with its memorable depiction of the social impact on the Yorkshire village of Grimethorpe and its famous Brass Band of the 1992 closure programme. Effectively, the only coalfields left working after this were those of North Warwickshire and South Derbyshire.

Interfering in the railway system became and remained a favourite ‘boys with toys’ hobby but a dangerous obsession of governments of different colours. Margaret Thatcher, not being a boy, knew that the railways were much too much part of the working life of millions to be lightly broken up or sold off. When Nicholas Ridley, as Transport Secretary, had suggested this, Thatcher is said to have replied:

“Railway privatisation will be the Waterloo of this government. Please never mention the railways to me again.”

It was taken up again enthusiastically by John Major. British Rail had become a national joke, loss-making, accident-prone, with elderly tracks and rolling stock, and serving curled-up sandwiches. But the challenge of selling off a system on which millions of people depended was obvious. Making it profitable would result in significant and unpopular fare rises and cuts in services. Moreover, different train companies could hardly compete with each other directly, racing up and down the same rails. There was, therefore, a binary choice between cutting up ‘BR’ geographically, selling off both trains and track for each region, so that the system would look much the way it was in the thirties, or the railway could be split ‘vertically’ so that the State would continue to own the track, while the stations and the trains would be owned by private companies. This latter solution was the one chosen by the government and a vast, complicated new system of subsidies, contracts, bids, pricing, cross-ticketing and regulation was created, but rather than keeping the track under public control, it too was to be sold off to a single private monopoly to be called Railtrack. Getting across the country would become a complicated proposition and transaction, involving two or three separate rail companies. A Franchise Director was to be given powers over the profits, timetables and ticket-pricing of the new companies, and a Rail Regulator would oversee the track. Both would report directly to the Secretary of State so that any public dissatisfaction, commercial problem or safety issue would ultimately be the responsibility of the government. This was a strange and pointless form of privatization which ended up costing the taxpayer far more than British Rail. The journalist Simon Jenkins concluded:

The Treasury’s treatment of the railway in the 1990s was probably the worst instance of Whitehall industrial management since the Second World War.

006 (2)

005

One success story in the rail network was the completion of the Channel Tunnel link to France in 1994 (the Folkestone terminal is pictured above), providing a good example of the inter-relationship between transport links and general economic development. The Kent town of Ashford had a relationship with the railways going back to 1842, and the closure of the town’s railway works between 1981 and 1993 did not, however, undermine the local economy. Instead, Ashford benefited from the Channel Tunnel rail link, which made use of railway lines running through the town, and its population actually grew by ten per cent in the 1990s. The completion of the ‘Chunnel’ gave the town an international catchment area of eighty-five million within a single day’s journey. The opening of the Ashford International railway station, the main terminal for the rail link to Europe, attracted a range of engineering, financial, distribution and manufacturing companies. In addition to the fourteen business parks that were opened in and around the town itself, four greenfield sites were opened on the outskirts, including a science park owned by Trinity College, Cambridge. As the map above shows, Ashford is now closer to Paris and Brussels in travelling time than it is to Manchester and Liverpool. By the end of the century, the town, with its position at the hub of a huge motorway network as well as its international rail link, was ready to become part of a truly international economy.

006

Many of the improvements in transport infrastructure on both islands of Britain and Ireland were the result of EU funding, especially in Northern Ireland, and it was also having an impact on transport planning in Britain, with projects in the Highlands and Islands. In 1993 the EU decided to create a European-wide transport network. Of the fourteen priority associated with this aim, three are based in Britain and Ireland – a rail link from Cork to Northern Ireland and the ferry route to Scotland; a road link from the Low Countries across England and Wales to Ireland, and the West Coast rail route in Britain.

As a Brixton man, Major had experienced unemployment and was well prepared to take on the arrogant and inefficient quality of much so-called public service. But under the iron grip of the Treasury, there was little prospect for a revival of local democracy to take charge of local services again. This left a highly bureaucratic centralism as the only option left, one which gained momentum in the Thatcher years. Under Major, the centralised Funding Agency for Schools was formed and schools in England and Wales were ranked by crude league tables, depending on how well their pupils did in exams. The university system was vastly expanded by simply allowing colleges and polytechnics to rename themselves as universities. The hospital system was further centralised and given a host of new targets. The police, faced with a review of their pay and demands by the Home Secretary, Kenneth Clarke for their forces to be amalgamated, were given their own performance league tables. The Tories had spent seventy-four per cent more, in real terms, on law and order since 1979, yet crime was at an all-time high. Clarke’s contempt for many of the forces as ‘vested interests’ was not calculated to win them round to reform. Across England and Wales elected councillors were turfed off police boards and replaced by businessmen. In 1993 Clarke, the old Tory dog who had clearly learned new tricks during his time at the Department of Health where he was said to have castrated the regional health authority chairmen, defended his new police league tables in the ‘newspeak’ of governments yet to come:

The new accountability that we seek from our public services will not be achieved simply because men of good will and reasonableness wish that it be so. The new accountability is the new radicalism.

Across Britain, from the auditing of local government to the running of courts and the working hours of nurses, an army of civil servants, accountants, auditors and inspectors marched into workplaces. From time to time, ministers would weakly blame Brussels for the imposition of the cult of central control and measurement. But this was mostly a home-grown ‘superstate’. Major called this centralising policy the ‘Citizen’s Charter’, ignoring the fact that Britons are ‘subjects’ rather than citizens. He himself did not like the ‘headline’ very much because of its unconscious echoes of Revolutionary France. Every part of the government dealing with public service was ordered to come up with proposals for improvement at ‘grass-roots level’, to be pursued from the centre by questionnaires, league tables and a system of awards, called ‘Charter Marks’ for organizations that achieved the required standards. He spoke of ’empowering’, ‘helping the customer’ and ‘devolving’ and thought that regulation from the centre would not last long, rather like a Marxist-Leninist anticipating the ‘withering away’ of the state. In his case, though, this would come about as the effects of growing competition are felt. In practice, of course, the regulators grew more powerful, not less so. Despite the rhetoric, public servants were not being given real freedom to manage. Elected office-holders were being sacked. Major’s ‘withering away’ of the state was no more successful than Lenin’s.

Britain and Ireland – first steps on the road to peace:

009Above: US President Bill Clinton addressing a peace rally in Belfast during his visit in 1995. Clinton played a significant role as a ‘peace broker’ in negotiations leading up to ‘the Good Friday Agreement’.

In December 1993, John Major stood outside the steel-armoured door of Number Ten Downing Street with the ‘Taoiseach’ of the Irish Republic, Albert Reynolds. He declared a new principle which offended many traditional Conservatives and Unionists. If both parts of Ireland voted to be reunited, Britain would not stand in the way. She had, said Major, no selfish strategic or economic interest in Northern Ireland. He also stated that if the Provisional IRA, which had lately bombed the very building Major was standing in front of and murdered two young boys in Cheshire, renounced violence, Sinn Fein could be recognised as a legitimate political party. In the run-up to this Downing Street Declaration, which some saw as a betrayal of the Tory Party’s long-held dedication to the Union of Great Britain and Northern Ireland, the government had been conducting ‘back channel’ negotiations with the terrorist organisation. In August 1994 the IRA finally declared a complete cessation of military operations which, though it stopped a long way short of renouncing the use of violence altogether, was widely welcomed and was followed a month later by a Loyalist ceasefire. A complicated choreography of three-strand talks, framework documents and discussions about the decommissioning of weapons followed, while on the streets, extortion, knee-capping and occasional ‘executions’ continued. But whereas the number of those killed in sectarian violence and bombings in 1993 had been eighty-four, the toll fell to sixty-one the following year, and in 1995 it was in single figures, at just nine deaths.

Long negotiations between London and Dublin led to cross-border arrangements. These negotiations had also involved the United States, where an influential pro-Irish lobby had helped to sustain the IRA campaign into the nineties through finance provided through ‘Noraid’. In the mid-nineties, President Clinton acted as a peace-broker, visiting Belfast in 1995 and helping to maintain the fragile cease-fire in the North. The contradictory demands of Irish Republicanism and Ulster Unionism meant that Major failed to get a final agreement, which was left to Tony Blair, with the ongoing help of the American ex-senator George Mitchell. The fact that in 1991 both countries had signed the Maastricht Treaty for closer political and economic unity in Europe, set a broader context for a bilateral agreement. However, while Irish political leaders eagerly embraced the idea of European integration, their British counterparts, as we have seen, remained deeply divided over it.

Economic decline/ growth & political resuscitation:

008

The closure of the Swan Hunter shipyard on the Tyne in May 1993 is an illuminating example of the impact of de-industrialisation. Swan Hunter was the last working shipyard in the region but had failed to secure a warship contract. An old, established firm, it was suffering some of the same long-term decline that decimated shipbuilding employment nationally to 26,000 by the end of a century. This devastated the local economy, especially as a bitter legal wrangle over redundancy payments left many former workers with no compensation whatever for the loss of what they had believed was employment for life. But the effects of de-industrialisation could spread much further than local communities. The closure of the shipyard, as shown in the map above, but the failure of the firm also had a ‘knock-on’ effect as suppliers as far afield as London and Glasgow lost valuable orders and, as a result, jobs.

004

By 1994, employment in manufacturing in Britain had fallen to four million from the nine million it had reached at its peak in 1966. The resulting mass unemployment hurt the older industries of the Northwest worst, but the losses were proportionately as high in the Southeast, reflecting the decline in newer manufacturing industry. Across most of Britain and Ireland, there was also a decline in the number of manufacturing jobs continuing into and throughout the 1990s. The service sector, however, expanded, and general levels of unemployment, especially in Britain, fell dramatically in the nineties. Financial services showed strong growth, particularly in such places as London’s Docklands, with its new ‘light railway’, and Edinburgh. By the late nineties, the financial industry was the largest employer in northern manufacturing towns and cities like Leeds, which grew rapidly throughout the decade, aided by its ability to offer a range of cultural facilities that helped to attract an array of UK company headquarters. Manchester, similarly, enjoyed a renaissance, particularly in the spheres of music, the media and sport.

In July 1995, tormented by yet more rumours of right-wing conspiracies against him, Major riposted with a theatrical gesture of his own, resigning as leader of the Conservative Party and inviting all-comers to take him on. He told journalists gathered in the Number Ten garden that it was “put up or shut up time”. If he lost he would resign as Prime Minister. If he won, he would expect the party to rally around him. This was a gamble, since other potential leaders were available, not least Michael Heseltine, who had become Deputy Prime Minister, and Michael Portillo, then the pin-up boy of the Thatcherites, whose supporters prepared a campaign headquarters for him, only for him to then decide against standing. In the event, the challenger was John Redwood, the Secretary of State for Wales and a highly intelligent anti-EU right-winger. Major won his fight, though 109 Tory MPs refused to back him.

Fighting the return of Fascism in Europe:

Major was also having to deal with the inter-ethnic wars breaking out in the former Yugoslavia, following the recognition of Slovenia, Croatia and Bosnia as independent states in the early nineties. The worst violence occurred during the Serbian assault on Bosnia (I have written about the bloody 1992-94 Siege of Sarajevo, its capital, in an article elsewhere on this site based on John Simpson’s reporting). The term ‘ethnic cleansing’ was used for the first time as woeful columns of refugees fled in different directions. A nightmare which Europeans thought was over in 1945 was returning, only a couple of days’ drive away from London and half a day’s drive from where I was living on the southern borders of Hungary with Serbia and Croatia.

Six years after the siege, during a school visit to the Hague, I sat in the courtroom of the International War Crimes Tribunal on the former Yugoslavia and listened, in horror, to the testimonies of those who had been imprisoned and tortured in concentration camps during the Bosnian War. I couldn’t believe that what I was hearing had happened in the final decade of the twentieth century in Europe. Those on trial at that time were the prison camp guards who had carried out the atrocities, claiming what had become known as the Nuremberg Defence. Later on, those giving the orders, both Mladko Radic and Radovan Karadzic (pictured below with John Simpson in 1993), the military and political leaders of the Bosnian Serbs, went on trial in the same courtroom, were convicted of war crimes and duly locked away, together with the former Serbian President, Slobodan Milosevic. Major had asked how many troops it would take to keep the warring three sides apart and was told the number was four hundred thousand, three times the total size of the British Army at that time. He sent 1,800 men to protect the humanitarian convoys that were rumbling south from the UN bases in Hungary.

001

Although many British people sent food parcels, warm clothes, medicine and blankets, loaded onto trucks and driven across the Croatian border and into Bosnia, many in the government were reluctant for Britain to become further involved. But the evening news bulletins showed pictures of starving refugees, the uncovered mass graves of civilians shot dead by death squads, and children with appalling injuries. There was a frenzied campaign for Western intervention, but President Clinton was determined not to risk the lives of American soldiers on the ground. Instead, he considered less costly alternatives, such as air strikes. This would have put others who were on the ground, including the British and other nationalities involved in the UN operation, directly into the line of retaliatory fire of the Serbian troops. When the NATO air-strikes began, the Serbs took the UN troops hostage, including British soldiers, who were then used as human shields. When the Serbs captured the town of Srebrenica and carried out a mass slaughter of its Muslim citizens, there were renewed calls for ‘boots on the ground’, but they never came.

Following three years of fighting, sanctions on Serbia and the success of the Croat Army in fighting back, a peace agreement was finally made in Dayton, Ohio. The UN convoys and troops left Hungary. Major became the first British Prime Minister of the post-War World to grapple with the question of what the proper role of the West should be to ‘regional’ conflicts such as the Balkan wars. They showed quite clearly both the dangers and the limitations of intervention. When a civil conflict is relayed in all its horror to tens of millions of voters every night by television, the pressure to ‘do something’ is intense.  But mostly this requires not air strikes but a full-scale ground force, which will then be drawn into the war itself. Then it must be followed by years of neo-colonial aid and rebuilding. Major and his colleagues were accused of moral cowardice and cynicism in allowing the revival of fascist behaviour in one corner of Europe. Yet, especially given the benefit of hindsight of what happened subsequently in Iraq and Afghanistan, perhaps Western leaders were right to be wary of full-scale intervention.

Back to basics?

For many British voters, the Major years were associated with the sad, petty and lurid personal scandals that attended so many of his ministers, after he made an unwise speech calling for the return as old-style morality. In fact, back to basics referred to almost everything except personal sexual morality; he spoke of public service, industry, sound money, free trade, traditional teaching, respect for the family and the law and the defeat of crime. It gave the press, however, a fail-safe headline charge of hypocrisy whenever ministers were caught out. A series of infidelities were exposed; children born out-of-wedlock, a death from a sex stunt which went wrong, rumours about Major’s own affairs (which later turned out to be truer than realised at the time). More seriously, there was also an inquiry as to whether Parliament had been misled over the sale of arms to Iraq, but these were all knitted together into a single pattern of misbehaviour, referred to as ‘sleaze’.

In 1996, a three-year inquiry into whether the government had allowed a trial to go ahead against directors of an arms company, Matrix Churchill, knowing that they were, in fact, acting inside privately accepted guidelines, resulted in two ministers being publicly criticised. It showed that the government had allowed a more relaxed régime of military-related exports to Saddam Hussein even after the horrific gassing of five thousand Kurds at Falluja, also revealing a culture of secrecy and double standards in the process. Neil Hamilton MP was accused of accepting cash from Mohammed al-Fayed, the owner of Harrods, for asking questions in the Commons. One of the most dramatic episodes in the 1997 election was the overwhelming defeat he suffered in his Tatton constituency by the former BBC war reporter, Martin Bell, who had been badly injured in Sarajevo who became Britain’s first independent MP for nearly fifty years. Jonathan Aitken, a Treasury minister was accused of accepting improper hospitality from an Arab business contact. He resigned to fight the Guardian over the claims, with the simple sword of truth and the trusty shield of fair play. He was found guilty of perjury, spending eighteen months in prison.

002

By the end of Major’s government, it seemed that the Tories might have learned the lesson that disagreements over the EU were capable of splitting their party. However, there was a general mood of contempt for politicians and the press, in particular, had lost any sense of deference. The reforms of the health service, police and schools had produced few significant improvements. The post-Cold War world was turning out to be nastier and less predictable than the early nineties days of the ‘peace dividend’ had promised. The Labour Opposition would, in due course, consider how the country might be better governed and reformed, as well as what would be the right British approach to peace-keeping and intervention now that the United States was the last superpower left standing. But in the early months of 1997,  Tony Blair and his fresh young ‘New Labour’ team, including Alistair Campbell (pictured above), were oiling their effective election-winning machine and moving in to roll over a tired-looking John Major and his tarnished old Tories.

Sources:

Andrew Marr (2008), A History of Modern Britain. Basingstoke: Pan-Macmillan.

Simon Schama (2018), A History of Britain, 1776-2000; The Fate of Empire. London: BBC Worldwide.

John Simpson (1999), Strange Places, Questionable People. Basingstoke: Pan-Macmillan.

Peter Caterall, Roger Middleton, John Swift (2001), The Penguin Atlas of British & Irish History. London: Penguin Books.

Posted October 17, 2018 by TeamBritanniaHu in Apocalypse, Arabs, Balkan Crises, Britain, British history, Britons, Brussels, Christian Faith, Christian Socialism, Christianity, Church, Coalfields, Cold War, devolution, Egalitarianism, Ethnic cleansing, Europe, European Economic Community, European Union, Family, France, Genocide, German Reunification, Germany, Gorbachev, Humanism, Hungary, Immigration, Ireland, Irish history & folklore, Italy, Journalism, Labour Party, manufacturing, Margaret Thatcher, Marxism, morality, National Health Service (NHS), Refugees, Revolution, Scotland, Security, terrorism, Thatcherism, Unemployment, Wales

Tagged with , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

%d bloggers like this: