Archive for the ‘Gordon Brown’ Tag

You Only Live Twice – Cool Britannia to Cold Brexit: The United Kingdom, 1999-2019. Part Two: Identity, Immigration & Islam.   Leave a comment

 

002

British Identity at the Beginning of the New Millennium:

As Simon Schama pointed out in 2002, it was a fact that even though only half of the British-Caribbean population and a third of the British-Asian population were born in Britain, they continued to constitute only a small proportion of the total population. It was also true that any honest reckoning of the post-imperial account needed to take account of the appeal of separatist fundamentalism in Muslim communities. At the end of the last century, an opinion poll found that fifty per cent of British-born Caribbean men and twenty per cent of British-born Asian men had, or once had, white partners. In 2000, Yasmin Alibhai-Brown found that, when polled, eighty-eight per cent of white Britons between the ages of eighteen and thirty had no objection to inter-racial marriage; eighty-four per cent of West Indians and East Asians and fifty per cent of those from Indian, Pakistani or Bangladeshi backgrounds felt the same way. Schama commented:

The colouring of Britain exposes the disintegrationalist argument for the pallid, defensive thing that it is. British history has not just been some sort of brutal mistake or conspiracy that has meant the steamrollering of Englishness over subject nations. It has been the shaking loose of peoples from their roots. A Jewish intellectual expressing impatience with the harping on ‘roots’ once told me that “trees have roots; Jews have legs”. The same could be said of Britons who have shared the fate of empire, whether in Bombay or Bolton, who have encountered each other in streets, front rooms, kitchens and bedrooms.

001

Britain, the European Union, NATO & the Commonwealth, 2000

Until the Summer of 2001, this ‘integrationist’ view of British history and contemporary society was the broadly accepted orthodoxy among intellectuals and politicians, if not more popularly. At that point, however, partly as a result of riots in the north of England involving ethnic minorities, including young Muslim men, and partly because of events in New York and Washington, the existence of parallel communities began to be discussed more widely and the concept of ‘multiculturalism’ began to become subject to fundamental criticism on both the right and left of the political spectrum. In the ‘noughties’, the dissenters from the multicultural consensus began to be found everywhere along the continuum. In the eighties and nineties, there were critics who warned that the emphasis on mutual tolerance and equality between cultures ran the risk of encouraging separate development, rather than fostering a deeper sense of mutual understanding through interaction and integration between cultures. The ‘live and let live’ outlook which dominated ‘race relations’ quangos in the 1960s and ’70s had already begun to be replaced by a more active interculturalism, particularly in communities where that outlook had proven to be ineffective in countering the internecine conflicts of the 1980s. Good examples of this development can be found in the ‘Education for Mutual Understanding’ and ‘Inter-Cultural’ Educational projects in Northern Ireland and the North and West Midlands of England in which this author was involved and has written about elsewhere on this site.

Politicians also began to break with the multicultural consensus, and their views began to have an impact because while commentators on the right were expected to have ‘nativist’ if not ‘racist’ tendencies in the ‘Powellite’ tradition, those from the left could generally be seen as having less easily assailable motives.

Flickr - boellstiftung - Trevor Phillips.jpgTrevor Phillips (pictured left), whom I had known as the first black President of the National Union of Students in 1979 before, in 2003, he became the Chair of the Commission for Racial Equality, opened up territory in discussion and debate that others had not dared to ‘trespass’ into. His realisation that the race-relations ‘industry’ was part of the problem, and that partly as a result of talking up diversity the country was ‘sleepwalking to segregation’ was an insight that others began to share.

Simon Schama also argued that Britain should not have to choose between its own multi-cultural, global identity and its place in Europe. Interestingly, he put the blame for this pressure at least partly on the EU bureaucracy in Brussels, suggesting that…

 … the increasing compulsion to make the choice that General de Gaulle imposed on us between our European and our extra-European identity seems to order an impoverishment of our culture. It is precisely the the roving, unstable, complicated, migratory character of our history that ought to be seen as a gift for Europe. It is a past, after all, that uniquely in European history combines a passion for social justice with a tenacious attachment to bloody-minded liberty, a past designed to subvert, not reinforce, the streamlined authority of global bureaucracies and corporations. Our place at the European table ought to make room for that peculiarity or we should not bother showing up for dinner. What, after all, is the alternative? To surrender that ungainly, eccentric thing, British history, with all its warts and disfigurements, to the economic beauty parlour that is Brussels will mean a loss. But properly smartened up, we will of course be fully entitled to the gold-card benefits of the inward-looking club… Nor should Britain rush towards a re-branded future that presupposes the shame-faced repudiation of the past. For our history is not the captivity of our future; it is, in fact, the condition of our maturity.  

Featured Image -- 20189

‘Globalisation’

Fourteen years later, this was exactly the choice facing the British people, though now it was not De Gaulle or even the Brussels ‘Eurocrats’ who were asking the question, but the British Prime Minister, David Cameron, and his ‘Brexiteer’ Conservatives in his cabinet and on the back benches. The people themselves had not asked to be asked, but when they answered at the 2016 Referendum, they decided, by a very narrow majority, that they preferred the vision (some would say ‘unicorn’) of a ‘global’ Britain to the ‘gold-card benefits’ available at the European table it was already sitting at. Their ‘tenacious attachment’ to ‘bloody-minded liberty’ led to them expressing their desire to detach themselves from the European Union, though it is still not clear whether they want to remain semi-detached or move to a detached property at the very end of the street which as yet has not yet been planned, let alone built. All we have is a glossy prospectus which may or may not be delivered or even deliverable.

An internet poster from the 2016 Referendum Campaign

009

Looking back to 2002, the same year in which Simon Schama published his BBC series book, The Fate of Empire, the latest census for England and Wales was published. Enumerated and compiled the previous year, it showed the extent to which the countries had changed in the decade since the last census was taken. Douglas Murray, in the first chapter of his recent book, The Strange Death of Europe, first published in 2017, challenges us to imagine ourselves back in 2002 speculating about what England and Wales might look like in the 2011 Census. Imagine, he asks us, that someone in our company had projected:

“White Britons will become a minority in their own capital city by the end of this decade and the Muslim population will double in the next ten years.”

How would we have reacted in 2002? Would we have used words like ‘alarmist’, ‘scaremongering’, ‘racist’, ‘Islamophobic’? In 2002, a Times journalist made far less startling statements about likely future immigration, which were denounced by David Blunkett, then Home Secretary (using parliamentary privilege) as bordering on fascism. Yet, however much abuse they received for saying or writing it, anyone offering this analysis would have been proved absolutely right at the end of 2012, when the 2011 Census was published. It proved that only 44.9 per cent of London residents identified themselves as ‘white British’. It also revealed far more significant changes, showing that the number of people living in England and Wales who had been born ‘overseas’ had risen by nearly three million since 2001. In addition, nearly three million people in England and Wales were living in households where not one adult spoke English or Welsh as their main language.

DSCN0105

These were very major ethnic and linguistic changes, but there were equally striking findings of changing religious beliefs. The Census statistics showed that adherence to every faith except Christianity was on the rise. Since the previous census, the number of people identifying themselves as Christian had declined from seventy-two per cent to fifty-nine. The number of Christians in England and Wales dropped by more than four million, from thirty-seven million to thirty-three. While the Churches witnessed this collapse in their members and attendees, mass migration assisted a near doubling of worshippers of Islam. Between 2001 and 2011 the number of Muslims in England and Wales rose from 1.5 million to 2.7 million. While these were the official figures, it is possible that they are an underestimate, because many newly-arrived immigrants might not have filled in the forms at the beginning of April 2011 when the Census was taken, not yet having a registered permanent residence. The two local authorities whose populations were growing fastest in England, by twenty per cent in the previous ten years, were Tower Hamlets and Newham in London, and these were also among the areas with the largest non-response to the census, with around one in five households failing to return the forms.

002 (2)

Yet the results of the census clearly revealed that mass migration was in the process of altering England completely. In twenty-three of London’s thirty-three boroughs (see map above) ‘white Britons’ were now in a minority. A spokesman for the Office of National Statistics regarded this demonstrating ‘diversity’, which it certainly did, but by no means all commentators regarded this as something positive or even neutral. When politicians of all the main parties addressed the census results they greeted them in positive terms. This had been the ‘orthodox’ political view since in 2007 the then Mayor of London, Ken Livingstone, had spoken with pride about the fact that thirty-five per cent of the people working in London had been born in a foreign country. For years a sense of excitement and optimism about these changes in London and the wider country seemed the only appropriate tone to strike. This was bolstered by the sense that what had happened in the first decade of the twenty-first century was simply a continuation of what had worked well for Britain in the previous three decades. This soon turned out to be a politically-correct pretence, though what was new in this decade was not so much growth in immigration from Commonwealth countries and the Middle East, or from wartorn former Yugoslavia, but the impact of white European migrants from the new EU countries, under the terms of the accession treaties and the ‘freedom of movement’ regulations of the single market. As I noted in the previous article, the British government could have delayed the implementation of these provisions but chose not to.

Questions about the Quality & Quantity of Migration:

004

Besides the linguistic and cultural factors already dealt with, there were important economic differences between the earlier and the more recent migrations of Eastern Europeans. After 2004, young, educated Polish, Czech and Hungarian people had moved to Britain to earn money to earn money to send home or to take home with them in order to acquire good homes, marry and have children in their rapidly developing countries. And for Britain, as the host country, the economic growth of the 2000s was fuelled by the influx of energetic and talented people who, in the process, were also denying their own country their skills for a period. But the UK government had seriously underestimated the number of these workers who wanted to come to Britain. Ministers suggested that the number arriving would be around 26,000 over the first two years. This turned out to be wildly wrong, and in 2006 a Home Office minister was forced to admit that since EU expansion in 2004, 427,000 people from Poland and seven other new EU nations had applied to work in Britain. If the self-employed were included, he added, then the number might be as high as 600,000. There were also at least an additional 36,000 spouses and children who had arrived, and 27,000 child benefit applications had been received. These were very large numbers indeed, even if most of these turned out to be temporary migrants.

It has to be remembered, of course, that inward migration was partially offset by the outflow of around sixty thousand British people each year, mainly permanent emigrants to Australia, the United States, France and Spain. By the winter of 2006-07, one policy institute reckoned that there were 5.5 million British people living permanently overseas, nearly ten per cent of Britons, or more than the population of Scotland. In addition, another half a million were living abroad for a significant part of the year. Aside from Europe, the Middle East and Asia were seeing rising ‘colonies’ of expatriate British. A worrying proportion of them were graduates; Britain was believed to be losing one in six of its graduates to emigration. Many others were retired or better-off people looking for a life in the sun, just as many of the newcomers to Britain were young, ambitious and keen to work. Government ministers tended to emphasise these benign effects of immigration, but their critics looked around and asked where all the extra people would go, where they would live, and where their children would go to school, not to mention where the extra hospital beds, road space and local services would come from, and how these would be paid for.

Members of the campaign group Citizens UK hold a ‘refugees welcome’ event outside Lunar House in Croydon. Photograph: John Stillwell/PA

A secondary issue to that of ‘numbers’ was the system for asylum seekers. In 2000, there were thirty thousand failed asylum seekers in the United Kingdom, a third of those who had applied in 1999, when only 7,645 had been removed from the country. It was decided that it was impossible to remove more, and that to try to do so would prove divisive politically and financially costly. Added to this was the extent of illegal immigration, which had caught the ‘eye’ of the British public. There were already criminal gangs of Albanians, Kosovars and Albanians, operating from outside the EU, who were undermining the legal migration streams from Central-Eastern Europe in the eyes of many. The social service bill for these ‘illegal’ migrants became a serious burden for the Department of Social Security. Towns like Slough protested to the national government about the extra cost in housing, education and other services.

In addition, there was the sheer scale of the migration and the inability of the Home Office’s immigration and nationality department to regulate what was happening, to prevent illegal migrants from entering Britain, to spot those abusing the asylum system in order to settle in Britain and the failure to apprehend and deport people. Large articulated lorries filled with migrants, who had paid over their life savings to be taken to Britain, rumbled through the Channel Tunnel and the ferry ports. A Red Cross camp at Sangatte, near the French entrance to the ‘Chunnel’ (the photo below shows the Folkestone entrance), was blamed by Britain for exacerbating the problem. By the end of 2002, an estimated 67,000 had passed through the camp to Britain. The then Home Secretary, David Blunkett finally agreed on a deal with the French to close the camp down, but by then many African, Asian and Balkan migrants, believing the British immigration and benefits systems to be easier than those of other EU countries, had simply moved across the continent and waited patiently for their chance to board a lorry to Britain.

006 (2)

Successive Home Secretaries from Blunkett to Reid tried to deal with the trade, the latter confessing that his department was “not fit for purpose”. He promised to clear a backlog of 280,000 failed asylum claims, whose seekers were still in the country after five years. The historic Home Office was split up, creating a separate immigration and nationality service. Meanwhile, many illegal immigrants had succeeded in bypassing the asylum system entirely. In July 2005, the Home Office produced its own estimate of the number of these had been four years earlier. It reckoned that this was between 310,000 and 570,000, or up to one per cent of the total population. A year later, unofficial estimates pushed this number up to 800,000. The truth was that no-one really knew, but official figures showed the number applying for asylum were now falling, with the former Yugoslavia returning to relative peace.  Thousands of refugees were also being returned to Iraq, though the signs were already apparent that further wars in the Middle East and the impact of global warming on sub-Saharan Africa would soon send more disparate groups across the continents.

Britain’s Toxic Politics of Immigration:

010

To begin with, the arrival of workers from the ten countries who joined the EU in 2004 was a different issue, though it involved an influx of roughly the same size. By the government’s own figures, annual net inward migration had reached 185,000 and had averaged 166,000 over the previous seven years. This was significantly more than the average net inflow of fifty thousand New Commonwealth immigrants which Enoch Powell (pictured above) had referred to as ‘literally mad’ in his 1968 Rivers of Blood speech, though he had been criticising the immigration of East African Asians, of course. But although Powell’s speech was partly about race, colour and identity, it was also about numbers of immigrants and the practical concerns of his Wolverhampton constituents in finding hospital and school places in an overstretched public sector. It seems not unreasonable, and not at all racist, to suggest that it is a duty of central government to predict and provide for the number of newcomers it permits to settle in the country. In 2006, the Projections based on many different assumptions suggested that the UK population would grow by more than seven million by 2031. Of that, eighty per cent would be due to immigration. The organisation, Migration Watch UK, set up to campaign for tighter immigration controls, said this was equivalent to requiring the building of a new town the size of Cambridge each year, or five new cities the size of Birmingham over the predicted quarter century.

But such characterisations were surely caricatures of the situation since many of these new Eastern European migrants did not intend to settle permanently in the UK and could be expected to return to their countries of origin in due course. However, the massive underestimations of the scale of the inward migration were, of course, predictable to anybody with any knowledge of the history of post-war migration, replete with vast underestimates of the numbers expected. But it did also demonstrate that immigration control was simply not a priority for New Labour, especially in its early manifestations. It gave the impression that it regarded all immigration control, and even discussion of it, as inherently ‘racist’ (even the restriction of white European migration), which made any internal or external opposition hard to voice. The public response to the massive upsurge in immigration and to the swift transformation of parts of Britain it had not really reached before, was exceptionally tolerant. There were no significant or sustained outbreaks of racist abuse or violence before 2016, and the only racist political party, the British National Party (BNP) was subsequently destroyed, especially in London.

Official portrait of Dame Margaret Hodge crop 2.jpgIn April 2006, Margaret Hodge, the Labour MP for Barking since 1996 (pictured right), commented in an interview with The Sunday Telegraph that eight out of ten white working-class voters in her constituency might be tempted to vote for the British National Party (BNP) in the local elections on 4 May 2006 because “no one else is listening to them” about their concerns over unemployment, high house prices and the housing of asylum seekers in the area. She said the Labour Party must promote…

“… very, very strongly the benefits of the new, rich multi-racial society which is part of this part of London for me”.

There was widespread media coverage of her remarks, and Hodge was strongly criticised for giving the BNP publicity. The BNP went on to gain 11 seats in the local election out of a total of 51, making them the second largest party on the local council. It was reported that Labour activists accused Hodge of generating hundreds of extra votes for the BNP and that local members began to privately discuss the possibility of a move to deselect her. The GMB wrote to Hodge in May 2006, demanding her resignation. The Mayor of London, Ken Livingstone, later accused Hodge of “magnifying the propaganda of the BNP” after she said that British residents should get priority in council house allocations. In November 2009, the Leader of the BNP, Nick Griffin, announced that he intended to contest Barking at the 2010 general election. In spite of the unions’ position, Hodge was returned as Member for Barking in 2010, doubling her majority to over 16,000, whilst Griffin came third behind the Conservatives. The BNP lost all of its seats on Barking and Dagenham Council. Following the same general election in 2010, which saw New Labour defeated under Gordon Brown’s leadership.

Opinion polls and the simple, anecdotal evidence of living in the country showed that most people continued to feel zero personal animosity towards immigrants or people of different ethnic backgrounds. But poll after poll did show that a majority were deeply worried about what ‘all this’ migration meant for the country and its future. But even the mildest attempts to put these issues on the political agenda, such as the concerns raised by Margaret Hodge (and the 2005 Conservative election campaign poster suggesting ‘limits’ on immigration) were often met with condemnation by the ruling political class, with the result that there was still no serious public discussion of them. Perhaps successive governments of all hues had spent decades putting off any real debate on immigration because they suspected that the public disagreed with them and that it was a matter they had lost control over anyway.

Perhaps it was because of this lack of control that the principal reaction to the developing reality began to be to turn on those who expressed any concern about it, even when they reflected the views of the general public. This was done through charges of ‘racism’ and ‘bigotry’, such as the accidental ‘caught-on-mike’ remark made by Gordon Brown while getting into his car in the 2010 election campaign, when confronted by one of his own Labour councillors in a northern English town about the sheer numbers of migrants. It is said to have represented a major turning point in the campaign. A series of deflecting tactics became a replacement for action in the wake of the 2011 census, including the demand that the public should ‘just get over it’, which came back to haunt David Cameron’s ministers in the wake of the 2016 Referendum. In his Daily Telegraph column of December 2012, titled Let’s not dwell on immigration but sow the seeds of integration, Boris Johnson, then Mayor of London, responded to the census results by writing…

We need to stop moaning about the dam-burst. It’s happened. There is nothing we can now do except make the process of absorption as eupeptic as possible … 

The Mayor, who as an MP and member of David Cameron’s front-bench team later became a key leader of the ‘Leave’ campaign and an ardent Brexiteer, may well have been right in making this statement, saying what any practical politician in charge of a multi-cultural metropolis would have to say. But there is something cold about the tone of his remark, not least the absence of any sense that there were other people out there in the capital city not willing simply to ‘get over it’, who disliked the alteration of their society and never asked for it. It did not seem to have occurred to Johnson that there were those who might be nursing a sense of righteous indignation that about the fact that for years all the main parties had taken decisions that were so at variance with the opinions of their electors, or that there was something profoundly disenfranchising about such decisions, especially when addressed to a majority of the voting public.

In the same month as Johnson’s admonition, a poll by YouGov found two-thirds of the British public believed that immigration over the previous decade had been ‘a bad thing for Britain’. Only eleven per cent thought it had been ‘a good thing’. This included majorities among voters for every one of the three main parties. Poll after poll conducted over the next five years showed the same result. As well as routinely prioritising immigration as their top concern, a majority of voters in Britain regularly described immigration as having a negative impact on their public services and housing through overcrowding, as well as harming the nation’s identity. By 2012 the leaders of every one of the major parties in Britain had conceded that immigration was too high, but even whilst doing so all had also insisted that the public should ‘get over it’. None had any clear or successful policy on how to change course. Public opinion surveys suggest that a failure to do anything about immigration even while talking about it is one of the key areas of the breakdown in trust between the electorate and their political representatives.

At the same time, the coalition government of 2010-15 was fearful of the attribution of base motives if it got ‘tough on immigrants’. The Conservative leadership was trying to reposition itself as more socially ‘liberal’ under David Cameron. Nevertheless, at the election, they had promised to cut immigration from hundreds of thousands to tens of thousands per year, but they never succeeded in getting near that target. To show that she meant ‘business’, however, in 2013, Theresa May’s Home Office organised a number of vans with advertising hoardings to drive around six London boroughs where many illegal immigrants and asylum seekers lived. The posters on the hoardings read, In the UK illegally? Go home or face arrest, followed by a government helpline number. The posters became politically toxic immediately. The Labour Shadow Home Secretary, Yvette Cooper, described them as “divisive and disgraceful” and the campaign group Liberty branded them “racist and illegal”.

After some months it was revealed that the pilot scheme had successfully persuaded only eleven illegal immigrants to leave the country voluntarily. Theresa May admitted that the scheme had been a mistake and too “blunt”. Indeed, it was a ‘stunt’ designed to reassure the ‘native’ population that their government was getting tough, and it was not repeated, but the overall ‘hostile environment’ policy it was part of continued into the next majority Conservative government, leading to the illegal deportation of hundreds of ‘Windrush generation’ migrants from the Caribbean who had settled in Britain before 1968 and therefore lacked passports and papers identifying them as British subjects. The Tories repeated their promise on immigration more recently, in both David Cameron’s majority government of 2015 and Theresa May’s minority one of 2017, but are still failing to get levels down to tens of thousands. In fact, under Cameron, net immigration reached a record level of 330,000 per year, numbers which would fill a city the size of Coventry.

The movement of people, even before the European migration crisis of 2015, was of an entirely different quantity, quality and consistency from anything that the British Isles had experienced before, even in the postwar period. Yet the ‘nation of immigrants’ myth continued to be used to cover over the vast changes in recent years to pretend that history can be used to provide precedents for what has happened since the turn of the millennium. The 2011 Census could have provided an opportunity to address the recent transformation of British society but like other opportunities in the second half of the twentieth century to discuss immigration, it was missed. If the fact that ‘white Britons’ now comprised a minority of the London population was seen as a demonstration of ‘diversity’ then the census had shown that some London boroughs were already lacking in ‘diversity’, not because there weren’t enough people of immigrant origin but because there weren’t enough ‘white Britons’ still around to make those boroughs diverse.

Brexit – The Death of Diversity:

Since the 2011 Census, net migration into Britain has continued to be far in excess of three hundred thousand per year. The rising population of the United Kingdom is now almost entirely due to inward migration, and to higher birthrates among the predominantly young migrant population. In 2014 women who were born overseas accounted for twenty-seven per cent of all live births in England and Wales, and a third of all newborn babies had at least one overseas-born parent, a figure that had doubled since the 1990s. However, since the 2016 Brexit vote, statistics have shown that many recent migrants to Britain from the EU have been returning to their home countries so that it is difficult to know, as yet, how many of these children will grow up in Britain, or for how long. On the basis of current population trends, and without any further rise in net inward migration, the most modest estimate by the ONS of the future British population is that it will rise from its current level of sixty-five million to seventy million within a decade, seventy-seven million by 2050 and to more than eighty million by 2060. But if the post-2011 levels were to continue, the UK population would go above eighty million as early as 2040 and to ninety million by 2060. In this context, Douglas Murray asks the following rhetoric questions of the leaders of the mainstream political parties:

All these years on, despite the name-calling and the insults and the ignoring of their concerns, were your derided average white voters not correct when they said that they were losing their country? Irrespective of whether you think that they should have thought this, let alone whether they should have said this, said it differently or accepted the change more readily, it should at some stage cause people to pause and reflect that the voices almost everybody wanted to demonise and dismiss were in the final analysis the voices whose predictions were nearest to being right.

An Ipsos poll published in July 2016 surveyed public attitudes towards immigration across Europe. It revealed just how few people thought that immigration has had a beneficial impact on their societies. To the question, Would you say that immigration has generally had a positive or negative impact on your country? very low percentages of people in each country thought that it had had a positive effect. Britain had a comparatively positive attitude, with thirty-six per cent of people saying that they thought it had had a very or fairly positive impact. Meanwhile, on twenty-four per cent of Swedes felt the same way and just eighteen per cent of Germans. In Italy, France and Belgium only ten to eleven per cent of the population thought that it had made even a fairly positive impact on their countries. Despite the Referendum result, the British result may well have been higher because Britain had not experienced the same level of immigration from outside the EU as had happened in the inter-continental migration crisis of the previous summer.

whos-in-control-7

Indeed, the issue of immigration as it affected the 2016 Referendum in Britain was largely about the numbers of Eastern European migrants arriving in the country, rather than about illegal immigrants from outside the EU, or asylum seekers. Inevitably, all three issues became confused in the public mind, something that UKIP (United Kingdom Independence Party) used to good effect in its campaigning posters. The original version of the poster above, featuring UKIP leader Nigel Farage, caused considerable controversy by using pictures from the 2015 Crisis in Central-Eastern Europe to suggest that Europe was at ‘Breaking Point’ and that once in the EU, refugees and migrants would be able to enter Britain and settle there. This was untrue, as the UK is not in the ‘Schengen’ area. Campaigners against ‘Brexit’ pointed out the facts of the situation in the adapted internet poster. In addition, during the campaign, Eastern European leaders, including the Poles and the Hungarians, complained about the misrepresentation of their citizens as ‘immigrants’ like many of those who had recently crossed the EU’s Balkan borders in order to get to Germany or Sweden. As far as they were concerned, they were temporary internal migrants within the EU’s arrangements for ‘freedom of movement’ between member states. Naturally, because this was largely a one-way movement in numeric terms, this distinction was lost on many voters, however, as ‘immigration’ became the dominant factor in their backing of Brexit by a margin of 52% to 48%.

In Britain, the issue of Calais remained the foremost one in discussion in the autumn of 2016. The British government announced that it was going to have to build a further security wall near to the large migrant camp there. The one-kilometre wall was designed to further protect the entry point to Britain, and specifically to prevent migrants from trying to climb onto passing lorries on their way to the UK. Given that there were fewer than 6,500 people in the camp most of the time, a solution to Calais always seemed straightforward. All that was needed, argued activists and politicians, was a one-time generous offer and the camp could be cleared. But the reality was that once the camp was cleared it would simply be filled again. For 6,500 was an average day’s migration to Italy alone.

Blue: Schengen Area Green: Countries with open borders Ochre: Legally obliged to join

In the meantime, while the British and French governments argued over who was responsible for the situation at Calais, both day and night migrants threw missiles at cars, trucks and lorries heading to Britain in the hope that the vehicles would stop and they could climb aboard as stowaways for the journey across the Channel. The migrants who ended up in Calais had already broken all the EU’s rules on asylum in order to get there. They had not applied for asylum in their first country of entry, Greece, nor even in Hungary. Instead, they had pushed on through the national borders of the ‘Schengen’ free passage area (see map above right) until they reached the north of France. If they were cold, poor or just worse off, they were seen as having the right to come into a Europe which could no longer be bothered to turn anyone away.

007

Migrants/ Asylum Seekers arriving on the shores of the Greek island of Lesbos.

The Disintegration of Multiculturalism, ‘Parallel Development’ & the Populist Reaction in Britain:

After the 9/11 attacks on the USA, the wars in Iraq and Afghanistan and the 7/7 London bombings, there was no bigger cultural challenge to the British sense of proportion and fairness than the threat of ‘militant Islam’. There were plenty of angry young Muslim men prepared to listen to fanatical ‘imams’ and to act on their narrow-minded and bloodthirsty interpretations of ‘Jihad’. Their views, at odds with those of the well-established South Asian Muslim communities referred to above, were those of the ultra-conservative ‘Wahhabi’ Arabs and Iranian mullahs who insisted, for example, on women being fully veiled. But some English politicians, like Norman Tebbit, felt justified in asking whether Muslim communities throughout Britain really wanted to fully integrate. Would they, in Tebbit’s notorious ‘test’, support the English Cricket team when it played against Pakistan?

Britain did not have as high a proportion of Muslims as France, and not many, outside London and parts of the South East, of Arab and North African origin. But the large urban centres of the Home Counties, the English Midlands and the North of England had third generation Muslim communities of hundreds of thousands. They felt like they were being watched in a new way and were perhaps right to feel more than a little uneasy. In the old industrial towns on either side of the Pennines and in areas of West London there were such strong concentrations of Muslims that the word ‘ghetto’ was being used by ministers and civil servants, not just, as in the seventies and eighties, by rightwing organisations and politicians. White working-class people had long been moving, quietly, to more semi-rural commuter towns in the Home Counties and on the South Coast.

But those involved in this ‘white flight’, as it became known, were a minority if polling was an accurate guide. Only a quarter of Britons said that they would prefer to live in white-only areas. Yet even this measure of ‘multiculturalism’, defined as ‘live and let live’, was being questioned. How much should the new Britons ‘integrate’ or ‘assimilate’, and how much was the retention of traditions a matter of their rights to a distinctive cultural identity? After all, Britain had a long heritage of allowing newcomers to integrate on their own terms, retaining and contributing elements of their own culture. Speaking in December 2006, Blair cited forced marriages, the importation of ‘sharia’ law and the ban on women entering certain mosques as being on the wrong side of this line. In the same speech he used new, harder language. He claimed that, after the London bombings, …

“… for the first time in a generation there is an unease, an anxiety, even at points a resentment that outr very openness, our willingness to welcome difference, our pride in being home to many cultures, is being used against us … Our tolerance is what makes is part of what makes Britain, Britain. So conform to it; or don’t come here. We don’t want the hate-mongers … If you come here lawfully, we welcome you. If you are permitted to stay here permanently, you become an equal member of our community and become one of us.”

His speech was not just about security and the struggle against terrorism. He was defining the duty to integrate. Britain’s strong economic growth over the previous two decades, despite its weaker manufacturing base, was partly the product of its long tradition of hospitality. The question now was whether the country was becoming so overcrowded that this tradition of tolerance was finally eroding. England, in particular, had the highest population density of any major country in the Western world. It would require wisdom and frankness from politicians together with watchfulness and efficiency from Whitehall to keep the ship on an even keel. Without these qualities and trust from the people, how can we hope for meaningful reconciliation between Muslim, Christian, Jew and Humanist?; between newcomers, sojourners, old-timers and exiles?; between white Europeans, black Africans, South Asians and West Indians?

Map showing the location of Rotherham in South Yorkshire

In January 2011, a gang of nine Muslim men, seven of Pakistani heritage and two from North Africa, were convicted and sentenced at the Old Bailey in London for the sex trafficking of children between the ages of eleven and fifteen. One of the victims sold into a form of modern-day slavery was a girl of eleven who was branded with the initial of her ‘owner’ and abuser: ‘M’ for Mohammed. The court heard that he had branded her to make her his property and to ensure others knew about it. This did not happen in a Saudi or Pakistani backwater, nor even in one of the northern English towns that so much of the country had forgotten about until similar crimes involving Pakistani heritage men were brought to light. This happened in Oxfordshire between 2004 and 2012. Nobody could argue that gang rape and child abuse are the preserve of immigrants, but these court cases and the official investigations into particular types of child-rape gangs, especially in the case of Rotherham, have identified specific cultural attitudes towards women, especially non-Muslim women, that are similar to those held by men in parts of Pakistan. These have sometimes been extended into intolerant attitudes toward other religions, ethnic groups and sexual minorities. They are cultural attitudes which are anathema to the teachings of the Qu’ran and mainstream Imams, but fears of being accused of ‘racism’ for pointing out such factual connections had been at least partly responsible for these cases taking years to come to light.

British Muslims and members of the British-Pakistani community condemned both the abuse and that it had been covered up. Nazir Afzal (pictured right), Chief Crown Prosecutor of the Crown Prosecution Service (CPS) for North West England from 2011–2015, himself a Muslim, made the decision in 2011 to prosecute the Rochdale child sex abuse ring after the CPS had turned the case down. Responding to the Jay report, he argued that the abuse had no basis in Islam:

“Islam says that alcohol, drugs, rape and abuse are all forbidden, yet these men were surrounded by all of these things. … It is not the abusers’ race that defines them. It is their attitude toward women that defines them.” 

Below left: The front page of The Times, 24 September 2012.

Even then, however, in the Oxfordshire case, the gangs were described as ‘Asian’ by the media, rather than as men of Pakistani and Arabic origin. In addition, the fact that their victims were chosen because they were not Muslim was rarely mentioned in court or dwelt upon by the press. But despite sections of the media beginning focus on Pakistani men preying on young white girls, a 2013 report by the UK Muslim Women’s Network found that British Asian girls were also being abused across the country in situations that mirrored the abuse in Rotherham. The unfunded small-scale report found 35 cases of young Muslim girls of Pakistani-heritage being raped and passed around for sex by multiple men. In the report, one local Pakistani women’s group described how Pakistani-heritage girls were targeted by taxi drivers and on occasion by older men lying in wait outside school gates at dinner times and after school. They also cited cases in Rotherham where Pakistani landlords had befriended Pakistani women and girls on their own for purposes of sex, then passed on their name to other men who had then contacted them for sex. The Jay Report, published in 2014, acknowledged that the 2013 report of abuse of Asian girls was ‘virtually identical’ to the abuse that occurred in Rotherham, and also acknowledged that British Asian girls were unlikely to report their abuse due to the repercussions on their family. Asian girls were ‘too afraid to go to the law’ and were being blackmailed into having sex with different men while others were forced at knife-point to perform sexual acts on men. Support workers described how one teenage girl had been gang-raped at a party:

“When she got there, there was no party, there were no other female members present. What she found was that there were five adults, their ages ranging between their mid-twenties going on to the late-forties and the five men systematically, routinely, raped her. And the young man who was supposed to be her boyfriend stood back and watched”.

Groups would photograph the abuse and threaten to publish it to their fathers, brothers, and in the mosques, if their victims went to the police.

In June 2013, the polling company ComRes carried out a poll for BBC Radio 1 asking a thousand young British people about their attitudes towards the world’s major religions. The results were released three months later and showed that of those polled, twenty-seven per cent said that they did not trust Muslims (compared with 15% saying the same of Jews, 13% of Buddhists, and 12% of Christians). More significantly, perhaps, forty-four per cent said that they thought Muslims did not share the same views or values as the rest of the population. The BBC and other media in Britain then set to work to try to discover how Britain could address the fact that so many young people thought this way. Part of the answer may have had something to do with the timing of the poll, the fieldwork being carried out between 7-17 June. It had only been a few weeks before this that Drummer Lee Rigby, a young soldier on leave from Afghanistan, had been hit by a car in broad daylight outside an army barracks in South London, dragged into the middle of the road and hacked to death with machetes. The two murderers, Michael Adebolajo and Michael Adebowale, were Muslims of African origin who were carrying letters claiming justification for killing “Allah’s enemies”. It’s therefore reasonable to suppose that, rather than making assumptions about a religious minority without any evidence, those who were asked their opinions connected Muslims with a difference in basic values because they had been very recently associated with an act of extreme violence on the streets of London.

Unfortunately, attempts to provide a more balanced view and to separate these acts of terrorism from Islam have been dwarfed by the growing public perception of a problem which will not simply go away through the repetition of ‘mantras’. The internet has provided multiple and diverse sources of information, but the simple passage of the various events related above, and the many others available examples, have meant that the public have been able to make their own judgements about Islam, and they are certainly not as favourable as they were at the start of the current century. By 2015, one poll showed that only thirty per cent of the general public in Britain think that the values of Islam are ‘compatible’ with the values of British society. The passage of terrorist events on the streets of Europe continued through 2016 and 2017. On 22 March 2017, a 52-year-old British born convert to Islam, Khalid Masood, ploughed his car across Westminster Bridge, killing two tourists, one American and the other Romanian, and two British nationals. Dozens more were injured as they scattered, some falling into the River Thames below. Crashing into the railings at the side of Parliament, Masood then ran out of the hired vehicle and through the gates of the palace, where he stabbed the duty policeman, PC Keith Palmer, who died a few minutes later. Masood was then shot dead by armed police, his last phone messages revealing that he believed he was “waging jihad.” Two weeks later, at an inter-faith ‘Service of Hope’ at Westminster Abbey, its Dean, the Very Reverend John Hall, spoke for a nation he described as ‘bewildered’:

What could possibly motivate a man to hire a car and take it from Birmingham to Brighton to London, and then drive it fast at people he had never met, couldn’t possibly know, against whom he had no personal grudge, no reason to hate them and then run at the gates of the Palace of Westminster to cause another death? It seems that we shall never know.

Then on 22 May thousands of young women and girls were leaving a concert by the US pop singer Ariana Grande at Manchester Arena. Waiting for them as they streamed out was Salman Abedi, a twenty-two-year-old British-born man, whose Libyan parents had arrived in the UK in the early nineties after fleeing from the Gadaffi régime. In the underground foyer, Abedi detonated a bomb he was carrying which was packed with nuts, bolts and other shrapnel. Twenty-two people, children and parents who had arrived to pick them up, were killed instantly. Hundreds more were injured, many of them suffering life-changing wounds. Then, in what began to seem like a remorseless series of events, on 3 June three men drove a van into pedestrians crossing London Bridge. They leapt out of it and began slashing at the throats of pedestrians, appearing to be targeting women in particular. They then ran through Borough Market area shouting “this is for Allah”. Eight people were murdered and many more seriously injured before armed police shot the three men dead. Two of the three, all of whom were aged twenty to thirty, were born in Morocco. The oldest of them, Rachid Redouane, had entered Britain using a false name, claiming to be a Libyan and was actually five years older than he had pretended. He had been refused asylum and absconded. Khurram Butt had been born in Pakistan and had arrived in the UK as a ‘child refugee’ in 1998, his family having moved to the UK to claim asylum from ‘political oppression’, although Pakistan was not on the UNHCR list.

On the evening of 19 June, at end of the Muslim sabbath, in what appeared to be a ‘reprisal’, a forty-seven-year-old father or four from Cardiff drove a van into crowds of worshippers outside Finsbury Park mosque who were crossing the road to go to the nearby Muslim Welfare House. One man, who had collapsed on the road and was being given emergency aid, was run over and died at the scene. Almost a dozen more were injured. Up to this point, all the Islamist terror attacks, from 7/7/2005 onwards, had been planned and carried out by ‘home-grown’ terrorists. Even the asylum seekers involved in the June attack in London had been in the country since well before the 2015 migration crisis. But in mid-September, an eighteen-year-old Iraqi who arrived in the UK illegally in 2015, and had been living with British foster parents ever since, left a crudely-manufactured bomb on the London Underground District line during the rush hour when the carriages were also crowded with schoolchildren. The detonator exploded but failed to ignite the home-made device itself, leading to flash burns to the dozens of people in the carriage. A more serious blast would have led to those dozens being taken away in body bags, and many more injured in the stampede which would have followed at the station exit with its steep steps. As it was, the passengers remained calm during their evacuation, but the subsequent emphasis on the ubiquitous Blitz slogan ‘Keep Calm and Carry On!’

Conclusion: Brexit at its ‘Best’.

002

Of course, it would have been difficult to predict and prevent these attacks, either by erecting physical barriers or by identifying individuals who might be at risk from ‘radicalisation’, much of which takes place online. Most of the attackers had been born and radicalised in the UK, so no reinforcements at the borders, either in Calais or Kent would have kept them from enacting their atrocities. But the need for secure borders is not simple a symbolic or psychological reinforcement for the British people if it is combined with a workable and efficient asylum policy. We are repeatedly told that one of the two main reasons for the 2016 referendum decision for Britain to leave the EU was in order to take back control of its borders and immigration policy, though it was never demonstrated how exactly it had lost control of these, or at least how its EU membership had made it lose control over them.

001

There are already signs that, as much due to the fall in the value of the pound since Brexit as to Brexit itself, many Eastern European migrants are returning to their home countries, but the vast majority of them had already declared that they did not intend to settle permanently in the UK. The fact that so many came from 2004 onwards was entirely down to the decision of the British government not to delay or derogate the operation of the accession treaties. But the reality remains that, even if they were to be replaced by other European ‘immigrants’ in future, the UK would still need to control, as ever, the immigration of people from outside the EU, including asylum seekers, and that returning failed or bogus applicants would become more difficult. So, too, would the sharing of intelligence information about the potential threats of terrorists attempting to enter Britain as bogus refugees. Other than these considerations, the home-grown threat from Islamist terrorists is likely to be unaffected by Brexit one way or another, and can only be dealt with by anti-radicalisation strategies, especially through education and more active inter-cultural community relations aimed at full integration, not ‘parallel’ development.

‘Populism’

Since the Brexit referendum in 2016 and the election of Donald Trump, it seems that journalists just cannot get enough of Populism. In 1998, the Guardian published about three hundred articles that contained the term. In 2015, it was used in about a thousand articles, and one year later this number had doubled to almost two thousand. Populist parties across Europe have tripled their vote in Europe over the past twenty years and more than a quarter of Europeans voted populist in their last elections. So, in deciding to leave the EU, the British are, ironically, becoming more like their continental cousins in supporting populist causes and parties. In a recent article in The Guardian Weekly, (30 November 2018), Fintan O’Toole, a columnist for The Irish Times, points out that for many pro-Brexit journalists and politicians Brexit takes the form of a populist ‘Britain alone’ crusade (see the picture and text below) which has been endemic in Britain’s political discourse about Europe since it joined ‘the common market’ in 1973:

Europe’s role in this weird psychodrama is entirely pre-scripted. It doesn’t greatly matter what the European Union is or what it is doing – its function in the plot is to be a more insiduous form of nazism. This is important to grasp, because one of the key arguments in mainstream pro-Brexit political and journalistic discourse would be that Britain had to leave because the Europe it had joined was not the Europe it found itself part of in 2016…

… The idea of Europe as a soft-Nazi superstate was vividly present in 1975, even when the still-emerging EU had a much weaker, less evolved and less intrusive form…

Yet what brings these disparate modes together is the lure of self-pity, the weird need to dream England into a state of awful oppression… Hostility to the EU thus opens the way to a bizarre logic in which a Nazi invasion would have been, relatively speaking, welcome…

It was a masochistic rhetoric that would return in full force as the Brexit negotiations failed to produce the promised miracles.

002

Certainly, the rejection of Mrs May’s deal in the House of Commons by large numbers of ‘Brexiteer’ MPs from her own Conservative Party was largely, by their own admission, because they felt they could not trust the assurances given by the Presidents of the Council and Commission of the European Union who were, some MPs stated, trying to trick them into accepting provisions which would tie the UK indefinitely to EU regulations. It is undoubtedly true that the British people mostly don’t want to spend any more time arguing about Brexit. But when ‘leavers’ and ‘remainers’ are united only in disliking Mrs May’s solution, that offers no way forward. The Brexiteers can only offer a “managed no deal” as an alternative, which means just strapping on seat belts as your car heads for the cliff edge. Brexit has turned out to be an economic and political disaster already, fuelling, not healing the divisions in British society which have opened up over the last twenty years, and have widened into a chasm in the last six years since the triumph of the London Olympics and the Diamond Jubilee Celebrations. The extent of this folly has grown clearer with each turn of the page. But the ending is not fully written.

Sources (for both parts):

The Guardian Weekly,  30 November 2018. London.

Douglas Murray (2018), The Strange Death of Europe: Immigration, Identity, Islam. London: Bloomsbury.

Simon Schama (2002), A History of Britain III: 1776-2000, The Fate of Empire. London: BBC Worldwide.

Andrew Marr (2009), A History of Modern Britain. London: Pan Macmillan.

John Morrill (ed.), (2001), The Penguin Atlas of British and Irish History. Harmondsworth: Penguin Books.

 

Posted January 16, 2019 by AngloMagyarMedia in Affluence, Africa, Arabs, Assimilation, asylum seekers, Australia, Balkan Crises, BBC, Brexit, Britain, British history, Britons, Brussels, Caribbean, Cartoons, Christian Faith, Christianity, Church, Colonisation, Commonwealth, Compromise, decolonisation, democracy, Demography, devolution, Discourse Analysis, Education, Empire, English Language, Europe, European Economic Community, European Union, Factories, Germany, History, Home Counties, Humanitarianism, Hungary, Immigration, India, Integration, Iraq, Ireland, Jews, Journalism, Labour Party, liberalism, Midlands, Migration, multiculturalism, multilingualism, Mythology, New Labour, Population, populism, Reconciliation, Refugees, Respectability, Satire, Second World War, terror, terrorism, United Kingdom, United Nations, West Midlands, World War Two, xenophobia

Tagged with , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

You Only Live Twice – Cool Britannia to Cold Brexit: The United Kingdom, 1999-2019. Part One: Economics, Culture & Society.   Leave a comment

Europe-map-without-UK-012

Cold Shoulder or Warm Handshake?

On 29 March 2019, the United Kingdom of Great Britain and Northern Ireland will leave the European Union after forty-six years of membership, since it joined the European Economic Community on 1 January 1973 on the same day and hour as the Republic of Ireland. Yet in 1999, it looked as if the long-standing debate over Britain’s membership had been resolved. The Maastricht Treaty establishing the European Union had been signed by all the member states of the preceding European Community in February 1992 and was succeeded by a further treaty, signed in Amsterdam in 1999. What, then, has happened in the space of twenty years to so fundamentally change the ‘settled’ view of the British Parliament and people, bearing in mind that both Scotland and Northern Ireland voted to remain in the EU, while England and Wales both voted to leave? At the time of writing, the manner of our going has not yet been determined, but the invocation of ‘article fifty’ by the Westminster Parliament and the UK government means that the date has been set. So either we will have to leave without a deal, turning a cold shoulder to our erstwhile friends and allies on the continent, or we will finally ratify the deal agreed between the EU Commission, on behalf of the twenty-seven remaining member states, and leave with a warm handshake and most of our trading and cultural relations intact.

As yet, the possibility of a second referendum – or third, if we take into account the 1975 referendum, called by Harold Wilson (above) which was also a binary leave/ remain decision – seems remote. In any event, it is quite likely that the result would be the same and would kill off any opportunity of the UK returning to EU membership for at least another generation. As Ian Fleming’s James Bond tells us, ‘you only live twice’. That certainly seems to be the mood in Brussels too. I was too young to vote in 1975 by just five days, and another membership referendum would be unlikely to occur in my lifetime. So much has been said about following ‘the will of the people’, or at least 52% of them, that it would be a foolish government, in an age of rampant populism, that chose to revoke article fifty, even if Westminster voted for this. At the same time, and in that same populist age, we know from recent experience that in politics and international relations, nothing is inevitable…

referendum-ballot-box[1]

One of the major factors in the 2016 Referendum Campaign was the country’s public spending priorities, compared with those of the European Union. The ‘Leave’ campaign sent a double-decker bus around England stating that by ending the UK’s payments into the EU, more than 350 million pounds per week could be redirected to the National Health Service (NHS).

A British Icon Revived – The NHS under New Labour:

To understand the power of this statement, it is important to recognise that the NHS is unique in Europe in that it is wholly funded from direct taxation, and not via National Insurance, as in many other European countries. As a service created in 1948 to be ‘free at the point of delivery’, it is seen as a ‘British icon’ and funding has been a central issue in national election campaigns since 2001, when Tony Blair was confronted by an irate voter, Sharon Storer, outside a hospital. In its first election manifesto of 1997, ‘New Labour’ promised to safeguard the basic principles of the NHS, which we founded. The ‘we’ here was the post-war Labour government, whose socialist Health Minister, Aneurin Bevan, had established the service in the teeth of considerable opposition from within both parliament and the medical profession. ‘New Labour’ protested that under the Tories there had been fifty thousand fewer nurses but a rise of no fewer than twenty thousand managers – red tape which Labour would pull away and burn. Though critical of the internal markets the Tories had introduced, Blair promised to keep a split between those who commissioned health services and those who provided them.

001

Under Frank Dobson, Labour’s new Health Secretary, there was little reform of the NHS but there was, year by year, just enough extra money to stave off the winter crises. But then a series of tragic individual cases hit the headlines, and one of them came from a Labour peer and well-known medical scientist and fertility expert, Professor Robert Winston, who was greatly admired by Tony Blair. He launched a furious denunciation of the government over the treatment of his elderly mother. Far from upholding the NHS’s iconic status, Winston said that Britain’s health service was the worst in Europe and was getting worse under the New Labour government, which was being deceitful about the true picture. Labour’s polling on the issue showed that Winston was, in general terms, correct in his assessment in the view of the country as a whole. In January 2000, therefore, Blair announced directly to it that he would bring Britain’s health spending up to the European average within five years. That was a huge promise because it meant spending a third as much again in real terms, and his ‘prudent’ Chancellor of the Exchequer, Gordon Brown, was unhappy that Blair had not spoken enough on television about the need for health service reform to accompany the money, and had also ‘stolen’ his budget announcements. On Budget day itself, Brown announced that until 2004 health spending would rise at above six per cent beyond inflation every year, …

… by far the largest sustained increase in NHS funding in any period in its fifty-year history … half as much again for health care for every family in this country.       

The tilt away from Brown’s sharp spending controls during the first three years of the New Labour government had begun by the first spring of the new millennium, and there was more to come. With a general election looming in 2001, Brown also announced a review of the NHS and its future by a former banker. As soon as the election was over, broad hints about necessary tax rises were dropped. When the Wanless Report was finally published, it confirmed much that the winter crisis of 1999-2000 had exposed. The NHS was not, whatever Britons fondly believed, better than health systems in other developed countries, and it needed a lot more money. ‘Wanless’ also rejected a radical change in funding, such as a switch to insurance-based or semi-private health care. Brown immediately used this as objective proof that taxes had to rise in order to save the NHS. In his next budget of 2002, Brown broke with a political convention that which had reigned since the mid-eighties, that direct taxes would not be raised again. He raised a special one per cent national insurance levy, equivalent to a penny on income tax, to fund the huge reinvestment in Britain’s health.

Public spending shot up with this commitment and, in some ways, it paid off, since by 2006 there were around 300,000 extra NHS staff compared to 1997. That included more than ten thousand extra senior hospital doctors (about a quarter more) and 85,000 more nurses. But there were also nearly forty thousand managers, twice as many as Blair and Brown had ridiculed the Tory government for hiring. An ambitious computer project for the whole NHS became an expensive catastrophe. Meanwhile, the health service budget rose from thirty-seven billion to more than ninety-two billion a year. But the investment produced results, with waiting lists, a source of great public anger from the mid-nineties, falling by 200,000. By 2005, Blair was able to talk of the best waiting list figures since 1988. Hardly anyone was left waiting for an inpatient appointment for more than six months. Death rates from cancer for people under the age of seventy-five fell by 15.7 per cent between 1996 and 2006 and death rates from heart disease fell by just under thirty-six per cent. Meanwhile, the public finance initiative meant that new hospitals were being built around the country. But, unfortunately for New Labour, that was not the whole story of the Health Service under their stewardship. As Andrew Marr has attested,

…’Czars’, quangos, agencies, commissions, access teams and planners hunched over the NHS as Whitehall, having promised to devolve power, now imposed a new round of mind-dazing control.

By the autumn of 2004 hospitals were subject to more than a hundred inspections. War broke out between Brown and the Treasury and the ‘Blairite’ Health Secretary, Alan Milburn, about the basic principles of running the hospitals. Milburn wanted more competition between them, but Brown didn’t see how this was possible when most people had only one major local hospital. Polling suggested that he was making a popular point. Most people simply wanted better hospitals, not more choice. A truce was eventually declared with the establishment of a small number of independent, ‘foundation’ hospitals. By the 2005 general election, Michael Howard’s Conservatives were attacking Labour for wasting money and allowing people’s lives to be put at risk in dirty, badly run hospitals. Just like Labour once had, they were promising to cut bureaucracy and the number of organisations within the NHS. By the summer of 2006, despite the huge injection of funds, the Service was facing a cash crisis. Although the shortfall was not huge as a percentage of the total budget, trusts in some of the most vulnerable parts of the country were on the edge of bankruptcy, from Hartlepool to Cornwall and across to London. Throughout Britain, seven thousand jobs had gone and the Royal College of Nursing, the professional association to which most nurses belonged, was predicting thirteen thousand more would go soon. Many newly and expensively qualified doctors and even specialist consultants could not find work. It seemed that wage costs, expensive new drugs, poor management and the money poured into endless bureaucratic reforms had resulted in a still inadequate service. Bupa, the leading private operator, had been covering some 2.3 million people in 1999. Six years later, the figure was more than eight million. This partly reflected greater affluence, but it was also hardly a resounding vote of confidence in Labour’s management of the NHS.

Public Spending, Declining Regions & Economic Development:

As public spending had begun to flow during the second Blair administration, vast amounts of money had gone in pay rises, new bureaucracies and on bills for outside consultants. Ministries had been unused to spending again, after the initial period of ‘prudence’, and did not always do it well. Brown and his Treasury team resorted to double and triple counting of early spending increases in order to give the impression they were doing more for hospitals, schools and transport than they actually could. As Marr has pointed out, …

… In trying to achieve better policing, more effective planning, healthier school food, prettier town centres and a hundred other hopes, the centre of government ordered and cajoled, hassled and harangued, always high-minded, always speaking for ‘the people’.  

The railways, after yet another disaster, were shaken up again. In very controversial circumstances Railtrack, the once-profitable monopoly company operating the lines, was driven to bankruptcy and a new system of Whitehall control was imposed. At one point, Tony Blair boasted of having five hundred targets for the public sector. Parish councils, small businesses and charities found that they were loaded with directives. Schools and hospitals had many more. Marr has commented, …

The interference was always well-meant but it clogged up the arteries of free decision-taking and frustrated responsible public life. 

002

Throughout the New Labour years, with steady growth and low inflation, most of the country grew richer. Growth since 1997, at 2.8 per cent per year, was above the post-war average, GDP per head was above that of France and Germany and the country had the second lowest jobless figures in the EU. The number of people in work increased by 2.4 million. Incomes grew, in real terms, by about a fifth. Pensions were in trouble, but house price inflation soured, so the owners found their properties more than doubling in value and came to think of themselves as prosperous. By 2006 analysts were assessing the disposable wealth of the British at forty thousand pounds per household. However, the wealth was not spread geographically, averaging sixty-eight thousand in the south-east of England, but a little over thirty thousand in Wales and north-east England (see map above). But even in the historically poorer parts of the UK house prices had risen fast, so much so that government plans to bulldoze worthless northern terraces had to be abandoned when they started to regain value. Cheap mortgages, easy borrowing and high property prices meant that millions of people felt far better off, despite the overall rise in the tax burden. Cheap air travel gave the British opportunities for easy travel both to traditional resorts and also to every part of the European continent. British expatriates were able to buy properties across the French countryside and in southern Spain. Some even began to commute weekly to jobs in London or Manchester from Mediterranean villas, and regional airports boomed as a result.

Sir Tim Berners Lee arriving at the Guildhall to receive the Honorary Freedom of the City of LondonThe internet, also known as the ‘World-Wide Web’, which was ‘invented’ by the British computer scientist Tim Berners-Lee at the end of 1989 (pictured right in 2014), was advancing from the colleges and institutions into everyday life by the mid- ‘noughties’. It first began to attract popular interest in the mid-nineties: Britain’s first internet café and magazine, reviewing a few hundred early websites, were both launched in 1994. The following year saw the beginning of internet shopping as a major pastime, with both ‘eBay’ and ‘Amazon’ arriving, though to begin with they only attracted tiny numbers of people.

But the introduction of new forms of mail-order and ‘click and collect’ shopping quickly attracted significant adherents from different ‘demographics’.  The growth of the internet led to a feeling of optimism, despite warnings that the whole digital world would collapse because of the inability of computers to cope with the last two digits in the year ‘2000’, which were taken seriously at the time. In fact, the ‘dot-com’ bubble was burst by its own excessive expansion, as with any bubble, and following a pause and a lot of ruined dreams, the ‘new economy’ roared on again. By 2000, according to the Office of National Statistics (ONS), around forty per cent of Britons had accessed the internet at some time. Three years later, nearly half of British homes were ‘online’. By 2004, the spread of ‘broadband’ connections had brought a new mass market in ‘downloading’ music and video. By 2006, three-quarters of British children had internet access at home.

001

Simultaneously, the rich of America, Europe and Russia began buying up parts of London, and then other ‘attractive’ parts of the country, including Edinburgh, the Scottish Highlands, Yorkshire and Cornwall. ‘Executive housing’ with pebbled driveways, brick facing and dormer windows, was growing across farmland and by rivers with no thought of flood-plain constraints. Parts of the country far from London, such as the English south-west and Yorkshire, enjoyed a ripple of wealth that pushed their house prices to unheard-of levels. From Leith to Gateshead, Belfast to Cardiff Bay, once-derelict shorefront areas were transformed. The nineteenth-century buildings in the Albert Dock in Liverpool (above) now house a maritime museum, an art gallery, shopping centre and television studio. It has also become a tourist attraction. For all the problems and disappointments, and the longer-term problems with their financing, new schools and public buildings sprang up – new museums, galleries, vast shopping complexes (see below), corporate headquarters in a biomorphic architecture of glass and steel, more imaginative and better-looking than their predecessors from the dreary age of concrete.

002

Supermarket chains exercised huge market power, offering cheap meat and dairy products into almost everyone’s budgets. Factory-made ready-meals were transported and imported by the new global air freight market and refrigerated trucks and lorries moving freely across a Europe shorn of internal barriers. Out-of-season fruit and vegetables, fish from the Pacific, exotic foods of all kinds and freshly cut flowers appeared in superstores everywhere. Hardly anyone was out of reach of a ‘Tesco’, a ‘Morrison’s’, a ‘Sainsbury’s’ or an ‘Asda’. By the mid-noughties, the four supermarket giants owned more than 1,500 superstores throughout the UK. They spread the consumption of goods that in the eighties and nineties had seemed like luxuries. Students had to take out loans in order to go to university but were far more likely to do so than previous generations, as well as to travel more widely on a ‘gap’ year, not just to study or work abroad.

Those ‘Left Behind’ – Poverty, Pensions & Public Order:

Materially, for the majority of people, this was, to use Marr’s term, a ‘golden age’, which perhaps helps to explain both why earlier real anger about earlier pension decisions and stealth taxes did not translate into anti-Labour voting in successive general elections. The irony is that in pleasing ‘Middle Englanders’, the Blair-Brown government lost contact with traditional Labour voters, especially in the North of Britain, who did not benefit from these ‘golden years’ to the same extent. Gordon Brown, from the first, made much of New Labour’s anti-poverty agenda, and especially child poverty. Since the launch of the Child Poverty Action Group, this latter problem had become particularly emotive. Labour policies took a million children out of relative poverty between 1997 and 2004, though the numbers rose again later. Brown’s emphasis was on the working poor and the virtue of work. So his major innovations were the national minimum wage, the ‘New Deal’ for the young unemployed, and the working families’ tax credit, as well as tax credits aimed at children. There was also a minimum income guarantee and a later pension credit, for poorer pensioners.

The minimum wage was first set at three pounds sixty an hour, rising year by year. In 2006 it was 5.35 an hour. Because the figures were low, it did not destroy the two million jobs as the Tories claimed it would. Neither did it produce higher inflation; employment continued to grow while inflation remained low. It even seemed to have cut red tape. By the mid-noughties, the minimum wage covered two million people, the majority of them women. Because it was updated ahead of rises in inflation rates, the wages of the poor also rose faster. It was so successful that even the Tories were forced to embrace it ahead of the 2005 election. The New Deal was funded by a windfall tax on privatised utility companies, and by 2000 Blair said it had helped a quarter of a million young people back into work, and it was being claimed as a major factor in lower rates of unemployment as late as 2005. But the National Audit Office, looking back on its effect in the first parliament, reckoned the number of under twenty-five-year-olds helped into real jobs was as low as 25,000, at a cost per person of eight thousand pounds. A second initiative was targeted at the babies and toddlers of the most deprived families. ‘Sure Start’ was meant to bring mothers together in family centres across Britain – 3,500 were planned for 2010, ten years after the scheme had been launched – and to help them to become more effective parents. However, some of the most deprived families failed to show up. As Andrew Marr wrote, back in 2007:

Poverty is hard to define, easy to smell. In a country like Britain, it is mostly relative. Though there are a few thousand people living rough or who genuinely do not have enough to keep them decently alive, and many more pensioners frightened of how they will pay for heating, the greater number of poor are those left behind the general material improvement in life. This is measured by income compared to the average and by this yardstick in 1997 there were three to four million children living in households of relative poverty, triple the number in 1979. This does not mean they were physically worse off than the children of the late seventies, since the country generally became much richer. But human happiness relates to how we see ourselves relative to those around us, so it was certainly real. 

The Tories, now under new management in the shape of a media-marketing executive and old Etonian, David Cameron, also declared that they believed in this concept of relative poverty. After all, it was on their watch, during the Thatcher and Major governments, that it had tripled, which is why it was only towards the end of the New Labour governments that they could accept the definition of the left-of-centre Guardian columnist, Polly Toynbee. A world of ‘black economy’ work also remained below the minimum wage, in private care homes, where migrant servants were exploited, and in other nooks and crannies. Some 336,000 jobs remained on ‘poverty pay’ rates. Yet ‘redistribution of wealth’, a socialist phrase which had become unfashionable under New Labour lest it should scare away middle Englanders, was stronger in Brown’s Britain than in other major industrialised nations. Despite the growth of the super-rich, many of whom were immigrants anyway, overall equality increased in these years. One factor in this was the return to the means-testing of benefits, particularly for pensioners and through the working families’ tax credit, subsequently divided into a child tax credit and a working tax credit. This was a U-turn by Gordon Brown, who had opposed means-testing when in Opposition. As Chancellor, he concluded that if he was to direct scarce resources at those in real poverty, he had little choice.

Apart from the demoralising effect it had on pensioners, the other drawback to means-testing was that a huge bureaucracy was needed to track people’s earnings and to try to establish exactly what they should be getting in benefits. Billions were overpaid and as people did better and earned more from more stable employment, they then found themselves facing huge demands to hand back the money they had already spent. Thousands of extra civil servants were needed to deal with the subsequent complaints and the scheme became extremely expensive to administer. There were also controversial drives to oblige more disabled people back to work, and the ‘socially excluded’ were confronted by a range of initiatives designed to make them more middle class. Compared with Mrs Thatcher’s Victorian Values and Mr Major’s Back to Basics campaigns, Labour was supposed to be non-judgemental about individual behaviour. But a form of moralism did begin to reassert itself. Parenting classes were sometimes mandated through the courts and for the minority who made life hell for their neighbours on housing estates, Labour introduced the Anti-Social Behaviour Order (‘Asbo’). These were first given out in 1998, granted by magistrates to either the police or the local council. It became a criminal offence to break the curfew or other sanction, which could be highly specific. Asbos could be given out for swearing at others in the street, harassing passers-by, vandalism, making too much noise, graffiti, organising ‘raves’, flyposting, taking drugs, sniffing glue, joyriding, prostitution, hitting people and drinking in public.

001 (2)

Although they served a useful purpose in many cases, there were fears that for the really rough elements in society and their tough children they became a badge of honour. Since breaking an Asbo could result in an automatic prison sentence, people were sent to jail for crimes that had not warranted this before. But as they were refined in use and strengthened, they became more effective and routine. By 2007, seven and a half thousand had been given out in England and Wales alone and Scotland had introduced its own version in 2004. Some civil liberties campaigners saw this development as part of a wider authoritarian and surveillance agenda which also led to the widespread use of CCTV (Closed Circuit Television) cameras by the police and private security guards, especially in town centres (see above). Also in 2007, it was estimated that the British were being observed and recorded by 4.2 million such cameras. That amounted to one camera for every fourteen people, a higher ratio than for any other country in the world, with the possible exception of China. In addition, the number of mobile phones was already equivalent to the number of people in Britain. With global satellite positioning chips (GPS) these could show exactly where their users were and the use of such systems in cars and even out on the moors meant that Britons were losing their age-old prowess for map-reading.

002003

The ‘Seven Seven’ Bombings – The Home-grown ‘Jihadis’:

Despite these increasing means of mass surveillance, Britain’s cities have remained vulnerable to terrorist attacks, more recently by so-called ‘Islamic terrorists’ rather than by the Provisional IRA, who abandoned their bombing campaign in 1998. On 7 July 2005, at rush-hour, four young Muslim men from West Yorkshire and Buckinghamshire, murdered fifty-two people and injured 770 others by blowing themselves up on London Underground trains and on a London bus. The report into this worst such attack in Britain later concluded that they were not part of an al Qaeda cell, though two of them had visited camps in Pakistan, and that the rucksack bombs had been constructed at the cost of a few hundred pounds. Despite the government’s insistence that the war in Iraq had not made Britain more of a target for terrorism, the Home Office investigation asserted that the four had been motivated, in part at least, by ‘British foreign policy’.

They had picked up the information they needed for the attack from the internet. It was a particularly grotesque attack, because of the terrifying and bloody conditions in the underground tunnels and it vividly reminded the country that it was as much a target as the United States or Spain. Indeed, the long-standing and intimate relationship between Great Britain and Pakistan, with constant and heavy air traffic between them, provoked fears that the British would prove uniquely vulnerable. Tony Blair heard of the attack at the most poignant time, just following London’s great success in winning the bid to host the 2012 Olympic Games (see above). The ‘Seven Seven’ bombings are unlikely to have been stopped by CCTV surveillance, of which there was plenty at the tube stations, nor by ID cards (which had recently been under discussion), since the killers were British subjects, nor by financial surveillance, since little money was involved and the materials were paid for in cash. Even better intelligence might have helped, but the Security Services, both ‘MI5’ and ‘MI6’ as they are known, were already in receipt of huge increases in their budgets, as they were in the process of tracking down other murderous cells. In 2005, police arrested suspects in Birmingham, High Wycombe and Walthamstow, in east London, believing there was a plot to blow up as many as ten passenger aircraft over the Atlantic.

After many years of allowing dissident clerics and activists from the Middle East asylum in London, Britain had more than its share of inflammatory and dangerous extremists, who admired al Qaeda and preached violent jihad. Once 11 September 2001 had changed the climate, new laws were introduced to allow the detention without trial of foreigners suspected of being involved in supporting or fomenting terrorism. They could not be deported because human rights legislation forbade sending back anyone to countries where they might face torture. Seventeen were picked up and held at Belmarsh high-security prison. But in December 2004, the House of Lords ruled that these detentions were discriminatory and disproportionate, and therefore illegal. Five weeks later, the Home Secretary Charles Clarke hit back with ‘control orders’ to limit the movement of men he could not prosecute or deport. These orders would also be used against home-grown terror suspects. A month later, in February 2005, sixty Labour MPs rebelled against these powers too, and the government only narrowly survived the vote. In April 2006 a judge ruled that the control orders were an affront to justice because they gave the Home Secretary, a politician, too much power. Two months later, the same judge ruled that curfew orders of eighteen hours per day on six Iraqis were a deprivation of liberty and also illegal. The new Home Secretary, John Reid, lost his appeal and had to loosen the orders.

006

Britain found itself in a struggle between its old laws and liberties and a new, borderless world in which the hallowed principles of ‘habeas corpus’, free speech, a presumption of innocence, asylum, the right of British subjects to travel freely in their own country without identifying papers, and the sanctity of homes in which the law-abiding lived were all coming under increasing jeopardy. The new political powers seemed to government ministers the least that they needed to deal with a threat that might last for another thirty years in order, paradoxically, to secure Britain’s liberties for the long-term beyond that. They were sure that most British people agreed, and that the judiciary, media, civil rights campaigners and elected politicians who protested were an ultra-liberal minority. Tony Blair, John Reid and Jack Straw were emphatic about this, and it was left to liberal Conservatives and the Liberal Democrats to mount the barricades in defence of civil liberties. Andrew Marr conceded at the time that the New Labour ministers were ‘probably right’. With the benefit of hindsight, others will probably agree. As Gordon Brown eyed the premiership, his rhetoric was similarly tough, but as Blair was forced to turn to the ‘war on terror’ and Iraq, he failed to concentrate enough on domestic policy. By 2005, neither of them could be bothered to disguise their mutual enmity, as pictured above. A gap seemed to open up between Blair’s enthusiasm for market ideas in the reform of health and schools, and Brown’s determination to deliver better lives for the working poor. Brown was also keen on bringing private capital into public services, but there was a difference in emphasis which both men played up. Blair claimed that the New Labour government was best when we are at our boldest. But Brown retorted that it was best when we are Labour. 

002 (2)

Tony Blair’s legacy continued to be paraded on the streets of Britain,

here blaming him and George Bush for the rise of ‘Islamic State’ in Iraq.

Asylum Seekers, EU ‘Guest’ Workers & Immigrants:

One result of the long Iraqi conflict, which President Bush finally declared to be over on 1 May 2003, was the arrival of many Iraqi asylum-seekers in Britain; Kurds, as well as Shiites and Sunnis. This attracted little comment at the time because there had been both Iraqi and Iranian refugees in Britain since the 1970s, especially as students and the fresh influx were only a small part of a much larger migration into the country which changed it fundamentally during the Blair years. This was a multi-lingual migration, including many Poles, some Hungarians and other Eastern Europeans whose countries had joined the EU and its single market in 2004. When the EU expanded Britain decided that, unlike France or Germany, it would not try to delay opening the country to migrant workers. The accession treaties gave nationals from these countries the right to freedom of movement and settlement, and with average earnings three times higher in the UK, this was a benefit which the Eastern Europeans were keen to take advantage of. Some member states, however, exercised their right to ‘derogation’ from the treaties, whereby they would only permit migrant workers to be employed if employers were unable to find a local candidate. In terms of European Union legislation, a derogation or that a member state has opted not to enforce a specific provision in a treaty due to internal circumstances (typically a state of emergency), and to delay full implementation of the treaty for five years. The UK decided not to exercise this option.

There were also sizeable inflows of western Europeans, though these were mostly students, who (somewhat controversially) were also counted in the immigration statistics, and young professionals with multi-national companies. At the same time, there was continued immigration from Africa, the Middle East and Afghanistan, as well as from Russia, Australia, South Africa and North America. In 2005, according to the Office for National Statistics, ‘immigrants’ were arriving to live in Britain at the rate of 1,500 a day. Since Tony Blair had been in power, more than 1.3 million had arrived. By the mid-2000s, English was no longer the first language of half the primary school children in London, and the capital had more than 350 different first languages. Five years later, the same could be said of many towns in Kent and other Eastern counties of England.

The poorer of the new migrant groups were almost entirely unrepresented in politics, but radically changed the sights, sounds and scents of urban Britain, and even some of its market towns. The veiled women of the Muslim world or its more traditionalist Arab, Afghan and Pakistani quarters became common sights on the streets, from Kent to Scotland and across to South Wales. Polish tradesmen, fruit-pickers and factory workers were soon followed by shops owned by Poles or stocking Polish and East European delicacies and selling Polish newspapers and magazines. Even road signs appeared in Polish, though in Kent these were mainly put in place along trucking routes used by Polish drivers, where for many years signs had been in French and German, a recognition of the employment changes in the long-distance haulage industry. Even as far north as Cheshire (see below), these were put in place to help monolingual truckers using trunk roads, rather than local Polish residents, most of whom had enough English to understand such signs either upon arrival or shortly afterwards. Although specialist classes in English had to be laid on in schools and community centres, there was little evidence that the impact of multi-lingual migrants had a long-term impact on local children and wider communities. In fact, schools were soon reporting a positive impact in terms of their attitudes toward learning and in improving general educational standards.

001

Problems were posed, however, by the operations of people smugglers and criminal gangs. Chinese villagers were involved in a particular tragedy when nineteen of them were caught while cockle-picking in Morecambe Bay by the notorious tides and drowned. Many more were working for ‘gang-masters’ as virtual, in some cases actual ‘slaves’. Russian voices became common on the London Underground, and among prostitutes on the streets. The British Isles found themselves to be ‘islands in the stream’ of international migration, the chosen ‘sceptred isle’ destinations of millions of newcomers. Unlike Germany, Britain was no longer a dominant manufacturing country but had rather become, by the late twentieth century, a popular place to develop digital and financial products and services. Together with the United States and against the Soviet Union, it was determined to preserve a system of representative democracy and the free market. Within the EU, Britain maintained its earlier determination to resist the Franco-German federalist model, with its ‘social chapter’ involving ever tighter controls over international corporations and ever closer political union. Britain had always gone out into the world. Now, increasingly, the world came to Britain, whether poor immigrants, rich corporations or Chinese manufacturers.

005

Multilingual & Multicultural Britain:

Immigration had always been a constant factor in British life, now it was also a fact of life which Europe and the whole world had to come to terms with. Earlier post-war migrations to Britain had provoked a racialist backlash, riots, the rise of extreme right-wing organisations and a series of new laws aimed at controlling it. New laws had been passed to control both immigration from the Commonwealth and the backlash to it. The later migrations were controversial in different ways. The ‘Windrush’ arrivals from the Caribbean and those from the Indian subcontinent were people who looked different but who spoke the same language and in many ways had had a similar education to that of the ‘native’ British. Many of the later migrants from Eastern Europe looked similar to the white British but shared little by way of a common linguistic and cultural background. However, it’s not entirely true to suggest, as Andrew Marr seems to, that they did not have a shared history. Certainly, through no fault of their own, the Eastern Europeans had been cut off from their western counterparts by their absorption into the Soviet Russian Empire after the Second World War, but in the first half of the century, Poland had helped the British Empire to subdue its greatest rival, Germany, as had most of the peoples of the former Yugoslavia. Even during the Soviet ‘occupation’ of these countries, many of their citizens had found refuge in Britain.

Moreover, by the early 1990s, Britain had already become both a multilingual nation. In 1991, Safder Alladina and Viv Edwards published a book for the Longman Linguistics Library which detailed the Hungarian, Lithuanian, Polish, Ukrainian and Yiddish speech communities of previous generations. Growing up in Birmingham, I certainly heard many Polish, Yiddish, Yugoslav and Greek accents among my neighbours and parents of school friends, at least as often as I heard Welsh, Irish, Caribbean, Indian and Pakistani accents. The Longman book begins with a foreword by Debi Prasanna Pattanayak in which she stated that the Language Census of 1987 had shown that there were 172 different languages spoken by children in the schools of the Inner London Education Authority. In an interesting precursor of the controversy to come, she related how the reaction in many quarters was stunned disbelief, and how one British educationalist had told her that England had become a third world country. She commented:

After believing in the supremacy of English as the universal language, it was difficult to acknowledge that the UK was now one of the greatest immigrant nations of the modern world. It was also hard to see that the current plurality is based on a continuity of heritage. … Britain is on the crossroads. It can take an isolationist stance in relation to its internal cultural environment. It can create a resilient society by trusting its citizens to be British not only in political but in cultural terms. The first road will mean severing dialogue with the many heritages which have made the country fertile. The second road would be working together with cultural harmony for the betterment of the country. Sharing and participation would ensure not only political but cultural democracy. The choice is between mediocrity and creativity.

002

Language and dialect in the British Isles, showing the linguistic diversity in many English cities by 1991 as a result of Commonwealth immigration as well as the survival and revival of many of the older Celtic languages and dialects of English.

Such ‘liberal’, ‘multi-cultural’ views may be unfashionable now, more than a quarter of a century later, but it is perhaps worth stopping to look back on that cultural crossroads, and on whether we are now back at that same crossroads, or have arrived at another one. By the 1990s, the multilingual setting in which new Englishes evolved had become far more diverse than it had been in the 1940s, due to immigration from the Indian subcontinent, the Caribbean, the Far East, and West and East Africa. The largest of the ‘community languages’ was Punjabi, with over half a million speakers, but there were also substantial communities of Gujurati speakers (perhaps a third of a million) and a hundred thousand Bengali speakers. In some areas, such as East London, public signs and notices recognise this (see below). Bengali-speaking children formed the most recent and largest linguistic minority within the ILEA and because the majority of them had been born in Bangladesh, they were inevitably in the greatest need of language support within the schools. A new level of linguistic and cultural diversity was introduced through Commonwealth immigration.

003

007

Birmingham’s booming postwar economy attracted West Indian settlers from Jamaica, Barbados and St Kitts in the 1950s. By 1971, the South Asian and West Indian populations were equal in size and concentrated in the inner city wards of North and Central Birmingham (see the map above).  After the hostility towards New Commonwealth immigrants in some sections of the local White populations in the 1960s and ’70s, they had become more established in cities like Birmingham, where places of worship, ethnic groceries, butchers and, perhaps most significantly, ‘balti’ restaurants, began to proliferate in the 1980s and ’90s. The settlers materially changed the cultural and social life of the city, most of the ‘white’ population believing that these changes were for the better. By 1991, Pakistanis had overtaken West Indians and Indians to become the largest single ethnic minority in Birmingham. The concentration of West Indian and South Asian British people in the inner city areas changed little by the end of the century, though there was an evident flight to the suburbs by Indians. As well as being poorly-paid, the factory work available to South Asian immigrants like the man in a Bradford textile factory below, was unskilled. By the early nineties, the decline of the textile industry over the previous two decades had let to high long-term unemployment in the immigrant communities in the Northern towns, leading to serious social problems.

006

Nor is it entirely true to suggest that, as referred to above, Caribbean arrivals in Britain faced few linguistic obstacles integrating themselves into British life from the late 1940s to the late 1980s. By the end of these forty years, the British West Indian community had developed its own “patois”, which had a special place as a token of identity. One Jamaican schoolgirl living in London in the late eighties explained the social pressures that frowned on Jamaican English in Jamaica, but which made it almost obligatory in London. She wasn’t allowed to speak Jamaican Creole in front of her parents in Jamaica. When she arrived in Britain and went to school, she naturally tried to fit in by speaking the same patois, but some of her British Caribbean classmates told her that, as a “foreigner”, she should not try to be like them, and should speak only English. But she persevered with the patois and lost her British accent after a year and was accepted by her classmates. But for many Caribbean visitors to Britain, the patois of Brixton and Notting Hill was a stylized form that was not truly Jamaican, not least because British West Indians had come from all parts of the Caribbean. When another British West Indian girl, born in Britain, was taken to visit Jamaica, she found herself being teased about her London patois and told to speak English.

003

The predicament that still faced the ‘Black British’ in the late eighties and into the nineties was that, for all the rhetoric, they were still not fully accepted by the established ‘White community’. Racism was still an everyday reality for large numbers of British people. There was plenty of evidence of the ways in which Black people were systematically denied access to employment in all sections of the job market.  The fact that a racist calamity like the murder in London of the black teenager Stephen Lawrence could happen in 1993 was a testimony to how little had changed in British society’s inability to face up to racism since the 1950s. As a result, the British-Caribbean population could still not feel itself to be neither fully British. This was the poignant outcome of what the British Black writer Caryl Phillips has called “The Final Passage”, the title of his novel which is narrated in Standard English with the direct speech by the characters rendered in Creole. Phillips migrated to Britain as a baby with his parents in the 1950s, and sums up his linguistic and cultural experience as follows:

“The paradox of my situation is that where most immigrants have to learn a new language, Caribbean immigrants have to learn a new form of the same language. It induces linguistic shizophrenia – you have an identity that mirrors the larger cultural confusion.”

One of his older characters in The Final Passage characterises “England” as a “college for the West Indian”, and, as Philipps himself put it, that is “symptomatic of the colonial situation; the language is divided as well”.  As the “Windrush Scandal”, involving the deportation of British West Indians from the UK has recently shown, this post-colonial “cultural confusion” still ‘colours’ political and institutional attitudes twenty-five years after the death of Stephen Lawrence, leading to discriminatory judgements by officials. This example shows how difficult it is to arrive at some kind of chronological classification of migrations to Britain into the period of economic expansion of the 1950s and 1960s; the asylum-seekers of the 1970s and 1980s; and the EU expansion and integration in the 1990s and the first decades of the 2000s. This approach assumed stereotypical patterns of settlement for the different groups, whereas the reality was much more diverse. Most South Asians, for example, arrived in Britain in the post-war period but they were joining a migration ‘chain’ which had been established at the beginning of the twentieth century. Similarly, most Eastern European migrants arrived in Britain in several quite distinct waves of population movement. This led the authors of the Longman Linguistics book to organise it into geolinguistic areas, as shown in the figure below:

001

The Poles and Ukrainians of the immediate post-war period, the Hungarians in the 1950s, the Vietnamese refugees in the 1970s and the Tamils in the 1980s, sought asylum in Britain as refugees. In contrast, settlers from India, Pakistan, Bangladesh and the Caribbean, had, in the main come from areas of high unemployment and/or low wages, for economic reasons. It was not possible, even then, to make a simple split between political and economic migrants since, even within the same group, motivations differed through time. The Eastern Europeans who had arrived in Britain since the Second World War had come for a variety of reasons; in many cases, they were joining earlier settlers trying either to escape poverty in the home country or to better their lot. A further important factor in the discussion about the various minority communities in Britain was the pattern of settlement. Some groups were concentrated into a relatively small geographical area which made it possible to develop and maintain strong social networks; others were more dispersed and so found it more difficult to maintain a sense of community. Most Spaniards, Turks and Greeks were found in London, whereas Ukrainians and Poles were scattered throughout the country. In the case of the Poles, the communities outside London were sufficiently large to be able to sustain an active community life; in the case of Ukrainians, however, the small numbers and the dispersed nature of the community made the task of forging a separate linguistic and cultural identity a great deal more difficult.

Groups who had little contact with the home country also faced very real difficulties in retaining their distinct identities. Until 1992, Lithuanians, Latvians, Ukrainians and Estonians were unable to travel freely to their country of origin; neither could they receive visits from family members left behind; until the mid-noughties, there was no possibility of new immigration which would have the effect of revitalizing these communities in Britain. Nonetheless, they showed great resilience in maintaining their ethnic minority, not only through community involvement in the UK but by building links with similar groups in Europe and even in North America. The inevitable consequence of settlement in Britain was a shift from the mother tongue to English. The extent of this shift varied according to individual factors such as the degree of identification with the mother tongue culture; it also depended on group factors such as the size of the community, its degree of self-organisation and the length of time it had been established in Britain. For more recently arrived communities such as the Bangladeshis, the acquisition of English was clearly a more urgent priority than the maintenance of the mother tongue, whereas, for the settled Eastern Europeans, the shift to English was so complete that mother tongue teaching was often a more urgent community priority. There were reports of British-born Ukrainians and Yiddish-speaking Jews who were brought up in predominantly English-speaking homes who were striving to produce an environment in which their children could acquire their ‘heritage’ language.

Blair’s Open Door Policy & EU Freedom of Movement:

During the 1980s and ’90s, under the ‘rubric’ of multiculturalism, a steady stream of immigration into Britain continued, especially from the Indian subcontinent. But an unspoken consensus existed whereby immigration, while always gradually increasing, was controlled. What happened after the Labour Party’s landslide victory in 1997 was a breaking of that consensus, according to Douglas Murray, the author of the recent (2017) book, The Strange Death of Europe. He argues that once in power, Tony Blair’s government oversaw an opening of the borders on a scale unparalleled even in the post-war decades. His government abolished the ‘primary purpose rule’, which had been used as a filter out bogus marriage applications. The borders were opened to anyone deemed essential to the British economy, a definition so broad that it included restaurant workers as ‘skilled labourers’. And as well as opening the door to the rest of the world, they opened the door to the new EU member states after 2004. It was the effects of all of this, and more, that created the picture of the country which was eventually revealed in the 2011 Census, published at the end of 2012.

004

The numbers of non-EU nationals moving to settle in Britain were expected only to increase from 100,000 a year in 1997 to 170,000 in 2004. In fact, the government’s predictions for the number of new arrivals over the five years 1999-2004 were out by almost a million people. It also failed to anticipate that the UK might also be an attractive destination for people with significantly lower average income levels or without a minimum wage. For these reasons, the number of Eastern European migrants living in Britain rose from 170,000 in 2004 to 1.24 million in 2013. Whether the surge in migration went unnoticed or was officially approved, successive governments did not attempt to restrict it until after the 2015 election, by which time it was too late.

(to be continued)

Posted January 15, 2019 by AngloMagyarMedia in Affluence, Africa, Arabs, Assimilation, asylum seekers, Belfast, Birmingham, Black Market, Britain, British history, Britons, Bulgaria, Calais, Caribbean, Celtic, Celts, Child Welfare, Cold War, Colonisation, Commonwealth, Communism, Compromise, Conservative Party, decolonisation, democracy, Demography, Discourse Analysis, Domesticity, Economics, Education, Empire, English Language, Europe, European Economic Community, European Union, Factories, History, Home Counties, Humanism, Humanitarianism, Hungary, Immigration, Imperialism, India, Integration, Iraq, Ireland, Journalism, Labour Party, liberal democracy, liberalism, Linguistics, manufacturing, Margaret Thatcher, Midlands, Migration, Militancy, multiculturalism, multilingualism, Music, Mythology, Narrative, National Health Service (NHS), New Labour, Old English, Population, Poverty, privatization, Racism, Refugees, Respectability, Scotland, Socialist, south Wales, terror, terrorism, Thatcherism, Unemployment, United Kingdom, United Nations, Victorian, Wales, Welsh language, xenophobia, Yugoslavia

Tagged with , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Britain, Ireland and Europe, 1994-99: Peace, Devolution & Development.   Leave a comment

LSF (1947) Nobel Peace Prize obv

Unionists & Nationalists – The Shape of Things to Come:

In Northern Ireland, optimism was the only real force behind the peace process. Too often, this is remembered by one of Blair’s greatest soundbites as the talks reached their climax: This is no time for soundbites … I feel the hand of history on my shoulder. Despite the comic nature of this remark, it would be churlish not to acknowledge this as one of his greatest achievements. Following the tenacious efforts of John Major to bring Republicans and Unionists to the table, which had resulted in a stalemate. Tony Blair had already decided in Opposition that an Irish peace settlement would be one of his top priorities in government. He went to the province as his first visit after winning power and focused Number Ten on the negotiations as soon as the IRA, sensing a fresh opportunity, announced a further ceasefire. In Mo Mowlem, Blair’s brave new Northern Ireland Secretary, he had someone who was prepared to be tough in negotiations with the Unionists and encouraging towards Sinn Feiners in order to secure a deal. Not surprisingly, the Ulster Unionist politicians soon found her to be too much of a ‘Green’. She concentrated her charm and bullying on the Republicans, while a Number Ten team dealt with the Unionists. Blair emphasised his familial links with Unionism in order to win their trust.

004

There were also direct talks between the Northern Irish political parties, aimed at producing a return of power-sharing in the form of an assembly in which they could all sit. These were chaired by former US Senator George Mitchell and were the toughest part. There were also talks between the Northern Irish parties and the British and Irish governments about the border and the constitutional position of Northern Ireland in the future. Finally, there were direct talks between London and Dublin on the wider constitutional and security settlement. This tripartite process was long and intensely difficult for all concerned, which appeared to have broken down at numerous points and was kept going mainly thanks to Blair himself. He took big personal risks, such as when he invited Gerry Adams and Martin McGuinness of Sinn Fein-IRA to Downing Street. Some in the Northern Ireland office still believe that Blair gave too much away to the Republicans, particularly over the release of terrorist prisoners and the amnesty which indemnified known terrorists, like those responsible for the Birmingham bombings in 1974, from prosecution. At one point, when talks had broken down again over these issues, Mo Mowlem made the astonishing personal decision to go into the notorious Maze prison herself and talk to both Republican and Loyalist terrorist prisoners. Hiding behind their politicians, the hard men still saw themselves as being in charge of their ‘sides’ in the sectarian conflict. But Blair spent most of his time trying to keep the constitutional Unionists ‘on board’, having moved Labour policy away from support for Irish unification. In Washington, Blair was seen as being too Unionist.

005

Given a deadline of Easter 1998, a deal was finally struck, just in time, on Good Friday, hence the alternative name of ‘the Belfast Agreement’. Northern Ireland would stay part of the United Kingdom for as long as the majority in the province wished it so. The Republic of Ireland would give up its territorial claim to the North, amending its constitution to this effect. The parties would combine in a power-sharing executive, based on a newly elected assembly. There would also be a North-South body knitting the two political parts of the island together for various practical purposes and mundane matters. The paramilitary organisations would surrender or destroy their weapons, monitored by an independent body. Prisoners would be released and the policing of Northern Ireland would be made non-sectarian by the setting up of a new police force to replace the Royal Ulster Constabulary (RUC), whose bias towards the Unionist community had long been a sore point for Nationalists. The deal involved a great deal of pain, particularly for the Unionists. It was only the start of a true peace and would be threatened frequently afterwards, such as when the centre of Omagh was bombed only a few months after its signing by a renegade splinter group of the IRA calling itself ‘the Real IRA’ (see the photo below). It murdered twenty-nine people and injured two hundred. Yet this time the violent extremists were unable to stop the rest from talking.

004 (2)

Once the agreement had been ratified on both sides of the border, the decommissioning of arms proved a seemingly endless and wearisome game of bluff. Though the two leaders of the moderate parties in Northern Ireland, David Trimble of the Ulster Unionists and John Hume of the Nationalist SDLP, won the Nobel Prize for Peace, both these parties were soon replaced in elections by the harder-line Democratic Unionist Party led by Rev. Dr Ian Paisley, and by Sinn Fein, under Adams and McGuinness. Initially, this made it harder to set up an effective power-sharing executive at Stormont (pictured below). Yet to almost everyone’s surprise, Paisley and McGuinness sat down together and formed a good working relationship. The thuggery and crime attendant on years of paramilitary activity took another decade to disappear. Yet because of the agreement hundreds more people are still alive who would have died had the ‘troubles’ continued. They are living in relatively peaceful times. Investment has returned and Belfast has been transformed into a busier, more confident city. Large businesses increasingly work on an all-Ireland basis, despite the continued existence of two currencies and a border. The fact that both territories are within the European Union enables this to happen without friction at present, though this may change when the UK leaves the EU and the Republic becomes a ‘foreign country’ to it for the first time since the Norman Conquest. Tony Blair can take a sizeable slice of credit for this agreement. As one of his biographers has written:

He was exploring his own ability to take a deep-seated problem and deal with it. It was a life-changing experience for him.

003

If the Good Friday Agreement changed the future relationship of the UK and Ireland, Scottish and Welsh devolution changed the future political shape of Great Britain. The relative indifference of the eighteen-year Tory ascendancy to the plight of the industrial areas of Scotland and Wales had transformed the prospects of the nationalist parties in both countries. Through the years of Tory rule, the case for a Scottish parliament had been bubbling under north of the border. Margaret Thatcher had been viewed as a conspicuously English figure imposing harsh economic penalties on Scotland, which had always considered itself to be inherently more egalitarian and democratic. The Tories, who had successfully played the Scottish card against centralising Labour in 1951, had themselves become labelled as a centralising and purely English party. Local government had already been reorganised in Britain and Northern Ireland in the early 1990s with the introduction of ‘unitary’ authorities.

002

Scotland had a public culture further to the left than that of southern England, and therefore the initiatives on devolution came from the respectable middle-classes. A group of pro-devolution activists, including SNP, Labour and Liberal supporters, churchmen, former civil servants and trade unionists to found the Campaign for a Scottish Assembly. In due course, this produced a Constitutional Convention meant to bring in a wider cross-section of Scottish life behind their ‘Claim of Right’. It argued that if Scots were to stand on their own two feet as Mrs Thatcher had insisted, they needed control over their own affairs. Momentum increased when the Scottish Tories lost half their remaining seats in the 1987 election, and, following the poll tax rebellion, the Convention got going in March 1989, after Donald Dewar, Labour’s leader in Scotland, decided to work with other parties. The Convention brought together the vast majority of Scottish MPs, all but two of Scotland’s regional, district and island councils, the trade unions, churches, charities and many other organisations, in fact almost everyone except the Conservatives, who were sticking with the original Union, and the SNP, who wanted full independence.

Scottish Tories, finding themselves increasingly isolated, fought back vainly. They pointed out that if a Tory government, based on English votes, was regarded as illegitimate by the Scots, then in future a Labour government based on Scottish votes might be regarded as illegitimate by the English. In a 1992 poll in Scotland, fifty per cent of those asked said they were in favour of independence within the European Union. In the 1992 election, John Major had made an impassioned appeal for the survival of the Union. Had the four countries never come together, he argued, their joint history would have never been as great: Are we, in our generation, to throw all that away?  He won back a single Scottish seat. Various minor sops were offered to the Scots during his years in office, including the return of the Stone of Destiny, with much ceremony. However, the minor Tory recovery in 1992 was wiped out in the Labour landslide of 1997, when all the Conservatives seats north of the border, where they had once held the majority of them, were lost, as they were in Wales. Formerly just contestants in middle-class, rural and intellectual constituencies, in 1997 Scottish and Welsh nationalists now made huge inroads into former Conservative areas, and even into the Labour heartlands, and the latter despite the Labour leadership being held consecutively by a Welshman and a Scot.

By the time Tony Blair became the party leader, Labour’s commitment to devolution was long-standing. Unlike his predecessor, he was not much interested in devolution or impressed by it, particularly not for Wales, where support had been far more muted. The only thing he could do by this stage was to insist that a Scottish Parliament and Welsh Assembly would only be set up after referenda in the two countries, which in Scotland’s case would include a second question as to whether the parliament should be given the power to vary the rate of income tax by 3p in the pound. In September 1997, Scotland voted by three to one for the new Parliament, and by nearly two to one to give it tax-varying powers. The vote for the Welsh Assembly was far closer, with a wafer-thin majority secured by the final constituency to declare, that of Carmarthen. The Edinburgh parliament would have clearly defined authority over a wide range of public services – education, health, welfare, local government, transport and housing – while Westminster kept control over taxation, defence, foreign affairs and some lesser matters. The Welsh assembly in Cardiff would have fewer powers and no tax-raising powers. The Republic of Ireland was similarly divided between two regional assemblies but unlike the assemblies in the UK, these were not elected.

In 1999, therefore, devolved governments, with varying powers, were introduced in Scotland, Wales and, following the ratification referendum on the Belfast Agreement, in Northern Ireland. After nearly three hundred years, Scotland got its parliament with 129 MSPs, and Wales got its assembly with sixty members. Both were elected by proportional representation, making coalition governments almost inevitable. In Scotland, Labour provided the first ‘first minister’ in Donald Dewar, a much-loved intellectual, who took charge of a small group of Labour and Liberal Democrat ministers. To begin with, Scotland was governed from the Church of Scotland’s general assembly buildings. The devolution promised by John Smith and instituted by Tony Blair’s new Labour government in the late 1990s did, initially, seem to take some of the momentum out of the nationalist fervour, but apparently at the expense of stoking the fires of English nationalism, resentful at having Scottish and Welsh MPs represented in their own assemblies as well as in Westminster. But there was no early crisis at Westminster because of the unfairness of Scottish and Welsh MPs being able to vote on England-only business, the so-called Midlothian Question, particularly when the cabinet was so dominated by Scots. But despite these unresolved issues, the historic constitutional changes brought about by devolution and the Irish peace process reshaped both Britain and Ireland, producing irrevocable results. In his television series A History of Britain, first broadcast on the BBC in 2000, Simon Schama argued that…

Histories of Modern Britain these days come not to praise it but to bury it, celebrating the denationalization of Britain, urging on the dissolution of ‘Ukania’ into the constituent European nationalities of Scotland, Wales and England (which would probably tell the Ulster Irish either to absorb themselves into a single European Ireland or to find a home somewhere else – say the Isle of Man). If the colossal asset of the empire allowed Britain, in the nineteenth and early twentieth century, to exist as a genuine national community ruled by Welsh, Irish and (astonishingly often) Scots, both in Downing Street and in the remote corners of the empire, the end of that imperial enterprise, the theory goes, ought also to mean the decent, orderly liquidation of Britannia Inc. The old thing never meant anything anyway, it is argued; it was just a spurious invention designed to seduce the Celts into swallowing English domination where once they had been coerced into it, and to persuade the English themselves that they would be deeply adored on the grouse moors of the Trossachs as in the apple orchards of the Weald. The virtue of Britain’s fall from imperial grace, the necessity of its European membership if only to avoid servility to the United States, is that it forces ‘the isles’ to face the truth: that they are many nations, not one.

However, in such a reduction of false British national consciousness to the ‘true’ identities and entities of Scotland, Wales and England, he argued, self-determination could go beyond the ‘sub-nations’, each of which was just as much an invention, or a re-invention, as was Britain. Therefore an independent Scotland would not be able to resist the rights to autonomy of the Orkney and Shetland islands, with their Nordic heritage, or the remaining Gallic-speaking isles of the Outer Hebrides. Similarly, the still primarily Anglophone urban south-Walians and the inhabitants of the Welsh borders and south coast of Pembrokeshire might in future wish to assert their linguistic and cultural differences from the Welsh-speakers of the rural Welsh-speakers of West and North Wales. With the revival of their Celtic culture, the Cornish might also wish to seek devolution from a country from which all other Celts have retreated into their ethnolinguistic heartlands. Why shouldn’t post-imperial Britain undergo a process of ‘balkanization’ like that of the Former Yugoslavia?

LSF RSF Lets build a culture of peace LR

Well, many like Schama seemed to answer at that time, and still do today, precisely because of what happened due to ethnonationalism in the Balkans, especially in Bosnia and Kosovo, where the conflicts were only just, in 1999, being brought to an end by air-strikes and the creation of tides of refugees escaping brutal ethnic cleansing. The breaking up of Britain into ever smaller and purer units of pure white ethnic groups was to be resisted. Instead, a multi-national, multi-ethnic and multi-cultural Britain was coming into being through a gradual and peaceful process of devolution of power to the various national, ethnic and regional groups and a more equal re-integration of them into a ‘mongrel’ British nation within a renewed United Kingdom.

Economic Development, the Regions of Britain & Ireland and the Impact of the EU:

004

The late twentieth century saw the transformation of the former docklands of London into offices and fashionable modern residential developments, with a new focus on the huge Canary Wharf scheme (pictured above) to the east of the city. The migration of some financial services and much of the national press to the major new developments in London’s Docklands prompted the development of the Docklands Light Railway and the Jubilee line extension. The accompanying modernisation of the London Underground was hugely expensive in legal fees and hugely complex in contracts. Outside of London, improvements in public transport networks were largely confined to urban and suburban centres with light railway networks developed in Manchester, Sheffield and Croydon.

Beyond Canary Wharf to the east, the Millennium Dome, which Blair’s government inherited from the Tories, was a billion pound gamble which Peter Mandelson and ‘Tony’s cronies’ decided to push ahead with, despite cabinet opposition. Architecturally, the dome was striking and elegant, a landmark for London which can be seen from by air passengers arriving in the capital. The millennium was certainly worth celebrating but the conundrum ministers and their advisers faced was what to put in their ‘pleasure’ Dome. It would be magnificent, unique, a tribute to British daring and ‘can-do’. Blair himself said that it would provide the first paragraph of his next election manifesto. But this did not answer the current question of what it was for, exactly. When the Dome finally opened at New Year, the Queen, Prime Minister and celebrities were treated to a mish-mash of a show which embarrassed many of them. When it opened to the public, the range of mildly interesting exhibits was greeted as a huge disappointment. Optimism and daring, it seemed, were not enough to fill the people’s expectations. Later that year, Londoners were given a greater gift in the form of a mayor and regional assembly with powers over local planning and transport. This new authority in part replaced the Greater London Council abolished by the Thatcher government in 1986.

However, there were no signs that the other conurbations in the regions of England wanted regionalisation, except for some stirrings in the Northeast and Cornwall. The creation of nine Regional Development Agencies in England in 1998-99 did not seek to meet a regionalist agenda. In fact, these new bodies to a large extent matched the existing structures set up since the 1960s for administrative convenience and to encourage inward investment. Improving transport links were seen as an important means of stimulating regional development and combating congestion. Major Road developments in the 1990s included the completion of the M25 orbital motorway around London and the M40 link between London and Birmingham. However, despite this construction programme, congestion remained a problem: the M25, for example, became the butt of jokes labelling it as the largest car park on the planet, while traffic speeds in central London continued to fall, reaching fifteen kilometres per hour by 1997, about the same as they had been in 1907. Congestion was not the only problem, however, as environmental protests led to much of the road-building programme begun by the Tory governments being shelved after 1997. The late nineties also saw the development of some of the most expensive urban motorways in Europe.

In the Sottish Highlands and Islands, the new Skye road bridge connected the Isle of Skye to the mainland. A group led by the Bank of America built and ran the new bridge. It was one of the first projects built under a ‘public finance initiative’, or PFI, which had started life under Tory Chancellor Norman Lamont, five years before Labour came to power when he experimented with privatising public projects and allowing private companies to run them, keeping the revenue. Although the basic idea was simple enough, this represented a major change in how government schemes were working, big enough to arouse worry even outside the tribes of political obsessives. There were outraged protests from some islanders about paying tolls to a private consortium and eventually the Scottish Executive bought the bridge back. At the opposite corner of the country, the Queen Elizabeth II road bridge was built joining Kent and Essex across the Thames at Dartford, easing congestion on both sides of the Dartford tunnel. It was the first bridge across the river in a new place for more than half a century and was run by a company called ‘Le Crossing’, successfully taking tolls from motorists.

006

Undoubtedly the most important transport development was the Channel Tunnel rail link to France, completed in 1994. It was highly symbolic of Britain’s commitment to European integration, and millions of people and vehicles had travelled from London to Paris in under three hours by the end of the century. The town of Ashford in Kent was one of the major beneficiaries of the ‘Chunnel’ rail link, making use of railway links running through the town. Its population grew by over ten per cent in the 1990s. By the end of that decade, the town had an international catchment area of some eighty-five million people within a single day’s journey. This and the opening of Ashford International railway station as the main terminal in the rail link to the continent attracted a range of engineering, financial, distribution and manufacturing companies to the town. In addition to the fourteen business parks that were established in the town, new retail parks were opened. Four green-field sites were also opened on the outskirts of the town, including a science park owned by Trinity College, Cambridge. Ashford became closer to Paris and Brussels than it was to Manchester and Liverpool, as can be seen on the map below. In addition to its international rail link, the town’s position at the hub of a huge motorway network was in a position to be an integral part of a truly international transport system.

005

002

Modern-day affluence at the turn of the century was reflected in the variety of goods and services concentrated in shopping malls. They are now often built on major roads outside towns and cities to make them accessible to the maximum number of people in a region.

Economic change was most dramatic in the Irish Republic, which enjoyed the highest growth rates in Europe in the 1990s. The so-called ‘Celtic Tiger’ economy boomed, aided by inward investment so that by the end of the decade GDP per capita had surpassed that of the UK. Dublin, which remained if anything more dominant than London as a capital city, flourished as a result of a strong growth in the service industries. Growth rates for the ‘new economy’ industries such as information and communications technology were among the highest in the world. Generous tax arrangements and the city’s growing reputation as a cultural centre meanwhile helped to encourage the development of Dublin’s ‘rockbroker belt’. Even agriculture in the Irish Republic, in decline in the early 1990s, still contributed nine per cent of Ireland’s GDP, three times the European average. In the west of Ireland, it was increasingly supplemented by the growth of tourism.

Nevertheless, while the expansion of Ireland’s prosperity lessened the traditional east-west divide, it did not eliminate it. Low population density and a dispersed pattern of settlement were felt to make rail developments unsuitable. Consequently, Ireland’s first integrated transport programme, the Operational Programme for Peripherality, concentrated on improving: the routes from the west of Ireland to the ferry port of Rosslare; the routes from Belfast to Cork; Dublin and the southwest; east-west routes across the Republic. Many of these improvements benefited from EU funding. The EU also aided, through its ‘peace programme’, the development of transport planning in Britain, with infrastructure projects in, for example, the Highlands and Islands of Scotland. In 1993, the EU had decided to create a combined European transport network. Of the fourteen projects associated with this aim, three were based in Britain and Ireland – a rail link from Cork to Larne in Northern Ireland, the ferry port for Scotland; a road link from the Low Countries across England and Wales to Ireland, and the West Coast mainline railway route in Britain.

The old north-south divide in Britain reasserted itself with a vengeance in the late 1990s as people moved south in search of jobs and prosperity as prices and wages rose. Even though the shift towards service industries was reducing regional economic diversity, the geographical distribution of regions eligible for European structural funds for economic improved the continuing north-south divide. Transport was only one way in which the EU increasingly came to shape the geography of the British Isles in the nineties. It was, for example, a key factor in the creation of the new administrative regions of Britain and Ireland in 1999. At the same time, a number of British local authorities opened offices in Brussels for lobbying purposes and attempts to maximise receipts from European structural funds also encouraged the articulation of regionalism. Cornwall, for instance, ‘closed’ its ‘border’ with Devon briefly in 1998 in protest at not receiving its EU social funds, while the enthusiasm for the supposed economic benefits that would result from ‘independence in Europe’ helped to explain the revival of the Scottish Nationalist Party following devolution. ‘Silicon Gen’ in central Scotland was, by the end of the decade, the largest producer of computing equipment in Europe.

The European connection was less welcome in other quarters, however. Fishermen, particularly in Devon and Cornwall and on the North Sea Coast of England, felt themselves the victims of the Common Fisheries Policy quota system. There was also a continuing strong sense of ‘Euroscepticism’ in England, fuelled at this stage by a mixture of concerns about ‘sovereignty’ and economic policy, which I will deal with in a separate article. Here, it is worth noting that even the most enthusiastic Europhiles, the Irish, sought to reject recent EU initiatives which they felt were not in their interests in their 2001 referendum on the Treaty of Nice. Nevertheless, the growth of physical links with Europe, like the Channel Tunnel, the connections between the British and French electricity grids, and the development of ‘budget’ airlines, made it clear that both of the main ‘offshore’ islands, Britain and Ireland were, at the turn of the century, becoming increasingly integrated, both in economic and administrative terms, with the continent of Europe.

006 (2)

At the beginning of 1999, however, a debate began over British membership of the euro, the single currency which was finally taking shape within the EU. Though he was never a fanatic on the subject, Blair’s pro-European instincts and his desire to be a leading figure inside the EU predisposed him to announce that Britain would join, not in the first wave, but soon afterwards. He briefed that this would happen. British business seemed generally in favour, but the briefing and guesswork in the press were completely baffling. For Gordon Brown, stability came first, and he concluded that it was not likely that Britain could safely join the euro within the first Parliament. When he told Blair this, the two argued and then eventually agreed on a compromise. Britain would probably stay out during the first Parliament, but the door should be left slightly ajar. Pro-European business people and those Tories who had lent Blair and Brown their conditional support, as well as Blair’s continental partners, should be kept on board, as should the anti-Euro press. The terms of the delicate compromise were meant to be revealed in an interview given by Brown to The Times. Being more hostile to entry than Blair, and talking to an anti-euro newspaper, his team briefed more strongly than Blair would have liked. By the time the story was written, the pound had been saved from extinction for the lifetime of the Parliament. Blair was aghast at this.

Featured Image -- 33924

The chaos surrounding this important matter was ended and the accord with Blair patched up by Brown and his adviser Ed Balls, who quickly produced five economic tests which would be applied before Britain would enter the euro. They required more detailed work by the Treasury; the underlying point was that the British and continental economies must be properly aligned before Britain would join. Brown then told the Commons that though it was likely that, for economic reasons, Britain would not join the euro until after the next election, there was no constitutional or political reason not to join. Preparations for British entry would therefore begin. This gave the impression that once the tests were met there would be a post-election referendum, followed by the demise of sterling.

In 1999, with a full-scale launch at a London cinema, Blair was joined by the Liberal Democrat leader Charles Kennedy and the two former Tory cabinet ministers Ken Clarke and Michael Heseltine to launch ‘Britain in Europe’ as a counter-blast to the anti-Euro campaign of ‘Business for Sterling’. Blair promised that together they would demolish the arguments against the euro, and there was alarmist media coverage about the loss of eight million jobs if Britain pulled out of the EU. But the real outcome of this conflict was that the power to decide over membership of the euro passed decisively from Blair to Brown, whose Treasury fortress became the guardian of the economic tests. Brown would keep Britain out on purely economic grounds, something which won him great personal credit among Conservative ‘press barons’. There was to be no referendum on the pound versus euro, however much the Prime Minister wanted one.

Very little of what New Labour had achieved up to 1999 was what it was really about, however, and most of its achievements had been in dealing with problems and challenges inherited from previous governments or with ‘events’ to which it had to react. Its intended purpose was to deliver a more secure economy, radically better public services and a new deal for those at the bottom of British society. Much of this was the responsibility of Gordon Brown, as agreed in the leadership contest accord between the two men. The Chancellor would become a controversial figure later in government, but in his early period at the Treasury, he imposed a new way of governing. He had run his time in Opposition with a tight team of his own, dominated by Ed Balls, later an MP and Treasury minister before becoming shadow chancellor under Ed Miliband following the 2010 general election. Relations between Team Brown and the Treasury officials began badly and remained difficult for a long time. Brown’s handing of interest control to the Bank of England was theatrical, planned secretly in Opposition and unleashed to widespread astonishment immediately New Labour won. Other countries, including Germany and the US, had run monetary policy independently of politicians, but this was an unexpected step for a left-of-centre British Chancellor. It turned out to be particularly helpful to Labour ministers since it removed at a stroke the old suspicion that they would favour high employment over low inflation. As one of Brown’s biographers commented, he…

 …could only give expression to his socialist instincts after playing the role of uber-guardian of the capitalist system.

The bank move has gone down as one of the clearest achievements of the New Labour era. Like the Irish peace process and the devolution referenda, it was an action which followed on immediately after Labour won power, though, unlike those achievements, it was not something referred to in the party’s election manifesto. Brown also stripped the Bank of England of its old job of overseeing the rest of the banking sector. Otherwise, it would have had a potential conflict of interest if it had had to concern itself with the health of commercial banks at the same time as managing interest rates. As a result of these early actions, New Labour won a reputation for being economically trustworthy and its Chancellor was identified with ‘prudent’ management of the nation’s finances. Income tax rates did not increase, which reassured the middle classes. Even when Brown found what has more recently been referred to as ‘the magic money tree’, he did not automatically harvest it. And when the ‘dot-com bubble’ was at its most swollen, he sold off licenses for the next generation of mobile phones for 22.5 bn, vastly more than they were soon worth. The produce went not into new public spending but into repaying the national debt, 37 bn of it. By 2002 government interest payments on this were at their lowest since 1914, as a proportion of its revenue.

Despite his growing reputation for prudence, Brown’s introduction of ‘stealth taxes’ proved controversial, however. These included the freezing of income tax thresholds so that an extra 1.5 million people found themselves paying the top rate; the freezing of personal allowances; rises in stamp duties on houses and a hike in national insurance. In addition, some central government costs were palmed off onto the devolved administrations or local government, so that council tax rose sharply, and tax credits for share dividends were removed. Sold at the time as a ‘prudent’ technical reform, enabling companies to reinvest in their core businesses, this latter measure had a devastating effect on the portfolios of pension funds, wiping a hundred billion off the value of retirement pensions. This was a staggering sum, amounting to more than twice as much as the combined pension deficits of Britain’s top 350 companies. Pensioners and older workers were angered when faced with great holes in their pension funds. They were even more outraged when Treasury papers released in 2007 showed that Brown had been warned about the effect this measure would have. The destruction of a once-proud pension industry had more complex causes than Brown’s decision; Britain’s fast-ageing population was also a major factor, for one. But the pension fund hit produced more anger than any other single act by the New Labour Chancellor.

Perhaps the most striking long-term effect of Brown’s careful running of the economy was the stark, dramatic shape of public spending. For his first two years, he stuck fiercely to the promise he had made about continuing the Major government’s spending levels. These were so tight that even the man who set these levels, Kenneth Clarke, said that he would not actually have kept to them had the Tories been re-elected and had he been reappointed as Chancellor. Brown brought down the State’s share of public spending from nearly 41% of GDP to 37.4% by 1999-2000, the lowest percentage since 1960 and far below anything achieved under Thatcher. He was doing the opposite of what previous Labour Chancellors had done. On arriving in office, they had immediately started spending, in order to stimulate the economy in classical Keynesian terms. When they had reached their limits, they had then had to raise taxes. He began by putting a squeeze on spending and then loosening up later. There was an abrupt and dramatic surge in public spending, particularly on health, back up to 43%. The lean years were immediately followed by the fat ones, famine by the feast. But the consequence of the squeeze was that the first New Labour government of 1997-2001 achieved far less in public services than it had promised. For example, John Prescott had promised a vast boost in public transport, telling the Commons in 1997:

I will have failed if in five years’ time there are not many more people using public transport and far fewer journeys by car. It’s a tall order, but I urge you to hold me to it.

Because of ‘Prudence’, and Blair’s worries about being seen as anti-car, Prescott had nothing like the investment to follow through and failed completely. Prudence also meant that Brown ploughed ahead with cuts in benefit for lone-parent families, angering Labour MPs and resulting in a Scottish Labour conference which labelled their Westminster government and their own Scots Chancellor as economically inept, morally repugnant and spiritually bereft. Reform costs money and without money, it barely happened in the first term, except in isolated policy areas where Blair and Brown put their heads down and concentrated. The most dramatic programme was in raising literacy and numeracy among younger children, where Number Ten worked closely with the Education Secretary, David Blunkett, and scored real successes. But unequivocally successful public service reforms were rare.

At first, Labour hated the idea of PFIs, which were a mixture of two things associated with Thatcherite economic policies, the privatisation of capital projects, with the government paying a fee to private companies over many years, and the contracting out of services – waste collection, school meals, cleaning – which had been imposed on unwilling socialist councils from the eighties. Once in power, however, Labour ministers began to realise that those three little letters were political magic because they allowed them to announce and oversee exciting new projects and take the credit for them in the full knowledge that the full bill would be left for the taxpayers of twenty to fifty years hence. In this way, spending and funding of new hospitals or schools would be a problem for a future health or education minister.

PFIs were particularly attractive when other kinds of spending were tightly controlled by ‘Prudence’. Large amounts of capital for public buildings were declared to be ‘investment’, not spending, and put to one side of the public accounts. The justification was that private companies would construct and run this infrastructure so much more efficiently than the State and that profits paid to them by taxpayers would be more than compensated for. Ministers replied to criticisms of these schemes by pointing out that, without them, Britain would not get the hundreds of new school buildings, hospitals, health centres, fire stations, army barracks, helicopter training schools, prisons, government offices, roads and bridges that it so obviously needed by the nineties. Significantly, the peak year for PFIs was 1999-2000, just as the early Treasury prudence in conventional spending had bitten hardest and was being brought to an end.

Sources:

Andrew Marr (2008), A History of Modern Britain. Basingstoke: Pan Macmillan.

Simon Schama (2000), A History of Britain: The Fate of Empire, 1776-2000. London: BBC Worldwide.

Peter Catterall (et. al.) (2001), The Penguin Atlas of British & Irish History. London: Penguin Books.

Posted November 23, 2018 by AngloMagyarMedia in Affluence, Agriculture, Balkan Crises, BBC, Belfast Agreement, Birmingham, Britain, British history, Britons, Brussels, Celtic, devolution, Education, Ethnic cleansing, Europe, European Union, History, Immigration, Integration, Irish history & folklore, John Major, Margaret Thatcher, Migration, morality, nationalism, Nationality, New Labour, Population, privatization, Quakers (Religious Society of Friends), Reconciliation, Respectability, Social Service, south Wales, Thatcherism, Unionists, Wales, War Crimes, Welsh language, Yugoslavia

Tagged with , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

‘Celebrity’ Britain: The Arrival of ‘New Labour’ & Diana’s Demise, 1994-99.   Leave a comment

The Advent of Brown & Blair:

Featured Image -- 41771

Tony Blair was far more of an establishment figure than his mentor John Smith, or his great political ‘friend’ and future rival, Gordon Brown. He was the son of a Tory lawyer and went to preparatory school in Durham and then to a fee-paying boarding school in Edinburgh. He then went ‘up’ to Oxford, becoming a barrister and joining the Labour Party before he fell in love with a young Liverpudlian socialist called Cherie Booth, who sharpened his left-wing credentials before he became an MP at the 1983 General Election, winning a safe Labour seat in the north-east of England. Once in the Commons, he quickly fell in with Gordon Brown, another new MP, who was much that Blair was not. He was a tribal Labour Party man from a family which was strongly political and had rarely glimpsed the English Establishment, even its middle ranks from which Blair sprung. Brown had been Scotland’s best-known student politician and player in Scottish Labour politics from the age of twenty-three, followed by a stint in television. Yet the two men had their Christian beliefs in common, Anglo-Catholic in Blair’s case and Presbyterian in Brown’s. Most importantly, they were both deeply impatient with the state of the Labour Party. For seven or eight years they had seemed inseparable, sharing a small office together. Brown tutored Blair in some of the darker arts of politics while Blair explained the thinking of the English metropolitan and suburban middle classes to Brown. Together they made friends with Westminster journalists, both maturing as performers in the Commons, and together they worked their way up the ranks of the shadow cabinet.

After the 1992 defeat, Blair made a bleak public judgement about why Labour had lost so badly. The reason was simple: Labour has not been trusted to fulfil the aspirations of the majority of people in a modern world. As shadow home secretary he began to put that right, promising to be tough on crime and tough on the causes of crime. He was determined to return his party to the common-sense values of Christian Socialism, also influenced by the mixture of socially conservative and economically liberal messages used by Bill Clinton and his ‘New Democrats’. So too was Gordon Brown but as shadow chancellor, his job was to demolish the cherished spending plans of his colleagues. Also, his support for the ERM made him ineffective when Major and Lamont suffered their great defeat. By 1994, the Brown-Blair relationship was less strong than it had been, but they visited the States together to learn the new political style of the Democrats which, to the advantage of Blair, relied heavily on charismatic leadership. Back home, Blair pushed Smith to reform the party rulebook, falling out badly with him in the process. Media commentators began to tip Blair as the next leader, and slowly but surely, the Brown-Blair relationship was turning into a Blair-Brown one.

002 (2)

In the days following the sudden death of the Labour leader, John Smith (pictured right), Tony Blair decided almost immediately to run as his replacement, while Gordon Brown hesitated, perhaps more grief-stricken. But he had assumed he would eventually inherit the leadership, and was aghast when he heard of Blair’s early declaration. There were at least ten face-to-face confrontations between the two men, in London and Edinburgh. In the opinion polls, Blair was shown to be more popular, and he had the backing of more MPs as well as that of the press. Crucial to Blair’s case was his use of received pronunciation which, after Neil Kinnock and John Smith’s heavily accented English, would reassure those more prejudiced parts of the Kingdom which were the main battlegrounds for Labour, and in which Celtic tones were not perhaps as appreciated as they might be nowadays. They were alright when heard from actors and BBC presenters, but they made politicians seem more ‘peripheral’. Brown had a deeper knowledge of the Labour movement and broader support among the trade unions, however, and had also thought through his policy agenda for change in greater detail. Given the vagaries of Labour’s electoral college system, it is impossible to judge, even now, what might have happened had the ‘young English hart’ locked horns with the ‘tough Scottish stag’, but they agreed at the time that it would be disastrous for them to fight each other as two ‘modernisers’ since Brown would have to attack Blair from the left and the unions would then demand their tribute from him if he won.

So the two men came to a deal, culminating in a dinner at a ‘chic’ Islington restaurant. The outcome is still a matter of some dispute, but we know that Blair acknowledged that Brown, as Chancellor in a Labour government, would have complete authority over a wide range of policy which he would direct through the Treasury, including the ‘social justice’ agenda. But it is unlikely that he would have been so arrogant as to agree, as some have suggested, that he would hand over the premiership to Brown after seven years. After all, at that time Labour was already still three years away from winning its first term and not even the sharpest crystal ball could have projected the second term at that juncture. The most significant result of their dinner-table deal was that, following all the battles between Tory premiers and chancellors of the then recent and current Conservative governments, Brown’s Treasury would become a bastion for British home affairs, while Blair was left to concentrate on foreign policy largely unimpeded, with all the tragic consequences with which we are now familiar, with the benefit of the hindsight of the last twenty years.

Team Tony & ‘Blair’s Babes’:

When it arrived, the 1997 General Election demonstrated just what a stunningly efficient and effective election-winning team Tony Blair led, comprising those deadly masters of spin, Alistair Campbell and Peter Mandelson. ‘New Labour’ as it was now officially known, won 419 seats, the largest number ever for the party and comparable only with the seats won by the National Government of 1935. Its Commons majority was also a modern record, 179 seats, and thirty-three more than Attlee’s landslide majority of 1945. The swing of ten per cent from the Conservatives was another post-war record, roughly double that which the 1979 Thatcher victory had produced in the opposite direction. But the turn-out was very low, at seventy-one per cent the lowest since 1935. Labour had won a famous victory but nothing like as many actual votes as John Major had won five years earlier. But Blair’s party also won heavily across the south and in London, in parts of Britain where it had previously been unable to reach or represent in recent times.

As the sun came up on a jubilant, celebrating Labour Party returning to power after an eighteen-year absence, there was a great deal of Bohemian rhapsodizing about a new dawn for Britain. Alistair Campbell had assembled crowds of party workers and supporters to stand along Downing Street waving union flags as the Blairs strode up to claim their victory spoils. Briefly, at least, it appeared that the whole country had turned out to cheer the champions. In deepest, Lib-Con ‘marginal’ Somerset, many of us had been up all night, secretly sipping our Cava in front of the incredible scenes unfolding before our disbelieving eyes, and when the results came in from Basildon and Birmingham Edgbaston (my first constituency at the age of eighteen when it had already been a safe seat for Tory matron Jill Knight for at least a decade), we were sure that this would indeed be a landslide victory, even if we had had to vote for the Liberal Democrats in the West Country just to make sure that there was no way back for the Tories. The victory was due to a small group of self-styled modernisers who had seized the Labour Party and made it a party of the ‘left and centre-left’, at least for the time being, though by the end of the following thirteen years, and after two more elections, they had taken it further to the right than anyone expected on that balmy early summer morning; there was no room for cynicism amid all the euphoria. Labour was rejuvenated, and that was all that mattered.

A record number of women were elected to Parliament, 119, of whom 101 were Labour MPs, the so-called ‘Blair’s babes’. Despite becoming one of the first countries in the world to have a female prime minister, in 1987 there were just 6.3% of women MPs in government in the UK, compared with 10% in Germany and about a third in Norway and Sweden. Only France came below the UK with 5.7%.

Official portrait of Dame Margaret Hodge crop 2.jpgBefore the large group of women MPs joined her in 1997, Margaret Hodge (pictured below, c.1992, and right, in c. 2015) had already become MP for Barking in a 1994 by-election, following the death of Labour MP Jo Richardson. While still a new MP, Hodge endorsed the candidature of Tony Blair, a former Islington neighbour, for the Labour Party leadership, and was appointed Junior Minister for Disabled People in 1998. Before entering the Commons, she had been Leader of Islington Council and had not been short of invitations from constituencies to stand in the 1992 General Election. Given that she is now referred to as a ‘veteran MP’ it is therefore interesting to note that she had turned these offers down, citing her family commitments:

002

“It’s been a hard decision; the next logical step is from local to national politics and I would love to be part of a Labour government influencing change. But it’s simply inconsistent with family life, and I have four children who mean a lot to me. 

“It does make me angry that the only way up the political ladder is to work at it twenty-four hours a day, seven days a week. That’s not just inappropriate for a woman who has to look after children or relatives, it’s inappropriate for any normal person.

“The way Parliament functions doesn’t attract me very much. MPs can seem terribly self-obsessed, more interested in their latest media appearance than in creating change.” 

003

Patricia Hewitt.jpg

Patricia Hewitt (pictured above, in 1992, and more recently, right) had first begun looking for a seat in the 1970s when she was general secretary of the National Council of Civil Liberties (NCCL). She later commented that… looking for a seat takes an enormous amount of time, and money, too if you’re travelling a lot. Eventually, she was chosen to fight Leicester East in 1983, a contest which she lost by only nine hundred votes to the Conservative in what was then a relatively safe Tory seat. She later recalled driving up to Leicester on two evenings every week:

“I was planning to have a child after the elections – looking back I don’t know I imagined I was going to cope if Labour had won the seat… Even without children, I was leading such a pressured life – and my partner was doing the same as a Labour councillor – that it did put a strain on our relationship.”

She then became Neil Kinnock’s press and broadcasting secretary. In this role, she was a key player in the first stages of the ‘modernisation’ of the Labour Party, and along with Clive Hollick, helped set up the Institute for Public Policy Research and was its deputy director 1989–1994. By the time of the 1992 General Election she had two small children, so she decided not to look for a seat. Following Labour’s defeat in 1992, Hewitt was asked by the new Labour Leader, John Smith, to help establish the Commission on Social Justice, of which she became deputy chair. She then became head of research with Andersen Consulting, remaining in the post during the period 1994–1997. Hewitt was finally elected to Parliament to the House of Commons as the first female MP for Leicester West at the 1997 General Election, following the retirement of Greville Janner. She was elected with a majority of 12,864 and remained the constituency MP until stepping down in 2010.

001

Mary Kaldor (pictured right in the 1980s, and below in 2000), by contrast, never became an MP, one of the ‘loves’ Labour lost. A British academic, currently Professor of Global Governance at the London School of Economics, where she is also the Director of the Civil Society and Human Security Research Unit, she was the daughter of the economist Nicholas Kaldor, an exiled Hungarian economist who became an adviser to Harold Wilson in the 1960s. In the nineties, she was a senior research fellow at the Science Policy Research Unit of Sussex, and former foreign policy adviser to the Labour Party. She was shortlisted for Hackney and Dulwich in 1981, attending masses of meetings, many of which were boring at which she was endlessly having to be nice to people. Her youngest child was two years old at the time and was therefore ambivalent about the idea of becoming an MP:

“I was very well-equipped with baby minders and a nice understanding husband, but what on earth is the point of having children if you’re not going to see them?

“Building links with eastern Europe through the peace movement was more exciting than anything I could ever have done as an MP … (which seemed) entirely about competitiveness and being in the limelight, giving you no time to think honestly about your political views.”

Mary Kaldor crop.jpg

In 1999, Kaldor supported international military intervention over Kosovo on humanitarian grounds, calling for NATO ground forces to follow aerial bombardment in an article for The Guardian. I have written about the war in Kosovo in a separate article in this series. Significantly, however, by the end of the next decade Kaldor lost faith in the principle and practice of humanitarian intervention, telling the same paper:

The international community makes a terrible mess wherever it goes…

It is hard to find a single example of humanitarian intervention during the 1990s that can be unequivocally declared a success. Especially after Kosovo, the debate about whether human rights can be enforced through military means is ever more intense.

Moreover, the wars in Afghanistan and Iraq, which have been justified in humanitarian terms, have further called into question the case for intervention.

002

Blair needed the support and encouragement of admirers and friends who would coax and goad him. There was Mandelson, the brilliant but temperamental former media boss, who had now become an MP. Although adored by Blair, he was so mistrusted by other members of the team that Blair’s inner circle gave him the codename ‘Bobby’ (as in Bobby Kennedy). Alistair Campbell, Blair’s press officer and attack-dog is pictured above, in a characteristic ‘pose’. A former journalist and natural propagandist, he had helped orchestrate the campaign of mockery against Major. Then there was Anji Hunter, the contralto charmer who had known Blair as a young rock-singer and was his best hotline to middle England. Derry Irvine was a brilliant Highlands lawyer who had first found a place in his chambers for Blair and Booth. He advised on constitutional change and became Lord Chancellor in due course. These people, with the Brown team working in parallel, formed the inner core. The young David Miliband, son of a well-known Marxist philosopher, provided research support. Among the MPs who were initially close were Marjorie ‘Mo’ Mowlem and Jack Straw, but the most striking aspect about ‘Tony’s team’ was how few elected politicians it included.

The small group of people who put together the New Labour ‘project’ wanted to find a way of governing which helped the worse off, particularly by giving them better chances in education and to find jobs, while not alienating the mass of middle-class voters. They were extraordinarily worried by the press and media, bruised by what had happened to Kinnock, whom they had all worked with, and ruthlessly focused on winning over anyone who could be won. But they were ignorant of what governing would be like. They were able to take power at a golden moment when it would have been possible to fulfil all the pledges they had made. Blair had the wind at his back as the Conservatives would pose no serious threat to him for many years to come. Far from inheriting a weak or crisis-ridden economy, he was actually taking over at the best possible time when the country was recovering strongly but had not yet quite noticed that this was the case. Blair had won by being ruthless, and never forgot it, but he also seemed not to realise quite what an opportunity ‘providence’ had handed him.

Cool Britannia and the Celebrity Princess:

001

Above: a page from a recent school text.

Tony Blair arrived in power in a country with a revived fashion for celebrity, offering a few politicians new opportunities but at a high cost. It was not until 1988 that the full shape of modern celebrity culture had become apparent. That year had seen the publication of the truly modern glossy glamour magazines when Hello! was launched. Its successful formula was soon copied by OK! from 1993 and many other magazines soon followed suit, to the point where the yards of coloured ‘glossies’ filled the newsagents’ shelves in every town and village in the country. Celebrities were paid handsomely for being interviewed and photographed in return for coverage which was always fawningly respectful and never hostile. The rich and famous, no matter how flawed in real life, were able to shun the mean-minded sniping of the ‘gutter press’, the tabloid newspapers. In the real world, the sunny, airbrushed world of Hello! was inevitably followed by divorces, drunken rows, accidents and ordinary scandals. But people were happy to read good news about these beautiful people even if they knew that there was more to their personalities and relationships than met the eye. In the same year that Hello! went into publication, ITV also launched its the most successful of the daytime television shows, This Morning, hosted from Liverpool by Richard Madeley and Judy Finnigan, providing television’s celebrity breakthrough moment.

This celebrity fantasy world, which continued to open up in all directions throughout the nineties, served to re-emphasise to alert politicians, broadcasting executives and advertisers the considerable power of optimism. The mainstream media in the nineties was giving the British an unending stream of bleakness and disaster, so millions tuned in and turned over to celebrity. That they did so in huge numbers did not mean that they thought that celebrities had universally happy lives. And in the eighties and nineties, no celebrity gleamed more brightly than the beautiful yet troubled Princess Diana. For fifteen years she was an ever-present presence: as an aristocratic girl, whose childhood had been blighted by her parents’ divorce, her fairytale marriage in 1981 found her pledging her life to a much older man who shared few of her interests and did not even seem to be truly in love with her. Just as the monarchy had gained from its marriages, especially the filmed-for-television romance, engagement and wedding of Charles and Diana, the latter attracting a worldwide audience of at least eight hundred million, so it lost commensurately from the failure of those unions.

050

Above: Hello! looks back on the 1981 Royal Wedding from that of 2011.

Diana quickly learned how to work the crowds and to seduce the cameras like Marilyn Monroe. By the end of the eighties, she had become a living fashion icon. Her eating disorder, bulimia, was one suffered by a growing number of young women and teenage girls from less privileged homes. When AIDS was in the news, she hugged its victims to show that it was safe, and she went on to campaign for a ban on the use of land-mines. The slow disintegration of this marriage transfixed Britain, as Diana moved from a china-doll debutante to painfully thin young mother, to an increasingly charismatic and confident public figure, surprising her husband who had always assumed she would be in his shadow. After the birth of their second son Harry in 1987, Charles and Diana’s marriage was visibly failing.

When rumours spread of her affairs, they no longer had the moral impact that they might have had in previous decades. By the nineties, Britain was now a divorce-prone country, in which ‘what’s best for the kids’ and ‘I deserve to be happy’ were phrases which were regularly heard in suburban kitchen-diners. Diana was not simply a pretty woman married to a king-in-waiting but someone people felt, largely erroneously, would understand them. There was an obsessive aspect to the admiration of her, something that the Royal Family had not seen before, and its leading members found it very uncomfortable and even, at times, alarming. They were being challenged as living symbols of Britain’s ‘family values’ and found wanting, just as John Major’s government would also be hoisted by its own petard as its ‘Back to Basics’ campaign was overwhelmed by an avalanche of sexual and financial scandals.

By the mid-1990s, the monarchy was looking shaky, perhaps even mortal. The strain of being at once a ceremonial and a familial institution was proving a bit much. The year 1992, referred to as the Queen as her ‘annus horribilis’ in her Christmas speech, first saw the separation of the other royal couple, Andrew and Sarah, followed by a major fire at Windsor Castle in November. The journalist Andrew Morton claimed to tell Diana’s True Story in a book which described suicide attempts, blazing rows, her bulimia and her growing certainty that Prince Charles had resumed an affair with his old love Camilla Parker-Bowles, something he later confirmed in a television interview with Jonathan Dimbleby. In December, John Major announced the separation of Charles and Diana to the House of Commons. There was a further blow to the Royal Family’s prestige in 1994 when the royal yacht Britannia, the floating emblem of the monarch’s global presence, was decommissioned.

046

 Above: Prince William with his mother, c. 1994.

Then came the revelatory 1995 interview on BBC TV’s Panorama programme between Diana and Martin Bashir. Breaking every taboo left in Royal circles, she freely discussed the breakup of her marriage, claiming that there were ‘three of us’ in it, attacked the Windsors for their cruelty and promised to be ‘a queen of people’s hearts’. Finally divorced in 1996, she continued her charity work around the world and began a relationship with Dodi al-Fayed, the son of the owner of Harrods, Mohammed al-Fayed. To many in the establishment, she was a selfish, unhinged woman who was endangering the monarchy. To many millions more, however, she was more valuable than the formal monarchy, her readiness to share her pain in public making her even more fashionable. She was followed all around the world, her face and name selling many papers and magazines. By the late summer of 1997, Britain had two super-celebrities, Tony Blair and Princess Diana.

It was therefore grimly fitting that Tony Blair’s most resonant words as Prime Minister which brought him to the height of his popularity came on the morning when Diana was killed in a car-crash, together with Dodi, in a Paris underpass. Blair was woken from a deep sleep at his constituency home, first to be told about the accident, and then to be told that Diana had died. Deeply shocked and worried about what his proper role should be, Blair spoke first to Campbell and then to the Queen, who told him that neither she nor any other senior member of the Royal Family would be making a statement. He decided, therefore, that he had to say something. Later that Sunday morning, standing in front of his local parish church, he spoke words which were transmitted live around the world:

“I feel, like everyone else in this country today, utterly devastated. Our thoughts and prayers are with Princess Diana’s family, in particular her two sons, her two boys – our hearts go out to them. We are today a nation in a state of shock…

“Her own life was often sadly touched the lives of so many others in Britain and throughout the world with joy and with comfort. How many times shall we remember her in how many different ways, with the sick, the dying, with children, with the needy? With just a look or a gesture that spoke so much more than words, she would reveal to all of us the depth of her compassion and her humanity.

“People everywhere, not just here in Britain, kept faith with Princess Diana. They liked her, they loved her, they regarded her as one of the people. She was – the People’s Princess and that is how she will stay, how she will remain in our hearts and our memories for ever.”

Although these words seem, more than twenty years on, to be reminiscent of past tributes paid to religious leaders, at the time they were much welcomed and assented to. They were the sentiments of one natural charismatic public figure to another. Blair regarded himself as the people’s Prime Minister, leading the people’s party, beyond left and right, beyond faction or ideology, with a direct line to the people’s instincts. After his impromptu eulogy, his approval rating rose to over ninety per cent, a figure not normally witnessed in democracies. Blair and Campbell then paid their greatest service to the ancient institution of the monarchy itself. The Queen, still angry and upset about Diana’s conduct and concerned for the welfare of her grandchildren, wanted a quiet funeral and to remain at Balmoral, away from the scenes of public mourning in London. However, this was potentially disastrous for her public image. There was a strange mood in the country deriving from Diana’s charisma, which Blair had referenced in his words at Trimdon. If those words had seemed to suggest that Diana was a saint, a sub-religious hysteria responded to the thought. People queued to sign a book of condolence at St James’ Palace, rather than signing it online on the website of the Prince of Wales. Those queuing even reported supernatural appearances of the dead Princess’ image. By contrast, the lack of any act of public mourning by the Windsors and the suggestion of a quiet funeral seemed to confirm Diana’s television criticisms of the Royal Family as being cold if not cruel towards her.

001

In particular, the Queen was criticised for following protocol, which prohibited the flying of flags at Buckingham Palace when she was not in residence, rather than fulfilling the deep need of a grief-stricken public to see the Union flag flying there at half-mast. According to another protocol, flags were only flown at half-mast on the deaths of the monarch or their immediate blood relatives. But the crown lives or dies by such symbolic moments, and the Queen relented. Also, with Prince Charles’ full agreement, Blair and his aides put pressure on the Palace first into accepting that there would have to be a huge public funeral so that the public could express their grief, and second into accepting that the Queen should return to London. She did, just in time to quieten the genuine and growing anger about her perceived attitude towards Diana. This was a generational problem as well as a class one. The Queen had been brought up in a land of buttoned lips, stoicism and private grieving. She now reigned over a country which expected and almost required exhibitionism. For some years, the deaths of children, or the scenes of fatal accidents had been marked by little shrines of cellophane-wrapped flowers, soft toys and cards. In the run-up to Diana’s funeral parts of central London seemed almost Mediterranean in their public grieving. There were vast mounds of flowers, people sleeping out, holding up placards and weeping in the streets, strangers hugging each other.

The immense outpouring of public emotion in the weeks that followed seemed both to overwhelm and distinguish itself from the more traditional devotion to the Queen herself and to her immediate family. The crisis was rescued by a live, televised speech she made from the Palace which was striking in its informality and obviously sincere expression of personal sorrow. As Simon Schama has put it,

The tidal wave of feeling that swept over the country testified to the sustained need of the public to come together in a recognizable community of sentiment, and to do so as the people of a democratic monarchy.

003

The funeral itself was like no other before, bringing the capital to a standstill. In Westminster Abbey, campaigners stood alongside aristocrats, entertainers with politicians and rock musicians with charity workers. Elton John performed a hastily rewritten version of ‘Candle in the Wind’, originally his lament for Marilyn Monroe, now dedicated to ‘England’s Rose’, and Princess Diana’s brother Earl Spencer made a half-coded attack from the pulpit on the Windsors’ treatment of his sister. This was applauded when it was relayed outside and clapping was heard in the Abbey itself. Diana’s body was driven to her last resting place at the Spencers’ ancestral home of Althorp in Northamptonshire. Nearly a decade later, and following many wild theories circulated through cyberspace which reappeared regularly in the press, an inquiry headed by a former Metropolitan Police commissioner concluded that she had died because the driver of her car was drunk and was speeding in order to throw off pursuing ‘paparazzi’ photographers. The Queen recovered her standing after her live broadcast about her wayward former daughter-in-law. She would later rise again in public esteem to be seen to be one of the most successful monarchs for centuries and the longest-serving ever. A popular film about her, including a sympathetic portrayal of these events, sealed this verdict.

012

HM Queen Elizabeth II in 2001.

Tony Blair never again quite captured the mood of the country as he did in those sad late summer days. It may be that his advice and assistance to the Queen in 1997 was as vital to her as it was, in the view of Palace officials, thoroughly impertinent. His instinct for popular culture when he arrived in power was certainly uncanny. The New Age spiritualism which came out into the open when Diana died was echoed among Blair’s Downing Street circle. What other politicians failed to grasp and what he did grasp, was the power of optimism expressed in the glossy world of celebrity, and the willingness of people to forgive their favourites not just once, but again and again. One of the negative longer-term consequences of all this was that charismatic celebrities discovered that, if they apologised and bared a little of their souls in public, they could get away with most things short of murder. For politicians, even charismatic ones like Blair, life would prove a little tougher, and the electorate would be less forgiving of oft-repeated mistakes.

(to be continued).

Posted October 22, 2018 by AngloMagyarMedia in Affluence, Agriculture, BBC, Belfast Agreement, Belgium, Birmingham, Britain, Brussels, Christian Faith, Christianity, Church, Conquest, Conservative Party, devolution, Europe, European Economic Community, European Union, France, History, Integration, Ireland, Irish history & folklore, Journalism, Labour Party, Margaret Thatcher, Migration, Millenarianism, Monarchy, Narrative, nationalism, Nationality, New Labour, Population, Respectability, Scotland, Uncategorized, West Midlands

Tagged with , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Years of Transition – Britain, Europe & the World: 1992-1997.   Leave a comment

Epilogue to the Eighties & Prologue to the Nineties:

I can recall the real sense of optimism which resulted from the end of the Cold War, formally ending with President Gorbachev’s announcement of the dissolution of the Soviet Union on Christmas Day 1991. Although never an all-out global war, it had resulted in the deaths of up to forty million people throughout the world, involving more than a hundred and fifty smaller ‘proxy’ conflicts. Moreover, we had lived under a continual sense of doom, that it was only a matter of time until our brief, young lives would be snuffed out by a nuclear apocalypse. Now, politicians and journalists in the West talked of a coming ‘peace dividend’ and the end of the surveillance, spy and secret state in both east and west. The only continuing threat to British security came from the Provisional IRA. They hit Downing Street with a triple mortar attack in February 1991, coming close to killing the new Prime Minister, John Major, and his team of ministers and officials directing the Gulf War.

Margaret ThatcherBy the time Margaret Thatcher left office in tears on 28 November 1990, ‘Thatcherism’ was also a spent force, though its influence lingered on until at least the end of the century, and not just among Conservatives. Only a minority even among the ‘party faithful’ had been true believers and the Tory MPs would have voted her out had her cabinet ministers not beaten them to it. As Andrew Marr has written, History is harshest to a leader just as they fall. She had been such a strident presence for so long that many who had first welcomed her as a ‘gust’ of fresh air now felt the need for gentler breezes. Those who wanted a quieter, less confrontational leader found one in John Major.

Yet most people, in the end, had done well under her premiership, not just the ‘yuppies’ but also her lower-middle-class critics who developed their own entrepreneurial sub-cultures rather than depending on traditional sponsorship from arts councils and local authorities. By the early nineties, Britons were on average much wealthier than they had been in the late seventies and enjoyed a wider range of holidays, better food, and a greater variety of television channels and other forms of home entertainment. Nor was everything the Thatcher governments did out of tune with social reality. The sale of council houses which corresponded to the long passion of the British to be kings and queens of their own little castles. Sales of state utilities, on the other hand, presupposed a hunger for stakeholdership that was much less deeply rooted in British habits, and the subsequently mixed fortunes of those stocks did nothing to help change those habits. Most misguided of all was the decision to implement the ‘poll tax’ as a regressive tax. In the end, Thatcher’s 1987-90 government became just the latest in a succession of post-war British governments that had seen their assumptions rebound on them disastrously. This ‘trend’ was to continue under John Major. The upper middle-class ‘Victorian Values’ of the grocer’s daughter from Grantham were replaced by the ‘family values’ of the lower middle-class garden gnome salesman from Brixton, only for him to be overwhelmed by an avalanche of sexual and financial scandals.

The single most important event of the early nineties in Britain, possibly globally too, had nothing to do with politics and diplomacy or warfare and terrorism, at least not in the nineties. Tim Berners-Lee, a British scientist, invented the World Wide Web, or the Internet. His idea was for a worldwide ‘hypertext’, the computer-aided reading of electronic documents to allow people to work together remotely., sharing their knowledge in a ‘web’ of documents. His creation of it would give the internet’s hardware its global voice. He was an Oxford graduate who had made his first computer with a soldering iron, before moving to CERN, the European Physics laboratory, in Switzerland in 1980, the world’s largest scientific research centre. Here he wrote his first programme in 1989 and a year later he proposed his hypertext revolution which arrived in CERN in December 1990. The ‘internet’ was born the following summer. He chose not to patent his creation so that it would be free to everyone.

The Election of 1992 – A Curious Confidence Trick?:

002

John Major called an election for April 1992. Under a pugnacious Chris Patten, now Party chairman, the Tories targeted Labour’s enthusiasm for high taxes. During the campaign itself, Major found himself returning to his roots in Brixton and mounting a ‘soap-box’, from which he addressed raucous crowds through a megaphone. John Simpson, the BBC correspondent, was given the task of covering Major’s own campaign, and on 15 March he travelled to Sawley, in the PM’s constituency of Huntingdon, where Major was due to Meet the People. I have written elsewhere about the details of this, and his soap-box campaign, as reported by Simpson. Although Simpson described it as ‘a wooden construction of some kind’, Andrew Marr claims it was ‘a plastic container’. Either way, it has gone down in political history, together with the megaphone, as the prop that won him the election. The stark visual contrast achieved with the carefully stage-managed Labour campaign struck a chord with the media and he kept up an act that his father would have been proud of, playing the underdog to Neil Kinnock’s government in waiting. Right at the end, at an eve of poll rally in Sheffield, Kinnock’s self-control finally gave way and he began punching the air and crying “y’awl’ right!” as if he were an American presidential candidate. It was totally ‘cringe-worthy’ TV viewing, alienating if not repulsing swathes of the very middle England voters he needed to attract.

On 9 April 1992 Major’s Conservatives won fourteen million votes, more than any party in British political history. It was a great personal victory for the ‘new’ Prime Minister, but one which was also based on people’s fears of higher taxes under a Labour government. It was also one of the biggest victories in percentage terms since 1945, though the vagaries of the electoral system gave the Tories a majority of just twenty-one seats in parliament. Neil Kinnock was even more devastated than he had been in 1987 when he had not been expected to defeat Thatcher. The only organ of the entire British press which had called the election correctly was the Spectator. Its editor, Dominic Lawson, headlined the article which John Simpson wrote for him The Curious Confidence of Mr Major so that the magazine seemed to suggest that the Conservatives might pull off a surprise win. Simpson himself admitted to not having the slightest idea who would win, though it seemed more likely to him that Labour would. Yet he felt that John Major’s own apparent certainty was worth mentioning. When the results started to become clear on that Friday morning, 10 April, the Spectator stood out favourably from the shelves of newsagents, surrounded by even the late, or early editions of newspapers and magazines which had all been predicting a Labour victory.

003 (2)

The only politician possibly more disappointed than Neil Kinnock, who immediately left front-line politics, was Chris Patten, who had been the real magician behind Major’s remarkable victory. He lost his seat to the Liberals in marginal Bath and went off to become the final governor of Hong Kong ahead of the long-agreed handover of Britain’s last colony in 1997. Kinnock, a former long-term opponent of Britain’s membership of the EEC/ EC went off to Brussels to become a European Commissioner. Despite his triumph in the popular vote, never has such a famous victory produced so rotten an outcome for the victors. The smallness of Major’s majority meant that his authority could easily be eaten away in the Commons. As a consequence, he would not go down as a great leader in parliamentary posterity, though he remained popular in the country as a whole for some time, if not with the Thatcherites and Eurosceptic “bastards” in his own party.  Even Margaret Thatcher could not have carried through her revolutionary reforms after the 1979 and 1983 elections with the kind of parliamentary arithmetic which was dealt her successor. In Rugby terms, although the opposition’s three-quarters had been foiled by this artful dodger of a full-back, he had been dealt a ‘hospital pass’ by his own side. For the moment, he had control of the slippery ball, but he was soon to be forced back into series of crushing rucks and mauls among his own twenty-stone forwards.

 John Smith – Labour’s lost leader and his legacy:

002 (2)

After Neil Kinnock gave up the Labour leadership following his second electoral defeat in 1992, he was replaced by John Smith (pictured above), a placid, secure, self-confident Scottish lawyer. As Shadow Chancellor, he had been an effective cross-examiner of Nigel Lawson, John Major and Norman Lamont and had he not died of a heart attack in 1994, three years ahead of the next election, most political pundits agreed that, following the tarnishing of the Major administration in the mid-nineties, he would have become Prime Minister at that election. Had he done so, Britain would have had a traditional social democratic government, much like those of continental Europe. He came from a family of herring fishermen on the West Coast of Scotland, the son of a headmaster. Labour-supporting from early youth, bright and self-assured, he got his real political education at Glasgow University, part of a generation of brilliant student debaters from all parties who would go on to dominate Scottish and UK politics including, in due succession, Donald Dewar, Gordon Brown, Alistair Darling and Douglas Alexander. Back in the early sixties, Glasgow University Labour Club was a hotbed not of radicals, but of Gaitskell-supporting moderates. This was a position that Smith never wavered from, as he rose as one of the brightest stars of the Scottish party, and then through government under Wilson and Callaghan as a junior minister dealing with the oil industry and devolution before entering cabinet as President of the Board of Trade, its youngest member at just forty. In opposition, John Smith managed to steer clear of the worst in-fighting, eventually becoming Kinnock’s shadow chancellor. In Thatcher’s England, however, he was spotted as a tax-raising corporatist of the old school. One xenophobic letter he received brusquely informed him:

You’ll not get my BT shares yet, you bald, owl-looking Scottish bastard. Go back to Scotland and let that other twit Kinnock go back to Wales.

Smith came from an old-fashioned Christian egalitarian background which put him naturally out of sympathy with the hedonistic culture of southern England.  Just before he became Labour leader he told a newspaper he believed above all in education, because…

 … it opens the doors of the imagination, breaks down class barriers and frees people. In our family … money was looked down on and education was revered. I am still slightly contemptuous of money.

Smith was never personally close to Kinnock but was scrupulously loyal to him as his leader, he nevertheless succeeded him by a huge margin in 1992. By then he had already survived a serious cardiac arrest and had taken up hill-walking. Though Smith swiftly advanced the careers of his bright young lieutenants, Tony Blair and Gordon Brown, they soon became disappointed by his view that the Labour party needed simply to be improved, not radically transformed. In particular, he was reluctant to take on the party hierarchy and unions over issues of internal democracy, such as the introduction of a one-member, one-vote system for future leadership elections. He was sure that Labour could regain power with a revival of its traditional spirit. At one point, Tony Blair was so dispirited by Smith’s leadership style that he considered leaving politics altogether and going back to practising law. Smith died of a second heart attack on 12 May 1994. After the initial shock and grief subsided, Labour moved rapidly away from his policy of ‘gradualism’ towards ‘Blairite’ transformation. One part of his legacy still remains, however, shaping modern Britain today. As the minister who had struggled to achieve devolution for Scotland in 1978-9, he remained a passionate supporter of the ‘unfinished business’ of re-establishing the Holyrood Parliament and setting up a Welsh Assembly. With his friend Donal Dewar he had committed Labour so utterly to the idea in Opposition, despite Kinnock’s original strong anti-nationalist stance, that Blair, no great fan of devolution himself, found that he had to implement Smith’s unwelcome bequest to him.

Black Wednesday and the Maastricht Treaty:

The crisis that soon engulfed the Major government back in the early autumn of 1992 was a complicated economic one. From August 1992 to July 1996 I was mainly resident in Hungary, and so, although an economic historian, never really understood the immediate series of events that led to it or the effects that followed. This was still in pre-internet days, so I had little access to English language sources, except via my short-wave radio and intermittent newspapers bought during brief visits to Budapest. I had also spent most of 1990 and half of 1991 in Hungary, so there were also longer-term gaps in my understanding of these matters. I have written about them in earlier articles in this series, dealing with the end of the Thatcher years. Hungary itself was still using an unconvertible currency throughout the nineties, which only became seriously devalued in 1994-96, and when my income from my UK employers also fell in value, as a family we decided to move back to Britain to seek full-time sterling salaries. The first thing that happened was that they lost their fiscal policy in a single day when the pound fell out of the ERM (European Exchange Rate Mechanism). In his memoirs, John Major described the effect of this event in stark terms:

Black Wednesday – 16 September 1992, the day the pound toppled out of the ERM – was a political and economic calamity. It unleashed havoc in the Conservative Party and it changed the political landscape of Britain.

For Major and his government, the point was that as the German central bank had a deserved reputation for anti-inflationary rigour, having to follow or ‘shadow’ the mark meant that Britain had purchased a respected off-the-shelf policy. Sticking to the mighty mark was a useful signal to the rest of the world that this government, following all the inflationary booms of the seventies and eighties, was serious about controlling inflation. On the continent, however, the point of the ERM was entirely different, intended to lead to a strong new single currency that the countries of central Europe would want to join as members of an enlarged EC/EU. So a policy which Margaret Thatcher had earlier agreed to, in order to bring down British inflation, was now a policy she and her followers abhorred since it drew Britain ever closer towards a European superstate in the ‘Delors Plan’. This was a confused and conflicted state of affairs for most of the Tories, never mind British subjects at home and abroad.

The catalyst for sterling’s fall was the fall in the value of the dollar, pulling the pound down with it. Worse still, the money flowed into the Deutschmarks, which duly rose; so the British government raised interest rates to an eye-watering ten per cent, in order to lift the pound. When this failed to work, the next obvious step would have been for the German central bank to cut their interest rates, lowering the value of the mark and keeping the ERM formation intact. This would have helped the Italian lira and other weak currencies as well as the pound. But since Germany had just reunited after the ‘fall of the wall’, the whole cost of bringing the poorer East Germans into line with their richer compatriots in the West led to a real fear of renewed inflation as well as to memories of the Berlin Crisis of 1948-49 and the hyperinflation of the Weimar period. So the Germans, regardless of the pain being experienced by Britain, Italy and the rest, wanted to keep their high-value mark and their high interest rates. Major put considerable and concerted pressure on Chancellor Kohl, warning of the danger of the Maastricht treaty failing completely since the Danes had just rejected it in a referendum and the French were also having a plebiscite. None of this had any effect on Kohl who, like a previous German Chancellor, would not move.

002 (62)

In public, the British government stuck to the line that the pound would stay in the ERM at all costs. It was not simply a European ‘joint-venture’ mechanism but had been part of the anti-inflation policy of both the Lawson and Major chancellorships. Then, the now PM had told the unemployed workers and the repossessed homeowners in Britain that if it isn’t hurting, it isn’t working, so his credibility had been tied to the success of the ERM ever since. It had also been, as Foreign Secretary and now as Prime Minister, his foreign policy of placing Britain ‘at the heart of Europe’. It was his big idea for both economic and diplomatic survival in an increasingly ‘globalised’ environment. Norman Lamont, who as Chancellor was as committed as Major, told ‘the markets’ that Britain would neither leave the mechanism nor deviate from it by devaluing the pound. ERM membership was at the centre of our policy and there should not be one scintilla of doubt that it would continue. Major went even further, telling a Scottish audience that with inflation down to 3.7 per cent and falling, it would be madness to leave the ERM. He added that:

“The soft option, the devaluer’s option, the inflationary option, would be a betrayal of our future.”

However, then the crisis deepened with the lira crashing out of the ERM formation. International money traders, such as the Hungarian-born György Soros, began to turn their attention to the weak pound and carried on selling. They were betting that Major and Lamont would not keep interest rates so high that the pound could remain up there with the mark – an easy, one-way bet. In the real world, British interest rates were already painfully high. On the morning of ‘Black Wednesday’, at 11 a.m., the Bank of England raised them by another two points. This was to be agonising for home-owners and businesses alike, but Lamont said he would take whatever measures were necessary to keep the pound in the mechanism. Panic mounted and the selling continued: a shaken Lamont rushed round to tell Major that the interest rate hike had not worked, but Major and his key ministers decided to stay in the ERM. The Bank of England announced that interest would go up by a further three points, to fifteen per cent. Had it been sustained, this would have caused multiple bankruptcies across the country, but the third rise made no difference either. Eventually, at 4 p.m., Major phoned the Queen to tell her that he was recalling Parliament. At 7.30 p.m., Lamont left the Treasury to announce to the press and media in Whitehall that he was suspending sterling’s membership of the ERM and was reversing the day’s rise in interest rates.

Major considered resigning. It was the most humiliating moment in British politics since the IMF crisis of 1976, sixteen years earlier. But if he had done so Lamont would have had to go as well, leaving the country without its two most senior ministers in the midst of a terrible crisis. Major decided to stay on, though he was forever diminished by what had happened. Lamont also stayed at his post and was delighted as the economy began to respond to lower interest rates, and a slow recovery began. While others suffered further unemployment, repossession and bankruptcy, he was forever spotting the ‘green shoots’ of recovery. In the following months, Lamont created a new unified budget system and took tough decisions to repair the public finances. But as the country wearied of recession, he became an increasingly easy ‘butt’ of media derision. To Lamont’s complete surprise, Major sacked him as Chancellor a little over six months after Black Wednesday. Lamont retaliated in a Commons statement in which he said: We give the impression of being in office, but not in power. Major appointed Kenneth Clarke, one of the great characters of modern Conservatism, to replace him.

In the Commons, the struggle to ratify the Maastricht Treaty hailed as a great success for Major before the election, became a long and bloody one. Major’s small majority was more than wiped out by the number of ‘Maastricht rebels’, egged on by Lady Thatcher and Lord Tebbit. Black Wednesday had emboldened those who saw the ERM and every aspect of European federalism as disastrous for Britain. Major himself wrote in his memoirs that it turned …

… a quarter of a century of unease into a flat rejection of any wider involvement in Europe … emotional rivers burst their banks.

Most of the newspapers which had welcomed Maastricht were now just as vehemently against it. The most powerful Conservative voices in the media were hostile both to the treaty and to Major. His often leaded use of English and lack of ‘panache’ led many of England’s snobbish ‘High Tories’ to brand him shockingly ill-educated and third-rate as a national leader. A constantly shifting group of between forty to sixty Tory MPs regularly worked with the Labour opposition to defeat key parts of the Maastricht bill, so that Major’s day-to-day survival was always in doubt. Whenever, however, he called a vote of confidence and threatened his rebellious MPs with an election, he won. Whenever John Smith’s Labour Party and the Tory rebels could find some common cause, however thin, he was in danger of losing. In the end, Major got his legislation and Britain signed the Maastricht Treaty, but it came at an appalling personal and political cost. Talking in the general direction of an eavesdropping microphone, he spoke of three anti-European ‘bastards’ in his own cabinet, an obvious reference to Michael Portillo, Peter Lilley and John Redwood. The country watched a divided party tearing itself apart and was not impressed.

By the autumn of 1993, Norman Lamont was speaking openly about the possibility that Britain might have to leave the European Union altogether, and there were moves to force a national referendum. The next row was over the voting system to be used when the EU expanded. Forced to choose between a deal which weakened Britain’s hand and stopping the enlargement from happening at all by vetoing it, Foreign Secretary Douglas Hurd went for a compromise. All hell broke loose, as Tory MPs began talking of a leadership challenge to Major. This subsided, but battle broke out again over the European budget and fisheries policy. Eight MPs had their formal membership of the Tory Party withdrawn. By this point, John Smith’s sudden death had brought Tony Blair to the fore as leader of the Opposition. When Major readmitted the Tory rebels, Blair jibed: I lead my party, you follow yours. Unlike Lamont’s remark in the Commons, Blair’s comment struck a chord with the country.

The concluding chapter of the Thatcher Revolution:

While the central story of British politics in the seven years between the fall of Thatcher and the arrival to power of Blair was taken up by Europe, on the ‘home front’ the government tried to maintain the momentum of the Thatcher revolution. After many years of dithering, British Rail was divided up and privatised, as was the remaining coal industry. After the 1992 election, it was decided that over half the remaining coal mining jobs must go, in a closure programme of thirty-one pits to prepare the industry for privatization. This angered many Tory MPs who felt that the strike-breaking effect of the Nottinghamshire-based Union of Democratic Mineworkers in the previous decade deserved a better reward, and it aroused public protest as far afield as Cheltenham. Nevertheless, with power companies moving towards gas and oil, and the industrial muscle of the miners long-since broken, the closures and sales went ahead within the next two years, 1992-4. The economic effect on local communities was devastating, as the 1996 film Brassed Off shows vividly, with its memorable depiction of the social impact on the Yorkshire village of Grimethorpe and its famous Brass Band of the 1992 closure programme. Effectively, the only coalfields left working after this were those of North Warwickshire and South Derbyshire.

Interfering in the railway system became and remained a favourite ‘boys with toys’ hobby but a dangerous obsession of governments of different colours. Margaret Thatcher, not being a boy, knew that the railways were much too much part of the working life of millions to be lightly broken up or sold off. When Nicholas Ridley, as Transport Secretary, had suggested this, Thatcher is said to have replied:

“Railway privatisation will be the Waterloo of this government. Please never mention the railways to me again.”

It was taken up again enthusiastically by John Major. British Rail had become a national joke, loss-making, accident-prone, with elderly tracks and rolling stock, and serving curled-up sandwiches. But the challenge of selling off a system on which millions of people depended was obvious. Making it profitable would result in significant and unpopular fare rises and cuts in services. Moreover, different train companies could hardly compete with each other directly, racing up and down the same rails. There was, therefore, a binary choice between cutting up ‘BR’ geographically, selling off both trains and track for each region, so that the system would look much the way it was in the thirties, or the railway could be split ‘vertically’ so that the State would continue to own the track, while the stations and the trains would be owned by private companies. This latter solution was the one chosen by the government and a vast, complicated new system of subsidies, contracts, bids, pricing, cross-ticketing and regulation was created, but rather than keeping the track under public control, it too was to be sold off to a single private monopoly to be called Railtrack. Getting across the country would become a complicated proposition and transaction, involving two or three separate rail companies. A Franchise Director was to be given powers over the profits, timetables and ticket-pricing of the new companies, and a Rail Regulator would oversee the track. Both would report directly to the Secretary of State so that any public dissatisfaction, commercial problem or safety issue would ultimately be the responsibility of the government. This was a strange and pointless form of privatization which ended up costing the taxpayer far more than British Rail. The journalist Simon Jenkins concluded:

The Treasury’s treatment of the railway in the 1990s was probably the worst instance of Whitehall industrial management since the Second World War.

006 (2)

005

One success story in the rail network was the completion of the Channel Tunnel link to France in 1994 (the Folkestone terminal is pictured above), providing a good example of the inter-relationship between transport links and general economic development. The Kent town of Ashford had a relationship with the railways going back to 1842, and the closure of the town’s railway works between 1981 and 1993 did not, however, undermine the local economy. Instead, Ashford benefited from the Channel Tunnel rail link, which made use of railway lines running through the town, and its population actually grew by ten per cent in the 1990s. The completion of the ‘Chunnel’ gave the town an international catchment area of eighty-five million within a single day’s journey. The opening of the Ashford International railway station, the main terminal for the rail link to Europe, attracted a range of engineering, financial, distribution and manufacturing companies. In addition to the fourteen business parks that were opened in and around the town itself, four greenfield sites were opened on the outskirts, including a science park owned by Trinity College, Cambridge. As the map above shows, Ashford is now closer to Paris and Brussels in travelling time than it is to Manchester and Liverpool. By the end of the century, the town, with its position at the hub of a huge motorway network as well as its international rail link, was ready to become part of a truly international economy.

006

Many of the improvements in transport infrastructure on both islands of Britain and Ireland were the result of EU funding, especially in Northern Ireland, and it was also having an impact on transport planning in Britain, with projects in the Highlands and Islands. In 1993 the EU decided to create a European-wide transport network. Of the fourteen priority associated with this aim, three are based in Britain and Ireland – a rail link from Cork to Northern Ireland and the ferry route to Scotland; a road link from the Low Countries across England and Wales to Ireland, and the West Coast rail route in Britain.

As a Brixton man, Major had experienced unemployment and was well prepared to take on the arrogant and inefficient quality of much so-called public service. But under the iron grip of the Treasury, there was little prospect for a revival of local democracy to take charge of local services again. This left a highly bureaucratic centralism as the only option left, one which gained momentum in the Thatcher years. Under Major, the centralised Funding Agency for Schools was formed and schools in England and Wales were ranked by crude league tables, depending on how well their pupils did in exams. The university system was vastly expanded by simply allowing colleges and polytechnics to rename themselves as universities. The hospital system was further centralised and given a host of new targets. The police, faced with a review of their pay and demands by the Home Secretary, Kenneth Clarke for their forces to be amalgamated, were given their own performance league tables. The Tories had spent seventy-four per cent more, in real terms, on law and order since 1979, yet crime was at an all-time high. Clarke’s contempt for many of the forces as ‘vested interests’ was not calculated to win them round to reform. Across England and Wales elected councillors were turfed off police boards and replaced by businessmen. In 1993 Clarke, the old Tory dog who had clearly learned new tricks during his time at the Department of Health where he was said to have castrated the regional health authority chairmen, defended his new police league tables in the ‘newspeak’ of governments yet to come:

The new accountability that we seek from our public services will not be achieved simply because men of good will and reasonableness wish that it be so. The new accountability is the new radicalism.

Across Britain, from the auditing of local government to the running of courts and the working hours of nurses, an army of civil servants, accountants, auditors and inspectors marched into workplaces. From time to time, ministers would weakly blame Brussels for the imposition of the cult of central control and measurement. But this was mostly a home-grown ‘superstate’. Major called this centralising policy the ‘Citizen’s Charter’, ignoring the fact that Britons are ‘subjects’ rather than citizens. He himself did not like the ‘headline’ very much because of its unconscious echoes of Revolutionary France. Every part of the government dealing with public service was ordered to come up with proposals for improvement at ‘grass-roots level’, to be pursued from the centre by questionnaires, league tables and a system of awards, called ‘Charter Marks’ for organizations that achieved the required standards. He spoke of ’empowering’, ‘helping the customer’ and ‘devolving’ and thought that regulation from the centre would not last long, rather like a Marxist-Leninist anticipating the ‘withering away’ of the state. In his case, though, this would come about as the effects of growing competition are felt. In practice, of course, the regulators grew more powerful, not less so. Despite the rhetoric, public servants were not being given real freedom to manage. Elected office-holders were being sacked. Major’s ‘withering away’ of the state was no more successful than Lenin’s.

Britain and Ireland – first steps on the road to peace:

009Above: US President Bill Clinton addressing a peace rally in Belfast during his visit in 1995. Clinton played a significant role as a ‘peace broker’ in negotiations leading up to ‘the Good Friday Agreement’.

In December 1993, John Major stood outside the steel-armoured door of Number Ten Downing Street with the ‘Taoiseach’ of the Irish Republic, Albert Reynolds. He declared a new principle which offended many traditional Conservatives and Unionists. If both parts of Ireland voted to be reunited, Britain would not stand in the way. She had, said Major, no selfish strategic or economic interest in Northern Ireland. He also stated that if the Provisional IRA, which had lately bombed the very building Major was standing in front of and murdered two young boys in Cheshire, renounced violence, Sinn Fein could be recognised as a legitimate political party. In the run-up to this Downing Street Declaration, which some saw as a betrayal of the Tory Party’s long-held dedication to the Union of Great Britain and Northern Ireland, the government had been conducting ‘back channel’ negotiations with the terrorist organisation. In August 1994 the IRA finally declared a complete cessation of military operations which, though it stopped a long way short of renouncing the use of violence altogether, was widely welcomed and was followed a month later by a Loyalist ceasefire. A complicated choreography of three-strand talks, framework documents and discussions about the decommissioning of weapons followed, while on the streets, extortion, knee-capping and occasional ‘executions’ continued. But whereas the number of those killed in sectarian violence and bombings in 1993 had been eighty-four, the toll fell to sixty-one the following year, and in 1995 it was in single figures, at just nine deaths.

Long negotiations between London and Dublin led to cross-border arrangements. These negotiations had also involved the United States, where an influential pro-Irish lobby had helped to sustain the IRA campaign into the nineties through finance provided through ‘Noraid’. In the mid-nineties, President Clinton acted as a peace-broker, visiting Belfast in 1995 and helping to maintain the fragile cease-fire in the North. The contradictory demands of Irish Republicanism and Ulster Unionism meant that Major failed to get a final agreement, which was left to Tony Blair, with the ongoing help of the American ex-senator George Mitchell. The fact that in 1991 both countries had signed the Maastricht Treaty for closer political and economic unity in Europe, set a broader context for a bilateral agreement. However, while Irish political leaders eagerly embraced the idea of European integration, their British counterparts, as we have seen, remained deeply divided over it.

Economic decline/ growth & political resuscitation:

008

The closure of the Swan Hunter shipyard on the Tyne in May 1993 is an illuminating example of the impact of de-industrialisation. Swan Hunter was the last working shipyard in the region but had failed to secure a warship contract. An old, established firm, it was suffering some of the same long-term decline that decimated shipbuilding employment nationally to 26,000 by the end of a century. This devastated the local economy, especially as a bitter legal wrangle over redundancy payments left many former workers with no compensation whatever for the loss of what they had believed was employment for life. But the effects of de-industrialisation could spread much further than local communities. The closure of the shipyard, as shown in the map above, but the failure of the firm also had a ‘knock-on’ effect as suppliers as far afield as London and Glasgow lost valuable orders and, as a result, jobs.

004

By 1994, employment in manufacturing in Britain had fallen to four million from the nine million it had reached at its peak in 1966. The resulting mass unemployment hurt the older industries of the Northwest worst, but the losses were proportionately as high in the Southeast, reflecting the decline in newer manufacturing industry. Across most of Britain and Ireland, there was also a decline in the number of manufacturing jobs continuing into and throughout the 1990s. The service sector, however, expanded, and general levels of unemployment, especially in Britain, fell dramatically in the nineties. Financial services showed strong growth, particularly in such places as London’s Docklands, with its new ‘light railway’, and Edinburgh. By the late nineties, the financial industry was the largest employer in northern manufacturing towns and cities like Leeds, which grew rapidly throughout the decade, aided by its ability to offer a range of cultural facilities that helped to attract an array of UK company headquarters. Manchester, similarly, enjoyed a renaissance, particularly in the spheres of music, the media and sport.

In July 1995, tormented by yet more rumours of right-wing conspiracies against him, Major riposted with a theatrical gesture of his own, resigning as leader of the Conservative Party and inviting all-comers to take him on. He told journalists gathered in the Number Ten garden that it was “put up or shut up time”. If he lost he would resign as Prime Minister. If he won, he would expect the party to rally around him. This was a gamble, since other potential leaders were available, not least Michael Heseltine, who had become Deputy Prime Minister, and Michael Portillo, then the pin-up boy of the Thatcherites, whose supporters prepared a campaign headquarters for him, only for him to then decide against standing. In the event, the challenger was John Redwood, the Secretary of State for Wales and a highly intelligent anti-EU right-winger. Major won his fight, though 109 Tory MPs refused to back him.

Fighting the return of Fascism in Europe:

Major was also having to deal with the inter-ethnic wars breaking out in the former Yugoslavia, following the recognition of Slovenia, Croatia and Bosnia as independent states in the early nineties. The worst violence occurred during the Serbian assault on Bosnia (I have written about the bloody 1992-94 Siege of Sarajevo, its capital, in an article elsewhere on this site based on John Simpson’s reporting). The term ‘ethnic cleansing’ was used for the first time as woeful columns of refugees fled in different directions. A nightmare which Europeans thought was over in 1945 was returning, only a couple of days’ drive away from London and half a day’s drive from where I was living on the southern borders of Hungary with Serbia and Croatia.

Six years after the siege, during a school visit to the Hague, I sat in the courtroom of the International War Crimes Tribunal on the former Yugoslavia and listened, in horror, to the testimonies of those who had been imprisoned and tortured in concentration camps during the Bosnian War. I couldn’t believe that what I was hearing had happened in the final decade of the twentieth century in Europe. Those on trial at that time were the prison camp guards who had carried out the atrocities, claiming what had become known as the Nuremberg Defence. Later on, those giving the orders, both Mladko Radic and Radovan Karadzic (pictured below with John Simpson in 1993), the military and political leaders of the Bosnian Serbs, went on trial in the same courtroom, were convicted of war crimes and duly locked away, together with the former Serbian President, Slobodan Milosevic. Major had asked how many troops it would take to keep the warring three sides apart and was told the number was four hundred thousand, three times the total size of the British Army at that time. He sent 1,800 men to protect the humanitarian convoys that were rumbling south from the UN bases in Hungary.

001

Although many British people sent food parcels, warm clothes, medicine and blankets, loaded onto trucks and driven across the Croatian border and into Bosnia, many in the government were reluctant for Britain to become further involved. But the evening news bulletins showed pictures of starving refugees, the uncovered mass graves of civilians shot dead by death squads, and children with appalling injuries. There was a frenzied campaign for Western intervention, but President Clinton was determined not to risk the lives of American soldiers on the ground. Instead, he considered less costly alternatives, such as air strikes. This would have put others who were on the ground, including the British and other nationalities involved in the UN operation, directly into the line of retaliatory fire of the Serbian troops. When the NATO air-strikes began, the Serbs took the UN troops hostage, including British soldiers, who were then used as human shields. When the Serbs captured the town of Srebrenica and carried out a mass slaughter of its Muslim citizens, there were renewed calls for ‘boots on the ground’, but they never came.

Following three years of fighting, sanctions on Serbia and the success of the Croat Army in fighting back, a peace agreement was finally made in Dayton, Ohio. The UN convoys and troops left Hungary. Major became the first British Prime Minister of the post-War World to grapple with the question of what the proper role of the West should be to ‘regional’ conflicts such as the Balkan wars. They showed quite clearly both the dangers and the limitations of intervention. When a civil conflict is relayed in all its horror to tens of millions of voters every night by television, the pressure to ‘do something’ is intense.  But mostly this requires not air strikes but a full-scale ground force, which will then be drawn into the war itself. Then it must be followed by years of neo-colonial aid and rebuilding. Major and his colleagues were accused of moral cowardice and cynicism in allowing the revival of fascist behaviour in one corner of Europe. Yet, especially given the benefit of hindsight of what happened subsequently in Iraq and Afghanistan, perhaps Western leaders were right to be wary of full-scale intervention.

Back to basics?

For many British voters, the Major years were associated with the sad, petty and lurid personal scandals that attended so many of his ministers, after he made an unwise speech calling for the return as old-style morality. In fact, back to basics referred to almost everything except personal sexual morality; he spoke of public service, industry, sound money, free trade, traditional teaching, respect for the family and the law and the defeat of crime. It gave the press, however, a fail-safe headline charge of hypocrisy whenever ministers were caught out. A series of infidelities were exposed; children born out-of-wedlock, a death from a sex stunt which went wrong, rumours about Major’s own affairs (which later turned out to be truer than realised at the time). More seriously, there was also an inquiry as to whether Parliament had been misled over the sale of arms to Iraq, but these were all knitted together into a single pattern of misbehaviour, referred to as ‘sleaze’.

In 1996, a three-year inquiry into whether the government had allowed a trial to go ahead against directors of an arms company, Matrix Churchill, knowing that they were, in fact, acting inside privately accepted guidelines, resulted in two ministers being publicly criticised. It showed that the government had allowed a more relaxed régime of military-related exports to Saddam Hussein even after the horrific gassing of five thousand Kurds at Falluja, also revealing a culture of secrecy and double standards in the process. Neil Hamilton MP was accused of accepting cash from Mohammed al-Fayed, the owner of Harrods, for asking questions in the Commons. One of the most dramatic episodes in the 1997 election was the overwhelming defeat he suffered in his Tatton constituency by the former BBC war reporter, Martin Bell, who had been badly injured in Sarajevo who became Britain’s first independent MP for nearly fifty years. Jonathan Aitken, a Treasury minister was accused of accepting improper hospitality from an Arab business contact. He resigned to fight the Guardian over the claims, with the simple sword of truth and the trusty shield of fair play. He was found guilty of perjury, spending eighteen months in prison.

002

By the end of Major’s government, it seemed that the Tories might have learned the lesson that disagreements over the EU were capable of splitting their party. However, there was a general mood of contempt for politicians and the press, in particular, had lost any sense of deference. The reforms of the health service, police and schools had produced few significant improvements. The post-Cold War world was turning out to be nastier and less predictable than the early nineties days of the ‘peace dividend’ had promised. The Labour Opposition would, in due course, consider how the country might be better governed and reformed, as well as what would be the right British approach to peace-keeping and intervention now that the United States was the last superpower left standing. But in the early months of 1997,  Tony Blair and his fresh young ‘New Labour’ team, including Alistair Campbell (pictured above), were oiling their effective election-winning machine and moving in to roll over a tired-looking John Major and his tarnished old Tories.

Sources:

Andrew Marr (2008), A History of Modern Britain. Basingstoke: Pan-Macmillan.

Simon Schama (2018), A History of Britain, 1776-2000; The Fate of Empire. London: BBC Worldwide.

John Simpson (1999), Strange Places, Questionable People. Basingstoke: Pan-Macmillan.

Peter Caterall, Roger Middleton, John Swift (2001), The Penguin Atlas of British & Irish History. London: Penguin Books.

Posted October 17, 2018 by AngloMagyarMedia in Apocalypse, Arabs, Balkan Crises, Britain, British history, Britons, Brussels, Christian Faith, Christian Socialism, Christianity, Church, Coalfields, Cold War, devolution, Egalitarianism, Ethnic cleansing, Europe, European Economic Community, European Union, Family, France, Genocide, German Reunification, Germany, Gorbachev, Humanism, Hungary, Immigration, Ireland, Irish history & folklore, Italy, Journalism, Labour Party, manufacturing, Margaret Thatcher, Marxism, morality, National Health Service (NHS), Refugees, Revolution, Scotland, Security, terrorism, Thatcherism, Unemployment, Wales

Tagged with , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,