Archive for the ‘European Economic Community’ Category

You Only Live Twice – Cool Britannia to Cold Brexit: The United Kingdom, 1999-2019. Part Two: Identity, Immigration & Islam.   Leave a comment

 

002

British Identity at the Beginning of the New Millennium:

As Simon Schama pointed out in 2002, it was a fact that even though only half of the British-Caribbean population and a third of the British-Asian population were born in Britain, they continued to constitute only a small proportion of the total population. It was also true that any honest reckoning of the post-imperial account needed to take account of the appeal of separatist fundamentalism in Muslim communities. At the end of the last century, an opinion poll found that fifty per cent of British-born Caribbean men and twenty per cent of British-born Asian men had, or once had, white partners. In 2000, Yasmin Alibhai-Brown found that, when polled, eighty-eight per cent of white Britons between the ages of eighteen and thirty had no objection to inter-racial marriage; eighty-four per cent of West Indians and East Asians and fifty per cent of those from Indian, Pakistani or Bangladeshi backgrounds felt the same way. Schama commented:

The colouring of Britain exposes the disintegrationalist argument for the pallid, defensive thing that it is. British history has not just been some sort of brutal mistake or conspiracy that has meant the steamrollering of Englishness over subject nations. It has been the shaking loose of peoples from their roots. A Jewish intellectual expressing impatience with the harping on ‘roots’ once told me that “trees have roots; Jews have legs”. The same could be said of Britons who have shared the fate of empire, whether in Bombay or Bolton, who have encountered each other in streets, front rooms, kitchens and bedrooms.

001

Britain, the European Union, NATO & the Commonwealth, 2000

Until the Summer of 2001, this ‘integrationist’ view of British history and contemporary society was the broadly accepted orthodoxy among intellectuals and politicians, if not more popularly. At that point, however, partly as a result of riots in the north of England involving ethnic minorities, including young Muslim men, and partly because of events in New York and Washington, the existence of parallel communities began to be discussed more widely and the concept of ‘multiculturalism’ began to become subject to fundamental criticism on both the right and left of the political spectrum. In the ‘noughties’, the dissenters from the multicultural consensus began to be found everywhere along the continuum. In the eighties and nineties, there were critics who warned that the emphasis on mutual tolerance and equality between cultures ran the risk of encouraging separate development, rather than fostering a deeper sense of mutual understanding through interaction and integration between cultures. The ‘live and let live’ outlook which dominated ‘race relations’ quangos in the 1960s and ’70s had already begun to be replaced by a more active interculturalism, particularly in communities where that outlook had proven to be ineffective in countering the internecine conflicts of the 1980s. Good examples of this development can be found in the ‘Education for Mutual Understanding’ and ‘Inter-Cultural’ Educational projects in Northern Ireland and the North and West Midlands of England in which this author was involved and has written about elsewhere on this site.

Politicians also began to break with the multicultural consensus, and their views began to have an impact because while commentators on the right were expected to have ‘nativist’ if not ‘racist’ tendencies in the ‘Powellite’ tradition, those from the left could generally be seen as having less easily assailable motives.

Flickr - boellstiftung - Trevor Phillips.jpgTrevor Phillips (pictured left), whom I had known as the first black President of the National Union of Students in 1979 before, in 2003, he became the Chair of the Commission for Racial Equality, opened up territory in discussion and debate that others had not dared to ‘trespass’ into. His realisation that the race-relations ‘industry’ was part of the problem, and that partly as a result of talking up diversity the country was ‘sleepwalking to segregation’ was an insight that others began to share.

Simon Schama also argued that Britain should not have to choose between its own multi-cultural, global identity and its place in Europe. Interestingly, he put the blame for this pressure at least partly on the EU bureaucracy in Brussels, suggesting that…

 … the increasing compulsion to make the choice that General de Gaulle imposed on us between our European and our extra-European identity seems to order an impoverishment of our culture. It is precisely the the roving, unstable, complicated, migratory character of our history that ought to be seen as a gift for Europe. It is a past, after all, that uniquely in European history combines a passion for social justice with a tenacious attachment to bloody-minded liberty, a past designed to subvert, not reinforce, the streamlined authority of global bureaucracies and corporations. Our place at the European table ought to make room for that peculiarity or we should not bother showing up for dinner. What, after all, is the alternative? To surrender that ungainly, eccentric thing, British history, with all its warts and disfigurements, to the economic beauty parlour that is Brussels will mean a loss. But properly smartened up, we will of course be fully entitled to the gold-card benefits of the inward-looking club… Nor should Britain rush towards a re-branded future that presupposes the shame-faced repudiation of the past. For our history is not the captivity of our future; it is, in fact, the condition of our maturity.  

Featured Image -- 20189

‘Globalisation’

Fourteen years later, this was exactly the choice facing the British people, though now it was not De Gaulle or even the Brussels ‘Eurocrats’ who were asking the question, but the British Prime Minister, David Cameron, and his ‘Brexiteer’ Conservatives in his cabinet and on the back benches. The people themselves had not asked to be asked, but when they answered at the 2016 Referendum, they decided, by a very narrow majority, that they preferred the vision (some would say ‘unicorn’) of a ‘global’ Britain to the ‘gold-card benefits’ available at the European table it was already sitting at. Their ‘tenacious attachment’ to ‘bloody-minded liberty’ led to them expressing their desire to detach themselves from the European Union, though it is still not clear whether they want to remain semi-detached or move to a detached property at the very end of the street which as yet has not yet been planned, let alone built. All we have is a glossy prospectus which may or may not be delivered or even deliverable.

An internet poster from the 2016 Referendum Campaign

009

Looking back to 2002, the same year in which Simon Schama published his BBC series book, The Fate of Empire, the latest census for England and Wales was published. Enumerated and compiled the previous year, it showed the extent to which the countries had changed in the decade since the last census was taken. Douglas Murray, in the first chapter of his recent book, The Strange Death of Europe, first published in 2017, challenges us to imagine ourselves back in 2002 speculating about what England and Wales might look like in the 2011 Census. Imagine, he asks us, that someone in our company had projected:

“White Britons will become a minority in their own capital city by the end of this decade and the Muslim population will double in the next ten years.”

How would we have reacted in 2002? Would we have used words like ‘alarmist’, ‘scaremongering’, ‘racist’, ‘Islamophobic’? In 2002, a Times journalist made far less startling statements about likely future immigration, which were denounced by David Blunkett, then Home Secretary (using parliamentary privilege) as bordering on fascism. Yet, however much abuse they received for saying or writing it, anyone offering this analysis would have been proved absolutely right at the end of 2012, when the 2011 Census was published. It proved that only 44.9 per cent of London residents identified themselves as ‘white British’. It also revealed far more significant changes, showing that the number of people living in England and Wales who had been born ‘overseas’ had risen by nearly three million since 2001. In addition, nearly three million people in England and Wales were living in households where not one adult spoke English or Welsh as their main language.

DSCN0105

These were very major ethnic and linguistic changes, but there were equally striking findings of changing religious beliefs. The Census statistics showed that adherence to every faith except Christianity was on the rise. Since the previous census, the number of people identifying themselves as Christian had declined from seventy-two per cent to fifty-nine. The number of Christians in England and Wales dropped by more than four million, from thirty-seven million to thirty-three. While the Churches witnessed this collapse in their members and attendees, mass migration assisted a near doubling of worshippers of Islam. Between 2001 and 2011 the number of Muslims in England and Wales rose from 1.5 million to 2.7 million. While these were the official figures, it is possible that they are an underestimate, because many newly-arrived immigrants might not have filled in the forms at the beginning of April 2011 when the Census was taken, not yet having a registered permanent residence. The two local authorities whose populations were growing fastest in England, by twenty per cent in the previous ten years, were Tower Hamlets and Newham in London, and these were also among the areas with the largest non-response to the census, with around one in five households failing to return the forms.

002 (2)

Yet the results of the census clearly revealed that mass migration was in the process of altering England completely. In twenty-three of London’s thirty-three boroughs (see map above) ‘white Britons’ were now in a minority. A spokesman for the Office of National Statistics regarded this demonstrating ‘diversity’, which it certainly did, but by no means all commentators regarded this as something positive or even neutral. When politicians of all the main parties addressed the census results they greeted them in positive terms. This had been the ‘orthodox’ political view since in 2007 the then Mayor of London, Ken Livingstone, had spoken with pride about the fact that thirty-five per cent of the people working in London had been born in a foreign country. For years a sense of excitement and optimism about these changes in London and the wider country seemed the only appropriate tone to strike. This was bolstered by the sense that what had happened in the first decade of the twenty-first century was simply a continuation of what had worked well for Britain in the previous three decades. This soon turned out to be a politically-correct pretence, though what was new in this decade was not so much growth in immigration from Commonwealth countries and the Middle East, or from wartorn former Yugoslavia, but the impact of white European migrants from the new EU countries, under the terms of the accession treaties and the ‘freedom of movement’ regulations of the single market. As I noted in the previous article, the British government could have delayed the implementation of these provisions but chose not to.

Questions about the Quality & Quantity of Migration:

004

Besides the linguistic and cultural factors already dealt with, there were important economic differences between the earlier and the more recent migrations of Eastern Europeans. After 2004, young, educated Polish, Czech and Hungarian people had moved to Britain to earn money to earn money to send home or to take home with them in order to acquire good homes, marry and have children in their rapidly developing countries. And for Britain, as the host country, the economic growth of the 2000s was fuelled by the influx of energetic and talented people who, in the process, were also denying their own country their skills for a period. But the UK government had seriously underestimated the number of these workers who wanted to come to Britain. Ministers suggested that the number arriving would be around 26,000 over the first two years. This turned out to be wildly wrong, and in 2006 a Home Office minister was forced to admit that since EU expansion in 2004, 427,000 people from Poland and seven other new EU nations had applied to work in Britain. If the self-employed were included, he added, then the number might be as high as 600,000. There were also at least an additional 36,000 spouses and children who had arrived, and 27,000 child benefit applications had been received. These were very large numbers indeed, even if most of these turned out to be temporary migrants.

It has to be remembered, of course, that inward migration was partially offset by the outflow of around sixty thousand British people each year, mainly permanent emigrants to Australia, the United States, France and Spain. By the winter of 2006-07, one policy institute reckoned that there were 5.5 million British people living permanently overseas, nearly ten per cent of Britons, or more than the population of Scotland. In addition, another half a million were living abroad for a significant part of the year. Aside from Europe, the Middle East and Asia were seeing rising ‘colonies’ of expatriate British. A worrying proportion of them were graduates; Britain was believed to be losing one in six of its graduates to emigration. Many others were retired or better-off people looking for a life in the sun, just as many of the newcomers to Britain were young, ambitious and keen to work. Government ministers tended to emphasise these benign effects of immigration, but their critics looked around and asked where all the extra people would go, where they would live, and where their children would go to school, not to mention where the extra hospital beds, road space and local services would come from, and how these would be paid for.

Members of the campaign group Citizens UK hold a ‘refugees welcome’ event outside Lunar House in Croydon. Photograph: John Stillwell/PA

A secondary issue to that of ‘numbers’ was the system for asylum seekers. In 2000, there were thirty thousand failed asylum seekers in the United Kingdom, a third of those who had applied in 1999, when only 7,645 had been removed from the country. It was decided that it was impossible to remove more, and that to try to do so would prove divisive politically and financially costly. Added to this was the extent of illegal immigration, which had caught the ‘eye’ of the British public. There were already criminal gangs of Albanians, Kosovars and Albanians, operating from outside the EU, who were undermining the legal migration streams from Central-Eastern Europe in the eyes of many. The social service bill for these ‘illegal’ migrants became a serious burden for the Department of Social Security. Towns like Slough protested to the national government about the extra cost in housing, education and other services.

In addition, there was the sheer scale of the migration and the inability of the Home Office’s immigration and nationality department to regulate what was happening, to prevent illegal migrants from entering Britain, to spot those abusing the asylum system in order to settle in Britain and the failure to apprehend and deport people. Large articulated lorries filled with migrants, who had paid over their life savings to be taken to Britain, rumbled through the Channel Tunnel and the ferry ports. A Red Cross camp at Sangatte, near the French entrance to the ‘Chunnel’ (the photo below shows the Folkestone entrance), was blamed by Britain for exacerbating the problem. By the end of 2002, an estimated 67,000 had passed through the camp to Britain. The then Home Secretary, David Blunkett finally agreed on a deal with the French to close the camp down, but by then many African, Asian and Balkan migrants, believing the British immigration and benefits systems to be easier than those of other EU countries, had simply moved across the continent and waited patiently for their chance to board a lorry to Britain.

006 (2)

Successive Home Secretaries from Blunkett to Reid tried to deal with the trade, the latter confessing that his department was “not fit for purpose”. He promised to clear a backlog of 280,000 failed asylum claims, whose seekers were still in the country after five years. The historic Home Office was split up, creating a separate immigration and nationality service. Meanwhile, many illegal immigrants had succeeded in bypassing the asylum system entirely. In July 2005, the Home Office produced its own estimate of the number of these had been four years earlier. It reckoned that this was between 310,000 and 570,000, or up to one per cent of the total population. A year later, unofficial estimates pushed this number up to 800,000. The truth was that no-one really knew, but official figures showed the number applying for asylum were now falling, with the former Yugoslavia returning to relative peace.  Thousands of refugees were also being returned to Iraq, though the signs were already apparent that further wars in the Middle East and the impact of global warming on sub-Saharan Africa would soon send more disparate groups across the continents.

Britain’s Toxic Politics of Immigration:

010

To begin with, the arrival of workers from the ten countries who joined the EU in 2004 was a different issue, though it involved an influx of roughly the same size. By the government’s own figures, annual net inward migration had reached 185,000 and had averaged 166,000 over the previous seven years. This was significantly more than the average net inflow of fifty thousand New Commonwealth immigrants which Enoch Powell (pictured above) had referred to as ‘literally mad’ in his 1968 Rivers of Blood speech, though he had been criticising the immigration of East African Asians, of course. But although Powell’s speech was partly about race, colour and identity, it was also about numbers of immigrants and the practical concerns of his Wolverhampton constituents in finding hospital and school places in an overstretched public sector. It seems not unreasonable, and not at all racist, to suggest that it is a duty of central government to predict and provide for the number of newcomers it permits to settle in the country. In 2006, the Projections based on many different assumptions suggested that the UK population would grow by more than seven million by 2031. Of that, eighty per cent would be due to immigration. The organisation, Migration Watch UK, set up to campaign for tighter immigration controls, said this was equivalent to requiring the building of a new town the size of Cambridge each year, or five new cities the size of Birmingham over the predicted quarter century.

But such characterisations were surely caricatures of the situation since many of these new Eastern European migrants did not intend to settle permanently in the UK and could be expected to return to their countries of origin in due course. However, the massive underestimations of the scale of the inward migration were, of course, predictable to anybody with any knowledge of the history of post-war migration, replete with vast underestimates of the numbers expected. But it did also demonstrate that immigration control was simply not a priority for New Labour, especially in its early manifestations. It gave the impression that it regarded all immigration control, and even discussion of it, as inherently ‘racist’ (even the restriction of white European migration), which made any internal or external opposition hard to voice. The public response to the massive upsurge in immigration and to the swift transformation of parts of Britain it had not really reached before, was exceptionally tolerant. There were no significant or sustained outbreaks of racist abuse or violence before 2016, and the only racist political party, the British National Party (BNP) was subsequently destroyed, especially in London.

Official portrait of Dame Margaret Hodge crop 2.jpgIn April 2006, Margaret Hodge, the Labour MP for Barking since 1996 (pictured right), commented in an interview with The Sunday Telegraph that eight out of ten white working-class voters in her constituency might be tempted to vote for the British National Party (BNP) in the local elections on 4 May 2006 because “no one else is listening to them” about their concerns over unemployment, high house prices and the housing of asylum seekers in the area. She said the Labour Party must promote…

“… very, very strongly the benefits of the new, rich multi-racial society which is part of this part of London for me”.

There was widespread media coverage of her remarks, and Hodge was strongly criticised for giving the BNP publicity. The BNP went on to gain 11 seats in the local election out of a total of 51, making them the second largest party on the local council. It was reported that Labour activists accused Hodge of generating hundreds of extra votes for the BNP and that local members began to privately discuss the possibility of a move to deselect her. The GMB wrote to Hodge in May 2006, demanding her resignation. The Mayor of London, Ken Livingstone, later accused Hodge of “magnifying the propaganda of the BNP” after she said that British residents should get priority in council house allocations. In November 2009, the Leader of the BNP, Nick Griffin, announced that he intended to contest Barking at the 2010 general election. In spite of the unions’ position, Hodge was returned as Member for Barking in 2010, doubling her majority to over 16,000, whilst Griffin came third behind the Conservatives. The BNP lost all of its seats on Barking and Dagenham Council. Following the same general election in 2010, which saw New Labour defeated under Gordon Brown’s leadership.

Opinion polls and the simple, anecdotal evidence of living in the country showed that most people continued to feel zero personal animosity towards immigrants or people of different ethnic backgrounds. But poll after poll did show that a majority were deeply worried about what ‘all this’ migration meant for the country and its future. But even the mildest attempts to put these issues on the political agenda, such as the concerns raised by Margaret Hodge (and the 2005 Conservative election campaign poster suggesting ‘limits’ on immigration) were often met with condemnation by the ruling political class, with the result that there was still no serious public discussion of them. Perhaps successive governments of all hues had spent decades putting off any real debate on immigration because they suspected that the public disagreed with them and that it was a matter they had lost control over anyway.

Perhaps it was because of this lack of control that the principal reaction to the developing reality began to be to turn on those who expressed any concern about it, even when they reflected the views of the general public. This was done through charges of ‘racism’ and ‘bigotry’, such as the accidental ‘caught-on-mike’ remark made by Gordon Brown while getting into his car in the 2010 election campaign, when confronted by one of his own Labour councillors in a northern English town about the sheer numbers of migrants. It is said to have represented a major turning point in the campaign. A series of deflecting tactics became a replacement for action in the wake of the 2011 census, including the demand that the public should ‘just get over it’, which came back to haunt David Cameron’s ministers in the wake of the 2016 Referendum. In his Daily Telegraph column of December 2012, titled Let’s not dwell on immigration but sow the seeds of integration, Boris Johnson, then Mayor of London, responded to the census results by writing…

We need to stop moaning about the dam-burst. It’s happened. There is nothing we can now do except make the process of absorption as eupeptic as possible … 

The Mayor, who as an MP and member of David Cameron’s front-bench team later became a key leader of the ‘Leave’ campaign and an ardent Brexiteer, may well have been right in making this statement, saying what any practical politician in charge of a multi-cultural metropolis would have to say. But there is something cold about the tone of his remark, not least the absence of any sense that there were other people out there in the capital city not willing simply to ‘get over it’, who disliked the alteration of their society and never asked for it. It did not seem to have occurred to Johnson that there were those who might be nursing a sense of righteous indignation that about the fact that for years all the main parties had taken decisions that were so at variance with the opinions of their electors, or that there was something profoundly disenfranchising about such decisions, especially when addressed to a majority of the voting public.

In the same month as Johnson’s admonition, a poll by YouGov found two-thirds of the British public believed that immigration over the previous decade had been ‘a bad thing for Britain’. Only eleven per cent thought it had been ‘a good thing’. This included majorities among voters for every one of the three main parties. Poll after poll conducted over the next five years showed the same result. As well as routinely prioritising immigration as their top concern, a majority of voters in Britain regularly described immigration as having a negative impact on their public services and housing through overcrowding, as well as harming the nation’s identity. By 2012 the leaders of every one of the major parties in Britain had conceded that immigration was too high, but even whilst doing so all had also insisted that the public should ‘get over it’. None had any clear or successful policy on how to change course. Public opinion surveys suggest that a failure to do anything about immigration even while talking about it is one of the key areas of the breakdown in trust between the electorate and their political representatives.

At the same time, the coalition government of 2010-15 was fearful of the attribution of base motives if it got ‘tough on immigrants’. The Conservative leadership was trying to reposition itself as more socially ‘liberal’ under David Cameron. Nevertheless, at the election, they had promised to cut immigration from hundreds of thousands to tens of thousands per year, but they never succeeded in getting near that target. To show that she meant ‘business’, however, in 2013, Theresa May’s Home Office organised a number of vans with advertising hoardings to drive around six London boroughs where many illegal immigrants and asylum seekers lived. The posters on the hoardings read, In the UK illegally? Go home or face arrest, followed by a government helpline number. The posters became politically toxic immediately. The Labour Shadow Home Secretary, Yvette Cooper, described them as “divisive and disgraceful” and the campaign group Liberty branded them “racist and illegal”.

After some months it was revealed that the pilot scheme had successfully persuaded only eleven illegal immigrants to leave the country voluntarily. Theresa May admitted that the scheme had been a mistake and too “blunt”. Indeed, it was a ‘stunt’ designed to reassure the ‘native’ population that their government was getting tough, and it was not repeated, but the overall ‘hostile environment’ policy it was part of continued into the next majority Conservative government, leading to the illegal deportation of hundreds of ‘Windrush generation’ migrants from the Caribbean who had settled in Britain before 1968 and therefore lacked passports and papers identifying them as British subjects. The Tories repeated their promise on immigration more recently, in both David Cameron’s majority government of 2015 and Theresa May’s minority one of 2017, but are still failing to get levels down to tens of thousands. In fact, under Cameron, net immigration reached a record level of 330,000 per year, numbers which would fill a city the size of Coventry.

The movement of people, even before the European migration crisis of 2015, was of an entirely different quantity, quality and consistency from anything that the British Isles had experienced before, even in the postwar period. Yet the ‘nation of immigrants’ myth continued to be used to cover over the vast changes in recent years to pretend that history can be used to provide precedents for what has happened since the turn of the millennium. The 2011 Census could have provided an opportunity to address the recent transformation of British society but like other opportunities in the second half of the twentieth century to discuss immigration, it was missed. If the fact that ‘white Britons’ now comprised a minority of the London population was seen as a demonstration of ‘diversity’ then the census had shown that some London boroughs were already lacking in ‘diversity’, not because there weren’t enough people of immigrant origin but because there weren’t enough ‘white Britons’ still around to make those boroughs diverse.

Brexit – The Death of Diversity:

Since the 2011 Census, net migration into Britain has continued to be far in excess of three hundred thousand per year. The rising population of the United Kingdom is now almost entirely due to inward migration, and to higher birthrates among the predominantly young migrant population. In 2014 women who were born overseas accounted for twenty-seven per cent of all live births in England and Wales, and a third of all newborn babies had at least one overseas-born parent, a figure that had doubled since the 1990s. However, since the 2016 Brexit vote, statistics have shown that many recent migrants to Britain from the EU have been returning to their home countries so that it is difficult to know, as yet, how many of these children will grow up in Britain, or for how long. On the basis of current population trends, and without any further rise in net inward migration, the most modest estimate by the ONS of the future British population is that it will rise from its current level of sixty-five million to seventy million within a decade, seventy-seven million by 2050 and to more than eighty million by 2060. But if the post-2011 levels were to continue, the UK population would go above eighty million as early as 2040 and to ninety million by 2060. In this context, Douglas Murray asks the following rhetoric questions of the leaders of the mainstream political parties:

All these years on, despite the name-calling and the insults and the ignoring of their concerns, were your derided average white voters not correct when they said that they were losing their country? Irrespective of whether you think that they should have thought this, let alone whether they should have said this, said it differently or accepted the change more readily, it should at some stage cause people to pause and reflect that the voices almost everybody wanted to demonise and dismiss were in the final analysis the voices whose predictions were nearest to being right.

An Ipsos poll published in July 2016 surveyed public attitudes towards immigration across Europe. It revealed just how few people thought that immigration has had a beneficial impact on their societies. To the question, Would you say that immigration has generally had a positive or negative impact on your country? very low percentages of people in each country thought that it had had a positive effect. Britain had a comparatively positive attitude, with thirty-six per cent of people saying that they thought it had had a very or fairly positive impact. Meanwhile, on twenty-four per cent of Swedes felt the same way and just eighteen per cent of Germans. In Italy, France and Belgium only ten to eleven per cent of the population thought that it had made even a fairly positive impact on their countries. Despite the Referendum result, the British result may well have been higher because Britain had not experienced the same level of immigration from outside the EU as had happened in the inter-continental migration crisis of the previous summer.

whos-in-control-7

Indeed, the issue of immigration as it affected the 2016 Referendum in Britain was largely about the numbers of Eastern European migrants arriving in the country, rather than about illegal immigrants from outside the EU, or asylum seekers. Inevitably, all three issues became confused in the public mind, something that UKIP (United Kingdom Independence Party) used to good effect in its campaigning posters. The original version of the poster above, featuring UKIP leader Nigel Farage, caused considerable controversy by using pictures from the 2015 Crisis in Central-Eastern Europe to suggest that Europe was at ‘Breaking Point’ and that once in the EU, refugees and migrants would be able to enter Britain and settle there. This was untrue, as the UK is not in the ‘Schengen’ area. Campaigners against ‘Brexit’ pointed out the facts of the situation in the adapted internet poster. In addition, during the campaign, Eastern European leaders, including the Poles and the Hungarians, complained about the misrepresentation of their citizens as ‘immigrants’ like many of those who had recently crossed the EU’s Balkan borders in order to get to Germany or Sweden. As far as they were concerned, they were temporary internal migrants within the EU’s arrangements for ‘freedom of movement’ between member states. Naturally, because this was largely a one-way movement in numeric terms, this distinction was lost on many voters, however, as ‘immigration’ became the dominant factor in their backing of Brexit by a margin of 52% to 48%.

In Britain, the issue of Calais remained the foremost one in discussion in the autumn of 2016. The British government announced that it was going to have to build a further security wall near to the large migrant camp there. The one-kilometre wall was designed to further protect the entry point to Britain, and specifically to prevent migrants from trying to climb onto passing lorries on their way to the UK. Given that there were fewer than 6,500 people in the camp most of the time, a solution to Calais always seemed straightforward. All that was needed, argued activists and politicians, was a one-time generous offer and the camp could be cleared. But the reality was that once the camp was cleared it would simply be filled again. For 6,500 was an average day’s migration to Italy alone.

Blue: Schengen Area Green: Countries with open borders Ochre: Legally obliged to join

In the meantime, while the British and French governments argued over who was responsible for the situation at Calais, both day and night migrants threw missiles at cars, trucks and lorries heading to Britain in the hope that the vehicles would stop and they could climb aboard as stowaways for the journey across the Channel. The migrants who ended up in Calais had already broken all the EU’s rules on asylum in order to get there. They had not applied for asylum in their first country of entry, Greece, nor even in Hungary. Instead, they had pushed on through the national borders of the ‘Schengen’ free passage area (see map above right) until they reached the north of France. If they were cold, poor or just worse off, they were seen as having the right to come into a Europe which could no longer be bothered to turn anyone away.

007

Migrants/ Asylum Seekers arriving on the shores of the Greek island of Lesbos.

The Disintegration of Multiculturalism, ‘Parallel Development’ & the Populist Reaction in Britain:

After the 9/11 attacks on the USA, the wars in Iraq and Afghanistan and the 7/7 London bombings, there was no bigger cultural challenge to the British sense of proportion and fairness than the threat of ‘militant Islam’. There were plenty of angry young Muslim men prepared to listen to fanatical ‘imams’ and to act on their narrow-minded and bloodthirsty interpretations of ‘Jihad’. Their views, at odds with those of the well-established South Asian Muslim communities referred to above, were those of the ultra-conservative ‘Wahhabi’ Arabs and Iranian mullahs who insisted, for example, on women being fully veiled. But some English politicians, like Norman Tebbit, felt justified in asking whether Muslim communities throughout Britain really wanted to fully integrate. Would they, in Tebbit’s notorious ‘test’, support the English Cricket team when it played against Pakistan?

Britain did not have as high a proportion of Muslims as France, and not many, outside London and parts of the South East, of Arab and North African origin. But the large urban centres of the Home Counties, the English Midlands and the North of England had third generation Muslim communities of hundreds of thousands. They felt like they were being watched in a new way and were perhaps right to feel more than a little uneasy. In the old industrial towns on either side of the Pennines and in areas of West London there were such strong concentrations of Muslims that the word ‘ghetto’ was being used by ministers and civil servants, not just, as in the seventies and eighties, by rightwing organisations and politicians. White working-class people had long been moving, quietly, to more semi-rural commuter towns in the Home Counties and on the South Coast.

But those involved in this ‘white flight’, as it became known, were a minority if polling was an accurate guide. Only a quarter of Britons said that they would prefer to live in white-only areas. Yet even this measure of ‘multiculturalism’, defined as ‘live and let live’, was being questioned. How much should the new Britons ‘integrate’ or ‘assimilate’, and how much was the retention of traditions a matter of their rights to a distinctive cultural identity? After all, Britain had a long heritage of allowing newcomers to integrate on their own terms, retaining and contributing elements of their own culture. Speaking in December 2006, Blair cited forced marriages, the importation of ‘sharia’ law and the ban on women entering certain mosques as being on the wrong side of this line. In the same speech he used new, harder language. He claimed that, after the London bombings, …

“… for the first time in a generation there is an unease, an anxiety, even at points a resentment that outr very openness, our willingness to welcome difference, our pride in being home to many cultures, is being used against us … Our tolerance is what makes is part of what makes Britain, Britain. So conform to it; or don’t come here. We don’t want the hate-mongers … If you come here lawfully, we welcome you. If you are permitted to stay here permanently, you become an equal member of our community and become one of us.”

His speech was not just about security and the struggle against terrorism. He was defining the duty to integrate. Britain’s strong economic growth over the previous two decades, despite its weaker manufacturing base, was partly the product of its long tradition of hospitality. The question now was whether the country was becoming so overcrowded that this tradition of tolerance was finally eroding. England, in particular, had the highest population density of any major country in the Western world. It would require wisdom and frankness from politicians together with watchfulness and efficiency from Whitehall to keep the ship on an even keel. Without these qualities and trust from the people, how can we hope for meaningful reconciliation between Muslim, Christian, Jew and Humanist?; between newcomers, sojourners, old-timers and exiles?; between white Europeans, black Africans, South Asians and West Indians?

Map showing the location of Rotherham in South Yorkshire

In January 2011, a gang of nine Muslim men, seven of Pakistani heritage and two from North Africa, were convicted and sentenced at the Old Bailey in London for the sex trafficking of children between the ages of eleven and fifteen. One of the victims sold into a form of modern-day slavery was a girl of eleven who was branded with the initial of her ‘owner’ and abuser: ‘M’ for Mohammed. The court heard that he had branded her to make her his property and to ensure others knew about it. This did not happen in a Saudi or Pakistani backwater, nor even in one of the northern English towns that so much of the country had forgotten about until similar crimes involving Pakistani heritage men were brought to light. This happened in Oxfordshire between 2004 and 2012. Nobody could argue that gang rape and child abuse are the preserve of immigrants, but these court cases and the official investigations into particular types of child-rape gangs, especially in the case of Rotherham, have identified specific cultural attitudes towards women, especially non-Muslim women, that are similar to those held by men in parts of Pakistan. These have sometimes been extended into intolerant attitudes toward other religions, ethnic groups and sexual minorities. They are cultural attitudes which are anathema to the teachings of the Qu’ran and mainstream Imams, but fears of being accused of ‘racism’ for pointing out such factual connections had been at least partly responsible for these cases taking years to come to light.

British Muslims and members of the British-Pakistani community condemned both the abuse and that it had been covered up. Nazir Afzal (pictured right), Chief Crown Prosecutor of the Crown Prosecution Service (CPS) for North West England from 2011–2015, himself a Muslim, made the decision in 2011 to prosecute the Rochdale child sex abuse ring after the CPS had turned the case down. Responding to the Jay report, he argued that the abuse had no basis in Islam:

“Islam says that alcohol, drugs, rape and abuse are all forbidden, yet these men were surrounded by all of these things. … It is not the abusers’ race that defines them. It is their attitude toward women that defines them.” 

Below left: The front page of The Times, 24 September 2012.

Even then, however, in the Oxfordshire case, the gangs were described as ‘Asian’ by the media, rather than as men of Pakistani and Arabic origin. In addition, the fact that their victims were chosen because they were not Muslim was rarely mentioned in court or dwelt upon by the press. But despite sections of the media beginning focus on Pakistani men preying on young white girls, a 2013 report by the UK Muslim Women’s Network found that British Asian girls were also being abused across the country in situations that mirrored the abuse in Rotherham. The unfunded small-scale report found 35 cases of young Muslim girls of Pakistani-heritage being raped and passed around for sex by multiple men. In the report, one local Pakistani women’s group described how Pakistani-heritage girls were targeted by taxi drivers and on occasion by older men lying in wait outside school gates at dinner times and after school. They also cited cases in Rotherham where Pakistani landlords had befriended Pakistani women and girls on their own for purposes of sex, then passed on their name to other men who had then contacted them for sex. The Jay Report, published in 2014, acknowledged that the 2013 report of abuse of Asian girls was ‘virtually identical’ to the abuse that occurred in Rotherham, and also acknowledged that British Asian girls were unlikely to report their abuse due to the repercussions on their family. Asian girls were ‘too afraid to go to the law’ and were being blackmailed into having sex with different men while others were forced at knife-point to perform sexual acts on men. Support workers described how one teenage girl had been gang-raped at a party:

“When she got there, there was no party, there were no other female members present. What she found was that there were five adults, their ages ranging between their mid-twenties going on to the late-forties and the five men systematically, routinely, raped her. And the young man who was supposed to be her boyfriend stood back and watched”.

Groups would photograph the abuse and threaten to publish it to their fathers, brothers, and in the mosques, if their victims went to the police.

In June 2013, the polling company ComRes carried out a poll for BBC Radio 1 asking a thousand young British people about their attitudes towards the world’s major religions. The results were released three months later and showed that of those polled, twenty-seven per cent said that they did not trust Muslims (compared with 15% saying the same of Jews, 13% of Buddhists, and 12% of Christians). More significantly, perhaps, forty-four per cent said that they thought Muslims did not share the same views or values as the rest of the population. The BBC and other media in Britain then set to work to try to discover how Britain could address the fact that so many young people thought this way. Part of the answer may have had something to do with the timing of the poll, the fieldwork being carried out between 7-17 June. It had only been a few weeks before this that Drummer Lee Rigby, a young soldier on leave from Afghanistan, had been hit by a car in broad daylight outside an army barracks in South London, dragged into the middle of the road and hacked to death with machetes. The two murderers, Michael Adebolajo and Michael Adebowale, were Muslims of African origin who were carrying letters claiming justification for killing “Allah’s enemies”. It’s therefore reasonable to suppose that, rather than making assumptions about a religious minority without any evidence, those who were asked their opinions connected Muslims with a difference in basic values because they had been very recently associated with an act of extreme violence on the streets of London.

Unfortunately, attempts to provide a more balanced view and to separate these acts of terrorism from Islam have been dwarfed by the growing public perception of a problem which will not simply go away through the repetition of ‘mantras’. The internet has provided multiple and diverse sources of information, but the simple passage of the various events related above, and the many others available examples, have meant that the public have been able to make their own judgements about Islam, and they are certainly not as favourable as they were at the start of the current century. By 2015, one poll showed that only thirty per cent of the general public in Britain think that the values of Islam are ‘compatible’ with the values of British society. The passage of terrorist events on the streets of Europe continued through 2016 and 2017. On 22 March 2017, a 52-year-old British born convert to Islam, Khalid Masood, ploughed his car across Westminster Bridge, killing two tourists, one American and the other Romanian, and two British nationals. Dozens more were injured as they scattered, some falling into the River Thames below. Crashing into the railings at the side of Parliament, Masood then ran out of the hired vehicle and through the gates of the palace, where he stabbed the duty policeman, PC Keith Palmer, who died a few minutes later. Masood was then shot dead by armed police, his last phone messages revealing that he believed he was “waging jihad.” Two weeks later, at an inter-faith ‘Service of Hope’ at Westminster Abbey, its Dean, the Very Reverend John Hall, spoke for a nation he described as ‘bewildered’:

What could possibly motivate a man to hire a car and take it from Birmingham to Brighton to London, and then drive it fast at people he had never met, couldn’t possibly know, against whom he had no personal grudge, no reason to hate them and then run at the gates of the Palace of Westminster to cause another death? It seems that we shall never know.

Then on 22 May thousands of young women and girls were leaving a concert by the US pop singer Ariana Grande at Manchester Arena. Waiting for them as they streamed out was Salman Abedi, a twenty-two-year-old British-born man, whose Libyan parents had arrived in the UK in the early nineties after fleeing from the Gadaffi régime. In the underground foyer, Abedi detonated a bomb he was carrying which was packed with nuts, bolts and other shrapnel. Twenty-two people, children and parents who had arrived to pick them up, were killed instantly. Hundreds more were injured, many of them suffering life-changing wounds. Then, in what began to seem like a remorseless series of events, on 3 June three men drove a van into pedestrians crossing London Bridge. They leapt out of it and began slashing at the throats of pedestrians, appearing to be targeting women in particular. They then ran through Borough Market area shouting “this is for Allah”. Eight people were murdered and many more seriously injured before armed police shot the three men dead. Two of the three, all of whom were aged twenty to thirty, were born in Morocco. The oldest of them, Rachid Redouane, had entered Britain using a false name, claiming to be a Libyan and was actually five years older than he had pretended. He had been refused asylum and absconded. Khurram Butt had been born in Pakistan and had arrived in the UK as a ‘child refugee’ in 1998, his family having moved to the UK to claim asylum from ‘political oppression’, although Pakistan was not on the UNHCR list.

On the evening of 19 June, at end of the Muslim sabbath, in what appeared to be a ‘reprisal’, a forty-seven-year-old father or four from Cardiff drove a van into crowds of worshippers outside Finsbury Park mosque who were crossing the road to go to the nearby Muslim Welfare House. One man, who had collapsed on the road and was being given emergency aid, was run over and died at the scene. Almost a dozen more were injured. Up to this point, all the Islamist terror attacks, from 7/7/2005 onwards, had been planned and carried out by ‘home-grown’ terrorists. Even the asylum seekers involved in the June attack in London had been in the country since well before the 2015 migration crisis. But in mid-September, an eighteen-year-old Iraqi who arrived in the UK illegally in 2015, and had been living with British foster parents ever since, left a crudely-manufactured bomb on the London Underground District line during the rush hour when the carriages were also crowded with schoolchildren. The detonator exploded but failed to ignite the home-made device itself, leading to flash burns to the dozens of people in the carriage. A more serious blast would have led to those dozens being taken away in body bags, and many more injured in the stampede which would have followed at the station exit with its steep steps. As it was, the passengers remained calm during their evacuation, but the subsequent emphasis on the ubiquitous Blitz slogan ‘Keep Calm and Carry On!’

Conclusion: Brexit at its ‘Best’.

002

Of course, it would have been difficult to predict and prevent these attacks, either by erecting physical barriers or by identifying individuals who might be at risk from ‘radicalisation’, much of which takes place online. Most of the attackers had been born and radicalised in the UK, so no reinforcements at the borders, either in Calais or Kent would have kept them from enacting their atrocities. But the need for secure borders is not simple a symbolic or psychological reinforcement for the British people if it is combined with a workable and efficient asylum policy. We are repeatedly told that one of the two main reasons for the 2016 referendum decision for Britain to leave the EU was in order to take back control of its borders and immigration policy, though it was never demonstrated how exactly it had lost control of these, or at least how its EU membership had made it lose control over them.

001

There are already signs that, as much due to the fall in the value of the pound since Brexit as to Brexit itself, many Eastern European migrants are returning to their home countries, but the vast majority of them had already declared that they did not intend to settle permanently in the UK. The fact that so many came from 2004 onwards was entirely down to the decision of the British government not to delay or derogate the operation of the accession treaties. But the reality remains that, even if they were to be replaced by other European ‘immigrants’ in future, the UK would still need to control, as ever, the immigration of people from outside the EU, including asylum seekers, and that returning failed or bogus applicants would become more difficult. So, too, would the sharing of intelligence information about the potential threats of terrorists attempting to enter Britain as bogus refugees. Other than these considerations, the home-grown threat from Islamist terrorists is likely to be unaffected by Brexit one way or another, and can only be dealt with by anti-radicalisation strategies, especially through education and more active inter-cultural community relations aimed at full integration, not ‘parallel’ development.

‘Populism’

Since the Brexit referendum in 2016 and the election of Donald Trump, it seems that journalists just cannot get enough of Populism. In 1998, the Guardian published about three hundred articles that contained the term. In 2015, it was used in about a thousand articles, and one year later this number had doubled to almost two thousand. Populist parties across Europe have tripled their vote in Europe over the past twenty years and more than a quarter of Europeans voted populist in their last elections. So, in deciding to leave the EU, the British are, ironically, becoming more like their continental cousins in supporting populist causes and parties. In a recent article in The Guardian Weekly, (30 November 2018), Fintan O’Toole, a columnist for The Irish Times, points out that for many pro-Brexit journalists and politicians Brexit takes the form of a populist ‘Britain alone’ crusade (see the picture and text below) which has been endemic in Britain’s political discourse about Europe since it joined ‘the common market’ in 1973:

Europe’s role in this weird psychodrama is entirely pre-scripted. It doesn’t greatly matter what the European Union is or what it is doing – its function in the plot is to be a more insiduous form of nazism. This is important to grasp, because one of the key arguments in mainstream pro-Brexit political and journalistic discourse would be that Britain had to leave because the Europe it had joined was not the Europe it found itself part of in 2016…

… The idea of Europe as a soft-Nazi superstate was vividly present in 1975, even when the still-emerging EU had a much weaker, less evolved and less intrusive form…

Yet what brings these disparate modes together is the lure of self-pity, the weird need to dream England into a state of awful oppression… Hostility to the EU thus opens the way to a bizarre logic in which a Nazi invasion would have been, relatively speaking, welcome…

It was a masochistic rhetoric that would return in full force as the Brexit negotiations failed to produce the promised miracles.

002

Certainly, the rejection of Mrs May’s deal in the House of Commons by large numbers of ‘Brexiteer’ MPs from her own Conservative Party was largely, by their own admission, because they felt they could not trust the assurances given by the Presidents of the Council and Commission of the European Union who were, some MPs stated, trying to trick them into accepting provisions which would tie the UK indefinitely to EU regulations. It is undoubtedly true that the British people mostly don’t want to spend any more time arguing about Brexit. But when ‘leavers’ and ‘remainers’ are united only in disliking Mrs May’s solution, that offers no way forward. The Brexiteers can only offer a “managed no deal” as an alternative, which means just strapping on seat belts as your car heads for the cliff edge. Brexit has turned out to be an economic and political disaster already, fuelling, not healing the divisions in British society which have opened up over the last twenty years, and have widened into a chasm in the last six years since the triumph of the London Olympics and the Diamond Jubilee Celebrations. The extent of this folly has grown clearer with each turn of the page. But the ending is not fully written.

Sources (for both parts):

The Guardian Weekly,  30 November 2018. London.

Douglas Murray (2018), The Strange Death of Europe: Immigration, Identity, Islam. London: Bloomsbury.

Simon Schama (2002), A History of Britain III: 1776-2000, The Fate of Empire. London: BBC Worldwide.

Andrew Marr (2009), A History of Modern Britain. London: Pan Macmillan.

John Morrill (ed.), (2001), The Penguin Atlas of British and Irish History. Harmondsworth: Penguin Books.

 

Posted January 16, 2019 by TeamBritanniaHu in Affluence, Africa, Arabs, Assimilation, asylum seekers, Australia, Balkan Crises, BBC, Brexit, Britain, British history, Britons, Brussels, Caribbean, Cartoons, Christian Faith, Christianity, Church, Colonisation, Commonwealth, Compromise, decolonisation, democracy, Demography, devolution, Discourse Analysis, Education, Empire, English Language, Europe, European Economic Community, European Union, Factories, Germany, History, Home Counties, Humanitarianism, Hungary, Immigration, India, Integration, Iraq, Ireland, Jews, Journalism, Labour Party, liberalism, Midlands, Migration, multiculturalism, multilingualism, Mythology, New Labour, Population, populism, Reconciliation, Refugees, Respectability, Satire, Second World War, terror, terrorism, United Kingdom, United Nations, West Midlands, World War Two, xenophobia

Tagged with , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

You Only Live Twice – Cool Britannia to Cold Brexit: The United Kingdom, 1999-2019. Part One: Economics, Culture & Society.   Leave a comment

Europe-map-without-UK-012

Cold Shoulder or Warm Handshake?

On 29 March 2019, the United Kingdom of Great Britain and Northern Ireland will leave the European Union after forty-six years of membership, since it joined the European Economic Community on 1 January 1973 on the same day and hour as the Republic of Ireland. Yet in 1999, it looked as if the long-standing debate over Britain’s membership had been resolved. The Maastricht Treaty establishing the European Union had been signed by all the member states of the preceding European Community in February 1992 and was succeeded by a further treaty, signed in Amsterdam in 1999. What, then, has happened in the space of twenty years to so fundamentally change the ‘settled’ view of the British Parliament and people, bearing in mind that both Scotland and Northern Ireland voted to remain in the EU, while England and Wales both voted to leave? At the time of writing, the manner of our going has not yet been determined, but the invocation of ‘article fifty’ by the Westminster Parliament and the UK government means that the date has been set. So either we will have to leave without a deal, turning a cold shoulder to our erstwhile friends and allies on the continent, or we will finally ratify the deal agreed between the EU Commission, on behalf of the twenty-seven remaining member states, and leave with a warm handshake and most of our trading and cultural relations intact.

As yet, the possibility of a second referendum – or third, if we take into account the 1975 referendum, called by Harold Wilson (above) which was also a binary leave/ remain decision – seems remote. In any event, it is quite likely that the result would be the same and would kill off any opportunity of the UK returning to EU membership for at least another generation. As Ian Fleming’s James Bond tells us, ‘you only live twice’. That certainly seems to be the mood in Brussels too. I was too young to vote in 1975 by just five days, and another membership referendum would be unlikely to occur in my lifetime. So much has been said about following ‘the will of the people’, or at least 52% of them, that it would be a foolish government, in an age of rampant populism, that chose to revoke article fifty, even if Westminster voted for this. At the same time, and in that same populist age, we know from recent experience that in politics and international relations, nothing is inevitable…

referendum-ballot-box[1]

One of the major factors in the 2016 Referendum Campaign was the country’s public spending priorities, compared with those of the European Union. The ‘Leave’ campaign sent a double-decker bus around England stating that by ending the UK’s payments into the EU, more than 350 million pounds per week could be redirected to the National Health Service (NHS).

A British Icon Revived – The NHS under New Labour:

To understand the power of this statement, it is important to recognise that the NHS is unique in Europe in that it is wholly funded from direct taxation, and not via National Insurance, as in many other European countries. As a service created in 1948 to be ‘free at the point of delivery’, it is seen as a ‘British icon’ and funding has been a central issue in national election campaigns since 2001, when Tony Blair was confronted by an irate voter, Sharon Storer, outside a hospital. In its first election manifesto of 1997, ‘New Labour’ promised to safeguard the basic principles of the NHS, which we founded. The ‘we’ here was the post-war Labour government, whose socialist Health Minister, Aneurin Bevan, had established the service in the teeth of considerable opposition from within both parliament and the medical profession. ‘New Labour’ protested that under the Tories there had been fifty thousand fewer nurses but a rise of no fewer than twenty thousand managers – red tape which Labour would pull away and burn. Though critical of the internal markets the Tories had introduced, Blair promised to keep a split between those who commissioned health services and those who provided them.

001

Under Frank Dobson, Labour’s new Health Secretary, there was little reform of the NHS but there was, year by year, just enough extra money to stave off the winter crises. But then a series of tragic individual cases hit the headlines, and one of them came from a Labour peer and well-known medical scientist and fertility expert, Professor Robert Winston, who was greatly admired by Tony Blair. He launched a furious denunciation of the government over the treatment of his elderly mother. Far from upholding the NHS’s iconic status, Winston said that Britain’s health service was the worst in Europe and was getting worse under the New Labour government, which was being deceitful about the true picture. Labour’s polling on the issue showed that Winston was, in general terms, correct in his assessment in the view of the country as a whole. In January 2000, therefore, Blair announced directly to it that he would bring Britain’s health spending up to the European average within five years. That was a huge promise because it meant spending a third as much again in real terms, and his ‘prudent’ Chancellor of the Exchequer, Gordon Brown, was unhappy that Blair had not spoken enough on television about the need for health service reform to accompany the money, and had also ‘stolen’ his budget announcements. On Budget day itself, Brown announced that until 2004 health spending would rise at above six per cent beyond inflation every year, …

… by far the largest sustained increase in NHS funding in any period in its fifty-year history … half as much again for health care for every family in this country.       

The tilt away from Brown’s sharp spending controls during the first three years of the New Labour government had begun by the first spring of the new millennium, and there was more to come. With a general election looming in 2001, Brown also announced a review of the NHS and its future by a former banker. As soon as the election was over, broad hints about necessary tax rises were dropped. When the Wanless Report was finally published, it confirmed much that the winter crisis of 1999-2000 had exposed. The NHS was not, whatever Britons fondly believed, better than health systems in other developed countries, and it needed a lot more money. ‘Wanless’ also rejected a radical change in funding, such as a switch to insurance-based or semi-private health care. Brown immediately used this as objective proof that taxes had to rise in order to save the NHS. In his next budget of 2002, Brown broke with a political convention that which had reigned since the mid-eighties, that direct taxes would not be raised again. He raised a special one per cent national insurance levy, equivalent to a penny on income tax, to fund the huge reinvestment in Britain’s health.

Public spending shot up with this commitment and, in some ways, it paid off, since by 2006 there were around 300,000 extra NHS staff compared to 1997. That included more than ten thousand extra senior hospital doctors (about a quarter more) and 85,000 more nurses. But there were also nearly forty thousand managers, twice as many as Blair and Brown had ridiculed the Tory government for hiring. An ambitious computer project for the whole NHS became an expensive catastrophe. Meanwhile, the health service budget rose from thirty-seven billion to more than ninety-two billion a year. But the investment produced results, with waiting lists, a source of great public anger from the mid-nineties, falling by 200,000. By 2005, Blair was able to talk of the best waiting list figures since 1988. Hardly anyone was left waiting for an inpatient appointment for more than six months. Death rates from cancer for people under the age of seventy-five fell by 15.7 per cent between 1996 and 2006 and death rates from heart disease fell by just under thirty-six per cent. Meanwhile, the public finance initiative meant that new hospitals were being built around the country. But, unfortunately for New Labour, that was not the whole story of the Health Service under their stewardship. As Andrew Marr has attested,

…’Czars’, quangos, agencies, commissions, access teams and planners hunched over the NHS as Whitehall, having promised to devolve power, now imposed a new round of mind-dazing control.

By the autumn of 2004 hospitals were subject to more than a hundred inspections. War broke out between Brown and the Treasury and the ‘Blairite’ Health Secretary, Alan Milburn, about the basic principles of running the hospitals. Milburn wanted more competition between them, but Brown didn’t see how this was possible when most people had only one major local hospital. Polling suggested that he was making a popular point. Most people simply wanted better hospitals, not more choice. A truce was eventually declared with the establishment of a small number of independent, ‘foundation’ hospitals. By the 2005 general election, Michael Howard’s Conservatives were attacking Labour for wasting money and allowing people’s lives to be put at risk in dirty, badly run hospitals. Just like Labour once had, they were promising to cut bureaucracy and the number of organisations within the NHS. By the summer of 2006, despite the huge injection of funds, the Service was facing a cash crisis. Although the shortfall was not huge as a percentage of the total budget, trusts in some of the most vulnerable parts of the country were on the edge of bankruptcy, from Hartlepool to Cornwall and across to London. Throughout Britain, seven thousand jobs had gone and the Royal College of Nursing, the professional association to which most nurses belonged, was predicting thirteen thousand more would go soon. Many newly and expensively qualified doctors and even specialist consultants could not find work. It seemed that wage costs, expensive new drugs, poor management and the money poured into endless bureaucratic reforms had resulted in a still inadequate service. Bupa, the leading private operator, had been covering some 2.3 million people in 1999. Six years later, the figure was more than eight million. This partly reflected greater affluence, but it was also hardly a resounding vote of confidence in Labour’s management of the NHS.

Public Spending, Declining Regions & Economic Development:

As public spending had begun to flow during the second Blair administration, vast amounts of money had gone in pay rises, new bureaucracies and on bills for outside consultants. Ministries had been unused to spending again, after the initial period of ‘prudence’, and did not always do it well. Brown and his Treasury team resorted to double and triple counting of early spending increases in order to give the impression they were doing more for hospitals, schools and transport than they actually could. As Marr has pointed out, …

… In trying to achieve better policing, more effective planning, healthier school food, prettier town centres and a hundred other hopes, the centre of government ordered and cajoled, hassled and harangued, always high-minded, always speaking for ‘the people’.  

The railways, after yet another disaster, were shaken up again. In very controversial circumstances Railtrack, the once-profitable monopoly company operating the lines, was driven to bankruptcy and a new system of Whitehall control was imposed. At one point, Tony Blair boasted of having five hundred targets for the public sector. Parish councils, small businesses and charities found that they were loaded with directives. Schools and hospitals had many more. Marr has commented, …

The interference was always well-meant but it clogged up the arteries of free decision-taking and frustrated responsible public life. 

002

Throughout the New Labour years, with steady growth and low inflation, most of the country grew richer. Growth since 1997, at 2.8 per cent per year, was above the post-war average, GDP per head was above that of France and Germany and the country had the second lowest jobless figures in the EU. The number of people in work increased by 2.4 million. Incomes grew, in real terms, by about a fifth. Pensions were in trouble, but house price inflation soured, so the owners found their properties more than doubling in value and came to think of themselves as prosperous. By 2006 analysts were assessing the disposable wealth of the British at forty thousand pounds per household. However, the wealth was not spread geographically, averaging sixty-eight thousand in the south-east of England, but a little over thirty thousand in Wales and north-east England (see map above). But even in the historically poorer parts of the UK house prices had risen fast, so much so that government plans to bulldoze worthless northern terraces had to be abandoned when they started to regain value. Cheap mortgages, easy borrowing and high property prices meant that millions of people felt far better off, despite the overall rise in the tax burden. Cheap air travel gave the British opportunities for easy travel both to traditional resorts and also to every part of the European continent. British expatriates were able to buy properties across the French countryside and in southern Spain. Some even began to commute weekly to jobs in London or Manchester from Mediterranean villas, and regional airports boomed as a result.

Sir Tim Berners Lee arriving at the Guildhall to receive the Honorary Freedom of the City of LondonThe internet, also known as the ‘World-Wide Web’, which was ‘invented’ by the British computer scientist Tim Berners-Lee at the end of 1989 (pictured right in 2014), was advancing from the colleges and institutions into everyday life by the mid- ‘noughties’. It first began to attract popular interest in the mid-nineties: Britain’s first internet café and magazine, reviewing a few hundred early websites, were both launched in 1994. The following year saw the beginning of internet shopping as a major pastime, with both ‘eBay’ and ‘Amazon’ arriving, though to begin with they only attracted tiny numbers of people.

But the introduction of new forms of mail-order and ‘click and collect’ shopping quickly attracted significant adherents from different ‘demographics’.  The growth of the internet led to a feeling of optimism, despite warnings that the whole digital world would collapse because of the inability of computers to cope with the last two digits in the year ‘2000’, which were taken seriously at the time. In fact, the ‘dot-com’ bubble was burst by its own excessive expansion, as with any bubble, and following a pause and a lot of ruined dreams, the ‘new economy’ roared on again. By 2000, according to the Office of National Statistics (ONS), around forty per cent of Britons had accessed the internet at some time. Three years later, nearly half of British homes were ‘online’. By 2004, the spread of ‘broadband’ connections had brought a new mass market in ‘downloading’ music and video. By 2006, three-quarters of British children had internet access at home.

001

Simultaneously, the rich of America, Europe and Russia began buying up parts of London, and then other ‘attractive’ parts of the country, including Edinburgh, the Scottish Highlands, Yorkshire and Cornwall. ‘Executive housing’ with pebbled driveways, brick facing and dormer windows, was growing across farmland and by rivers with no thought of flood-plain constraints. Parts of the country far from London, such as the English south-west and Yorkshire, enjoyed a ripple of wealth that pushed their house prices to unheard-of levels. From Leith to Gateshead, Belfast to Cardiff Bay, once-derelict shorefront areas were transformed. The nineteenth-century buildings in the Albert Dock in Liverpool (above) now house a maritime museum, an art gallery, shopping centre and television studio. It has also become a tourist attraction. For all the problems and disappointments, and the longer-term problems with their financing, new schools and public buildings sprang up – new museums, galleries, vast shopping complexes (see below), corporate headquarters in a biomorphic architecture of glass and steel, more imaginative and better-looking than their predecessors from the dreary age of concrete.

002

Supermarket chains exercised huge market power, offering cheap meat and dairy products into almost everyone’s budgets. Factory-made ready-meals were transported and imported by the new global air freight market and refrigerated trucks and lorries moving freely across a Europe shorn of internal barriers. Out-of-season fruit and vegetables, fish from the Pacific, exotic foods of all kinds and freshly cut flowers appeared in superstores everywhere. Hardly anyone was out of reach of a ‘Tesco’, a ‘Morrison’s’, a ‘Sainsbury’s’ or an ‘Asda’. By the mid-noughties, the four supermarket giants owned more than 1,500 superstores throughout the UK. They spread the consumption of goods that in the eighties and nineties had seemed like luxuries. Students had to take out loans in order to go to university but were far more likely to do so than previous generations, as well as to travel more widely on a ‘gap’ year, not just to study or work abroad.

Those ‘Left Behind’ – Poverty, Pensions & Public Order:

Materially, for the majority of people, this was, to use Marr’s term, a ‘golden age’, which perhaps helps to explain both why earlier real anger about earlier pension decisions and stealth taxes did not translate into anti-Labour voting in successive general elections. The irony is that in pleasing ‘Middle Englanders’, the Blair-Brown government lost contact with traditional Labour voters, especially in the North of Britain, who did not benefit from these ‘golden years’ to the same extent. Gordon Brown, from the first, made much of New Labour’s anti-poverty agenda, and especially child poverty. Since the launch of the Child Poverty Action Group, this latter problem had become particularly emotive. Labour policies took a million children out of relative poverty between 1997 and 2004, though the numbers rose again later. Brown’s emphasis was on the working poor and the virtue of work. So his major innovations were the national minimum wage, the ‘New Deal’ for the young unemployed, and the working families’ tax credit, as well as tax credits aimed at children. There was also a minimum income guarantee and a later pension credit, for poorer pensioners.

The minimum wage was first set at three pounds sixty an hour, rising year by year. In 2006 it was 5.35 an hour. Because the figures were low, it did not destroy the two million jobs as the Tories claimed it would. Neither did it produce higher inflation; employment continued to grow while inflation remained low. It even seemed to have cut red tape. By the mid-noughties, the minimum wage covered two million people, the majority of them women. Because it was updated ahead of rises in inflation rates, the wages of the poor also rose faster. It was so successful that even the Tories were forced to embrace it ahead of the 2005 election. The New Deal was funded by a windfall tax on privatised utility companies, and by 2000 Blair said it had helped a quarter of a million young people back into work, and it was being claimed as a major factor in lower rates of unemployment as late as 2005. But the National Audit Office, looking back on its effect in the first parliament, reckoned the number of under twenty-five-year-olds helped into real jobs was as low as 25,000, at a cost per person of eight thousand pounds. A second initiative was targeted at the babies and toddlers of the most deprived families. ‘Sure Start’ was meant to bring mothers together in family centres across Britain – 3,500 were planned for 2010, ten years after the scheme had been launched – and to help them to become more effective parents. However, some of the most deprived families failed to show up. As Andrew Marr wrote, back in 2007:

Poverty is hard to define, easy to smell. In a country like Britain, it is mostly relative. Though there are a few thousand people living rough or who genuinely do not have enough to keep them decently alive, and many more pensioners frightened of how they will pay for heating, the greater number of poor are those left behind the general material improvement in life. This is measured by income compared to the average and by this yardstick in 1997 there were three to four million children living in households of relative poverty, triple the number in 1979. This does not mean they were physically worse off than the children of the late seventies, since the country generally became much richer. But human happiness relates to how we see ourselves relative to those around us, so it was certainly real. 

The Tories, now under new management in the shape of a media-marketing executive and old Etonian, David Cameron, also declared that they believed in this concept of relative poverty. After all, it was on their watch, during the Thatcher and Major governments, that it had tripled, which is why it was only towards the end of the New Labour governments that they could accept the definition of the left-of-centre Guardian columnist, Polly Toynbee. A world of ‘black economy’ work also remained below the minimum wage, in private care homes, where migrant servants were exploited, and in other nooks and crannies. Some 336,000 jobs remained on ‘poverty pay’ rates. Yet ‘redistribution of wealth’, a socialist phrase which had become unfashionable under New Labour lest it should scare away middle Englanders, was stronger in Brown’s Britain than in other major industrialised nations. Despite the growth of the super-rich, many of whom were immigrants anyway, overall equality increased in these years. One factor in this was the return to the means-testing of benefits, particularly for pensioners and through the working families’ tax credit, subsequently divided into a child tax credit and a working tax credit. This was a U-turn by Gordon Brown, who had opposed means-testing when in Opposition. As Chancellor, he concluded that if he was to direct scarce resources at those in real poverty, he had little choice.

Apart from the demoralising effect it had on pensioners, the other drawback to means-testing was that a huge bureaucracy was needed to track people’s earnings and to try to establish exactly what they should be getting in benefits. Billions were overpaid and as people did better and earned more from more stable employment, they then found themselves facing huge demands to hand back the money they had already spent. Thousands of extra civil servants were needed to deal with the subsequent complaints and the scheme became extremely expensive to administer. There were also controversial drives to oblige more disabled people back to work, and the ‘socially excluded’ were confronted by a range of initiatives designed to make them more middle class. Compared with Mrs Thatcher’s Victorian Values and Mr Major’s Back to Basics campaigns, Labour was supposed to be non-judgemental about individual behaviour. But a form of moralism did begin to reassert itself. Parenting classes were sometimes mandated through the courts and for the minority who made life hell for their neighbours on housing estates, Labour introduced the Anti-Social Behaviour Order (‘Asbo’). These were first given out in 1998, granted by magistrates to either the police or the local council. It became a criminal offence to break the curfew or other sanction, which could be highly specific. Asbos could be given out for swearing at others in the street, harassing passers-by, vandalism, making too much noise, graffiti, organising ‘raves’, flyposting, taking drugs, sniffing glue, joyriding, prostitution, hitting people and drinking in public.

001 (2)

Although they served a useful purpose in many cases, there were fears that for the really rough elements in society and their tough children they became a badge of honour. Since breaking an Asbo could result in an automatic prison sentence, people were sent to jail for crimes that had not warranted this before. But as they were refined in use and strengthened, they became more effective and routine. By 2007, seven and a half thousand had been given out in England and Wales alone and Scotland had introduced its own version in 2004. Some civil liberties campaigners saw this development as part of a wider authoritarian and surveillance agenda which also led to the widespread use of CCTV (Closed Circuit Television) cameras by the police and private security guards, especially in town centres (see above). Also in 2007, it was estimated that the British were being observed and recorded by 4.2 million such cameras. That amounted to one camera for every fourteen people, a higher ratio than for any other country in the world, with the possible exception of China. In addition, the number of mobile phones was already equivalent to the number of people in Britain. With global satellite positioning chips (GPS) these could show exactly where their users were and the use of such systems in cars and even out on the moors meant that Britons were losing their age-old prowess for map-reading.

002003

The ‘Seven Seven’ Bombings – The Home-grown ‘Jihadis’:

Despite these increasing means of mass surveillance, Britain’s cities have remained vulnerable to terrorist attacks, more recently by so-called ‘Islamic terrorists’ rather than by the Provisional IRA, who abandoned their bombing campaign in 1998. On 7 July 2005, at rush-hour, four young Muslim men from West Yorkshire and Buckinghamshire, murdered fifty-two people and injured 770 others by blowing themselves up on London Underground trains and on a London bus. The report into this worst such attack in Britain later concluded that they were not part of an al Qaeda cell, though two of them had visited camps in Pakistan, and that the rucksack bombs had been constructed at the cost of a few hundred pounds. Despite the government’s insistence that the war in Iraq had not made Britain more of a target for terrorism, the Home Office investigation asserted that the four had been motivated, in part at least, by ‘British foreign policy’.

They had picked up the information they needed for the attack from the internet. It was a particularly grotesque attack, because of the terrifying and bloody conditions in the underground tunnels and it vividly reminded the country that it was as much a target as the United States or Spain. Indeed, the long-standing and intimate relationship between Great Britain and Pakistan, with constant and heavy air traffic between them, provoked fears that the British would prove uniquely vulnerable. Tony Blair heard of the attack at the most poignant time, just following London’s great success in winning the bid to host the 2012 Olympic Games (see above). The ‘Seven Seven’ bombings are unlikely to have been stopped by CCTV surveillance, of which there was plenty at the tube stations, nor by ID cards (which had recently been under discussion), since the killers were British subjects, nor by financial surveillance, since little money was involved and the materials were paid for in cash. Even better intelligence might have helped, but the Security Services, both ‘MI5’ and ‘MI6’ as they are known, were already in receipt of huge increases in their budgets, as they were in the process of tracking down other murderous cells. In 2005, police arrested suspects in Birmingham, High Wycombe and Walthamstow, in east London, believing there was a plot to blow up as many as ten passenger aircraft over the Atlantic.

After many years of allowing dissident clerics and activists from the Middle East asylum in London, Britain had more than its share of inflammatory and dangerous extremists, who admired al Qaeda and preached violent jihad. Once 11 September 2001 had changed the climate, new laws were introduced to allow the detention without trial of foreigners suspected of being involved in supporting or fomenting terrorism. They could not be deported because human rights legislation forbade sending back anyone to countries where they might face torture. Seventeen were picked up and held at Belmarsh high-security prison. But in December 2004, the House of Lords ruled that these detentions were discriminatory and disproportionate, and therefore illegal. Five weeks later, the Home Secretary Charles Clarke hit back with ‘control orders’ to limit the movement of men he could not prosecute or deport. These orders would also be used against home-grown terror suspects. A month later, in February 2005, sixty Labour MPs rebelled against these powers too, and the government only narrowly survived the vote. In April 2006 a judge ruled that the control orders were an affront to justice because they gave the Home Secretary, a politician, too much power. Two months later, the same judge ruled that curfew orders of eighteen hours per day on six Iraqis were a deprivation of liberty and also illegal. The new Home Secretary, John Reid, lost his appeal and had to loosen the orders.

006

Britain found itself in a struggle between its old laws and liberties and a new, borderless world in which the hallowed principles of ‘habeas corpus’, free speech, a presumption of innocence, asylum, the right of British subjects to travel freely in their own country without identifying papers, and the sanctity of homes in which the law-abiding lived were all coming under increasing jeopardy. The new political powers seemed to government ministers the least that they needed to deal with a threat that might last for another thirty years in order, paradoxically, to secure Britain’s liberties for the long-term beyond that. They were sure that most British people agreed, and that the judiciary, media, civil rights campaigners and elected politicians who protested were an ultra-liberal minority. Tony Blair, John Reid and Jack Straw were emphatic about this, and it was left to liberal Conservatives and the Liberal Democrats to mount the barricades in defence of civil liberties. Andrew Marr conceded at the time that the New Labour ministers were ‘probably right’. With the benefit of hindsight, others will probably agree. As Gordon Brown eyed the premiership, his rhetoric was similarly tough, but as Blair was forced to turn to the ‘war on terror’ and Iraq, he failed to concentrate enough on domestic policy. By 2005, neither of them could be bothered to disguise their mutual enmity, as pictured above. A gap seemed to open up between Blair’s enthusiasm for market ideas in the reform of health and schools, and Brown’s determination to deliver better lives for the working poor. Brown was also keen on bringing private capital into public services, but there was a difference in emphasis which both men played up. Blair claimed that the New Labour government was best when we are at our boldest. But Brown retorted that it was best when we are Labour. 

002 (2)

Tony Blair’s legacy continued to be paraded on the streets of Britain,

here blaming him and George Bush for the rise of ‘Islamic State’ in Iraq.

Asylum Seekers, EU ‘Guest’ Workers & Immigrants:

One result of the long Iraqi conflict, which President Bush finally declared to be over on 1 May 2003, was the arrival of many Iraqi asylum-seekers in Britain; Kurds, as well as Shiites and Sunnis. This attracted little comment at the time because there had been both Iraqi and Iranian refugees in Britain since the 1970s, especially as students and the fresh influx were only a small part of a much larger migration into the country which changed it fundamentally during the Blair years. This was a multi-lingual migration, including many Poles, some Hungarians and other Eastern Europeans whose countries had joined the EU and its single market in 2004. When the EU expanded Britain decided that, unlike France or Germany, it would not try to delay opening the country to migrant workers. The accession treaties gave nationals from these countries the right to freedom of movement and settlement, and with average earnings three times higher in the UK, this was a benefit which the Eastern Europeans were keen to take advantage of. Some member states, however, exercised their right to ‘derogation’ from the treaties, whereby they would only permit migrant workers to be employed if employers were unable to find a local candidate. In terms of European Union legislation, a derogation or that a member state has opted not to enforce a specific provision in a treaty due to internal circumstances (typically a state of emergency), and to delay full implementation of the treaty for five years. The UK decided not to exercise this option.

There were also sizeable inflows of western Europeans, though these were mostly students, who (somewhat controversially) were also counted in the immigration statistics, and young professionals with multi-national companies. At the same time, there was continued immigration from Africa, the Middle East and Afghanistan, as well as from Russia, Australia, South Africa and North America. In 2005, according to the Office for National Statistics, ‘immigrants’ were arriving to live in Britain at the rate of 1,500 a day. Since Tony Blair had been in power, more than 1.3 million had arrived. By the mid-2000s, English was no longer the first language of half the primary school children in London, and the capital had more than 350 different first languages. Five years later, the same could be said of many towns in Kent and other Eastern counties of England.

The poorer of the new migrant groups were almost entirely unrepresented in politics, but radically changed the sights, sounds and scents of urban Britain, and even some of its market towns. The veiled women of the Muslim world or its more traditionalist Arab, Afghan and Pakistani quarters became common sights on the streets, from Kent to Scotland and across to South Wales. Polish tradesmen, fruit-pickers and factory workers were soon followed by shops owned by Poles or stocking Polish and East European delicacies and selling Polish newspapers and magazines. Even road signs appeared in Polish, though in Kent these were mainly put in place along trucking routes used by Polish drivers, where for many years signs had been in French and German, a recognition of the employment changes in the long-distance haulage industry. Even as far north as Cheshire (see below), these were put in place to help monolingual truckers using trunk roads, rather than local Polish residents, most of whom had enough English to understand such signs either upon arrival or shortly afterwards. Although specialist classes in English had to be laid on in schools and community centres, there was little evidence that the impact of multi-lingual migrants had a long-term impact on local children and wider communities. In fact, schools were soon reporting a positive impact in terms of their attitudes toward learning and in improving general educational standards.

001

Problems were posed, however, by the operations of people smugglers and criminal gangs. Chinese villagers were involved in a particular tragedy when nineteen of them were caught while cockle-picking in Morecambe Bay by the notorious tides and drowned. Many more were working for ‘gang-masters’ as virtual, in some cases actual ‘slaves’. Russian voices became common on the London Underground, and among prostitutes on the streets. The British Isles found themselves to be ‘islands in the stream’ of international migration, the chosen ‘sceptred isle’ destinations of millions of newcomers. Unlike Germany, Britain was no longer a dominant manufacturing country but had rather become, by the late twentieth century, a popular place to develop digital and financial products and services. Together with the United States and against the Soviet Union, it was determined to preserve a system of representative democracy and the free market. Within the EU, Britain maintained its earlier determination to resist the Franco-German federalist model, with its ‘social chapter’ involving ever tighter controls over international corporations and ever closer political union. Britain had always gone out into the world. Now, increasingly, the world came to Britain, whether poor immigrants, rich corporations or Chinese manufacturers.

005

Multilingual & Multicultural Britain:

Immigration had always been a constant factor in British life, now it was also a fact of life which Europe and the whole world had to come to terms with. Earlier post-war migrations to Britain had provoked a racialist backlash, riots, the rise of extreme right-wing organisations and a series of new laws aimed at controlling it. New laws had been passed to control both immigration from the Commonwealth and the backlash to it. The later migrations were controversial in different ways. The ‘Windrush’ arrivals from the Caribbean and those from the Indian subcontinent were people who looked different but who spoke the same language and in many ways had had a similar education to that of the ‘native’ British. Many of the later migrants from Eastern Europe looked similar to the white British but shared little by way of a common linguistic and cultural background. However, it’s not entirely true to suggest, as Andrew Marr seems to, that they did not have a shared history. Certainly, through no fault of their own, the Eastern Europeans had been cut off from their western counterparts by their absorption into the Soviet Russian Empire after the Second World War, but in the first half of the century, Poland had helped the British Empire to subdue its greatest rival, Germany, as had most of the peoples of the former Yugoslavia. Even during the Soviet ‘occupation’ of these countries, many of their citizens had found refuge in Britain.

Moreover, by the early 1990s, Britain had already become both a multilingual nation. In 1991, Safder Alladina and Viv Edwards published a book for the Longman Linguistics Library which detailed the Hungarian, Lithuanian, Polish, Ukrainian and Yiddish speech communities of previous generations. Growing up in Birmingham, I certainly heard many Polish, Yiddish, Yugoslav and Greek accents among my neighbours and parents of school friends, at least as often as I heard Welsh, Irish, Caribbean, Indian and Pakistani accents. The Longman book begins with a foreword by Debi Prasanna Pattanayak in which she stated that the Language Census of 1987 had shown that there were 172 different languages spoken by children in the schools of the Inner London Education Authority. In an interesting precursor of the controversy to come, she related how the reaction in many quarters was stunned disbelief, and how one British educationalist had told her that England had become a third world country. She commented:

After believing in the supremacy of English as the universal language, it was difficult to acknowledge that the UK was now one of the greatest immigrant nations of the modern world. It was also hard to see that the current plurality is based on a continuity of heritage. … Britain is on the crossroads. It can take an isolationist stance in relation to its internal cultural environment. It can create a resilient society by trusting its citizens to be British not only in political but in cultural terms. The first road will mean severing dialogue with the many heritages which have made the country fertile. The second road would be working together with cultural harmony for the betterment of the country. Sharing and participation would ensure not only political but cultural democracy. The choice is between mediocrity and creativity.

002

Language and dialect in the British Isles, showing the linguistic diversity in many English cities by 1991 as a result of Commonwealth immigration as well as the survival and revival of many of the older Celtic languages and dialects of English.

Such ‘liberal’, ‘multi-cultural’ views may be unfashionable now, more than a quarter of a century later, but it is perhaps worth stopping to look back on that cultural crossroads, and on whether we are now back at that same crossroads, or have arrived at another one. By the 1990s, the multilingual setting in which new Englishes evolved had become far more diverse than it had been in the 1940s, due to immigration from the Indian subcontinent, the Caribbean, the Far East, and West and East Africa. The largest of the ‘community languages’ was Punjabi, with over half a million speakers, but there were also substantial communities of Gujurati speakers (perhaps a third of a million) and a hundred thousand Bengali speakers. In some areas, such as East London, public signs and notices recognise this (see below). Bengali-speaking children formed the most recent and largest linguistic minority within the ILEA and because the majority of them had been born in Bangladesh, they were inevitably in the greatest need of language support within the schools. A new level of linguistic and cultural diversity was introduced through Commonwealth immigration.

003

007

Birmingham’s booming postwar economy attracted West Indian settlers from Jamaica, Barbados and St Kitts in the 1950s. By 1971, the South Asian and West Indian populations were equal in size and concentrated in the inner city wards of North and Central Birmingham (see the map above).  After the hostility towards New Commonwealth immigrants in some sections of the local White populations in the 1960s and ’70s, they had become more established in cities like Birmingham, where places of worship, ethnic groceries, butchers and, perhaps most significantly, ‘balti’ restaurants, began to proliferate in the 1980s and ’90s. The settlers materially changed the cultural and social life of the city, most of the ‘white’ population believing that these changes were for the better. By 1991, Pakistanis had overtaken West Indians and Indians to become the largest single ethnic minority in Birmingham. The concentration of West Indian and South Asian British people in the inner city areas changed little by the end of the century, though there was an evident flight to the suburbs by Indians. As well as being poorly-paid, the factory work available to South Asian immigrants like the man in a Bradford textile factory below, was unskilled. By the early nineties, the decline of the textile industry over the previous two decades had let to high long-term unemployment in the immigrant communities in the Northern towns, leading to serious social problems.

006

Nor is it entirely true to suggest that, as referred to above, Caribbean arrivals in Britain faced few linguistic obstacles integrating themselves into British life from the late 1940s to the late 1980s. By the end of these forty years, the British West Indian community had developed its own “patois”, which had a special place as a token of identity. One Jamaican schoolgirl living in London in the late eighties explained the social pressures that frowned on Jamaican English in Jamaica, but which made it almost obligatory in London. She wasn’t allowed to speak Jamaican Creole in front of her parents in Jamaica. When she arrived in Britain and went to school, she naturally tried to fit in by speaking the same patois, but some of her British Caribbean classmates told her that, as a “foreigner”, she should not try to be like them, and should speak only English. But she persevered with the patois and lost her British accent after a year and was accepted by her classmates. But for many Caribbean visitors to Britain, the patois of Brixton and Notting Hill was a stylized form that was not truly Jamaican, not least because British West Indians had come from all parts of the Caribbean. When another British West Indian girl, born in Britain, was taken to visit Jamaica, she found herself being teased about her London patois and told to speak English.

003

The predicament that still faced the ‘Black British’ in the late eighties and into the nineties was that, for all the rhetoric, they were still not fully accepted by the established ‘White community’. Racism was still an everyday reality for large numbers of British people. There was plenty of evidence of the ways in which Black people were systematically denied access to employment in all sections of the job market.  The fact that a racist calamity like the murder in London of the black teenager Stephen Lawrence could happen in 1993 was a testimony to how little had changed in British society’s inability to face up to racism since the 1950s. As a result, the British-Caribbean population could still not feel itself to be neither fully British. This was the poignant outcome of what the British Black writer Caryl Phillips has called “The Final Passage”, the title of his novel which is narrated in Standard English with the direct speech by the characters rendered in Creole. Phillips migrated to Britain as a baby with his parents in the 1950s, and sums up his linguistic and cultural experience as follows:

“The paradox of my situation is that where most immigrants have to learn a new language, Caribbean immigrants have to learn a new form of the same language. It induces linguistic shizophrenia – you have an identity that mirrors the larger cultural confusion.”

One of his older characters in The Final Passage characterises “England” as a “college for the West Indian”, and, as Philipps himself put it, that is “symptomatic of the colonial situation; the language is divided as well”.  As the “Windrush Scandal”, involving the deportation of British West Indians from the UK has recently shown, this post-colonial “cultural confusion” still ‘colours’ political and institutional attitudes twenty-five years after the death of Stephen Lawrence, leading to discriminatory judgements by officials. This example shows how difficult it is to arrive at some kind of chronological classification of migrations to Britain into the period of economic expansion of the 1950s and 1960s; the asylum-seekers of the 1970s and 1980s; and the EU expansion and integration in the 1990s and the first decades of the 2000s. This approach assumed stereotypical patterns of settlement for the different groups, whereas the reality was much more diverse. Most South Asians, for example, arrived in Britain in the post-war period but they were joining a migration ‘chain’ which had been established at the beginning of the twentieth century. Similarly, most Eastern European migrants arrived in Britain in several quite distinct waves of population movement. This led the authors of the Longman Linguistics book to organise it into geolinguistic areas, as shown in the figure below:

001

The Poles and Ukrainians of the immediate post-war period, the Hungarians in the 1950s, the Vietnamese refugees in the 1970s and the Tamils in the 1980s, sought asylum in Britain as refugees. In contrast, settlers from India, Pakistan, Bangladesh and the Caribbean, had, in the main come from areas of high unemployment and/or low wages, for economic reasons. It was not possible, even then, to make a simple split between political and economic migrants since, even within the same group, motivations differed through time. The Eastern Europeans who had arrived in Britain since the Second World War had come for a variety of reasons; in many cases, they were joining earlier settlers trying either to escape poverty in the home country or to better their lot. A further important factor in the discussion about the various minority communities in Britain was the pattern of settlement. Some groups were concentrated into a relatively small geographical area which made it possible to develop and maintain strong social networks; others were more dispersed and so found it more difficult to maintain a sense of community. Most Spaniards, Turks and Greeks were found in London, whereas Ukrainians and Poles were scattered throughout the country. In the case of the Poles, the communities outside London were sufficiently large to be able to sustain an active community life; in the case of Ukrainians, however, the small numbers and the dispersed nature of the community made the task of forging a separate linguistic and cultural identity a great deal more difficult.

Groups who had little contact with the home country also faced very real difficulties in retaining their distinct identities. Until 1992, Lithuanians, Latvians, Ukrainians and Estonians were unable to travel freely to their country of origin; neither could they receive visits from family members left behind; until the mid-noughties, there was no possibility of new immigration which would have the effect of revitalizing these communities in Britain. Nonetheless, they showed great resilience in maintaining their ethnic minority, not only through community involvement in the UK but by building links with similar groups in Europe and even in North America. The inevitable consequence of settlement in Britain was a shift from the mother tongue to English. The extent of this shift varied according to individual factors such as the degree of identification with the mother tongue culture; it also depended on group factors such as the size of the community, its degree of self-organisation and the length of time it had been established in Britain. For more recently arrived communities such as the Bangladeshis, the acquisition of English was clearly a more urgent priority than the maintenance of the mother tongue, whereas, for the settled Eastern Europeans, the shift to English was so complete that mother tongue teaching was often a more urgent community priority. There were reports of British-born Ukrainians and Yiddish-speaking Jews who were brought up in predominantly English-speaking homes who were striving to produce an environment in which their children could acquire their ‘heritage’ language.

Blair’s Open Door Policy & EU Freedom of Movement:

During the 1980s and ’90s, under the ‘rubric’ of multiculturalism, a steady stream of immigration into Britain continued, especially from the Indian subcontinent. But an unspoken consensus existed whereby immigration, while always gradually increasing, was controlled. What happened after the Labour Party’s landslide victory in 1997 was a breaking of that consensus, according to Douglas Murray, the author of the recent (2017) book, The Strange Death of Europe. He argues that once in power, Tony Blair’s government oversaw an opening of the borders on a scale unparalleled even in the post-war decades. His government abolished the ‘primary purpose rule’, which had been used as a filter out bogus marriage applications. The borders were opened to anyone deemed essential to the British economy, a definition so broad that it included restaurant workers as ‘skilled labourers’. And as well as opening the door to the rest of the world, they opened the door to the new EU member states after 2004. It was the effects of all of this, and more, that created the picture of the country which was eventually revealed in the 2011 Census, published at the end of 2012.

004

The numbers of non-EU nationals moving to settle in Britain were expected only to increase from 100,000 a year in 1997 to 170,000 in 2004. In fact, the government’s predictions for the number of new arrivals over the five years 1999-2004 were out by almost a million people. It also failed to anticipate that the UK might also be an attractive destination for people with significantly lower average income levels or without a minimum wage. For these reasons, the number of Eastern European migrants living in Britain rose from 170,000 in 2004 to 1.24 million in 2013. Whether the surge in migration went unnoticed or was officially approved, successive governments did not attempt to restrict it until after the 2015 election, by which time it was too late.

(to be continued)

Posted January 15, 2019 by TeamBritanniaHu in Affluence, Africa, Arabs, Assimilation, asylum seekers, Belfast, Birmingham, Black Market, Britain, British history, Britons, Bulgaria, Calais, Caribbean, Celtic, Celts, Child Welfare, Cold War, Colonisation, Commonwealth, Communism, Compromise, Conservative Party, decolonisation, democracy, Demography, Discourse Analysis, Domesticity, Economics, Education, Empire, English Language, Europe, European Economic Community, European Union, Factories, History, Home Counties, Humanism, Humanitarianism, Hungary, Immigration, Imperialism, India, Integration, Iraq, Ireland, Journalism, Labour Party, liberal democracy, liberalism, Linguistics, manufacturing, Margaret Thatcher, Midlands, Migration, Militancy, multiculturalism, multilingualism, Music, Mythology, Narrative, National Health Service (NHS), New Labour, Old English, Population, Poverty, privatization, Racism, Refugees, Respectability, Scotland, Socialist, south Wales, terror, terrorism, Thatcherism, Unemployment, United Kingdom, United Nations, Victorian, Wales, Welsh language, xenophobia, Yugoslavia

Tagged with , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Roots of Liberal Democracy, Part Four: Liberation & Democratic Transition in Hungary, 1988-2004.   Leave a comment

003

Goodbye János Kádár!

By the end of 1988, Mikhail Gorbachev had clearly abandoned the ‘Brezhnev doctrine’ in terms of which the Soviet Union undertook to resort to military force in critical situations in the ‘eastern bloc’ countries. In other words, he intimated that the events of 1956 in Hungary, 1968 in Czechoslovakia and 1981 in Poland, where an invasion was only prevented by the announcement of martial law, would not be repeated. Kádár, the one-time pioneer of reforms in the bloc, was deeply disturbed by Gorbachev’s aspirations, for they now made any depth of reform possible, whereas the ones enacted up to 1985 in Hungary were the maximum he was willing to concede. It was rumoured among the broad segment of reformers in the party rank-and-file, whose expectations were heightened by Glasnost and Perestroika, that Gorbachev’s statements were being censored in Hungary as well as in the more rigid socialist countries. In the final stage of Kádár’s reforms in Hungary, ‘multiple candidacy’ was introduced for future general elections, allowing ‘independent’, non-party candidates to stand, resulting in ten per cent of the new parliament being composed of such deputies in 1985. Any further step in the opening up of the public sphere would have provided a fundamental challenge to the régime’s power base.

Supported by a faceless crowd of yes-men of his own age in the upper echelons of the party hierarchy, Kádár stubbornly denied any allegation that Hungary was in crisis. When he could no longer maintain this facade, in July 1987 he dropped his long-standing Prime Minister György Lázár, replacing him with one of the several vigorous, relatively young figures who were biding their time in the lower echelons. Károly Grósz was the most characteristic representative of the new technocratic cadres which were in favour of going forward with economic reforms without changing the political system. The policy of transition to a mixed economy based on mixed forms of property (state, co-operative and private) was therefore carried forward with the elimination of subsidised prices; the return, after four decades, of a two-level banking system and the introduction of a new tax system, including progressive personal income tax. Grósz also continued the ‘openness’ policy towards the West by abolishing all travel restrictions, winning Gorbachev’s confidence in the process. The Soviet leader had no objection to getting rid of Kádár, who was aged, sick and tired in every sense of the word. As he outlived his days, the stage was set for a succession struggle.

Besides Grósz, the main contenders included Nyers, the architect of the 1968 economic reforms and Imre Pozsgay, whose commitment to reform extended to the political sphere, in favour of democratisation. He was supported by a sizeable reform wing within the party, as well as by a group of social scientists who prepared, under his protection, a scenario for a transition to pluralism in 1986, Turning Point and Reform. In addition, Pozsgay communicated with a segment of the opposition led by ‘populist’ intellectuals. An investigation within the party and the expulsion of four prominent reformist intellectuals from the party in the spring of 1988 were intended by the ‘old guard’ to deter the opposition within the party, but the measure missed its target. Then on 22 May 1988, Kádár’s long rule came to an abrupt end: the party conference elevated him to the entirely impotent post of Party Chairman, electing Grósz as Party Secretary in his place and completely reshuffling the Political Committee. By this time the different opposition groups that had been germinating for a considerable period in the ‘secondary public sphere’ stepped forward into the primary one and started to develop as political parties, presenting the public with analyses of past and present communism, diagnoses of Hungary’s predicament, and antidotes to it, which proved to be more credible than the versions prevented by officialdom.

From its inception in the late 1970s, the opposition that arose as a viable political alternative a decade later was distinguishable from the post-1968 dissidents both by their ideological orientation and their strategy. Instead of grafting pluralism and democracy onto Marxism, which the experience of 1956 had shown to be futile, they drew on the liberal-democratic and Christian national traditions, and instead of the similarly futile effort to represent these endeavours in the ‘primary’ public sphere, whose organs and institutions were dominated by the party, they created and maintained autonomous organisations. At the outset, these initiatives were confined to a few dozen individuals, maintaining contacts with a few hundred others among the intellectuals of research institutes, university departments, editorial offices and student circles. Through these, their views started to infiltrate into the pages of literary and social science journals of the ‘primary’ sphere that were testing the limits of free speech. From the mid-1980s on, some of them also developed contacts with reformers within the party. Of course, the authorities continued to possess detailed and up-to-date information about the activities of opposition and the groups linked with them. But given the developing dialogue with the West and its increasing dependence on western loans, the régime could not afford to show its iron fist. Whenever the opposition made itself visible by coming out on the streets for alternative commemorations of the 1848 and 1956 Revolutions, up to 1988 arrests, detentions and beatings invariably followed. Otherwise, the régime contented itself with occasional harassment: sporadic searches, the confiscation of illegal publications, the rejection of travel permits, censorship of writers and replacement of editorial boards.

Far from being homogeneous, from the outset, there were clear divisions within the opposition, reflecting the old urban-populist divide, although they maintained a co-operative dialogue until the eve of the transition process. The ‘populists’ identified national ‘questions of fate’ as their main commitment, such as the conditions of Hungarian minorities in the neighbouring countries, types of social delinquency, demographic problems, the conditions of the Churches, the loosening of communal ties and the effects of communism on the national consciousness. The neglect of these issues by the government, especially the first, led to the beginning of these ‘populist’ nationalist trends, also at the end of the 1970s. From 1983 Sándor Csoóri became a dominant figure among the ‘populists’, with polemical writings combining the above-mentioned themes with a critique of the morally detrimental effects of socialism. New social service periodicals succeeded in outmaneuvering censorship and discussing in a more objective manner an extensive range of sensitive themes, not just Stalinism and the 1956 Revolution, but also anti-Semitism, the condition of the Roma minority, poverty and the anomalies of the social security system. Both liberal Democrats and populists established links with Hungarian emigré organisations in the West, benefiting in the shape of scholarships from the New York-based Open Society Foundation launched by the Hungarian-American businessman George Soros in 1982, which also opened a registered office in Budapest five years later.

In the first half of the 1980s, the endeavour of anti-communist cooperation dominated the relationship of the two camps of the opposition, so different in outlook. A conference was held at Monor in 1985 in June 1985, whose speakers addressed and analysed the most soaring issues of the then generalised crisis. As the transformation of the system responsible for it came on to the agenda, and programmes started to be worked out, the ways of ‘urbanists’ and ‘populists’ parted. In June 1987 the programme of the democratic opposition was published, entitled ‘Social Contract’. They were uncompromising in claiming that the current political leadership was unsuitable to guide the process. Their document concluded that Kádár must go. This was too radical for the populists, who envisaged a more gradual transition, with an active role for reform communists within it. As a result, the democratic opposition was not invited to the meeting of the ‘populist’ camp which took place at Lakitelek, near Kecskemét, where the Hungarian Democratic Forum (MDF) was founded. This was a recognised movement with the goal of transforming into a political party and was formed in the presence of Pozsgay and other reform Communists, on 27 September 1987.

006

The Young ‘Liberal’ Democrat, Viktor Orbán, speaking at the re-interment of Imre Nagy in June 1989. These days, neither Liberal Democracy nor Nagy’s Social Democracy are any more fashionable for Orbán and his now ultra-Conservative party and government.

The Alliance of Young Democrats (FIDESZ), established on 30 March 1988, originally as an alternative to the Communist Youth League, endeavoured to supersede the urbanist-populist divide and submitted a programme in which a mixed economy, human rights, political pluralism and national values were given equal emphasis. At the same time, it also identified itself as a radical liberal initiative, and for some time during the ‘Transition’, it remained the closest political ally of the former democratic opposition. The ‘urbanist’ counterpart of the MDF was the Network of Free Initiatives, launched on 1 May 1988 which then developed into the Alliance of Free Democrats (SZDSZ) on 13 November that same year, after their hope of integrating most or all of the democratic opposition became thwarted by the mushroom-like growth of quasi-political organisations, together with professional associations and trade unions in the intervening six months. Shortly afterwards, the ‘historical parties’ reformed themselves: the Independent Smallholder Party re-emerged on 18 November 1988, followed by the Social Democrats in January and the Christian Democrats in April 1989.

Meanwhile, in November 1988, Grósz had passed over the premiership to Miklós Németh who, contrary to expectations, became one of the engineers of transition. He drew reinforcement from the successful manoeuvring of Pozsgay, who arose as an emblematic figure of reform Communist policies by sharpening the divisions within the party through a number of publicly made statements from late 1988 onwards. Pozsgay had avoided getting involved on either side in the 1956 Uprising because he was based in a provincial town at the time. He was an intellectual by instinct and training, who had worked his way up through the system until he and his fellow reformers had been strong enough to vote Kádár, who had once referred to him as ‘impertinent’, out of power in May 1988. It was then that Pozsgay became a member of the Politburo and it was soon after that he, not Grosz, had emerged as the dominant figure in the party leadership. Most notably, his announcements had included breaking the taboo of 1956: the redefinition of the ‘counter-revolution’ was as a ‘popular uprising’, and the urging of the introduction of a multi-party system. This was ratified by the legislature on 11 January, and acknowledged by the party on February 11, 1989. Through a cabinet reshuffle in May 1989, the followers of Grósz were replaced in most posts by pragmatic reformers like Németh himself. This did much to undermine hard-liner positions in the party and to push it to disintegration. The founder of the party did not live to see it. In early May 1989, Kádár was relieved of his offices, and died on 6 July, the same day that Imre Nagy was officially rehabilitated.

Even before his total removal from power, it was already being openly said that the Kádár period had come to an end. What had come into existence under his aegis was now in ruins economically. The attempts of the régime at reform had won excessive, flattering judgements in the West, making it more suspect within the Eastern Bloc. But the end of the third decade of Kádár’s rule was overshadowed by the previously whispered, but later admitted, information that Hungary had accumulated a foreign debt of twenty billion dollars, most of it in a couple of years of recklessness. This was where the contradictory, limited national consensus had ended up, in a cul-de-sac of national bankruptcy; this was what the divergence of production of production and consumption, the maintenance of a tolerable standard of living, and the erroneous use of the loans received had amounted to. The heavy interest burden on these debts alone was to have its effects for decades, crippling many early attempts at renewal.

001

By July 1989, Hungary had become a de facto multi-party democracy again. Although these parties, new or old, were not mass parties with large numbers of activists, they were able to show that Grósz was wrong to suggest, as he once did at the end of 1988, that the streets belong to us. There were few mass demonstrations during this period, but those that did take place were organised by the opposition and were effective in conveying clear messages. They included mass protests over Ceausescu’s treatment of the Hungarian minority in Transylvania, reminding the Communists of their neglect of nationalist issues, and against the proposed construction of the hydro-electric dam system on the Danube Bend, which called attention to the ecological spoliation of communism. On 15 March, the anniversary of the 1848 Revolution, there was a keen competition to dominate the commemorative events in which the opposition scored a sweeping triumph; its main message was that the hundred-and-forty years of demands for civil liberty and representative government was still on the national agenda.

005

Above: The Danube Bend at Visegrád, where the river, hemmed in by the Börzöny and Pilis Hills, meanders beneath the castle at Visegrád. After the foundation of the Hungarian State, Visegrád was one of the first ecclesiastical centres, as well as being a royal estate and a county seat. After the Turkish Conquest in the sixteenth century. the ‘Hungarian Versailles’ was laid low and almost completely raised to the ground. In the 1980s the area was again brought to the forefront of public attention. Czechoslovakia and Hungary long ago planned the building of a dam, of which the main Slovak installation would be at Bős and the main Hungarian installation at Nagymaros, north of Visegrád, in close proximity to the Royal castle and palace. But in East Central Europe during the 1980s growing political dissatisfaction and civic opposition found an object of focus in this gigantic project. In this, ecological and environmental considerations played a major part, with national and international ramifications.  The Hungarian domestic opposition had two main areas of activity: the publication and distribution of pamphlets and the struggle against the Danube dam. In response to this, the new Hungarian government elected in 1990 stopped all construction work on its side of the river and started to restore the bank to its natural state. Later, the ‘Visegrád’ group of four neighbouring countries was formed at the palace.   

003

012

The most dramatic of all the public demonstrations was the official re-burial of the remains of Imre Nagy and his fellow ‘martyrs’ on the anniversary of their execution, 16 June 1989, which amounted to a public confession that in its origins the régime was built on terror and injustice. Nagy’s body, along with the others executed in 1958 was found in the waste ground at the Újköztemető (cemetery), wrapped in tar paper. After its exhumation, Nagy’s coffin lay in state in Heroes’ Square before being formally reburied. Over three hundred thousand citizens paid their respects to the martyrs of 1956, together with the tributes of government ministers. The fact that only a year beforehand police had used force to disperse a group of a few hundred demonstrators commemorating the martyrdom illustrates the rapid erosion of the régime’s authority and the simultaneous occupation of the public space by the opposition by the middle of 1989.

002

006

The Hole in the Curtain:

005

At last Hungary had come to terms with its past. Its future was determined by a decision taken by the Central Committee of the HSWP, to put the rapidly developing multi-party system on an official basis. Pozsgay’s own position had often seemed closer to that of the opposition Hungarian Democratic Forum (MDF) than to that of his own party. In the midst of these preparations for a peaceful transition of power and democratic elections, Kádár’s successors surprised the world at large. The summer of the annus mirabilis continued with its internationally most immediately conspicuous achievement: the dismantling of the ‘iron curtain’, the barbed-wire fence on the Austrian frontier, a process which had begun in May. On 23 August, the Foreign Minister Gyula Horn spent a sleepless night worrying about the changes going on around him and the irritated reactions of Hungary’s Warsaw Pact allies to them. He had been telephoned by the East German Foreign Minister, determined to know what was happening to Hungary’s border with Austria. He had assured him that sections had been removed for repair and would shortly be replaced.

007

008

001

Again at Pozsgay’s instigation, the border gates were opened to allow for a ‘pan-European picnic’ in the woods on the Austrian side, which several hundred East Germans (‘holidaying’ at Lake Balaton) were able to stream through (pictured above). Hungarian citizens already had the right to visa-free travel to the West, but thousands of disenchanted East Germans, hearing from compatriots of the ‘hole’ in the curtain, had been making their way into Hungary via Czechoslovakia to escape from their own unpopular hard-line régime. Hungary had signed a treaty with East Germany in 1968 pledging not to allow East Germans to leave for the West through its territory. Horn sounded out Moscow as for a reaction as to whether the Soviet leadership would object if Hungary abandoned this undertaking. This was an urgent practical problem for the Hungarians, as about twenty thousand citizens from the DDR were seeking refuge at the FRG Embassy in Budapest. The Soviets did not object, so Horn resolved to open the main border crossings on the roads to the West. He said later that…

… It was quite obvious to me that this would be the first step in a landslide-like series of events. 

013

Above: (left) Demonstrators in Budapest keep up the momentum; (right and below) East Germans, holidaying in Hungary, cross the border and head West, to the fury of their government, and to their own freedom.

010

On 10 September, despite strenuous objections from the East German government, Hungary’s border with Austria was opened to the East German refugees. Within three days, thirteen thousand East Germans, mostly young couples with children, had fled west. This was the biggest exodus across the ‘iron curtain’ since the Berlin Wall was built in 1961, and it was only the beginning. Eschewing its erstwhile role as ‘gendarme’, still expected of it within the Eastern camp, Hungary decided to let the refugees go West without exit visas, thereby playing the role of catalyst in the disintegration of the whole Soviet bloc. Over the next few months the international situation was transformed. Liberalisation in Hungary had led directly to the collapse of the Húsak régime in Prague and the breaching of the Berlin Wall in November 1989. Writing in 1990, the historian István Lázár commented:

Naturally, all this can, or should, be seen in connection with the rise of Mikhail Gorbachev in the Soviet Union, even if in history questions of cause and effect are not entirely settled. However the question of what went before and what happened afterwards is constantly debated in history. Hungary, desperate and euphoric at the same time, turning away from the road followed for almost a half century and hardly able to see the path of the future … took  state, national and political risks with some of its decisions in 1989 in a context of a rather uncertain international situation which was not moving towards stability. This is how we arrived at the 1990s. 

004

Queues on the road to Sopron and the border, with cardboard Trabants and boxes.

Tradition and Transition:

004

Simultaneously, the scenario worked out by the opposition and Németh’s pragmatists to facilitate an orderly transition was launched. Between June and September 1989, representatives of the HSWP, the Opposition ‘Round Table’ (established in March by eight organisations) and the ‘third side’ (the Patriotic Popular Front and the trade unions) discussed the central issues of the transition process at national meetings. By the time President Bush visited Budapest in July (11-13), Hungary had effectively ceased to be a Communist country or a Soviet satellite state. I have written elsewhere on this site about this first ever visit by a US President, its importance and its outcomes. John Simpson, the BBC’s correspondent was standing on the balcony of a flat overlooking Kossúth Square where the President was due to make a speech. The owner of the flat was an Anglophile in his mid-forties from a wealthy background. There were English touches on the walls: mementoes of visits by at least two generations of the family. From his balcony they looked down on the enthusiastic crowds that were starting to gather:

“These little Communists of ours are acting like real politicians”, he said; “they’re giving people what they want, instead of what they ought to want. The trouble is, they can never give us so much that we can forget that they are Communists”. …

… He was right about the fundamental unpopularity of the Party. I went to see Imre Pozsgay a few days later and asked him whether he and his colleagues would really be the beneficiaries of the changes they were introducing.

“Who can say? Naturally I hope so. That’s why we’re doing these things. But to be honest with you, there’s nothing else we can do. Even if others win the elections, there’s no serious alternative to doing what we have done”.

On 18 September, an agreement was signed which emphasised a mutual commitment to the creation of the legal and political conditions under which a multi-party democracy could be established and the rule of law upheld. In addition, it put forward plans for surmounting the ongoing economic crisis. It required the amending of the communist constitution of 1949, the establishment of a constitutional court and the re-regulation of the order of national elections, legislation on the operation and finances of political parties and the amendment of the penal code. The two ‘liberal’ parties, the SZDSZ and FIDESZ refused to sign the agreement because it stipulated the election of a head of state before the elections, which they thought would benefit the only obvious candidate and most popular reform-politician, Imre Pozsgay. They also hoped to drive a wedge between the reform Communists and the MDF by insisting on a referendum on the issue, the result of which went in their favour. It was a sure sign of what was to come the following spring.

On 6 October, Gorbachev began a two-day visit to East Germany to celebrate the fortieth anniversary of the German Democratic Republic (DDR). The government there, led for almost half of its life by the now seventy-four-year-old Erich Honecker, remained perhaps the most repressive régime in Eastern Europe. Only four days earlier, it had sealed its border with Czechoslovakia to prevent its people from voting with their feet and flooding to the West through Hungary. When Gorbachev suggested that a more permanent solution might be for the DDR to introduce a version of perestroika to satisfy people’s material needs and demands, Honecker refused to listen. He pointed out that on his last visit to Moscow, he had been shocked by the empty shops. How dare Gorbachev tell the leader of what many believed was the most prosperous country in the socialist world how he should run his economy! But Gorbachev persisted, telling a large rally that East Germany should introduce Soviet-style reforms, adding that the country’s policies should, however, be determined “not in Moscow, but in Berlin”. Two days after he left, Honecker was ousted within the DDR’s Politburo and replaced by Egon Krenz, who represented himself as the East German Gorbachev.

004

The crowds outside the Parliament welcoming the proclamation of the institution of a Liberal Democratic Constitution for the new ‘Republic of Hungary’, October 1989.

Meanwhile, meeting in Budapest, the Fourteenth Congress of the HSWP also proved to be its last. It officially abandoned Leninism. On the 7th, the vast majority of its deputies voted in favour of creating a new Hungarian Socialist Party (MSZP), which defined its aims in terms akin to those of Western European socialist parties. Out of seven hundred thousand Communist Party members, only fifty thousand transferred their membership to the new Socialist Party, before the first free elections of March 1990. Shortly after the dissolution of the HSWP, the party’s paramilitary organisation, the Workers’ Guard was also disbanded. In another ‘gesture’ to the memory of 1956, reparation payments were authorized by Parliament to those imprisoned after the Uprising. On the anniversary of Uprising, 23 October, Acting President Mátyás Szűrös proclaimed the new “Republic of Hungary” on the thirty-third anniversary of the Revolution. The “People’s Republic” created forty years earlier, had ceased to exist.

003

Parliament had changed eighty per cent of the 1949 constitution in the interim one that replaced it. It defined the peaceful transition to a market economy and the rule of law as the goal of the state. Its fundamental principles were defined as ‘civil democracy’ and ‘democratic socialism’. It guaranteed civil and human rights, declared the establishment of a multi-party system, not only eliminating the clause referring to the leading role of the Marxist-Leninist party of the working class but also outlawed the exercise of power by any single party. It was the first time that a ruling Communist Party anywhere had rejected its ideological faith and authorised a shift to liberal democracy and capitalism. Shortly after the promulgation and proclamation of the new constitution both inside and outside parliament (see the picture below), the red star was removed from the top of the building, demonstrating the end of the system of state socialism.

Yet now the full vulnerability of the economy was already being revealed, and the necessary decrease in consumption had to be forced on a society which was expecting a contrary shift. The past, both the pre-1949 and the post-1958 periods, began to be viewed with nostalgia, as ‘old-new’ ideas resurfaced alongside ‘brand-new’ ones. On the political scene, in both parliamentary and extra-parliamentary spheres, a faltering democracy continued to develop amidst struggles of bitter and frequently depressing content and form. In the meantime, both Eastern and Western visitors to Hungary at the beginning of the 1990s found the country more affluent and resourceful than did its own citizens, who saw it being forced into worrying straits. Eastern visitors were influenced by their own, often more miserable position, while Westerners found things better than their out-dated stereotypes of life behind the iron curtain would have led them to expect. This was Hungary’s paradox: almost every outside observer values the apparent dynamism of the country greatly, but unless they became inhabitants themselves, as some of us did, did they begin to see the burdens of ‘the changes’ born by ‘ordinary’ Hungarians and understood their caution and pessimism.

011

Above: The famous MDF (Hungarian Democratic Forum) poster from the 1990 Election Campaign: Comrades Go Home!

On 2 November, as Minister of State, Imre Pozsgay met President Bush in Washington to discuss Hungary’s transition to democracy, a week before the fall of the Berlin Wall. The following January, Hungary announced its withdrawal from the Warsaw Pact, at the same time as Czechoslovakia and Poland, at a meeting of Foreign Ministers in Budapest, with effect from 1 July. In February, the United States signed an agreement providing for a Peace Corps Program in Hungary, to begin the following September. In March, the Soviet Union reached an agreement to remove all Soviet troops from Hungary by July 1991, two-thirds of them by the end of 1991. John Simpson’s friend in Budapest had promised his father that he would not drink the bottle of Bell’s Scotch Whisky he had placed in the cupboard in 1947 until the day the Soviet troops left Budapest. That day was now approaching. When the final round of elections took place on 8 April 1990, the reform Communists won only eight per cent of the seats, and Pozsgay and his colleagues were out of office. A centre-right government came to power, led by the MDF. They had won 164 out of the 386 seats. Looking back from later in 1990, John Simpson commented:

As in 1918, Hungary had emerged from and empire and found itself on its own; though this time, unlike the violence and destruction which followed the abortive Communist republic of Béla Kun in 1919, the transition was peaceable and relaxed. Hungary’s economy and environment had been horribly damaged by thirty-three years of Marxism-Leninism; but now, at least, it had shown the way to the rest of Central and Eastern Europe. There are dozens of men and women … who had a part in encouraging the revolutions (which followed) … But the stout figure of Imre Pozgay, who now stays at home and cooks for his family while he tries to work out what to do next, is one of the more important of them.

014

Rather than bringing stability and calm, however, the 1990s in Hungary were a time of intensive movement across the political spectrum from right to left and back again, with a minority persisting on both extremes and an undercurrent of the old ‘populist-urbanist’ divide surfacing from time to time to emphasise patriotism over cosmopolitanism. Of the sixty-five parties formed in 1988-89, only twelve could run a national list at the elections of March-April 1990, and the four per cent ‘threshold’ required to make it into parliament eliminated half of them. Of the six parties that surpassed this, the highest-scoring MDF invited the Smallholders and the Christian Democrats to form a centre-right coalition. József Antall, a historian and museum curator who had become President of the MDF the previous year, became Hungary’s first prime minister in the new democratic era. Pledging itself to uphold Christian and national values besides democracy and the market economy, the coalition enjoyed a comfortable sixty per cent majority. The opposition consisted of the two liberal parties, the SZDSZ, which came second in the elections, and FIDESZ. The Socialists struggled hard to emerge from the isolation the past had thrown them into. Based on a ‘pact’ between Antall and the SZDSZ leadership, the prominent writer, translator and victim of the 1956 reprisals, Árpád Göncz, was elected by parliament as its Speaker and the President of the Republic. Over the next four years, he made periodic use of his limited powers to act as a counterweight to governmental power. He was re-elected in 1995.

As a result of the first free elections after the fall of state socialism, there was a comprehensive change in the highest echelons of the political élite: ninety-five per cent of the MPs were new in that position. Nearly as dramatic was the change in their social and cultural backgrounds. The first setback for the coalition government came in the municipal elections of the autumn of 1990. In the larger settlements, the two liberal parties scored much better than the government parties. The prominent SZDSZ politician, Gábor Demszky became Mayor of Budapest and was subsequently re-elected four times, becoming the most successful politician in post-1989 Hungary.  Following a protracted illness in late 1993, József Antall died. His funeral, in December 1993, was attended by world leaders including US Vice President Albert Gore. He was replaced by Peter Boross, his Minister of the Interior. With Antall’s untimely death, the MDF lost a politician whose stature was unparalleled among its inexperienced ranks.

It was not only a shift in political sympathies among a considerable proportion of voters that started well before the parliamentary elections of 1994, the outcome of which astounded many people from more than one point of view. A recasting of roles and ideological commitments accompanied a realignment of partnerships among the parties from roughly halfway through the electoral cycle. The MDF had first emerged as a grassroots democratic movement and had advocated a ‘third way’ between capitalism and communism. It had also been open towards ‘democratic socialism’. In government, it had adjusted itself to the personality of Antall, a ‘conservative liberal’, and had had to work hard to purge itself of its radical nationalist right-wing, which seceded in 1993 as the Party of Hungarian Justice and Life (MIÉP) led by the writer István Csurka. After its 1990 electoral victory, the MDF had indulged in militantly anti-communist rhetoric. This contrasted with the trajectory of the SZDSZ, which had initially tried to undermine the MDF’s credibility with allegations of collaboration with the former communists. Following the ‘media war’ which broke out between the two major parties, while the SZDSZ refused to abandon its core liberal values of upholding human rights, civil liberties and multi-culturalism, it re-evaluated its policies towards the left. This enabled the MSZP to re-emerge from the shadows and paved the way for the Democratic Charter, an initiative by intellectuals from both parties to counter the tide of radical nationalism that was threatening to engulf Hungarian political life.

009

Viktor Orbán in the mid-1990s, looking Right.

In these circumstances, the earlier affinity and sometimes close collaboration between the SZDSZ and FIDESZ began to unravel as the inherent differences between them became ever more obvious. Of FIDESZ’s initial platform – anti-communism, youth culture and political liberalism – only the first was entirely preserved, while the second was quickly abandoned and the third was increasingly modified by an emphasis on Christian values, conservative traditions and strong central government. By 1994, FIDESZ had thus redefined itself as a party of the centre-right, with the ambition to become the dominant and integrative force of that segment of the political spectrum. This process was cemented in the public eye by the addition of the title Hungarian Civic Party (MPP) to its name. In 1999, it resigned from the ‘Liberal International’ and joined the ‘European People’s Party’, the conservative-Christian Democrat alliance in the EU. But in 1994, there was a general recovery in the fortunes of European socialists and social democrats, and the pledges of the MSZP to the values of social democracy looked credible enough to earn it widespread respectability in Europe and admission to the ‘Socialist International’. Its pragmatism and its emphasis on modernisation and technological development won it a landslide victory in an election which showed that the country was tired of ideological strife and disappointed with the lack of progress in the economic transition. Although the Socialists won over fifty per cent of the seats in parliament, the SZDSZ accepted the offer of Gyula Horn, MSZP chairman, to join a coalition. The other four parties of the previous parliament constituted the opposition. The Socialist-Liberal coalition government faced urgent economic tasks.

In the early to mid-nineties, Western corporations and investors came to Hungary hoping, in the long run, for a strong revival from the Hungarian economy. They procrastinated over possible investment, however, due to the threat of uncontrolled inflation. In an economy which was rapidly polarising society, with increasing unemployment and poverty while the rich got visibly richer, Hungarian citizens were already gloomy when they looked around themselves. According to the journalist Paul Lendvai, between 1988 and 1993 GDP fell by twenty per cent, twelve per cent alone in 1991; in 1990-91 real wages fell by twelve per cent, while inflation was thirty-five per cent in 1991, twenty-three per cent in 1992 and only sank below twenty per cent in 1993. Unemployment had risen sharply as thousands of firms were liquidated and half a million jobs disappeared. If they contemplated, beyond the borders, a crisis-ridden Eastern Europe beset by nationality problems and compelled to starve before the much-promised economic upturn, they were gloomier still. As Lázár commented:

Looking at the recent changes, perhaps ungratefully, this is how we stand in East Central Europe in the middle of Carpathian Basin, before the 1100th anniversary of the Hungarian Conquest, which, in five years time, will be followed by the opening of the third millennium…

In spite of the differences in their fundamental values, socialist and liberal, the MSZP and SZDSZ had similar policies on a number of pressing transitional tasks, such as Hungary’s Euro-Atlantic integration and monetarist reform, providing a wide scope for collaboration between them. In both of these priorities, they were successful, but none of these did much to assuage the resentment many voters felt towards the post-1989 politicians in general. In addition, many SZDSZ supporters were puzzled by the party’s reconciliation with the Socialists which they felt had robbed the party of its original liberal character. In the light of this, it is perhaps unsurprising that the SZDSZ followed the other great party of the 1990 régime change, the MDF, into relative obscurity following the 1998 general election. The remodelled FIDESZ-MPP attracted growing support during the second part of the election cycle, capitalising on mistakes made by the Socialists. While the latter maintained much of their popularity, FIDESZ-MPP won the election narrowly on the platform of a ‘civic Hungary’ in which the post-communist heritage would be forever buried while the state would accept greater responsibility in supporting the growth of a broad middle-class following Christian-nationalist values.

To obtain a secure parliamentary majority, the FIDESZ chairman and new PM, Viktor Orbán, formed a coalition with the MDF and the Independent Smallholder Party (FKGP). While the historic FKGP had a respectable place in the liberal democratic endeavour in post-1945 Hungary, its reincarnation was an anti-élitist, populist force, notorious throughout the 1990s for its stormy internal relations. In addition, although not part of the government, the radical-nationalist MIÉP – anti-communist, anti-capitalist, anti-liberal, anti-globalist and anti-Semitic, frequently lent its support to the first Orbán government. On the other extreme of the political palette, the radical remnant of the HSWP, the Workers’ Party, openly cherished the heritage of the Kádár era and remained a part of the extra-parliamentary opposition throughout the post-1989 period. Whereas a fairly constant proportion of the electorate has supported a traditional conservative-liberal line with national and Christian commitments, in whichever of the pirouetting parties it appeared at any given election, the values and endeavours of the Socialists also continued to break through until recent elections. On the other hand, those associated with the Liberals fell to a level equal to the radical Right, a picture not very different from some Western European countries.

With regard to European integration, all significant political forces except MIÉP were in favour of it. Although the Council of Europe responded to the Hungarian application as early as November 1990, and Hungary became an associate member in December 1991, the ensuing process was considerably longer than optimistically hoped for. Alongside the Czech Republic, Estonia, Poland and Slovenia, Hungary gained full membership of the European Union on 1 May 2004. By this time, public opinion in the West was increasingly sceptical about both the broadening and deepening of the EU. I have written extensively about Hungary’s more rapid progression into NATO membership elsewhere on this site, but its involvement in peacekeeping in former Yugoslavia, from 1994-1999, undoubtedly aided its process of accession to the EU. In an atmosphere of growing anxiety for global safety, neither the requirements concerning border security nor other developments caused a further postponement.

(to be continued…)

002

Moments of Régime Change, Budapest (2009): Volt Produkció.

Posted January 2, 2019 by TeamBritanniaHu in anti-Communist, anti-Semitism, Austerity, Austria-Hungary, Balkan Crises, Brussels, Castles, Christian Faith, Christianity, Church, Co-operativism, Communism, Compromise, Conservative Party, democracy, Discourse Analysis, Education, Egalitarianism, Empire, Europe, European Economic Community, European Union, German Reunification, Germany, Gorbachev, History, Humanism, Humanitarianism, Humanities, Hungarian History, Hungary, Immigration, Integration, Iraq, liberal democracy, liberalism, Marxism, Migration, monetarism, Mythology, Narrative, nationalism, Nationality, NATO, Population, populism, Poverty, privatization, Proletariat, Racism, Reconciliation, Refugees, Respectability, Revolution, Serbia, Statehood, Uncategorized, Yugoslavia

Tagged with , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

The ‘Other England’ of the Sixties and Seventies: The Changing Fortunes of East Anglia.   Leave a comment

007

Looking across the River Deben towards Woodbridge from Sutton Hoo.

East of England; the Country from the Stour to the Wash:

001

After the far West of England, East Anglia was one of the most neglected regions of England until the sixties. In the fashionable division of the nation into North and South, it has tended to get lumped in with the South. The South-east Study of 1964 was less vague, however, drawing an arbitrary line from the Wash to the Dorset Coast at Bournemouth and defining the area to the east of this boundary as ‘South-east England’. In the same year, Geoffrey Moorhouse (pictured below), a well-known contemporary Guardian correspondent, wrote that, in time, if policies to encourage a counter-drift of the population from the South were not adopted, the whole of the vast area delineated might well become one in character, in relative wealth and in disfigurement. As far as he was concerned, the ‘carving out’ of this area encroached upon the traditional regions of the West Country, beginning at Alfred’s ancient capital of Winchester in Hampshire, and East Anglia, incorporating Norfolk, Suffolk and Essex, or at least that part of it lying to the north of Colchester. To the south, most of Essex was already part of the ‘Golden Circle’ commuter area for the metropolis, stretching from Shoeburyness at the end of the Thames estuary, around the edge of ‘Greater London’ and up the Hertfordshire border to the north of Harlow. Suffolk and Norfolk, however, still remained well ‘beyond the pale’ between the Stour Valley and the Wash, occupying most of the elliptical ‘knob’ sticking out into the North Sea. It was an ‘East Country’ which still seemed as remote from the metropolitan south-east of England as that other extremity in the far south-west peninsular.

003

In the fifties, as the wartime airfields were abandoned and the Defence Ministry personnel went back to London, East Anglia went back to its old ways of underemployment, rural depopulation, low land and property values. By the mid-fifties, the people of East Anglia were not yet having it as good as many parts of the Home Counties that Macmillan probably had in mind when he made his famous remark. Urban growth continued, however, into the early sixties. For the most part, development was unimaginative, as council estates were built to replace war-time damage and cater for the growing town populations.  Where, in 1959, the Norfolk County Council was getting four thousand applicants a year for planning permission, by 1964 the figure had risen to ten thousand. Issues of planned town growth became urgent. Old properties, particularly thatched cottages and timber-framed farmhouses were eagerly sought. For all the talk of imminent development, with all the benefits and drawbacks that this implied, East Anglia did not look as if it had changed much by the early sixties. The most noticeable signs of the times were the great number of abandoned railway stations. Railway traffic had declined throughout England as British road transport had eclipsed railways as the dominant carrier of freight. Several branch lines, such as the Long Melford to Bury St Edmunds and sections of the Waveney Valley had already closed before the celebrated ‘Beeching Axe’ was wielded in 1963. Neither Suffolk nor Norfolk enjoyed a share in the slow growth of national prosperity of the fifties, but then the boom came suddenly and Suffolk became the fastest growing county by the end of the decade. It began in the early sixties when many new industries came to the East Anglian towns and cities.

Photo0300

The abandoned railway station at Needham Market, Suffolk.

The ‘neglected’ Suffolk of the fifties was ready to be rediscovered in the sixties. Companies escaping from the high overheads in London and the Home Counties realised that they could find what they were looking for in Ipswich, Bury, Sudbury and Haverhill. Executives discovered that they could live in an area of great peace and beauty and yet be within commuting distance of their City desks. Moreover, the shift in the balance of international trade focused attention on once more on the eastern approaches. When the bulk of Britain’s trade was with the empire and North America it was logical that London, Southampton and Liverpool should have been the main ports. The railway network had been constructed in the nineteenth century in such a way as to convey manufactured goods to these ports. But the Empire had been all but disbanded and Britain was being drawn, inexorably if sometimes reluctantly, into the European Common Market. More and more industrial traffic took to the road; heavy lorries at first, then containers. Now producers were looking for the shortest routes to the continent, and many of them lay through Suffolk, shown below in Wilson’s 1977 map of the county.

002

One of the benefits of East Anglia’s poor communications was that, at the height of summer, it was the only region south of the Bristol-Wash line which was not crammed with holidaymakers and their traffic. The seaboard caught it a little, as of course did the Norfolk Broads. Norfolk reckons, for instance, that caravans are worth two million pounds a year to it one way or another and, like Cornwall, saw this as a mixed blessing; as Moorhouse was writing his book (in 1964), the County Council was in the process of spending fifty thousand pounds on buying up caravan sites which had been placed with an eye more to income than to landscape. But inland and away from the waterways crowds of people and cars were hard to find; out of the holiday season, East Anglia was scarcely visited by any ‘outsiders’ apart from occasional commercial travellers. Local difficulties, small by comparison with those of the North, were lost from sight. As the sixties progressed, more and more British people and continental visitors realised that discovered the attractions the two counties had to offer. As Derek Wilson wrote at the end of the following decade,

They realised that a century or more of economic stagnation had preserved from thoughtless development one of the loveliest corners of England. They came in increasing numbers by their, now ubiquitous, motor-cars to spend quiet family holidays at the coast, to tour the unspoilt villages, to admire the half-timbering, the thatch, the pargetting and the great wool churches. Some decided to stake a claim by buying up old cottages for ‘week-ending’ or retirement.

DSC09565

So great was the demand for even derelict old properties that prices trebled in the period 1969-73. Village communities were no longer so tight-knit so the arrival of these ‘strangers’ cannot be said to have disrupted a traditional culture. Only in those areas where the newcomers congregated in large numbers, buying up properties at inflated prices which ‘locals’ could no longer afford was any real and lasting cultural damage inflicted. At first, the seaside towns found it difficult to come to terms with the expansion in tourism, having been ignored for so long. Even the established Suffolk holiday resorts – Aldeburgh, Southwold, Dunwich, even Felixstowe – were ‘genteel’ places; compared with Clacton on the Essex coast which was far closer in time and space to for day-trippers from London, they did not bristle with amusement arcades, Wimpy bars, holiday camps and the assorted paraphernalia that urban man seems to expect at the seaside. Derek Wilson commented that Suffolk was more like a coy maiden prepared to be discovered than an accomplished seductress thrusting her charms at every single passer-by. 

dscn08091.jpg

Three centuries of properties in Aldeburgh, Suffolk.

A Metropolitan ‘Refugee’ in Dunwich:

001

Greyfriars, The Simpson coastal ‘pile’ in Dunwich.

One of the earliest of these ‘refugees’ from the metropolis was John Simpson (who was to become the BBC’s World Affairs Editor). When he was fifteen, in 1959,  moved from Putney to Dunwich. His holidays had already been taken up with following his father’s genealogical enthusiasms, and they went from village church to county archive to cathedral vault searching for records of births, marriages and deaths, and transcribing inscriptions on gravestones. Having discovered the full extent of the full extent of the Simpson’s Suffolk roots, Roy Simpson insisted that they should look for a country house there. John recalled,

We spent a wintry week driving from one depressing place to another and talking to lonely farmers’ wives whose ideal in life was to leave their fourteenth-century thatched manor-houses and move to a semi near the shops. We had almost given up one evening and were setting out on the road to London when I spotted a brief mention at the end of an estate agent’s list of a rambling place on a clifftop overlooking the sea at Dunwich. …

From the moment I saw it I knew I would never be happy until I lived there. No one could call ‘Greyfriars’ handsome. It was the left hand end of an enormous 1884 mock-Elizabethan pile which had been split up into three separate sections at the end of the war. Our part had around eight bedrooms and five bathrooms. … It was always absurdly unsuitable … four hours’ drive from London, and nowhere near the shops or anything else. Its eleven acres of land were slowly being swallowed up by the ravenous North Sea, and it cost a small fortune to keep warm and habitable. … 

The village of Dunwich immediately formed another element of that sense of the past, faded glory which had haunted so much of my life. In the early Middle Ages it had been the greatest port in England, sending ships and men and hundreds of barrels of herrings to the Kings of England, and possessing a bishopric and forty churches and monasteries. But it was built on cliffs of sand, and the storms of each winter undermined it and silted up the port. In the twelfth century, and again in the thirteenth, large parts of the town collapsed into the sea. … Our land ran down to the cliff edge, and we watched it shrink as the years went by. 

The stories about hearing bells under the sea were always just fantasy, but Dunwich was certainly a place of ghosts. A headless horseman was said to drive a phantom coach and four along one of the roads nearby. … In the grounds of our house two Bronze Age long-barrows stood among the later trees, and when the moon shone hard and silver down onto the house, and the thin clouds spread across the sky, and a single owl shrieked from the bare branches of the dead holm-oak outside my bedroom window, it was more than I could do to get out of bed and look at them. I would think of those cold bones and the savage gold ornaments around them, and shiver myself to sleep.

The winter of 1962 was the worst since 1947, and that was the worst since the 1660s, people said. The snow fell in early December and dug in like an invading army, its huge drifts slowly turning the colour and general consistency of rusty scrap iron. In our vast, uneconomic house at Dunwich the wind came off the North Sea with the ferocity of a guillotine blade and the exposed pipes duly froze hard. The Aga stood in the corner of the kitchen like an icy coffin. … We wandered round the house in overcoats, with scarves tied round our heads like the old women at Saxmundham market. None of the lavatories worked.

In October 1963, Roy Simpson drove his son ‘up’ to Cambridge from the Suffolk coast in his old Triumph. John Simpson set down his cases, as had many Suffolk boys before him, outside the porter’s lodge in the gateway of Magdalene College. For the next three years, his life revolved around the University city in the Fens until he joined the BBC in 1966.

Coast, Cathedral City & Inland Industrial Development:

2b54e4b900000578-3196566-image-m-9_1439473593698

The curvature of the eastern coastline had been responsible for the lack of metropolitan infiltration hitherto. Norfolk and Suffolk were in a cul-de-sac; even today, apart from the ports of Felixstowe and Harwich, on opposite sides of the mouth of the River Stour, they do not lie on transport routes to anywhere else, and their lines of communication with other parts of the country, except with London, were still poor in the early sixties, and are still relatively retarded half a century later, despite the widening of the A12 and the extension of the A14. The disadvantages of remoteness could be severe, but at the same time, this saved the two countries from the exploitation that had occurred in places with comparable potential. Had there been better communications, Norwich might have been as badly ravaged by the Industrial Revolution as Bradford, but the great East Anglian woollen trade and cloth-making industry were drawn to Yorkshire as much by the promise of easier transport as by the establishment of the power-loom on faster-flowing water sources. Instead, Norwich still retained the air of a medieval city in its centre with its cathedral, its castle, and its drunken-looking lollipop-coloured shops around Elm Hill, Magdalen Street, and St. Benedict’s. Its industries, like the Colman’s mustard factory, were already discreetly tucked away on its flanks, and there they did not intrude.

005 (61)

Norwich itself was poised to move forward by the sixties, and though its hopes had received a setback as a result of Britain’s early failures to get into the Common Market, it still saw itself as playing an important part in the development of trade between this country and the Continent. European connections were already strong in East Anglia. From the obvious Dutch gables widespread throughout the region (see the example below from a farmhouse near Woodbridge, Suffolk) and concentrated in places like Kings Lynn, to the names beginning with the prefix ‘Van’ in the telephone directories, Flemish influences could, and still can be found everywhere. Dutch farmers had been settling in the two counties since the late seventeenth century. There were two Swiss-owned boatyards on the Norfolk Broads and one of Norwich’s biggest manufacturers, Bata Shoes, was Swiss in origin. In the early sixties, two Danish firms had set themselves up near the city.

DSCN0672

For Suffolk, the sixties and seventies saw a most astonishing growth in the population, which had been decreasing for over a century. The population of Suffolk showed a comparatively modest, but significant growth from 475,000 in 1951 to 560,000 in 1961. Most of this increase was in West Suffolk, where the growth of Haverhill, Bury and Sudbury accounted for most of the extra population. These were designated in the mid-fifties as London overspill areas. In Haverhill, the notion of town expansion had been pioneered in 1955; by the time Geoffrey Moorhouse published his survey in 1964, there was already a plan for a further massive transfusion of people to the town from London.  Thetford, Bury St Edmunds, and Kings Lynn were to be transformed within the next two decades. Between the two censuses of 1961 to 1971, the population of Suffolk jumped by over eighteen per cent (the national average was 5.8 per cent). There were many reasons for this unprecedented growth, which brought Suffolk a prosperity it had not known since the great days of the cloth trade.

Photo0304

A variety of restored properties in Needham Market today.

But the hinterland towns of central East Anglia presented a bigger problem for the local planners and county authorities. They had grown up as market-places for the sale of agricultural produce like those in other parts of rural England. By the mid-sixties, they had held on to this function much longer than most. But the markets, and particularly the cattle markets, had recently become more and more concentrated in the biggest towns – Norwich, King’s Lynn, Bury and Cambridge – and the justification for places like Stowmarket, Diss, Eye, Downham Market and Needham Market (pictured above), in their traditional form had been rapidly disappearing. Their populations were in need of new industries to take the place of old commerce and, in part, they got them. As early as the sixties, a new town at Diss, on the Norfolk-Suffolk border, was already talked of.  Carefully planned industrial and housing estates were built and a variety of service industries and light engineering concerns moved their machines and desks to spacious premises from whose windows the workers could actually see trees and green fields. Writing in the late seventies, Derek Wilson concluded that, while such examples of economic planning and  ‘social engineering’ could only be described as revolutionary, they were still too recent to invite accurate assessment.

DSC09732

Above: The Centre of Ipswich is now undergoing an extensive renovation, including that of its historic Corn Exchange area, complete with a statue to one of its more famous sons, Giles, the Daily Express cartoonist, popular in the sixties and seventies, when rapid development engulfed many earlier buildings in concrete.

Paradoxically, Suffolk’s depressed isolation gave a boost to the new development. Some of Suffolk’s most beautiful countryside was no further from the metropolis than the ‘stockbroker belt’ of Sussex, Hampshire, Wiltshire, Berkshire and Buckinghamshire. Yet land and property prices in Suffolk were less than half of what they were in the desirable areas of those counties. Most of the county was within eighty miles of London and served by still reasonable rail connections, and improving road connections from the capital. The population was now more mobile, and light industry less tied to traditional centres.  But development in the sixties and seventies was not restricted to the eastern side of the two counties. Ipswich, the other town in the two counties which was relatively industrialised, had been, like Norwich, comparatively unscathed by that industrialisation. Its growth occurred largely as a result of migration within Suffolk. Even so, its population increased from a hundred thousand to a hundred and twenty-two thousand between 1961 and 1971. It became the only urban centre in the county to suffer the same fate of many large towns and cities across England in that period – haphazard and largely unplanned development over many years. In the late seventies, farmers could still remember when the county town was still was just that, a large market town, where they could hail one another across the street. By then, however, dual carriageways and one-way systems had been built in an attempt to relieve its congested centre, while old and new buildings jostled each other in what Derek Wilson called irredeemable incongruity.

East Anglia as Archetypal Agricultural England:

003 (64)

Life on the land had already begun to change more generally in the sixties. East Anglia is an important area to focus on in this respect, because it was, and still is, agricultural England. In the sixties and seventies, agriculture was revitalised: farmers bought new equipment and cultivated their land far more intensely than ever before. The industries here remained identical to the main purpose of life, which was to grow food and raise stock. Many of the industries in the two counties were secondary, and complimentary, to this purpose. Of the thirty-nine major industrial firms in East Suffolk, for example, twelve were concerned with food processing, milling, or making fertilisers, and of the five engineering shops most were turning out farm equipment among other things. These industries varied from the firm in Brandon which employed three people to make and export gun-flints to China and Africa, to the extensive Forestry Commission holding at Thetford, where it was calculated that the trees grew at the rate of seventeen tons an hour, or four hundred tons a day. But a quarter of the total workforce in Norfolk and Suffolk was employed in the primary industry of farming; there were more regular farm-workers in Norfolk than in any other English county. The county produced two of the founders of modern British agriculture, Coke of Holkham and Townshend of Raynham, and it had kept its place at the head of the field, quite literally.

DSCN0671

East Anglia was easily the biggest grain-producing region of the country and the biggest producer of sugar-beet. During the First World War, farmers had been encouraged to grow sugar beet in order to reduce the country’s dependence on imported cane sugar. This had been so successful that in 1924 the government offered a subsidy to beet producers. The crop was ideally suited to the heavy soil of central Suffolk and without delay, a number of farmers formed a co-operative and persuaded a Hungarian company to build a sugar factory near Bury St Edmunds. Five thousand acres were planted immediately and the acreage grew steadily over the next half-century. In 1973, the factory was considerably enlarged by the building of two huge new silos, which came to dominate the skyline along the A14 trunk road. The factory became the largest plant of its kind in Europe and by the late seventies was playing an important part in bringing Britain closer to its goal of self-sufficiency in sugar.

50

Local ingenuity and skill had devised and built many agricultural machines during the nineteenth century, like this threshing/ grain crushing machine from the Leiston Richard Garrett works, which made various farming machines, including tractors.

Of all the English counties, Norfolk had the biggest acreage of vegetables and the heaviest yield per acre of main crop potatoes. It was also the second biggest small fruit producer and the second highest breeder of poultry. Suffolk came close behind Norfolk in barley crops, while it had the biggest acreage of asparagus and more pigs than any other county. The region’s importance to agriculture was symbolised by the headquarters of the Royal Agricultural Society having its base in Norfolk, and the region also played host to the British-Canadian Holstein-Friesian Association, the Poll Friesian Cattle Society, the British Goat Society, and the British Waterfowl Association. No other county had as many farms over three hundred acres as Norfolk, and most of the really enormous farms of a thousand acres or more were to be found in the two Easternmost counties. The biggest farm in England, excluding those owned by the Crown, was to be found on the boundary of Bury St Edmunds, the ten-thousand-acre Iveagh estate, covering thirteen farmsteads, and including a piggery, three gamekeepers’ lodgings and homes for its cowmen, foresters and its works department foreman.

DSCN0666

The most significant change taking place on the land throughout England was in the size of farms. The big ones were getting bigger and the small ones were slowly dwindling and going out of business. Mechanisation was reducing the number of jobs available to agricultural workers, and from this followed the steady decline of rural communities. By the end of the sixties, however, the employment position in Norfolk was beginning to stabilise as the old farm hands who were reared as teams-men and field-workers and were kept on by benevolent employers retired and were not replaced. Although it employed fewer people than ever before, farming was still Suffolk’s largest single industry in the mid-seventies. After Britain joined the Common Market in 1973, accessibility to European markets had led to a certain amount of diversity. There were numerous farmers specialising in poultry, pigs and dairying. Yet persistently high world grain prices led to the intensive production of what the heavy soils of central Suffolk are best suited to – cereal crops. The tendency for large estates to be split up and fields to remain unploughed had been dramatically reversed. The larger the unit, the more productive and efficient the farm, with every producer determined to get the maximum yield from their acres.

71

The field patterns between Leiston and Sizewell (from the model detailed below).

As the big farms grew bigger and farming became more highly mechanised, farmers were tending to re-organise the shapes and sizes of their fields, making them as large as possible so that the tractor and the combine harvester could work them with greater ease and maximum efficiency. They uprooted trees and whole copses, which were awkward to plough and drill around, cut out hedges which for centuries had bounded small parcels of land, and filled in ditches. To the farmer, this meant the promise of greater productivity, but to the ecologist, it meant the balance of nature was being upset in a way that the farmer and the general countryside population, including animals as well as people, would have to pay for, later if not sooner. The practical answer to this problem has been the increasing use of chemicals to control pests which, as soon became obvious, was a double-edged blade. In addition, the poor land was treated with chemical fertilizers. East Anglia provided a classic example of what could happen as a result of the indiscriminate chemical warfare being conducted in the English countryside. As reported in the New Statesman (20 March 1964), …

… a Norfolk fruit-grower was persuaded by a pesticide salesman that the best way of keeping birds off his six acres of blackcurrants was to use an insecticide spray. Two days after he did so the area was littered with the silent corpses of dozens of species of insects, birds and mammals.

This was very far removed, of course, from the idealised conception of the rural life that most people carried around in their imaginations, and perhaps many of us still do today, especially when we look back on childhood visits to the countryside and relatives living in rural villages.  Moorhouse characterised this contrast as follows:

Smocked labourers, creaking hay carts, farmyard smells, and dew-lapped beasts by the duck-pond – these are still much more to the forefront of our consciousness than DDT, aldrin, dieldrin, and fluoroacetemide. In most of us, however completely we may be urbanised, there lurks some little lust for the land and a chance to work it.  

Rustic Life; Yeomen Farmers and Yokels:

Farmers had to become hard-nosed professional businessmen. The profits from their labour had to be extracted while they were there, for it was never certain what might be around the next bend. This emphasis on business sense, both in himself and in others, his passion for getting the maximum work out of his men and machines, was what made Moorhouse’s Norfolk farmer sound indistinguishable from any high-powered industrialist in the Midlands. In a sense, he wasn’t. He was prepared to try any method which would increase his productivity. In the early sixties, something very odd had been happening in his part of the world. Traditionally, ‘big’ Norfolk farmers like him had tended to be isolated neighbours, seeing each other at the market but otherwise scarcely at all. But he and three other men had taken to sharing their equipment for harvesting quick-freeze peas; this work had to be done particularly fast on a day appointed by the food factory and ‘Farmer Giles’ and his neighbours had decided that it could be done most efficiently and cheaply by pooling their men and machines and having this unit move from property to property in the course of one day. In 1964, they also clubbed together for a contracting helicopter to spray their crops. He and his friends, being staunch Tories, might not have accepted that they were putting co-operative principles into farming practice, but that was precisely what they were doing, just as the Suffolk sugar-beet growers had done forty years earlier.

For all his business acumen, however, ‘Farmer Giles’ measured up to the popular stereotypical image of a yeoman farmer. He was a warden at his local church, had a couple of horses in his stables and during ‘the season’ he went shooting for four days a week. He cared about the appearance of his patch of countryside, spent an impressive amount of time in doing up the tied cottages of his men, rather than selling it to them, as some of them would like. This is not simply because, in the long run, it results in a contented workforce, but because he can control what it looks like on the outside, as pretty as an antique picture, thatched and whitewashed. Fundamentally, he belonged as completely to the land as he possessed it. Though he no longer had any real need to, he did some manual work himself, as well as prowling around the farm to make sure everything was going to his overall plan. He was organic, like his 1,200 acres, which nonetheless produced a profit of sixteen thousand pounds a year. As he himself commented, overlooking his fields, there is something good about all this! A cynic might have responded to this by suggesting that any life that could produce such a profit was indeed, a good life.

17

Above & Below: Cattle grazing on the Deben meadows near Woodbridge, Suffolk.

But how had the tied agricultural workers, the eternal rustics, fared in this changing pattern of agriculture? The farm labourer interviewed by Moorhouse worked on the Norfolk-Suffolk border. He left school at fourteen, the son of a mid-Norfolk cowman of thirty-five years standing. He first worked on a poultry farm for a couple of years, had four years as assistant cowman to his father, five years as a stock feeder, then two years ‘on the land’ working with tractors and horses. He then came to the farm Moorhouse found him working on fifteen years previously, just after getting married, as a relief man. At the age of forty-two, with a teenage daughter, he was head cowman for a ‘gaffer’ with 450 arable acres and a hundred acres of pasture which carried fifty Friesian milking cows, forty-six calves, and a bull. His farmer was nearing seventy and didn’t hold with too many of the new ways. It was only in that year, 1964, that the modern method of milking – straight from the cow through a pipeline to a common container – had been adopted by his gaffer. Farmer Giles had been doing it this way ever since it was proved to be the quickest and easiest way. ‘Hodge’ got up at 5.30 a.m. to milk the cows and feed the calves. After breakfast until mid-day, he was busy about the yards, mixing meal, washing up and sterilizing equipment. From 1.30 p.m. he was out again, feeding the calves and doing various seasonal jobs until milking, which generally finished by 5 o’clock. Very often he went out again before bed-time, to check on the cows and the calves. He worked a six-and-a-half-day week, for which he was paid twenty-two per cent more than the basic farm worker’s wage for a forty-six-hour week.

16

When he first came to the farm, ‘Hodge’ was given, rent-free, a cottage, which was in rather worse shape than the shelters which housed the cows in winter. It had one of the tin-can lavatories described below and was lit with paraffin lamps. He had to tramp eighty yards to a well for water. There was one room downstairs plus a tiny kitchen, and two bedrooms, one of which was so small you couldn’t fit a full-size bed in it. After a while, the farmer modernised it at a cost of a thousand pounds, knocking it together with the next-door cottage. The renewed place, though still cramped, had all the basic necessities and Hodge paid twelve shillings a week for it. He accepted his situation, though the National Union of Agricultural Workers (NUAW) did not, since it had been trying to abolish tied cottages for forty years on the principle of eviction. Although a socialist and chairman of his local union branch, Hodge argued that tied cottages were necessary because the farm worker had to be near his job so that, as in his case, he could hop across the road before bedtime to check on the cows. Other changes had taken place in his lifetime on Norfolk land. The drift to the towns had fragmented the old society, and traditions had been quietly petering out. The parish church was generally full for the harvest festival, but otherwise ill-attended; the rector had three parishes to cope with.

Rural Poverty & Village Life:

DSC09763

A former labourer’s cottage in Saxmundham marketplace.

The poverty of the inland, rural villages was the result of far more basic concerns than the pressures on property prices created by newcomers, or the changes in agriculture, which did little to improve the lives of villagers. Their cottages may have looked attractive enough in their appearance on the outside, but too often offered their home-grown dwellers little encouragement to remain in them, and if they got the chance to move out they did, while there was no help at all for those who might be interested in trying their hand at rural life. Moorhouse found one village within ten miles of Ipswich which, apart from its electricity and piped water supplies, had not changed at all since the Middle Ages. Some of its cottages were without drains and in these, the housewife had to put a bucket under the plughole every time she wanted to empty the sink; she then carried it out and emptied onto the garden. Sewerage was unknown in the community of 586 people, none of whom had a flush toilet. They used tins, lacing them with disinfectant to keep down the smell and risk of infection. In some cases, these were housed in cubicles within the kitchens, from where they had to be carried out, usually full to the brim, through the front door. Every Wednesday night, as darkness fell, the Rural District Council bumble cart, as the villagers call it, arrived in the village street to remove the tins from the doorsteps. Moorhouse commented that this was…

… for nearly six hundred people … a regular feature of life in 1964 and the joke must long since have worn thin. There are villages in the remoter parts of the North-west Highlands of Scotland which are better equipped than this.

001

This was not by any means an isolated example. While in both counties the coverage of electricity and water supplies were almost complete, drainage and sewerage were far from being so. In the Clare rural district of Suffolk villages were expected to put up with the humiliating visitations of the ‘night cart’ for another five years; in the whole of West Suffolk there were twenty-four villages which could not expect sewerage until sometime between 1968 and 1981, and both county councils accepted that they were some villages which would never get these basic amenities. In East Suffolk, only those places within the narrow commuting belts around the biggest towns could be sure that they would one day soon become fully civilised. In Norfolk, it was estimated that as many as a hundred would never be so. Again, this was the price that East Anglia was paying for being off the beaten track. It was not the indolence of the county councils which ensured the continuance of this residue of highly photogenic rural slums, as Moorhouse put it, so much as cold economics. Both counties had, acre for acre, among the smallest population densities in England; in neither is there very much industry. Therefore, under the rating system of that time, based on property values and businesses, they were unable to raise sufficient funds to provide even these basic services, as we would see them now. Norfolk claimed to have the lowest rateable value among the English counties, and Suffolk was not much better off. They simply did not have the ‘wherewithal’ to make these small communities fit for human habitation. But this simple fact was little ‘comfort’ to those who had to live in them.

img_9755

County Hall, Norwich.

For a survey which it undertook for its 1951 development plan, East Suffolk County Council had decided that basic communal necessities consisted of at least a food shop, a non-food shop, a post office, a school, a doctor’s surgery and/or clinic, a village hall, and a church. When it took a long, hard look at its villages, it found that only forty-seven had all of these things, that ninety-three had all three basic requirements and that (food shop, school, village hall), that 133 had only one or two of them and that thirty-one had none. A similar survey by the West Suffolk County Council showed that only sixteen per cent of its 168 parishes had all the facilities and that about the same proportion had none. When the county authorities made a follow-up survey in 1962, using the same criteria, they found that the position of these rural communities had hardly changed in a decade. There were many more surgeries, due to the growing provisions of the NHS, but the number of village schools had dropped from 103 to 92 and of non-food shops from fifty to twenty-seven.

001

 Suffolk County flag.

In 1964, a regional, South-east Plan was being considered, which included both Suffolk and Norfolk. Moorhouse considered that it might transform the whole of East Anglia into something more approximating Hertfordshire or Essex in terms of economic development. But he also felt that unless there was a change of national direction, the East Country could not stay as it was, virtually inviolate, its people so conscious of their inaccessibility that they frequently refer to the rest of England as ‘The Shires’, and with so many of them eking out a living in small rural communities as their forefathers had done for generations.  It was scarcely surprising, wrote Moorhouse, that the young were leaving, looking for something better. The appeal of bigger towns and cities, with their exciting anonymity, was great enough for many whose childhood and adolescence had been spent wholly in the confining atmosphere of the village. Combined with the lack of basic amenities and work opportunities, this left young people with few reasons to stay.

Power, Ports & Progress:

74

A lonely stretch of coast near Leiston, still enjoyed by caravanners and campers, was the sight of another important development. There, at Sizewell, Britain’s second nuclear power station was built in the early 1960s (the first was built at Windscale in Cumbria in the late fifties). In 1966, power began surging out from the grey, cuboid plant (a model of which – pictured above – can be seen at the Richard Garrett museum in Leiston) into the national grid. By the late seventies, Sizewell’s 580,000 kilowatts were going a long way towards meeting eastern England’s electricity needs.

DSC09797

Sizewell Nuclear Power Station (2014)

The docks also began to be modernised, with ports like Tilbury and Felixstowe hastening the decline of London, which could not handle containerised freight. In addition, most of the Suffolk ports were no further from London than those of Kent and they were a great deal closer to the industrial Midlands and North. In 1955 the Felixstowe Dock and Railway Company had on its hands a dilapidated dock that needed dredging, and warehouses, quays and sea walls all showing signs of storm damage. The total labour force was nine men. By the mid-seventies, the dock area covered hundreds of acres, many reclaimed, made up of spacious wharves, warehouses and storage areas equipped with the latest cargo handling machinery. The transformation began in 1956 as the direct result of foresight and careful planning. The Company launched a three million pound project to create a new deepwater berth geared to the latest bulk transportation technique – containerisation. It calculated that changing trading patterns and Felixstowe’s proximity to Rotterdam and Antwerp provided exciting prospects for an efficient, well-equipped port. Having accomplished that, it set aside another eight million for an oil jetty and bulk liquid storage facilities. In addition, a passenger terminal was opened in 1975. The dock soon acquired a reputation for fast, efficient handling of all types of cargo, and consignments could easily reach the major industrial centres by faster road and rail networks.

DSC09982

Looking across the estuary from Harwich to the Felixstowe container port today.

DSC09983

Increasing trade crammed the Suffolk’s main roads with lorries and forced an expansion and improvement of port facilities. The development of new industries and the growth of the east coast ports necessitated a considerable programme of trunk road improvement. From the opening of the first stretches of motorway in the winter of 1958/59, including the M1, there was a major improvement in the road network. By 1967 motorways totalled 525 miles in length, at a cost of considerable damage to the environment.  This continued into the mid-seventies at a time when economic stringency was forcing the curtailment of other road building schemes. East Anglia’s new roads were being given priority treatment for the first time. Most of the A12, the London-Ipswich road, was made into a dual carriageway. The A45, the artery linking Ipswich and Felixstowe with the Midlands and the major motorways, had been considerably improved. Stowmarket, Bury St Edmunds and Newmarket had been bypassed. By the end of the decade, the A11/M11 London-Norwich road was completed, bringing to an end the isolation of central Norfolk and Suffolk.

021 (10)

DSC09844

 

 

 

 

 

 

 

 

 

 

 

 

 

Above Left: An old milestone in the centre of Woodbridge, Suffolk; Right: The M1 at Luton Spur, opened 1959.

Culture, Landscape & Heritage; Continuity & Conflict:

 

DSCN0790

Suffolk remained a haven for artists, writers and musicians. Indeed, if the county had any need to justify its existence it would be sufficient to read the roll call of those who have found their spiritual home within its borders. Among them, and above them, towers Benjamin Britten, who lived in Aldeburgh and drew inspiration from the land and people of Suffolk for his opera Peter Grimes. The composer moved to the seaside town in 1947 on his return from the USA and almost at once conceived the idea of holding a festival of arts there. It began quietly the following year but grew rapidly thereafter as the activities multiplied – concerts, recitals, operas and exhibitions – and every suitable local building was made use of. Many great artists came to perform and the public came, from all over the world, to listen. Britten had long felt the need for a large concert hall with good acoustics but he did not want to move the festival away from Aldeburgh and the cost of building a new hall was prohibitive.

DSCN0792

In October 1965, the lease of part of a disused ‘maltings’ at nearby Snape became available. It was in a beauty spot at a bridge over the River Alde (pictured above), and architects and builders were soon drafted in to transform the site into a concert hall and other facilities for making music. Queen Elizabeth II opened the buildings in June 1967, but almost exactly two years later disaster struck when the Maltings was burnt out. Only the smoke-blackened walls were left standing, but there was an almost immediate determination that the concert hall would be rebuilt. Donations poured in from all over the world and in less than forty-two weeks the hall had been reconstructed to the original design, and the complex was extended by adding rehearsal rooms, a music library, an art gallery, an exhibition hall and other facilities.

003

The Suffolk shore or, to be more accurate, ‘off-shore’ also made a crucial contribution to the breakthrough of popular or ‘pop’ music in Britain. At Easter 1964 the first illegal ‘pirate’ radio station, Radio Caroline, began broadcasting from a ship just off the Suffolk coast (see map, right). Within months, millions of young people were listening to Radio Caroline North and Radio Caroline South, Radio London and other pirate stations that sprung up. Not only did they broadcast popular music records, but they also reminded their listeners that any attempt to silence them would constitute a direct ‘attack on youth’.

007 (25)

With the advent of these radio stations, the BBC monopoly on airtime was broken, and bands were able to get heard beyond their concerts. Eventually, the Government acted to bring an end to its ‘cold war’ with the British record industry. The BBC set up Radio One to broadcast popular records and in August 1967, the Marine Offences Act outlawed the pirate ships.

Back on dry land, there were areas of conflict, then as now, in which the interests of farmers, businessmen, holidaymakers and country residents clashed. When the farmer rooted out hedges, sprayed insecticides indiscriminately and ploughed up footpaths he soon had conservationists and countryside agencies on his back. When schedule-conscious truck drivers thundered their way through villages, there were angry protests.

019 (17)

002

Saxtead Green’s post mill (see OS map above for location near Framlingham) as it looked in the 1970s when it was maintained by the Department of the Environment; it is now managed (2018) by English Heritage.

w290 (1)There were also, still, many for whom the images of Constable’s rolling landscapes were set in their mind’s eye. For them, this was, above all, his inviolable country. It was also dotted with windmills, another echo of earlier continental associations, many of them still working. Every new building project was examined in great detail by environmentalists.

Many local organisations were formed to raise awareness about and resist specific threats to rural heritage, such as the Suffolk Preservation Society and Suffolk Historic Churches Trust.

001

DSC09864

Most of the churches, like the very early example at Rendlesham (right), were built of flint, both in Suffolk and in Norfolk, where a great number of them have round towers, a feature unique to that county. The farming people of Barsham in the Waveney Valley added their church to the Norman round tower in the fourteenth century (pictured above). After that, they could not afford elaborate additions. When the nave needed re-roofing, modest thatch seemed to offer the best solution. Suffolk, in particular, had an incredibly rich and well-preserved heritage which gave it its distinct county identity.

DSC09863Almost every church had a superb timber roof, described by Moorhouse as a complex of rafters, kingposts, and hammerbeams which look, as you crane your neck at them, like the inverted hold of a ship (the one pictured left is again, from Rendlesham). Very often these medieval churches were miles from any kind of community, emphasising the peculiarly lonely feeling of most of the area. Most are the remains of the Black Death villages, where the plague killed off the entire population and no one ever came back.

 

Around its magnificent ‘wool church’ (pictured below), the half-timbered ‘perfection’ of Lavenham might not have survived quite so completely had it been located in the South of England. This was one of the hidden benefits of the county’s relative isolation which had, nevertheless, come to an end by the late seventies.

023

On the other hand, Wilson has reminded us that the wool-rich men of the town rebuilt their church almost entirely between 1485 and 1530 in the magnificent, new Perpendicular style, yet it remains today and is widely viewed as the crowning glory of ecclesiastical architecture in Suffolk. 

DSC09666

Many other of the county’s churches are not as Medieval as they look (see the fifteenth-century additions to the transepts of St Michael’s, Framlingham, above) which may challenge our contemporary view of the balance between preservation and progress. In 1974 the Department of the Environment produced a report called Strategic Choice for East Anglia. It forecast a population of over eight hundred thousand in Suffolk alone by the end of the century. It saw the major towns growing much larger and suggested that the counties would inevitably lose some of their individuality:

We know … that the change and the growth … will make East Anglia more like other places. For some, this will mean the growth should be resisted, and the opportunities which it brings should be foregone. Whether or not we sympathise with this point of view, we do not think it is practicable. Much of the change and growth that is coming cannot be prevented by any of the means that is likely to be available. The only realistic approach is to recognize this, and take firm, positive steps to maintain and even enhance the environment of the region, using the extra resources that growth will bring …

By the time the report was published, the people of East Anglia had already begun, as they had always done in earlier times, to face up to many of the problems which change and development brought their way.

 

Sources:

Joanna Bourke, et. al. (2001), The Penguin Atlas of British & Irish History. London: Penguin Books.

John Simpson (1999), Strange Places, Questionable People. Basingstoke: Macmillan Pan.

Derek Wilson (1977), A Short History of Suffolk. London: Batsford.

Geoffrey Moorhouse (1964),… Harmondsworth: Penguin Books.

004

Posted November 1, 2018 by TeamBritanniaHu in Affluence, Agriculture, Assimilation, BBC, Britain, British history, Christian Faith, Christian Socialism, Christianity, Church, Civilization, cleanliness, Co-operativism, Cold War, Commemoration, Conservative Party, Demography, Domesticity, East Anglia, Education, Elementary School, Europe, European Economic Community, Factories, Family, Great War, History, Home Counties, Hungary, Immigration, Integration, Journalism, Labour Party, manufacturing, Medieval, Midlands, Migration, Music, Mythology, Narrative, National Health Service (NHS), Norfolk, Population, Poverty, Refugees, Respectability, Scotland, Second World War, Suffolk, Tudor times, Uncategorized, Welfare State, World War One, World War Two

Tagged with , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

British Foreign Policy, NATO & the Shape of the World to Come, 1994-1999.   Leave a comment

Back to Attacking Iraq – Operation Desert Fox:

The Iraq War will no doubt remain the most important and controversial part of Tony Blair’s legacy. But long before it, during the first Clinton administration, two events had taken place which help to explain something of what followed. The first was the bombing of Iraq by the RAF and US air force as punishment for Saddam Hussein’s dodging of UN inspections. The second was the bombing of Serbia during the Kosovo crisis and the threat of a ground force invasion. These crises made Blair believe he had to be involved personally and directly involved in overseas wars. They emphasised the limitations of air power and the importance to him of media management. Without them, Blair’s reaction to the changing of world politics on 11 September 2001 would undoubtedly have been less resolute and well-primed. Evidence of Saddam Hussein’s interest in weapons of mass destruction had been shown to Blair soon after he took office. He raised it in speeches and privately with other leaders. Most countries in NATO and at the UN security council were angry about the dictator’s expulsion of UN inspectors when they tried to probe his huge palace compounds for biological and chemical weapons.  Initially, however, diplomatic pressure was brought to bear on him to allow the inspectors back. The Iraqi people were already suffering badly from the international sanctions on them. He readmitted the inspectors, but then began a game of cat-and-mouse with them.

Desert fox missile.jpg

A Tomahawk cruise missile is fired from an Arleigh Burke-class destroyer during Operation Desert Fox in December 1998

In October 1998, the United States and Britain finally lost patience and decided to smash Baghdad’s military establishment with missiles and bombing raids. In a foretaste of things to come, Blair presented MPs with a dossier about Saddam’s weapons of mass destruction. At the last minute, the Iraqi leader backed down again and the raids were postponed. The US soon concluded that this was just another ruse, however, and in December, British and American planes attacked, hitting 250 targets over four days. Operation Desert Fox, as it was called, probably only delayed Iraq’s weapons programme by a year or so though it was sold as a huge success. As was the case later, Britain and the United States were operating without a fresh UN resolution. But Blair faced little opposition either in Parliament or outside it, other than a from a handful of protesters chanting ‘don’t attack Iraq’ with accompanying placards. Nonetheless, there was a widespread suspicion around the world that Clinton had ordered the attacks to distract from his troubles at home. The raids were thus nicknamed ‘the war of Clinton’s trousers’ and during them, Congress was indeed debating impeachment proceedings, actually formally impeaching the President on their final day.

Rebuilding the Peace in Bosnia:  Dayton to Mostar, 1995-1999.

The break-up of Yugoslavia in the later stages of the long Balkan tragedy had haunted John Major’s time in office as UK Prime Minister. Finally, the three years of bitter warfare in Bosnia in which more than two million people had been displaced and over a hundred thousand had been killed, was brought to an end. In March 1994 the Bosnian Muslims and Croats formed a fragile federation, and in 1995 Bosnian Serbs successes against the Muslim enclaves of Yepa, Srebrenica and Gorazde provoked NATO to intervene. In November 1995, facing military defeat, the Serbian President Slobodan Milosevic bowed to international pressure to accept a settlement. A peace conference between the three sides involved in the conflict, the Serbs, Croats and Bosnian Muslims, ended in their joining into an uneasy federation with the initialling of an agreement in Dayton, Ohio, USA (shown below).

{{{image_alt}}}

Seated from left to right: Slobodan Milošević, Alija Izetbegović, Franjo Tuđman initialling the Dayton Peace Accords at the Wright-Patterson Air Force Base on 21 November 1995.

After the initialling in Dayton, Ohio, the full and formal agreement was signed in Paris on 14 December 1995 (right) and witnessed by Spanish Prime Minister Felipe Gonzalez, French President Jacques Chirac, U.S. President Bill Clinton, UK Prime Minister John Major, German Chancellor Helmut Kohl and Russian Prime Minister Viktor Chernomyrdin.

At the time, I was in my fourth academic year in southern Hungary, running a teachers’ exchange programme for Devon County Council and its ‘twin’ council in Hungary, Baranya County Assembly, based in Pécs. Even before the Dayton Accords, NATO was beginning to enlarge and expand itself into Central Europe. Participants at a Summit Meeting in January 1994 formally announced the Partnership for Peace programme, which provided for closer political and military cooperation with Central European countries looking to join NATO. Then, President Clinton, accompanied by  Secretary of State Christopher, met with leaders of the ‘Visegrád’ states (Hungary, Poland, the Czech Republic and Slovakia) in Prague. In December 1994, Clinton and Christopher attended a Conference on Security and Cooperation in Europe (CSCE) summit in Budapest. During this, the Presidents of the United States, Russia, Kazakhstan, Belarus and Ukraine signed the START 1 nuclear arms reduction treaty. A decision was also made to change the name of the CSCE to the Organisation for Security and Cooperation in Europe (OSCE) and to expand its responsibilities.

001

In particular, the Republic of Hungary, long before it joined NATO officially in 1999, had taken a number of steps to aid the mission of the Western Alliance. On 28 November 1995, following the initialling of the Dayton Accords, the Hungarian Government of Gyula Horn announced that Kaposvár would be the principal ground logistics and supply base for the US contingents of the international peace-keeping force in Bosnia, the NATO-led Implementation Force (IFOR). The Hungarian Parliament then voted almost unanimously to allow NATO air forces to use its bases, including the airfield at Taszár. The Kaposvár bases became operational in early December and the first American soldiers assigned to IFOR arrived at Taszár on 9 December. Most of the three thousand soldiers were charged with logistical tasks. The forces stationed at Kaposvár, units of the US First Armored Division regularly passed through our home city of Pécs ‘en route’ to Bosnia, in convoys of white military vehicles, trucks and troop-carriers. In mid-January 1996, President Clinton paid a snapshot visit to Taszár and met some of the US soldiers there, together with Hungarian State and government ministers. The Hungarian National Assembly also approved the participation of a Hungarian engineering unit in the operation of IFOR which left for Okucani in Croatia at the end of January. The following December the Hungarian Engineering Battalion was merged into the newly established Stabilization Forces (SFOR) in former Yugoslavia.

002

By the end of 1996, therefore, Hungary – one of the former Warsaw Pact countries applying to join NATO – had already been supporting the peace operation in Bosnia for over a year as a host and transit country for British and American troops, providing infrastructural support, placing both military and civilian facilities at their disposal and ensuring the necessary conditions for ground, water and air transport and the use of frequencies. In addition, the Hungarian Defence Forces had been contributing to the implementation of the Dayton Peace Accords with an engineering contingent at the battalion level of up to 416 troops during the IFOR/SFOR operation. It had carried out two hundred tasks, constructed twenty-two bridges and a total of sixty-five kilometres of railroads and taken part in the resurfacing of main roads. It had also carried out mine-clearing, searching over a hundred thousand square metres for explosives.

004

 

In February 1998, the Hungarian National Assembly voted unanimously to continue to take part in the SFOR operation in Bosnia. One event of major significance was the Hungarian forces’ participation in the restoration of the iconic ‘Old Bridge’ in Mostar, famously painted by the Hungarian artist Csontváry (his painting, shown below, is exhibited in the museum which bears his name in Pécs), which had been blown up in the Bosnian War in early 1990s.

(Photos above below: The Old Bridge and Old Town area of Mostar today)

Mostar Old Town Panorama

A monumental project to rebuild the Old Bridge to the original design, and restore surrounding structures and historic neighbourhoods was initiated in 1999 and mostly completed by Spring 2004, begun by the sizeable contingent of peacekeeping troops stationed in the surrounding area during the conflict. A grand re-opening was finally held on 23 July 2004 under heavy security.

 

001 (2)

Crisis & Civil War in Kosovo, 1997-98:

The Dayton peace agreement had calmed things down in former Yugoslavia, and by 1997 international peace-keeping forces such as IFOR and SFOR were able to successfully monitor the cease-fire and separate both the regular and irregular forces on the ground in Bosnia leading to relative stability. However, in 1997-98, events showed that much remained to be done to bring the military conflicts to an end. Bosnian Serbs and Croats sought closer ties for their respective areas with Serbia and Croatia proper. Then, the newly formed Kosovo Liberation Army (KLA) triggered a vicious new conflict. Kosovo, a province of Serbia, was dominated by Albanian-speaking Muslims but was considered almost a holy site in the heritage of the Serbs, who had fought a famous medieval battle there against the invading Ottoman forces. When Albania had won its independence from the Ottoman empire in 1912, over half the Albanian community was left outside its borders, largely in the Yugoslav-controlled regions of Kosovo and Macedonia. In 1998, the KLA stepped up its guerrilla campaign to win independence for Kosovo. The ex-communist Serbian leader, Slobodan Milosevic, having been forced to retreat from Bosnia, had now made himself the hero of the minority Kosovar Serbs. Serb forces launched a campaign of ethnic cleansing against the Albanians. Outright armed conflict in Kosovo started in late February 1998 and lasted until 11 June 1999. By the beginning of May 1998, the situation in the former Yugoslavia was back on the agenda of the Meeting of the NATO Military Committee. For the first time, this was attended by the Chiefs of Staff of the three ‘accession’ countries – Hungary, Poland and the Czech Republic.

Map 1: The Break-up of Yugoslavia, 1994-97

002

The map shows the areas still in conflict, 1994-1997, in Eastern Bosnia and Southern Central Serbia. The area in grey shows the area secured as the ‘independent’ Serbian Republic of Bosnia by Serb forces as of February 1994,  The blue areas are those with where ethnic minorities form the overall majority, while the purple areas show Serb majority areas with significant minorities. The green line shows the border between the Serb Republic component and the Croat-Muslim Federation component of Bosnia-Herzegovina according to the Dayton Peace Agreement, November 1995.

In a poll taken in August 1998, the Hungarian public expressed a positive view of NATO’s role in preventing and managing conflicts in the region. With respect to the situation in Kosovo, fifty-five per cent of those asked had expressed the view that the involvement of NATO would reduce the probability of a border conflict between Albania and Serbia and could prevent the outbreak of a full-scale civil war in Kosovo. At the same time, support for direct Hungarian participation in such peace-keeping actions was substantially smaller. While an overwhelming majority of those asked accepted the principle of making airspace available, as many as forty-six per cent were against even the continued participation of the engineering contingent in Bosnia and only twenty-eight per cent agreed with the involvement of Hungarian troops in a NATO operation in Kosovo. Other European countries, including Poland, the Czech Republic and the existing members of NATO were no more keen to become involved in a ground war in Kosovo. In Chicago, Tony Blair declared a new doctrine of the international community which allowed a just war, based on… values. President Clinton, however, was not eager to involve US troops in another ground war so soon after Bosnia, so he would only consider the use of air power at this stage.

Map 2: Position of Kosovo in Former Yugoslavia, 1995-99

Image result for kosovoOn 13 October 1998, the North Atlantic Council issued activation orders (ACTORDs) for the execution of both limited air strikes and a phased air campaign in Yugoslavia which would begin in approximately ninety-six hours. On 15 October 1998, the Hungarian Parliament gave its consent to the use of its airspace by reconnaissance, combat and transport aircraft taking part in the NATO actions aimed at the enforcement of the UN resolutions on the settlement of the crisis in Kosovo.

At this time, however, the United States and Britain were already involved in the stand-off with Saddam Hussein leading up to Operation Desert Fox in Iraq in December 1998, and so couldn’t afford to be involved in two bombing campaigns simultaneously. Also on the 15 October, the NATO Kosovo Verification Mission (KVM) Agreement for a ceasefire was signed, and the deadline for withdrawal was extended to 27 October. The Serbian withdrawal had, in fact, commenced on or around 25 October and the KVM began what was known as Operation Eagle Eye on 30 October. But, despite the use of international monitors, the KVM ceasefire broke down almost immediately. It was a large contingent of unarmed Organization for Security and Co-operation in Europe (OSCE) peace monitors (officially known as ‘verifiers’) that had moved into Kosovo, but their inadequacy was evident from the start. They were nicknamed the “clockwork oranges” in reference to their brightly coloured vehicles.

NATO’s Intervention & All Out War in Kosovo, 1998-99:

Map 3: Albanians in the Balkans, 1998-2001.

001 (2)

Milosevic used the break-down of the OSCE Mission and the world’s preoccupation with the bombing of Iraq to escalate his ethnic cleansing programme in Kosovo. The death squads went back to work and forced thousands of people to become refugees on wintry mountain tracks, producing uproar around the world.  As the winter of 1998-99 set in, the civil war was marked by increasingly savage Serb reprisals. Outright fighting resumed in December 1998 after both sides broke the ceasefire, and this surge in violence culminated in the killing of Zvonko Bojanić, the Serb mayor of the town of Kosovo Polje. Yugoslav authorities responded by launching a crackdown against KLA ‘militants’. On the ground in Kosovo, the January to March 1999 phase of the war brought increasing insecurity in urban areas, including bombings and murders. Such attacks took place during the Rambouillet talks in February and as the Kosovo Verification Agreement unravelled completely in March. Killings on the roads continued and increased and there were major military confrontations. Pristina, the capital of Kosovo, had been subjected to heavy firefights and segregation according to OSCE reports.

The worst incident had occurred on 15 January 1999, known as the Račak massacre. The slaughter of forty-five civilians in the town provoked international outrage and comparisons with Nazi crimes. The Kosovar Albanian farmers were rounded up, led up a hill and massacred. The bodies had been discovered later by OSCE monitors, including Head of Mission William Walker, and foreign news correspondents. This massacre was the turning point of the war, though Belgrade denied that a massacre had taken place. The Račak massacre was the culmination of the KLA attacks and Yugoslav reprisals that had continued throughout the winter of 1998–1999. The incident was immediately condemned as a massacre by the Western countries and the United Nations Security Council, and later became the basis of one of the charges of war crimes levelled against Milošević and his top officials in the Hague. Hundreds of thousands of people were on the move – eventually, roughly a million ethnic Albanians fled Kosovo and an estimated ten to twelve thousand were killed. According to Downing Street staff,  Tony Blair began to think he might not survive as Prime Minister unless something was done. The real problem, though, was that, after the Bosnian War, only the genuine threat of an invasion by ground troops would convince Milosevic to pull back; air power by itself was not enough. Blair tried desperately to convince Bill Clinton of this. He visited a refugee camp and declared angrily:

“This is obscene. It’s criminal … How can anyone think we shouldn’t intervene?”

Yet it would be the Americans whose troops would be once again in the line of fire since the European Union was far away from any coherent military structure and lacked the basic tools for carrying armies into other theatres. On 23 March 1999, Richard Holbrooke, US Assistant Secretary of State for Europe, returned to Brussels and announced that peace talks had failed and formally handed the matter to NATO for military action. Hours before the announcement, Yugoslavia announced on national television it had declared a state of emergency citing an imminent threat of war and began a huge mobilisation of troops and resources. Later that night, the Secretary-General of NATO, Javier Solana, announced he had directed the Supreme Allied Command to initiate air operations in the Federal Republic of Yugoslavia. On 24 March NATO started its bombing campaign against Yugoslavia. The BBC correspondent John Simpson was in Belgrade when the bombs started to fall. In the capital, he recalled, dangerous forces had been released. A battle was underway between the more civilised figures in Slobodan Milosevic’s administration and the savage nationalist faction headed by Vojislav Seselj, vice-premier of the Serbian government, whose supporters had carried out appalling atrocities in Croatia and Bosnia some years earlier. Earlier in the day, the large international press corps, three hundred strong, had attended a press conference held by the former opposition leader Vuk Draskovic, now a member of Milosevic’s government:

“You are all welcome to stay,” he told us grandly, looking more like Tsar Nicholas II than ever, his cheeks flushed with the first ‘slivovica’ of the day. Directly we arrived back at the Hyatt Hotel, where most of the foreign journalists were staying, we were told that the communications minister, a sinister and bloodless young acolyte of Seselj’s, had ordered everyone working for news organisations from the NATO countries to leave Belgrade at once. It was clear who had the real power, and it wasn’t Draskovic.

That morning Christiane Amanpour, the CNN correspondent, white-faced with nervousness, had been marched out of the hotel by a group of security men from a neutral embassy, put in a car and driven straight to the Hungarian border for her own safety. Arkan, the paramilitary leader who was charged with war-crimes as the war began, had established himself in the Hyatt’s coffee-shop in order to keep an eye on the Western journalists. His thugs, men and women dressed entirely in black, hung around the lobby. Reuters Television and the European Broadcasting Union had been closed down around noon by units of the secret police. They slapped some people around, and robbed a BBC cameraman and producer… of a camera.

Simpson was in two minds. He wanted to stay in Belgrade but yet wanted to get out with all the others. The eight of them in the BBC team had a meeting during which it quickly became clear that everyone else wanted to leave. He argued briefly for staying, but he didn’t want to be left entirely on his own in Belgrade with such lawlessness all around him. It felt like a re-run of the bombing of Baghdad in 1991, but then he had been hustled out of Iraq with the other Western journalists after the first five days of the bombing; now he was leaving Belgrade after only twenty-four hours, which didn’t feel right. At that point, he heard that an Australian correspondent whom he knew from Baghdad and other places was staying. Since Australia was not part of NATO, he couldn’t simply be ordered to leave. So, with someone else to share the risk, he decided he would try to stay too:

… I settled back on the bed, poured myself a generous slug of ‘Laphroaig’ and lit an Upmann’s Number 2. I had selected a CD with some care, and it was playing now:

‘There may be trouble ahead; But while there’s moonlight, and music, and love and romance; Let’s face the music and dance’.

Outside, a familiar wailing began: the air-raid siren. I took my Laphroaig and my cigar over to the window and looked out at the anti-aircraft fire which was already arcing up, red and white, into the night sky.

The bombing campaign lasted from 24 March to 11 June 1999, involving up to 1,000 aircraft operating mainly from bases in Italy and aircraft carriers stationed in the Adriatic. With the exception of Greece, all NATO members were involved to some degree. Over the ten weeks of the conflict, NATO aircraft flew over thirty-eight thousand combat missions. The proclaimed goal of the NATO operation was summed up by its spokesman as “Serbs out, peacekeepers in, refugees back”. That is, Yugoslav troops would have to leave Kosovo and be replaced by international peacekeepers to ensure that the Albanian refugees could return to their homes. The campaign was initially designed to destroy Yugoslav air defences and high-value military targets. But it did not go very well at first, with bad weather hindering many sorties early on.

Three days after John Simpson had decided to remain behind in Belgrade, still alone and having slept a total of seven hours since the war began, and with every programme of the BBC demanding reports from him, he had to write his weekly column for the Sunday Telegraph. At five-thirty in the morning, he described the situation as best as he could, then paused to look at the television screens across the room. BBC World, Sky and CNN were all showing an immense flood of refugees crossing the Macedonian border from Kosovo. Yet protecting these people from was surely the main purpose of the NATO bombing – that, and encouraging people in Serbia itself to turn against their President, Slobodan Milosevic. But NATO had seriously underestimated Milošević’s will to resist. Most of the people in Belgrade who had once been against him now seemed to have rallied to his support. Some of them had already been shouting at the journalist. And then the ethnic Albanians of Kosovo certainly weren’t exactly being protected. He went back to his word-processor and wrote:

If that was the purpose of the bombing, then it isn’t working yet.

He added a few more paragraphs, and then hurriedly faxed the article to London before the next wave of demands from BBC programmes could break over him. The Sunday Telegraph ran the article ‘rather big’ the next day, under the imposing but embarrassing headline, I’m sorry, but this war isn’t working. Tony Blair read the headline and was reported to be furious, yet he must have realised that it was true. His aim and that of Bill Clinton had been to carry out a swift series of air attacks that would force Milosevic to surrender. But the NATO onslaught had been much too feeble and much too circumscribed. Besides the attacks on Belgrade itself, British and American jets had attacked targets only in Kosovo and not in the rest of Serbia, so that other towns and cities had not been touched. Neither had the centre of the Serbian capital itself. President Clinton, as worried as ever about domestic public opinion, had promised that there would be no ground war. Significantly, for the future of the war, an American stealth bomber had crashed, or just possibly been shot down, outside Belgrade. After four days of the war, it began to look as if it might not be such a walkover for NATO after all.

Milosevic couldn’t make a quick climb-down in the face of NATO’s overwhelming force now; his own public opinion, intoxicated by its unexpected success, wouldn’t accept it. In any case, the force didn’t seem quite so overwhelming, and Serbia didn’t seem quite so feeble as had been predicted in Western ‘propaganda’. NATO was clearly in for a far longer campaign than it had anticipated, and there was a clear possibility that the alliance might fall apart over the next few weeks. So the machinery of the British government swung into action to deal with the problem, or rather the little local difficulty that a BBC journalist, also ‘freelancing’ for the Daily Telegraph had had the audacity to suggest that things were not quite going to plan. Backbench Labour MPs began complaining publicly about Simpson’s reporting. So Simpson decided to go out onto the streets of Belgrade to sample opinion directly, for himself. Other foreign camera crews had already had a difficult time trying to do this, and Simpson admitted to being distinctly nervous, as were his cameraman and the Serbian producers he had hired.

People crowded around them and jostled them in order to scream their anger against NATO. These were not stereotypical supporters of the Belgrade régime; many of them had taken part in the big anti-Milosevic two years earlier. But since they felt that, in the face of the bombing, they had no alternative but to regard themselves first and foremost as Serbian patriots, and therefore to support him as their leader. There was little doubt about the intensity of feeling: The men and women who gathered around the BBC team were on the very edge of violence. Before they started their interviews they asked a couple of pressing policemen if they would provide them with some protection. They walked off laughing. After their report was broadcast on that night’s Nine O’Clock News, the British government suggested, off the record, that the people interviewed were obviously afraid of Milosevic’s secret police, and that they had said only what they had been instructed to tell the BBC, or that they had been planted by the authorities for the team to interview. It was strange, the anonymous voices suggested, that someone as experienced as John Simpson, should have failed to realise this.

But the criticism of the bombing campaign was beginning to hit home. The bombers began hitting factories, television stations, bridges, power stations, railway lines, hospitals and many government buildings. This was, however, no more successful. Many innocent civilians were killed and daily life was disrupted across much of Serbia and Kosovo.

The worst incident was when sixty people were killed by an American cluster bomb in a market.

(Pictured above: Smoke in Novi Sad (Újvidék) after NATO bombardment. The aerial photo (below) on the right shows post-strike damage assessment of the Sremska Mitrovica ordnance storage depot, Serbia).

NATO military operations switched increasingly to attacking Yugoslav units on the ground, hitting targets as small as individual tanks and artillery pieces, as well as continuing with the strategic bombardment.

This activity was, however, heavily constrained by politics, as each target needed to be approved by all nineteen member states. By the start of April, the conflict appeared little closer to a resolution and NATO countries began to seriously consider conducting ground operations in Kosovo. At the start of May, a NATO aircraft attacked an Albanian refugee convoy ‘by mistake’, believing it was a Yugoslav military convoy (they may have mistaken the ‘Raba’ farm trucks for troop carriers of a similar make and shape), killing around fifty people. NATO admitted its mistake five days later, but only after the Yugoslavs had accused NATO of deliberately attacking the border-bound refugees; however, a later report conducted by the International Criminal Tribunal for the former Yugoslavia (ICTY) gave its verdict that…

… civilians were not deliberately attacked in this incident … neither the aircrew nor their commanders displayed the degree of recklessness in failing to take precautionary measures which would sustain criminal charges.

Reporting the War: Blair & the BBC.

At the time, in reply to these charges, NATO put forward all sorts of suggestions as to why what had happened, insisting that the convoy had been escorted by the Serbian military: thus making it a legitimate target. An American general suggested that after the NATO jets attacked that the Serbian soldiers travelling with the convoy had leapt out of the vehicles and in a fit of rage had massacred the civilians. It wasn’t all that far-fetched as a possible narrative; both before and after the incident, Serbian soldiers and paramilitaries carried out the most disgusting reprisals against innocent ethnic Albanian civilians. But it wasn’t true in this case. It later transpired that British pilots had recognised the convoy as a refugee one, and had warned the Americans not to attack. In a studio interview for the Nine O’Clock News on the night of the incidentJohn Simpson was asked who might have been responsible for the deaths of the refugees. He replied that if it had been done by the Serb forces, they would try to hush it up quickly. But if it had been NATO, then the Serbian authorities would probably take the journalists and TV crews to the site of the disaster and show them, as had happened on several occasions already when the evidence seemed to bear out the Serbian narratives.

The following day, the military press centre in Belgrade duly provided a coach, and the foreign journalists were taken down to see the site. The Serbs had left the bodies where they lay so that the cameramen could get good pictures of them; such pictures made excellent propaganda for them, of course. It was perfectly clear that NATO bombs had been responsible for the deaths, and eventually, NATO was obliged to give an unequivocal acceptance of culpability and to issue a full apology. But Downing Street was worried that disasters like this would turn public opinion against the war. As the person who had suggested that the Serbian version of events might actually be true, John Simpson became the direct target of the Blair government’s public relations machine. Tony Blair had staked everything on the success of NATO’s war against Milosevic, and it wasn’t going well. So he did precisely what the Thatcher government had done in the Falklands War in 1982, and during the Libyan bombing campaign of 1986, when the US planes used British bases, and what the Major administration did in 1991 when civilian casualties began to mount in the Gulf War: he attacked the BBC’s reporting as being biased. As an experienced war correspondent, Simpson had been expecting this knee-jerk reaction from the government:

Things always go wrong in war, and it’s important that people should know about it when it happens, just as they should know when things are going well. … No doubt arrogantly … I reckoned that over the years I had built up some credibility with the BBC’s audiences, so that people wouldn’t automatically believe it if they were told that I was swallowing the official Serbian line or deliberately trying to undermine NATO’s war effort. I did my utmost to report fairly and openly; and then I sat back and waited for the sky to fall in.

On 14 April, twenty-two days into the war, it did. Simpson started to get calls from friends at Westminster that Alistair Campbell, Tony Blair’s press spokesman, had criticised his reporting in the Westminster press lobby, briefing about the BBC correspondent’s lack of objectivity. Anonymous officials at the Ministry of Defence were also ‘whispering’ that he was blatantly pro-Serbian. The British Foreign Secretary Robin Cook called on him to leave Belgrade and Claire Short, the overseas development secretary, suggested that his reporting was akin to helping Hitler in the Second World War. Soon, Tony Blair himself was complaining to the House of Commons that I was reporting under the instruction and guidelines of the Serbian authorities. If he had made this statement outside Parliament, it would have been actionable. Simpson later asserted that:

It was absolutely and categorically untrue: I was neither instructed nor guided by the Serbs in what I said, and in fact my reports were more frequently censored by the Serbian authorities than those of any correspondent working in Belgrade throughout this period. Not only that, but our cameraman was given twenty-four hours to leave the country at the very time these accusations were being made, in order to punish the BBC for its ‘anti-Serbian reporting’.

The political editor of The Times, Philip Webster, then wrote a story which appeared on its front page on 15 April, reporting that the British government was accusing Simpson of pro-Serbian bias. This resulted in each of the mainstream broadsheet newspapers criticising the government for its attacks on the BBC, and several of the tabloids also made it clear that they didn’t approve either, including the Sun and the Daily Mail, neither of which was particularly friendly to the BBC. MPs from all sides of the House of Commons and various members of the Lords spoke up on behalf of Simpson and the BBC. Martin Bell, the war reporter turned MP also came to his defence, as did John Humphrys, the BBC radio presenter.

The BBC itself, which had not always rallied around its staff when they came under fire from politicians, gave Simpson unequivocal backing of a type he had not experienced before. Downing Street immediately backed away; when he wrote a letter of complaint to Alistair Campbell, he did not get an apology in reply, but an assurance that his professional abilities had not been called into question. As far as Whitehall was concerned, that was the end of it. Still, the predictable suggestion that there was some sort of similarity between the bombing of Serbia and the Second World War clearly struck a chord with some people. Simpson started to get shoals of angry and often insulting letters. The following example, in a ‘spidery hand’ from Anglesey, was typical:

Dear Mr Simpson,

When your country is at war and when our young men are putting their lives at risk on a daily basis, it is only a fool that would say or write anything to undermine their bravery. … in Hitler’s day you would be put in a safe place … where you probably belong.

Of course, the air campaign against Serbia was nothing like the Second World War. There was no conceivable threat to British democracy, nor to its continued existence as a nation. In this case, the only danger was to NATO’s cohesion, and to the reputation of Tony Blair’s government. The only problem was, as we had seen under Thatcher, that politicians had their own way of identifying their own fate with that of the country as a whole. The attacks on John Simpson attracted a great deal of attention from around the world as the international media saw them as an attempt by the British government to censor the BBC. In Belgrade, where the story was given huge attention, as the Serbian press and television seemed to think that it put the BBC on the same basis as themselves, totally controlled by the state. Simpson refused on principle to be interviewed by any Serbian journalist, especially from state television and pointed out to any of them who asked…

the difference between a free press and the kind of pro-government reporting that President Milosevic liked. None was quick-witted enough to reply that Tony Blair might have liked it too.

The Posturing PM & A Peculiar Way to Make a Living:

On 7 May, an allegedly ‘stealthy’ US bomber blew down half the Chinese Embassy in Belgrade, causing a huge international row. The NATO bombs killed three Chinese journalists and outraged Chinese public opinion.

Pictured left: Yugoslav anti-aircraft fire over Belgrade at night.

The United States and NATO later apologised for the bombing, saying that it occurred because of an outdated map provided by the CIA although this was challenged by a joint report from The Observer (UK) and Politiken (Denmark) newspapers which claimed that NATO intentionally bombed the embassy because it was being used as a relay station for Yugoslav army radio signals. Meanwhile, low cloud and the use of decoys by Milosevic’s generals limited the military damage in general.

Pictured right: Post-strike bomb damage assessment photo of Zastava car plant.

In another incident at the Dubrava prison in Kosovo in May 1999, the Yugoslav government attributed as many as 85 civilian deaths to NATO bombing of the facility after NATO sighted Serbian and Yugoslav military activity in the area. However, a Human Rights Watch report later concluded that at least nineteen ethnic Albanian prisoners had been killed by the bombing, but that an uncertain number – probably more than seventy – were killed by Serbian Government forces in the days immediately following the bombing.

But Washington was alarmed by the British PM’s moral posturing and it was only after many weeks of shuttle diplomacy that things began to move. Blair ordered fifty thousand British soldiers, most of the available army should be made available to invade Kosovo. This would mean a huge call-up of reserves and if it was designed to call Milosevic’s bluff, it was gambling on a massive scale, as other European nations had no intention of taking part in a ground campaign. But he did have the backing of NATO, which had decided that the conflict could only be settled by introducing a military peacekeeping force under its auspices in order to forcibly restrain the two sides. The Americans, therefore, began to toughen their language and worked together with the Russians to apply pressure on Milosevic. Finally, at the last minute of this brinkmanship, the Serb Parliament and President buckled and agreed to withdraw their forces from Kosovo, accepting its virtual independence, under an international mandate. Milošević finally recognised that Russia would not intervene to defend Yugoslavia despite Moscow’s strong anti-NATO rhetoric. He thus accepted the conditions offered by a Finnish–Russian mediation team and agreed to a military presence within Kosovo headed by the UN, but incorporating NATO troops.

From June 1999, therefore, Kosovo found itself administered by the international community. Many Kosovar Serbs migrated into Serbia proper, and in 2001 there was further Albanian guerilla activity in ‘northern Macedonia’, where a further ethnic Albanian insurgent group, the NLA, threatened to destabilize that new country, where over a third of the population is ethnic Albanian. Blair had won a kind of victory. Eight months later, Milosevic was toppled from power and ended up in the Hague, charged with war crimes. John Simpson managed to hang on in Belgrade for fourteen weeks altogether, and would have stayed there longer had he not been thrown out by the security police for ‘non-objective’ reporting; that is, reporting that was too objective for their taste. By that stage, the war was effectively all but over. By that stage, also, his wife Dee had been with him for almost a month, braving NATO bombs and the sometimes angry crowds in order to make some of their Simpson’s World programmes there (she is pictured below with John, back at their home near Dublin). He found himself in hospital following a pool-side accident in the Hyatt Hotel. The hospital was surrounded by potential NATO targets, and part of it had been hit. Power-cuts happened every day, and operations were affected as a result. After his, he lay in a large ward listening to the NATO planes flying overhead:

Most of my war had been spent in the Hyatt hotel, which even NATO seemed unlikely to regard as a target. The hospital was different. Every now and then there would be the sound of a heavy explosion, not far away. The patients up and down the corridor groaned or yelled out in their sleep. It was completely dark, because the power had been cut again… Sometimes one of the fifty or so people would call urgently for a nurse… No one would come. The hospital tried to minimise the danger to its staff by keeping as few people as possible on at night as possible. There were only two nurses in our part of the hospital… What would happen, I wondered, if the ward were hit by NATO? … How would I get out, given that I couldn’t even move?…

… I drifted into a kind of sleep, … the sound of bombers overhead and the shudder of explosions. In many ways, I suppose, it was unpleasant and frightening. Yet even then I saw it as something slightly different, as though I were standing outside myself observing. It was an extraordinary experience, what journalists would call a story, and for once I was the participant as well as the onlooker. … This is really why I do the work I do, and live the strange, rootless, insecure life I do; and even when it goes wrong I can turn it into a story. Lying in my hospital bed I fished a torch out of my bag, reached for my notebook, and started writing a despatch for ‘From Our Own Correspondent’ about being in a Serbian hospital during the bombing.

001

As far as the British Prime Minister’s Foreign policy was concerned, first Operation Desert Fox and then Kosovo were vital to the ‘learning curve’ which determined his decision-making over his response to the 9/11 attacks in New York and Washington, and in particular in relation to his backing for the full-scale invasion of Iraq. They taught him that bombing, by itself, rarely worked. They suggested that threatened by the ground invasion of superior forces, dictators will back down. They confirmed him in his view of himself as a moral war leader, combating dictators. After working well with Clinton over Desert Fox, however, he was concerned that he had tried to bounce him too obviously over Kosovo. He learned that US Presidents needed careful handling, but that he could not rely on Britain’s European allies very much in military matters. Nevertheless, he pressed the case later for the establishment of a European ‘rapid reaction force’ to shoulder more of the burden in future regional wars. He learned to ignore criticism from both left and right at home, which became deafening during the bombing of Belgrade and Kosovo. He learned to cope with giving orders which would result in much loss of life. He learned an abiding hostility to the media, and in particular to the BBC, whose reporting of the Kosovo bombing campaign, especially that of John Simpson, had infuriated him.

The Beginnings of Euro-Atlantic Reintegration, 1998-99:

Map 4:

001

(Nagorno-Karabakh, Chechnya and Tatarstan asserted their independence after 1990)

007

The close working relationship between the United States, the United Kingdom and Hungary, and their cooperation at all levels throughout the period 1989-99, had helped to pave the way for a smooth transition to full NATO membership for the Republic at the end of those years. During the NATO summit in Madrid, Secretary-General Javier Solana had invited Hungary, the Czech Republic and Poland to consider joining NATO. A national referendum in Hungary had approved NATO membership on 16 November 1997. At the end of January 1999, Foreign Affairs, János Martonyi had received a letter from NATO General Secretary Javier Solano formally inviting Hungary to join NATO. The same letter was sent to the Foreign Ministries of the Czech Republic and Poland, following the completion of the ratification process in the existing member states, including the UK (in August 1998). The National Assembly in Hungary voted overwhelmingly (96%) for accession on 9 February, and on 12 March the solemn ceremony of the accession of the three countries was held in Independence, Missouri, the birthplace of the former US President, Harry S Truman, in the library named after him. In her speech praising the three countries, US Secretary of State, Madeleine Albright emphasised the significance of the 1956 Hungarian Revolution for world history and welcomed the country of King Stephen and Cardinal Mindszenty into the Atlantic Alliance.

003

006

Later that year, Martonyi wrote in the that…

The tragic events that have been taking place in the territory of the former Yugoslavia, most lately in Kosovo, has made us realise in a dramatic way that security means much more than just in its military definition and that the security of Europe is indivisible. Crisis situations have also warned us that one single organisation, however efficient, is not able to solve the economic, environmental or security problems as a region, let alone of the whole continent, on its own. … Another important lesson of the crisis in the former Yugoslavia has been that no durable peace can be achieved in the region in the absence of genuine democracy and functioning democratic institutions in the countries concerned.  

005

When Hungary acceded to NATO and its flag was raised outside the Alliance’s HQ in Brussels on 16 March, along with those of Poland and the Czech Republic, it finally became a formal ally of the United States and the United Kingdom. By 2001 many of the former eastern bloc countries had submitted applications for membership of the EU, eventually joining in 2004. The European Community had formally become the European Union on 1 January 1994 following the ratification of the Maastricht Treaty the previous year and later that year Hungary was the first of the newly liberated Central European countries to apply for membership. Poland, Slovakia, the Czech Republic, Romania and Bulgaria followed soon after. The European Free Trade Association (EFTA), which had been set up by Britain in 1959, as an alternative to the EEC (when De Gaulle said “Non!”), gradually lost members to the EC/EU. Most of the remaining EFTA countries – Finland, Sweden and Austria – joined the EU in 1995, although Norway rejected membership in a referendum.

Map 5:

001 (3)

Despite all the bullets and bombs which had been flying in the course of the wars in the former Yugoslavia, and, to some extent, because of them, Europe emerged from the nineties as a more politically and economically integrated continent than it had been both at the end of the eighties, and possibly since before the Balkan Wars of the early twentieth century. Through the expansion of NATO, and despite the posturing of the Blair government, the Atlantic Alliance was also at its strongest ‘shape’ since the end of the Cold War, able to adapt to the re-shaping of the world which was to follow the millennarian events of the early years of the twenty-first century.

Sources:

Mark Almond, András Bereznay, et. al. (2001), The Times History of Europe. London: Times Books/ Harper Collins Publishers.

Andrew Marr (2008), A History of Modern Britain. Basingstoke: Pan Macmillan.

John Simpson (1999), Strange Places, Questionable People. Basingstoke: Pan Macmillan.

Rudolf Joó (ed.)(1999), Hungary: A Member of NATO. Budapest: Ministry of Foreign Affairs of the Republic of Hungary.

 

Posted October 27, 2018 by TeamBritanniaHu in Baghdad, Balkan Crises, BBC, Britain, British history, Britons, Bulgaria, Cold War, Communism, Conservative Party, democracy, Ethnic cleansing, Europe, European Economic Community, European Union, Falklands, Genocide, guerilla warfare, Gulf War, History, Hungary, Iraq, John Major, Labour Party, liberal democracy, Margaret Thatcher, Migration, Militancy, Narrative, nationalism, Nationality, NATO, New Labour, Ottoman Empire, Population, Refugees, Russia, Seasons, Security, Serbia, Statehood, terror, terrorism, tyranny, United Nations, USA, USSR, War Crimes, Warfare, Yugoslavia

Tagged with , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

‘Celebrity’ Britain: The Arrival of ‘New Labour’ & Diana’s Demise, 1994-99.   Leave a comment

The Advent of Brown & Blair:

Featured Image -- 41771

Tony Blair was far more of an establishment figure than his mentor John Smith, or his great political ‘friend’ and future rival, Gordon Brown. He was the son of a Tory lawyer and went to preparatory school in Durham and then to a fee-paying boarding school in Edinburgh. He then went ‘up’ to Oxford, becoming a barrister and joining the Labour Party before he fell in love with a young Liverpudlian socialist called Cherie Booth, who sharpened his left-wing credentials before he became an MP at the 1983 General Election, winning a safe Labour seat in the north-east of England. Once in the Commons, he quickly fell in with Gordon Brown, another new MP, who was much that Blair was not. He was a tribal Labour Party man from a family which was strongly political and had rarely glimpsed the English Establishment, even its middle ranks from which Blair sprung. Brown had been Scotland’s best-known student politician and player in Scottish Labour politics from the age of twenty-three, followed by a stint in television. Yet the two men had their Christian beliefs in common, Anglo-Catholic in Blair’s case and Presbyterian in Brown’s. Most importantly, they were both deeply impatient with the state of the Labour Party. For seven or eight years they had seemed inseparable, sharing a small office together. Brown tutored Blair in some of the darker arts of politics while Blair explained the thinking of the English metropolitan and suburban middle classes to Brown. Together they made friends with Westminster journalists, both maturing as performers in the Commons, and together they worked their way up the ranks of the shadow cabinet.

After the 1992 defeat, Blair made a bleak public judgement about why Labour had lost so badly. The reason was simple: Labour has not been trusted to fulfil the aspirations of the majority of people in a modern world. As shadow home secretary he began to put that right, promising to be tough on crime and tough on the causes of crime. He was determined to return his party to the common-sense values of Christian Socialism, also influenced by the mixture of socially conservative and economically liberal messages used by Bill Clinton and his ‘New Democrats’. So too was Gordon Brown but as shadow chancellor, his job was to demolish the cherished spending plans of his colleagues. Also, his support for the ERM made him ineffective when Major and Lamont suffered their great defeat. By 1994, the Brown-Blair relationship was less strong than it had been, but they visited the States together to learn the new political style of the Democrats which, to the advantage of Blair, relied heavily on charismatic leadership. Back home, Blair pushed Smith to reform the party rulebook, falling out badly with him in the process. Media commentators began to tip Blair as the next leader, and slowly but surely, the Brown-Blair relationship was turning into a Blair-Brown one.

002 (2)

In the days following the sudden death of the Labour leader, John Smith (pictured right), Tony Blair decided almost immediately to run as his replacement, while Gordon Brown hesitated, perhaps more grief-stricken. But he had assumed he would eventually inherit the leadership, and was aghast when he heard of Blair’s early declaration. There were at least ten face-to-face confrontations between the two men, in London and Edinburgh. In the opinion polls, Blair was shown to be more popular, and he had the backing of more MPs as well as that of the press. Crucial to Blair’s case was his use of received pronunciation which, after Neil Kinnock and John Smith’s heavily accented English, would reassure those more prejudiced parts of the Kingdom which were the main battlegrounds for Labour, and in which Celtic tones were not perhaps as appreciated as they might be nowadays. They were alright when heard from actors and BBC presenters, but they made politicians seem more ‘peripheral’. Brown had a deeper knowledge of the Labour movement and broader support among the trade unions, however, and had also thought through his policy agenda for change in greater detail. Given the vagaries of Labour’s electoral college system, it is impossible to judge, even now, what might have happened had the ‘young English hart’ locked horns with the ‘tough Scottish stag’, but they agreed at the time that it would be disastrous for them to fight each other as two ‘modernisers’ since Brown would have to attack Blair from the left and the unions would then demand their tribute from him if he won.

So the two men came to a deal, culminating in a dinner at a ‘chic’ Islington restaurant. The outcome is still a matter of some dispute, but we know that Blair acknowledged that Brown, as Chancellor in a Labour government, would have complete authority over a wide range of policy which he would direct through the Treasury, including the ‘social justice’ agenda. But it is unlikely that he would have been so arrogant as to agree, as some have suggested, that he would hand over the premiership to Brown after seven years. After all, at that time Labour was already still three years away from winning its first term and not even the sharpest crystal ball could have projected the second term at that juncture. The most significant result of their dinner-table deal was that, following all the battles between Tory premiers and chancellors of the then recent and current Conservative governments, Brown’s Treasury would become a bastion for British home affairs, while Blair was left to concentrate on foreign policy largely unimpeded, with all the tragic consequences with which we are now familiar, with the benefit of the hindsight of the last twenty years.

Team Tony & ‘Blair’s Babes’:

When it arrived, the 1997 General Election demonstrated just what a stunningly efficient and effective election-winning team Tony Blair led, comprising those deadly masters of spin, Alistair Campbell and Peter Mandelson. ‘New Labour’ as it was now officially known, won 419 seats, the largest number ever for the party and comparable only with the seats won by the National Government of 1935. Its Commons majority was also a modern record, 179 seats, and thirty-three more than Attlee’s landslide majority of 1945. The swing of ten per cent from the Conservatives was another post-war record, roughly double that which the 1979 Thatcher victory had produced in the opposite direction. But the turn-out was very low, at seventy-one per cent the lowest since 1935. Labour had won a famous victory but nothing like as many actual votes as John Major had won five years earlier. But Blair’s party also won heavily across the south and in London, in parts of Britain where it had previously been unable to reach or represent in recent times.

As the sun came up on a jubilant, celebrating Labour Party returning to power after an eighteen-year absence, there was a great deal of Bohemian rhapsodizing about a new dawn for Britain. Alistair Campbell had assembled crowds of party workers and supporters to stand along Downing Street waving union flags as the Blairs strode up to claim their victory spoils. Briefly, at least, it appeared that the whole country had turned out to cheer the champions. In deepest, Lib-Con ‘marginal’ Somerset, many of us had been up all night, secretly sipping our Cava in front of the incredible scenes unfolding before our disbelieving eyes, and when the results came in from Basildon and Birmingham Edgbaston (my first constituency at the age of eighteen when it had already been a safe seat for Tory matron Jill Knight for at least a decade), we were sure that this would indeed be a landslide victory, even if we had had to vote for the Liberal Democrats in the West Country just to make sure that there was no way back for the Tories. The victory was due to a small group of self-styled modernisers who had seized the Labour Party and made it a party of the ‘left and centre-left’, at least for the time being, though by the end of the following thirteen years, and after two more elections, they had taken it further to the right than anyone expected on that balmy early summer morning; there was no room for cynicism amid all the euphoria. Labour was rejuvenated, and that was all that mattered.

A record number of women were elected to Parliament, 119, of whom 101 were Labour MPs, the so-called ‘Blair’s babes’. Despite becoming one of the first countries in the world to have a female prime minister, in 1987 there were just 6.3% of women MPs in government in the UK, compared with 10% in Germany and about a third in Norway and Sweden. Only France came below the UK with 5.7%.

Official portrait of Dame Margaret Hodge crop 2.jpgBefore the large group of women MPs joined her in 1997, Margaret Hodge (pictured below, c.1992, and right, in c. 2015) had already become MP for Barking in a 1994 by-election, following the death of Labour MP Jo Richardson. While still a new MP, Hodge endorsed the candidature of Tony Blair, a former Islington neighbour, for the Labour Party leadership, and was appointed Junior Minister for Disabled People in 1998. Before entering the Commons, she had been Leader of Islington Council and had not been short of invitations from constituencies to stand in the 1992 General Election. Given that she is now referred to as a ‘veteran MP’ it is therefore interesting to note that she had turned these offers down, citing her family commitments:

002

“It’s been a hard decision; the next logical step is from local to national politics and I would love to be part of a Labour government influencing change. But it’s simply inconsistent with family life, and I have four children who mean a lot to me. 

“It does make me angry that the only way up the political ladder is to work at it twenty-four hours a day, seven days a week. That’s not just inappropriate for a woman who has to look after children or relatives, it’s inappropriate for any normal person.

“The way Parliament functions doesn’t attract me very much. MPs can seem terribly self-obsessed, more interested in their latest media appearance than in creating change.” 

003

Patricia Hewitt.jpg

Patricia Hewitt (pictured above, in 1992, and more recently, right) had first begun looking for a seat in the 1970s when she was general secretary of the National Council of Civil Liberties (NCCL). She later commented that… looking for a seat takes an enormous amount of time, and money, too if you’re travelling a lot. Eventually, she was chosen to fight Leicester East in 1983, a contest which she lost by only nine hundred votes to the Conservative in what was then a relatively safe Tory seat. She later recalled driving up to Leicester on two evenings every week:

“I was planning to have a child after the elections – looking back I don’t know I imagined I was going to cope if Labour had won the seat… Even without children, I was leading such a pressured life – and my partner was doing the same as a Labour councillor – that it did put a strain on our relationship.”

She then became Neil Kinnock’s press and broadcasting secretary. In this role, she was a key player in the first stages of the ‘modernisation’ of the Labour Party, and along with Clive Hollick, helped set up the Institute for Public Policy Research and was its deputy director 1989–1994. By the time of the 1992 General Election she had two small children, so she decided not to look for a seat. Following Labour’s defeat in 1992, Hewitt was asked by the new Labour Leader, John Smith, to help establish the Commission on Social Justice, of which she became deputy chair. She then became head of research with Andersen Consulting, remaining in the post during the period 1994–1997. Hewitt was finally elected to Parliament to the House of Commons as the first female MP for Leicester West at the 1997 General Election, following the retirement of Greville Janner. She was elected with a majority of 12,864 and remained the constituency MP until stepping down in 2010.

001

Mary Kaldor (pictured right in the 1980s, and below in 2000), by contrast, never became an MP, one of the ‘loves’ Labour lost. A British academic, currently Professor of Global Governance at the London School of Economics, where she is also the Director of the Civil Society and Human Security Research Unit, she was the daughter of the economist Nicholas Kaldor, an exiled Hungarian economist who became an adviser to Harold Wilson in the 1960s. In the nineties, she was a senior research fellow at the Science Policy Research Unit of Sussex, and former foreign policy adviser to the Labour Party. She was shortlisted for Hackney and Dulwich in 1981, attending masses of meetings, many of which were boring at which she was endlessly having to be nice to people. Her youngest child was two years old at the time and was therefore ambivalent about the idea of becoming an MP:

“I was very well-equipped with baby minders and a nice understanding husband, but what on earth is the point of having children if you’re not going to see them?

“Building links with eastern Europe through the peace movement was more exciting than anything I could ever have done as an MP … (which seemed) entirely about competitiveness and being in the limelight, giving you no time to think honestly about your political views.”

Mary Kaldor crop.jpg

In 1999, Kaldor supported international military intervention over Kosovo on humanitarian grounds, calling for NATO ground forces to follow aerial bombardment in an article for The Guardian. I have written about the war in Kosovo in a separate article in this series. Significantly, however, by the end of the next decade Kaldor lost faith in the principle and practice of humanitarian intervention, telling the same paper:

The international community makes a terrible mess wherever it goes…

It is hard to find a single example of humanitarian intervention during the 1990s that can be unequivocally declared a success. Especially after Kosovo, the debate about whether human rights can be enforced through military means is ever more intense.

Moreover, the wars in Afghanistan and Iraq, which have been justified in humanitarian terms, have further called into question the case for intervention.

002

Blair needed the support and encouragement of admirers and friends who would coax and goad him. There was Mandelson, the brilliant but temperamental former media boss, who had now become an MP. Although adored by Blair, he was so mistrusted by other members of the team that Blair’s inner circle gave him the codename ‘Bobby’ (as in Bobby Kennedy). Alistair Campbell, Blair’s press officer and attack-dog is pictured above, in a characteristic ‘pose’. A former journalist and natural propagandist, he had helped orchestrate the campaign of mockery against Major. Then there was Anji Hunter, the contralto charmer who had known Blair as a young rock-singer and was his best hotline to middle England. Derry Irvine was a brilliant Highlands lawyer who had first found a place in his chambers for Blair and Booth. He advised on constitutional change and became Lord Chancellor in due course. These people, with the Brown team working in parallel, formed the inner core. The young David Miliband, son of a well-known Marxist philosopher, provided research support. Among the MPs who were initially close were Marjorie ‘Mo’ Mowlem and Jack Straw, but the most striking aspect about ‘Tony’s team’ was how few elected politicians it included.

The small group of people who put together the New Labour ‘project’ wanted to find a way of governing which helped the worse off, particularly by giving them better chances in education and to find jobs, while not alienating the mass of middle-class voters. They were extraordinarily worried by the press and media, bruised by what had happened to Kinnock, whom they had all worked with, and ruthlessly focused on winning over anyone who could be won. But they were ignorant of what governing would be like. They were able to take power at a golden moment when it would have been possible to fulfil all the pledges they had made. Blair had the wind at his back as the Conservatives would pose no serious threat to him for many years to come. Far from inheriting a weak or crisis-ridden economy, he was actually taking over at the best possible time when the country was recovering strongly but had not yet quite noticed that this was the case. Blair had won by being ruthless, and never forgot it, but he also seemed not to realise quite what an opportunity ‘providence’ had handed him.

Cool Britannia and the Celebrity Princess:

001

Above: a page from a recent school text.

Tony Blair arrived in power in a country with a revived fashion for celebrity, offering a few politicians new opportunities but at a high cost. It was not until 1988 that the full shape of modern celebrity culture had become apparent. That year had seen the publication of the truly modern glossy glamour magazines when Hello! was launched. Its successful formula was soon copied by OK! from 1993 and many other magazines soon followed suit, to the point where the yards of coloured ‘glossies’ filled the newsagents’ shelves in every town and village in the country. Celebrities were paid handsomely for being interviewed and photographed in return for coverage which was always fawningly respectful and never hostile. The rich and famous, no matter how flawed in real life, were able to shun the mean-minded sniping of the ‘gutter press’, the tabloid newspapers. In the real world, the sunny, airbrushed world of Hello! was inevitably followed by divorces, drunken rows, accidents and ordinary scandals. But people were happy to read good news about these beautiful people even if they knew that there was more to their personalities and relationships than met the eye. In the same year that Hello! went into publication, ITV also launched its the most successful of the daytime television shows, This Morning, hosted from Liverpool by Richard Madeley and Judy Finnigan, providing television’s celebrity breakthrough moment.

This celebrity fantasy world, which continued to open up in all directions throughout the nineties, served to re-emphasise to alert politicians, broadcasting executives and advertisers the considerable power of optimism. The mainstream media in the nineties was giving the British an unending stream of bleakness and disaster, so millions tuned in and turned over to celebrity. That they did so in huge numbers did not mean that they thought that celebrities had universally happy lives. And in the eighties and nineties, no celebrity gleamed more brightly than the beautiful yet troubled Princess Diana. For fifteen years she was an ever-present presence: as an aristocratic girl, whose childhood had been blighted by her parents’ divorce, her fairytale marriage in 1981 found her pledging her life to a much older man who shared few of her interests and did not even seem to be truly in love with her. Just as the monarchy had gained from its marriages, especially the filmed-for-television romance, engagement and wedding of Charles and Diana, the latter attracting a worldwide audience of at least eight hundred million, so it lost commensurately from the failure of those unions.

050

Above: Hello! looks back on the 1981 Royal Wedding from that of 2011.

Diana quickly learned how to work the crowds and to seduce the cameras like Marilyn Monroe. By the end of the eighties, she had become a living fashion icon. Her eating disorder, bulimia, was one suffered by a growing number of young women and teenage girls from less privileged homes. When AIDS was in the news, she hugged its victims to show that it was safe, and she went on to campaign for a ban on the use of land-mines. The slow disintegration of this marriage transfixed Britain, as Diana moved from a china-doll debutante to painfully thin young mother, to an increasingly charismatic and confident public figure, surprising her husband who had always assumed she would be in his shadow. After the birth of their second son Harry in 1987, Charles and Diana’s marriage was visibly failing.

When rumours spread of her affairs, they no longer had the moral impact that they might have had in previous decades. By the nineties, Britain was now a divorce-prone country, in which ‘what’s best for the kids’ and ‘I deserve to be happy’ were phrases which were regularly heard in suburban kitchen-diners. Diana was not simply a pretty woman married to a king-in-waiting but someone people felt, largely erroneously, would understand them. There was an obsessive aspect to the admiration of her, something that the Royal Family had not seen before, and its leading members found it very uncomfortable and even, at times, alarming. They were being challenged as living symbols of Britain’s ‘family values’ and found wanting, just as John Major’s government would also be hoisted by its own petard as its ‘Back to Basics’ campaign was overwhelmed by an avalanche of sexual and financial scandals.

By the mid-1990s, the monarchy was looking shaky, perhaps even mortal. The strain of being at once a ceremonial and a familial institution was proving a bit much. The year 1992, referred to as the Queen as her ‘annus horribilis’ in her Christmas speech, first saw the separation of the other royal couple, Andrew and Sarah, followed by a major fire at Windsor Castle in November. The journalist Andrew Morton claimed to tell Diana’s True Story in a book which described suicide attempts, blazing rows, her bulimia and her growing certainty that Prince Charles had resumed an affair with his old love Camilla Parker-Bowles, something he later confirmed in a television interview with Jonathan Dimbleby. In December, John Major announced the separation of Charles and Diana to the House of Commons. There was a further blow to the Royal Family’s prestige in 1994 when the royal yacht Britannia, the floating emblem of the monarch’s global presence, was decommissioned.

046

 Above: Prince William with his mother, c. 1994.

Then came the revelatory 1995 interview on BBC TV’s Panorama programme between Diana and Martin Bashir. Breaking every taboo left in Royal circles, she freely discussed the breakup of her marriage, claiming that there were ‘three of us’ in it, attacked the Windsors for their cruelty and promised to be ‘a queen of people’s hearts’. Finally divorced in 1996, she continued her charity work around the world and began a relationship with Dodi al-Fayed, the son of the owner of Harrods, Mohammed al-Fayed. To many in the establishment, she was a selfish, unhinged woman who was endangering the monarchy. To many millions more, however, she was more valuable than the formal monarchy, her readiness to share her pain in public making her even more fashionable. She was followed all around the world, her face and name selling many papers and magazines. By the late summer of 1997, Britain had two super-celebrities, Tony Blair and Princess Diana.

It was therefore grimly fitting that Tony Blair’s most resonant words as Prime Minister which brought him to the height of his popularity came on the morning when Diana was killed in a car-crash, together with Dodi, in a Paris underpass. Blair was woken from a deep sleep at his constituency home, first to be told about the accident, and then to be told that Diana had died. Deeply shocked and worried about what his proper role should be, Blair spoke first to Campbell and then to the Queen, who told him that neither she nor any other senior member of the Royal Family would be making a statement. He decided, therefore, that he had to say something. Later that Sunday morning, standing in front of his local parish church, he spoke words which were transmitted live around the world:

“I feel, like everyone else in this country today, utterly devastated. Our thoughts and prayers are with Princess Diana’s family, in particular her two sons, her two boys – our hearts go out to them. We are today a nation in a state of shock…

“Her own life was often sadly touched the lives of so many others in Britain and throughout the world with joy and with comfort. How many times shall we remember her in how many different ways, with the sick, the dying, with children, with the needy? With just a look or a gesture that spoke so much more than words, she would reveal to all of us the depth of her compassion and her humanity.

“People everywhere, not just here in Britain, kept faith with Princess Diana. They liked her, they loved her, they regarded her as one of the people. She was – the People’s Princess and that is how she will stay, how she will remain in our hearts and our memories for ever.”

Although these words seem, more than twenty years on, to be reminiscent of past tributes paid to religious leaders, at the time they were much welcomed and assented to. They were the sentiments of one natural charismatic public figure to another. Blair regarded himself as the people’s Prime Minister, leading the people’s party, beyond left and right, beyond faction or ideology, with a direct line to the people’s instincts. After his impromptu eulogy, his approval rating rose to over ninety per cent, a figure not normally witnessed in democracies. Blair and Campbell then paid their greatest service to the ancient institution of the monarchy itself. The Queen, still angry and upset about Diana’s conduct and concerned for the welfare of her grandchildren, wanted a quiet funeral and to remain at Balmoral, away from the scenes of public mourning in London. However, this was potentially disastrous for her public image. There was a strange mood in the country deriving from Diana’s charisma, which Blair had referenced in his words at Trimdon. If those words had seemed to suggest that Diana was a saint, a sub-religious hysteria responded to the thought. People queued to sign a book of condolence at St James’ Palace, rather than signing it online on the website of the Prince of Wales. Those queuing even reported supernatural appearances of the dead Princess’ image. By contrast, the lack of any act of public mourning by the Windsors and the suggestion of a quiet funeral seemed to confirm Diana’s television criticisms of the Royal Family as being cold if not cruel towards her.

001

In particular, the Queen was criticised for following protocol, which prohibited the flying of flags at Buckingham Palace when she was not in residence, rather than fulfilling the deep need of a grief-stricken public to see the Union flag flying there at half-mast. According to another protocol, flags were only flown at half-mast on the deaths of the monarch or their immediate blood relatives. But the crown lives or dies by such symbolic moments, and the Queen relented. Also, with Prince Charles’ full agreement, Blair and his aides put pressure on the Palace first into accepting that there would have to be a huge public funeral so that the public could express their grief, and second into accepting that the Queen should return to London. She did, just in time to quieten the genuine and growing anger about her perceived attitude towards Diana. This was a generational problem as well as a class one. The Queen had been brought up in a land of buttoned lips, stoicism and private grieving. She now reigned over a country which expected and almost required exhibitionism. For some years, the deaths of children, or the scenes of fatal accidents had been marked by little shrines of cellophane-wrapped flowers, soft toys and cards. In the run-up to Diana’s funeral parts of central London seemed almost Mediterranean in their public grieving. There were vast mounds of flowers, people sleeping out, holding up placards and weeping in the streets, strangers hugging each other.

The immense outpouring of public emotion in the weeks that followed seemed both to overwhelm and distinguish itself from the more traditional devotion to the Queen herself and to her immediate family. The crisis was rescued by a live, televised speech she made from the Palace which was striking in its informality and obviously sincere expression of personal sorrow. As Simon Schama has put it,

The tidal wave of feeling that swept over the country testified to the sustained need of the public to come together in a recognizable community of sentiment, and to do so as the people of a democratic monarchy.

003

The funeral itself was like no other before, bringing the capital to a standstill. In Westminster Abbey, campaigners stood alongside aristocrats, entertainers with politicians and rock musicians with charity workers. Elton John performed a hastily rewritten version of ‘Candle in the Wind’, originally his lament for Marilyn Monroe, now dedicated to ‘England’s Rose’, and Princess Diana’s brother Earl Spencer made a half-coded attack from the pulpit on the Windsors’ treatment of his sister. This was applauded when it was relayed outside and clapping was heard in the Abbey itself. Diana’s body was driven to her last resting place at the Spencers’ ancestral home of Althorp in Northamptonshire. Nearly a decade later, and following many wild theories circulated through cyberspace which reappeared regularly in the press, an inquiry headed by a former Metropolitan Police commissioner concluded that she had died because the driver of her car was drunk and was speeding in order to throw off pursuing ‘paparazzi’ photographers. The Queen recovered her standing after her live broadcast about her wayward former daughter-in-law. She would later rise again in public esteem to be seen to be one of the most successful monarchs for centuries and the longest-serving ever. A popular film about her, including a sympathetic portrayal of these events, sealed this verdict.

012

HM Queen Elizabeth II in 2001.

Tony Blair never again quite captured the mood of the country as he did in those sad late summer days. It may be that his advice and assistance to the Queen in 1997 was as vital to her as it was, in the view of Palace officials, thoroughly impertinent. His instinct for popular culture when he arrived in power was certainly uncanny. The New Age spiritualism which came out into the open when Diana died was echoed among Blair’s Downing Street circle. What other politicians failed to grasp and what he did grasp, was the power of optimism expressed in the glossy world of celebrity, and the willingness of people to forgive their favourites not just once, but again and again. One of the negative longer-term consequences of all this was that charismatic celebrities discovered that, if they apologised and bared a little of their souls in public, they could get away with most things short of murder. For politicians, even charismatic ones like Blair, life would prove a little tougher, and the electorate would be less forgiving of oft-repeated mistakes.

(to be continued).

Posted October 22, 2018 by TeamBritanniaHu in Affluence, Agriculture, BBC, Belfast Agreement, Belgium, Birmingham, Britain, Brussels, Christian Faith, Christianity, Church, Conquest, Conservative Party, devolution, Europe, European Economic Community, European Union, France, History, Integration, Ireland, Irish history & folklore, Journalism, Labour Party, Margaret Thatcher, Migration, Millenarianism, Monarchy, Narrative, nationalism, Nationality, New Labour, Population, Respectability, Scotland, Uncategorized, West Midlands

Tagged with , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Years of Transition – Britain, Europe & the World: 1992-1997.   Leave a comment

Epilogue to the Eighties & Prologue to the Nineties:

I can recall the real sense of optimism which resulted from the end of the Cold War, formally ending with President Gorbachev’s announcement of the dissolution of the Soviet Union on Christmas Day 1991. Although never an all-out global war, it had resulted in the deaths of up to forty million people throughout the world, involving more than a hundred and fifty smaller ‘proxy’ conflicts. Moreover, we had lived under a continual sense of doom, that it was only a matter of time until our brief, young lives would be snuffed out by a nuclear apocalypse. Now, politicians and journalists in the West talked of a coming ‘peace dividend’ and the end of the surveillance, spy and secret state in both east and west. The only continuing threat to British security came from the Provisional IRA. They hit Downing Street with a triple mortar attack in February 1991, coming close to killing the new Prime Minister, John Major, and his team of ministers and officials directing the Gulf War.

Margaret ThatcherBy the time Margaret Thatcher left office in tears on 28 November 1990, ‘Thatcherism’ was also a spent force, though its influence lingered on until at least the end of the century, and not just among Conservatives. Only a minority even among the ‘party faithful’ had been true believers and the Tory MPs would have voted her out had her cabinet ministers not beaten them to it. As Andrew Marr has written, History is harshest to a leader just as they fall. She had been such a strident presence for so long that many who had first welcomed her as a ‘gust’ of fresh air now felt the need for gentler breezes. Those who wanted a quieter, less confrontational leader found one in John Major.

Yet most people, in the end, had done well under her premiership, not just the ‘yuppies’ but also her lower-middle-class critics who developed their own entrepreneurial sub-cultures rather than depending on traditional sponsorship from arts councils and local authorities. By the early nineties, Britons were on average much wealthier than they had been in the late seventies and enjoyed a wider range of holidays, better food, and a greater variety of television channels and other forms of home entertainment. Nor was everything the Thatcher governments did out of tune with social reality. The sale of council houses which corresponded to the long passion of the British to be kings and queens of their own little castles. Sales of state utilities, on the other hand, presupposed a hunger for stakeholdership that was much less deeply rooted in British habits, and the subsequently mixed fortunes of those stocks did nothing to help change those habits. Most misguided of all was the decision to implement the ‘poll tax’ as a regressive tax. In the end, Thatcher’s 1987-90 government became just the latest in a succession of post-war British governments that had seen their assumptions rebound on them disastrously. This ‘trend’ was to continue under John Major. The upper middle-class ‘Victorian Values’ of the grocer’s daughter from Grantham were replaced by the ‘family values’ of the lower middle-class garden gnome salesman from Brixton, only for him to be overwhelmed by an avalanche of sexual and financial scandals.

The single most important event of the early nineties in Britain, possibly globally too, had nothing to do with politics and diplomacy or warfare and terrorism, at least not in the nineties. Tim Berners-Lee, a British scientist, invented the World Wide Web, or the Internet. His idea was for a worldwide ‘hypertext’, the computer-aided reading of electronic documents to allow people to work together remotely., sharing their knowledge in a ‘web’ of documents. His creation of it would give the internet’s hardware its global voice. He was an Oxford graduate who had made his first computer with a soldering iron, before moving to CERN, the European Physics laboratory, in Switzerland in 1980, the world’s largest scientific research centre. Here he wrote his first programme in 1989 and a year later he proposed his hypertext revolution which arrived in CERN in December 1990. The ‘internet’ was born the following summer. He chose not to patent his creation so that it would be free to everyone.

The Election of 1992 – A Curious Confidence Trick?:

002

John Major called an election for April 1992. Under a pugnacious Chris Patten, now Party chairman, the Tories targeted Labour’s enthusiasm for high taxes. During the campaign itself, Major found himself returning to his roots in Brixton and mounting a ‘soap-box’, from which he addressed raucous crowds through a megaphone. John Simpson, the BBC correspondent, was given the task of covering Major’s own campaign, and on 15 March he travelled to Sawley, in the PM’s constituency of Huntingdon, where Major was due to Meet the People. I have written elsewhere about the details of this, and his soap-box campaign, as reported by Simpson. Although Simpson described it as ‘a wooden construction of some kind’, Andrew Marr claims it was ‘a plastic container’. Either way, it has gone down in political history, together with the megaphone, as the prop that won him the election. The stark visual contrast achieved with the carefully stage-managed Labour campaign struck a chord with the media and he kept up an act that his father would have been proud of, playing the underdog to Neil Kinnock’s government in waiting. Right at the end, at an eve of poll rally in Sheffield, Kinnock’s self-control finally gave way and he began punching the air and crying “y’awl’ right!” as if he were an American presidential candidate. It was totally ‘cringe-worthy’ TV viewing, alienating if not repulsing swathes of the very middle England voters he needed to attract.

On 9 April 1992 Major’s Conservatives won fourteen million votes, more than any party in British political history. It was a great personal victory for the ‘new’ Prime Minister, but one which was also based on people’s fears of higher taxes under a Labour government. It was also one of the biggest victories in percentage terms since 1945, though the vagaries of the electoral system gave the Tories a majority of just twenty-one seats in parliament. Neil Kinnock was even more devastated than he had been in 1987 when he had not been expected to defeat Thatcher. The only organ of the entire British press which had called the election correctly was the Spectator. Its editor, Dominic Lawson, headlined the article which John Simpson wrote for him The Curious Confidence of Mr Major so that the magazine seemed to suggest that the Conservatives might pull off a surprise win. Simpson himself admitted to not having the slightest idea who would win, though it seemed more likely to him that Labour would. Yet he felt that John Major’s own apparent certainty was worth mentioning. When the results started to become clear on that Friday morning, 10 April, the Spectator stood out favourably from the shelves of newsagents, surrounded by even the late, or early editions of newspapers and magazines which had all been predicting a Labour victory.

003 (2)

The only politician possibly more disappointed than Neil Kinnock, who immediately left front-line politics, was Chris Patten, who had been the real magician behind Major’s remarkable victory. He lost his seat to the Liberals in marginal Bath and went off to become the final governor of Hong Kong ahead of the long-agreed handover of Britain’s last colony in 1997. Kinnock, a former long-term opponent of Britain’s membership of the EEC/ EC went off to Brussels to become a European Commissioner. Despite his triumph in the popular vote, never has such a famous victory produced so rotten an outcome for the victors. The smallness of Major’s majority meant that his authority could easily be eaten away in the Commons. As a consequence, he would not go down as a great leader in parliamentary posterity, though he remained popular in the country as a whole for some time, if not with the Thatcherites and Eurosceptic “bastards” in his own party.  Even Margaret Thatcher could not have carried through her revolutionary reforms after the 1979 and 1983 elections with the kind of parliamentary arithmetic which was dealt her successor. In Rugby terms, although the opposition’s three-quarters had been foiled by this artful dodger of a full-back, he had been dealt a ‘hospital pass’ by his own side. For the moment, he had control of the slippery ball, but he was soon to be forced back into series of crushing rucks and mauls among his own twenty-stone forwards.

 John Smith – Labour’s lost leader and his legacy:

002 (2)

After Neil Kinnock gave up the Labour leadership following his second electoral defeat in 1992, he was replaced by John Smith (pictured above), a placid, secure, self-confident Scottish lawyer. As Shadow Chancellor, he had been an effective cross-examiner of Nigel Lawson, John Major and Norman Lamont and had he not died of a heart attack in 1994, three years ahead of the next election, most political pundits agreed that, following the tarnishing of the Major administration in the mid-nineties, he would have become Prime Minister at that election. Had he done so, Britain would have had a traditional social democratic government, much like those of continental Europe. He came from a family of herring fishermen on the West Coast of Scotland, the son of a headmaster. Labour-supporting from early youth, bright and self-assured, he got his real political education at Glasgow University, part of a generation of brilliant student debaters from all parties who would go on to dominate Scottish and UK politics including, in due succession, Donald Dewar, Gordon Brown, Alistair Darling and Douglas Alexander. Back in the early sixties, Glasgow University Labour Club was a hotbed not of radicals, but of Gaitskell-supporting moderates. This was a position that Smith never wavered from, as he rose as one of the brightest stars of the Scottish party, and then through government under Wilson and Callaghan as a junior minister dealing with the oil industry and devolution before entering cabinet as President of the Board of Trade, its youngest member at just forty. In opposition, John Smith managed to steer clear of the worst in-fighting, eventually becoming Kinnock’s shadow chancellor. In Thatcher’s England, however, he was spotted as a tax-raising corporatist of the old school. One xenophobic letter he received brusquely informed him:

You’ll not get my BT shares yet, you bald, owl-looking Scottish bastard. Go back to Scotland and let that other twit Kinnock go back to Wales.

Smith came from an old-fashioned Christian egalitarian background which put him naturally out of sympathy with the hedonistic culture of southern England.  Just before he became Labour leader he told a newspaper he believed above all in education, because…

 … it opens the doors of the imagination, breaks down class barriers and frees people. In our family … money was looked down on and education was revered. I am still slightly contemptuous of money.

Smith was never personally close to Kinnock but was scrupulously loyal to him as his leader, he nevertheless succeeded him by a huge margin in 1992. By then he had already survived a serious cardiac arrest and had taken up hill-walking. Though Smith swiftly advanced the careers of his bright young lieutenants, Tony Blair and Gordon Brown, they soon became disappointed by his view that the Labour party needed simply to be improved, not radically transformed. In particular, he was reluctant to take on the party hierarchy and unions over issues of internal democracy, such as the introduction of a one-member, one-vote system for future leadership elections. He was sure that Labour could regain power with a revival of its traditional spirit. At one point, Tony Blair was so dispirited by Smith’s leadership style that he considered leaving politics altogether and going back to practising law. Smith died of a second heart attack on 12 May 1994. After the initial shock and grief subsided, Labour moved rapidly away from his policy of ‘gradualism’ towards ‘Blairite’ transformation. One part of his legacy still remains, however, shaping modern Britain today. As the minister who had struggled to achieve devolution for Scotland in 1978-9, he remained a passionate supporter of the ‘unfinished business’ of re-establishing the Holyrood Parliament and setting up a Welsh Assembly. With his friend Donal Dewar he had committed Labour so utterly to the idea in Opposition, despite Kinnock’s original strong anti-nationalist stance, that Blair, no great fan of devolution himself, found that he had to implement Smith’s unwelcome bequest to him.

Black Wednesday and the Maastricht Treaty:

The crisis that soon engulfed the Major government back in the early autumn of 1992 was a complicated economic one. From August 1992 to July 1996 I was mainly resident in Hungary, and so, although an economic historian, never really understood the immediate series of events that led to it or the effects that followed. This was still in pre-internet days, so I had little access to English language sources, except via my short-wave radio and intermittent newspapers bought during brief visits to Budapest. I had also spent most of 1990 and half of 1991 in Hungary, so there were also longer-term gaps in my understanding of these matters. I have written about them in earlier articles in this series, dealing with the end of the Thatcher years. Hungary itself was still using an unconvertible currency throughout the nineties, which only became seriously devalued in 1994-96, and when my income from my UK employers also fell in value, as a family we decided to move back to Britain to seek full-time sterling salaries. The first thing that happened was that they lost their fiscal policy in a single day when the pound fell out of the ERM (European Exchange Rate Mechanism). In his memoirs, John Major described the effect of this event in stark terms:

Black Wednesday – 16 September 1992, the day the pound toppled out of the ERM – was a political and economic calamity. It unleashed havoc in the Conservative Party and it changed the political landscape of Britain.

For Major and his government, the point was that as the German central bank had a deserved reputation for anti-inflationary rigour, having to follow or ‘shadow’ the mark meant that Britain had purchased a respected off-the-shelf policy. Sticking to the mighty mark was a useful signal to the rest of the world that this government, following all the inflationary booms of the seventies and eighties, was serious about controlling inflation. On the continent, however, the point of the ERM was entirely different, intended to lead to a strong new single currency that the countries of central Europe would want to join as members of an enlarged EC/EU. So a policy which Margaret Thatcher had earlier agreed to, in order to bring down British inflation, was now a policy she and her followers abhorred since it drew Britain ever closer towards a European superstate in the ‘Delors Plan’. This was a confused and conflicted state of affairs for most of the Tories, never mind British subjects at home and abroad.

The catalyst for sterling’s fall was the fall in the value of the dollar, pulling the pound down with it. Worse still, the money flowed into the Deutschmarks, which duly rose; so the British government raised interest rates to an eye-watering ten per cent, in order to lift the pound. When this failed to work, the next obvious step would have been for the German central bank to cut their interest rates, lowering the value of the mark and keeping the ERM formation intact. This would have helped the Italian lira and other weak currencies as well as the pound. But since Germany had just reunited after the ‘fall of the wall’, the whole cost of bringing the poorer East Germans into line with their richer compatriots in the West led to a real fear of renewed inflation as well as to memories of the Berlin Crisis of 1948-49 and the hyperinflation of the Weimar period. So the Germans, regardless of the pain being experienced by Britain, Italy and the rest, wanted to keep their high-value mark and their high interest rates. Major put considerable and concerted pressure on Chancellor Kohl, warning of the danger of the Maastricht treaty failing completely since the Danes had just rejected it in a referendum and the French were also having a plebiscite. None of this had any effect on Kohl who, like a previous German Chancellor, would not move.

002 (62)

In public, the British government stuck to the line that the pound would stay in the ERM at all costs. It was not simply a European ‘joint-venture’ mechanism but had been part of the anti-inflation policy of both the Lawson and Major chancellorships. Then, the now PM had told the unemployed workers and the repossessed homeowners in Britain that if it isn’t hurting, it isn’t working, so his credibility had been tied to the success of the ERM ever since. It had also been, as Foreign Secretary and now as Prime Minister, his foreign policy of placing Britain ‘at the heart of Europe’. It was his big idea for both economic and diplomatic survival in an increasingly ‘globalised’ environment. Norman Lamont, who as Chancellor was as committed as Major, told ‘the markets’ that Britain would neither leave the mechanism nor deviate from it by devaluing the pound. ERM membership was at the centre of our policy and there should not be one scintilla of doubt that it would continue. Major went even further, telling a Scottish audience that with inflation down to 3.7 per cent and falling, it would be madness to leave the ERM. He added that:

“The soft option, the devaluer’s option, the inflationary option, would be a betrayal of our future.”

However, then the crisis deepened with the lira crashing out of the ERM formation. International money traders, such as the Hungarian-born György Soros, began to turn their attention to the weak pound and carried on selling. They were betting that Major and Lamont would not keep interest rates so high that the pound could remain up there with the mark – an easy, one-way bet. In the real world, British interest rates were already painfully high. On the morning of ‘Black Wednesday’, at 11 a.m., the Bank of England raised them by another two points. This was to be agonising for home-owners and businesses alike, but Lamont said he would take whatever measures were necessary to keep the pound in the mechanism. Panic mounted and the selling continued: a shaken Lamont rushed round to tell Major that the interest rate hike had not worked, but Major and his key ministers decided to stay in the ERM. The Bank of England announced that interest would go up by a further three points, to fifteen per cent. Had it been sustained, this would have caused multiple bankruptcies across the country, but the third rise made no difference either. Eventually, at 4 p.m., Major phoned the Queen to tell her that he was recalling Parliament. At 7.30 p.m., Lamont left the Treasury to announce to the press and media in Whitehall that he was suspending sterling’s membership of the ERM and was reversing the day’s rise in interest rates.

Major considered resigning. It was the most humiliating moment in British politics since the IMF crisis of 1976, sixteen years earlier. But if he had done so Lamont would have had to go as well, leaving the country without its two most senior ministers in the midst of a terrible crisis. Major decided to stay on, though he was forever diminished by what had happened. Lamont also stayed at his post and was delighted as the economy began to respond to lower interest rates, and a slow recovery began. While others suffered further unemployment, repossession and bankruptcy, he was forever spotting the ‘green shoots’ of recovery. In the following months, Lamont created a new unified budget system and took tough decisions to repair the public finances. But as the country wearied of recession, he became an increasingly easy ‘butt’ of media derision. To Lamont’s complete surprise, Major sacked him as Chancellor a little over six months after Black Wednesday. Lamont retaliated in a Commons statement in which he said: We give the impression of being in office, but not in power. Major appointed Kenneth Clarke, one of the great characters of modern Conservatism, to replace him.

In the Commons, the struggle to ratify the Maastricht Treaty hailed as a great success for Major before the election, became a long and bloody one. Major’s small majority was more than wiped out by the number of ‘Maastricht rebels’, egged on by Lady Thatcher and Lord Tebbit. Black Wednesday had emboldened those who saw the ERM and every aspect of European federalism as disastrous for Britain. Major himself wrote in his memoirs that it turned …

… a quarter of a century of unease into a flat rejection of any wider involvement in Europe … emotional rivers burst their banks.

Most of the newspapers which had welcomed Maastricht were now just as vehemently against it. The most powerful Conservative voices in the media were hostile both to the treaty and to Major. His often leaded use of English and lack of ‘panache’ led many of England’s snobbish ‘High Tories’ to brand him shockingly ill-educated and third-rate as a national leader. A constantly shifting group of between forty to sixty Tory MPs regularly worked with the Labour opposition to defeat key parts of the Maastricht bill, so that Major’s day-to-day survival was always in doubt. Whenever, however, he called a vote of confidence and threatened his rebellious MPs with an election, he won. Whenever John Smith’s Labour Party and the Tory rebels could find some common cause, however thin, he was in danger of losing. In the end, Major got his legislation and Britain signed the Maastricht Treaty, but it came at an appalling personal and political cost. Talking in the general direction of an eavesdropping microphone, he spoke of three anti-European ‘bastards’ in his own cabinet, an obvious reference to Michael Portillo, Peter Lilley and John Redwood. The country watched a divided party tearing itself apart and was not impressed.

By the autumn of 1993, Norman Lamont was speaking openly about the possibility that Britain might have to leave the European Union altogether, and there were moves to force a national referendum. The next row was over the voting system to be used when the EU expanded. Forced to choose between a deal which weakened Britain’s hand and stopping the enlargement from happening at all by vetoing it, Foreign Secretary Douglas Hurd went for a compromise. All hell broke loose, as Tory MPs began talking of a leadership challenge to Major. This subsided, but battle broke out again over the European budget and fisheries policy. Eight MPs had their formal membership of the Tory Party withdrawn. By this point, John Smith’s sudden death had brought Tony Blair to the fore as leader of the Opposition. When Major readmitted the Tory rebels, Blair jibed: I lead my party, you follow yours. Unlike Lamont’s remark in the Commons, Blair’s comment struck a chord with the country.

The concluding chapter of the Thatcher Revolution:

While the central story of British politics in the seven years between the fall of Thatcher and the arrival to power of Blair was taken up by Europe, on the ‘home front’ the government tried to maintain the momentum of the Thatcher revolution. After many years of dithering, British Rail was divided up and privatised, as was the remaining coal industry. After the 1992 election, it was decided that over half the remaining coal mining jobs must go, in a closure programme of thirty-one pits to prepare the industry for privatization. This angered many Tory MPs who felt that the strike-breaking effect of the Nottinghamshire-based Union of Democratic Mineworkers in the previous decade deserved a better reward, and it aroused public protest as far afield as Cheltenham. Nevertheless, with power companies moving towards gas and oil, and the industrial muscle of the miners long-since broken, the closures and sales went ahead within the next two years, 1992-4. The economic effect on local communities was devastating, as the 1996 film Brassed Off shows vividly, with its memorable depiction of the social impact on the Yorkshire village of Grimethorpe and its famous Brass Band of the 1992 closure programme. Effectively, the only coalfields left working after this were those of North Warwickshire and South Derbyshire.

Interfering in the railway system became and remained a favourite ‘boys with toys’ hobby but a dangerous obsession of governments of different colours. Margaret Thatcher, not being a boy, knew that the railways were much too much part of the working life of millions to be lightly broken up or sold off. When Nicholas Ridley, as Transport Secretary, had suggested this, Thatcher is said to have replied:

“Railway privatisation will be the Waterloo of this government. Please never mention the railways to me again.”

It was taken up again enthusiastically by John Major. British Rail had become a national joke, loss-making, accident-prone, with elderly tracks and rolling stock, and serving curled-up sandwiches. But the challenge of selling off a system on which millions of people depended was obvious. Making it profitable would result in significant and unpopular fare rises and cuts in services. Moreover, different train companies could hardly compete with each other directly, racing up and down the same rails. There was, therefore, a binary choice between cutting up ‘BR’ geographically, selling off both trains and track for each region, so that the system would look much the way it was in the thirties, or the railway could be split ‘vertically’ so that the State would continue to own the track, while the stations and the trains would be owned by private companies. This latter solution was the one chosen by the government and a vast, complicated new system of subsidies, contracts, bids, pricing, cross-ticketing and regulation was created, but rather than keeping the track under public control, it too was to be sold off to a single private monopoly to be called Railtrack. Getting across the country would become a complicated proposition and transaction, involving two or three separate rail companies. A Franchise Director was to be given powers over the profits, timetables and ticket-pricing of the new companies, and a Rail Regulator would oversee the track. Both would report directly to the Secretary of State so that any public dissatisfaction, commercial problem or safety issue would ultimately be the responsibility of the government. This was a strange and pointless form of privatization which ended up costing the taxpayer far more than British Rail. The journalist Simon Jenkins concluded:

The Treasury’s treatment of the railway in the 1990s was probably the worst instance of Whitehall industrial management since the Second World War.

006 (2)

005

One success story in the rail network was the completion of the Channel Tunnel link to France in 1994 (the Folkestone terminal is pictured above), providing a good example of the inter-relationship between transport links and general economic development. The Kent town of Ashford had a relationship with the railways going back to 1842, and the closure of the town’s railway works between 1981 and 1993 did not, however, undermine the local economy. Instead, Ashford benefited from the Channel Tunnel rail link, which made use of railway lines running through the town, and its population actually grew by ten per cent in the 1990s. The completion of the ‘Chunnel’ gave the town an international catchment area of eighty-five million within a single day’s journey. The opening of the Ashford International railway station, the main terminal for the rail link to Europe, attracted a range of engineering, financial, distribution and manufacturing companies. In addition to the fourteen business parks that were opened in and around the town itself, four greenfield sites were opened on the outskirts, including a science park owned by Trinity College, Cambridge. As the map above shows, Ashford is now closer to Paris and Brussels in travelling time than it is to Manchester and Liverpool. By the end of the century, the town, with its position at the hub of a huge motorway network as well as its international rail link, was ready to become part of a truly international economy.

006

Many of the improvements in transport infrastructure on both islands of Britain and Ireland were the result of EU funding, especially in Northern Ireland, and it was also having an impact on transport planning in Britain, with projects in the Highlands and Islands. In 1993 the EU decided to create a European-wide transport network. Of the fourteen priority associated with this aim, three are based in Britain and Ireland – a rail link from Cork to Northern Ireland and the ferry route to Scotland; a road link from the Low Countries across England and Wales to Ireland, and the West Coast rail route in Britain.

As a Brixton man, Major had experienced unemployment and was well prepared to take on the arrogant and inefficient quality of much so-called public service. But under the iron grip of the Treasury, there was little prospect for a revival of local democracy to take charge of local services again. This left a highly bureaucratic centralism as the only option left, one which gained momentum in the Thatcher years. Under Major, the centralised Funding Agency for Schools was formed and schools in England and Wales were ranked by crude league tables, depending on how well their pupils did in exams. The university system was vastly expanded by simply allowing colleges and polytechnics to rename themselves as universities. The hospital system was further centralised and given a host of new targets. The police, faced with a review of their pay and demands by the Home Secretary, Kenneth Clarke for their forces to be amalgamated, were given their own performance league tables. The Tories had spent seventy-four per cent more, in real terms, on law and order since 1979, yet crime was at an all-time high. Clarke’s contempt for many of the forces as ‘vested interests’ was not calculated to win them round to reform. Across England and Wales elected councillors were turfed off police boards and replaced by businessmen. In 1993 Clarke, the old Tory dog who had clearly learned new tricks during his time at the Department of Health where he was said to have castrated the regional health authority chairmen, defended his new police league tables in the ‘newspeak’ of governments yet to come:

The new accountability that we seek from our public services will not be achieved simply because men of good will and reasonableness wish that it be so. The new accountability is the new radicalism.

Across Britain, from the auditing of local government to the running of courts and the working hours of nurses, an army of civil servants, accountants, auditors and inspectors marched into workplaces. From time to time, ministers would weakly blame Brussels for the imposition of the cult of central control and measurement. But this was mostly a home-grown ‘superstate’. Major called this centralising policy the ‘Citizen’s Charter’, ignoring the fact that Britons are ‘subjects’ rather than citizens. He himself did not like the ‘headline’ very much because of its unconscious echoes of Revolutionary France. Every part of the government dealing with public service was ordered to come up with proposals for improvement at ‘grass-roots level’, to be pursued from the centre by questionnaires, league tables and a system of awards, called ‘Charter Marks’ for organizations that achieved the required standards. He spoke of ’empowering’, ‘helping the customer’ and ‘devolving’ and thought that regulation from the centre would not last long, rather like a Marxist-Leninist anticipating the ‘withering away’ of the state. In his case, though, this would come about as the effects of growing competition are felt. In practice, of course, the regulators grew more powerful, not less so. Despite the rhetoric, public servants were not being given real freedom to manage. Elected office-holders were being sacked. Major’s ‘withering away’ of the state was no more successful than Lenin’s.

Britain and Ireland – first steps on the road to peace:

009Above: US President Bill Clinton addressing a peace rally in Belfast during his visit in 1995. Clinton played a significant role as a ‘peace broker’ in negotiations leading up to ‘the Good Friday Agreement’.

In December 1993, John Major stood outside the steel-armoured door of Number Ten Downing Street with the ‘Taoiseach’ of the Irish Republic, Albert Reynolds. He declared a new principle which offended many traditional Conservatives and Unionists. If both parts of Ireland voted to be reunited, Britain would not stand in the way. She had, said Major, no selfish strategic or economic interest in Northern Ireland. He also stated that if the Provisional IRA, which had lately bombed the very building Major was standing in front of and murdered two young boys in Cheshire, renounced violence, Sinn Fein could be recognised as a legitimate political party. In the run-up to this Downing Street Declaration, which some saw as a betrayal of the Tory Party’s long-held dedication to the Union of Great Britain and Northern Ireland, the government had been conducting ‘back channel’ negotiations with the terrorist organisation. In August 1994 the IRA finally declared a complete cessation of military operations which, though it stopped a long way short of renouncing the use of violence altogether, was widely welcomed and was followed a month later by a Loyalist ceasefire. A complicated choreography of three-strand talks, framework documents and discussions about the decommissioning of weapons followed, while on the streets, extortion, knee-capping and occasional ‘executions’ continued. But whereas the number of those killed in sectarian violence and bombings in 1993 had been eighty-four, the toll fell to sixty-one the following year, and in 1995 it was in single figures, at just nine deaths.

Long negotiations between London and Dublin led to cross-border arrangements. These negotiations had also involved the United States, where an influential pro-Irish lobby had helped to sustain the IRA campaign into the nineties through finance provided through ‘Noraid’. In the mid-nineties, President Clinton acted as a peace-broker, visiting Belfast in 1995 and helping to maintain the fragile cease-fire in the North. The contradictory demands of Irish Republicanism and Ulster Unionism meant that Major failed to get a final agreement, which was left to Tony Blair, with the ongoing help of the American ex-senator George Mitchell. The fact that in 1991 both countries had signed the Maastricht Treaty for closer political and economic unity in Europe, set a broader context for a bilateral agreement. However, while Irish political leaders eagerly embraced the idea of European integration, their British counterparts, as we have seen, remained deeply divided over it.

Economic decline/ growth & political resuscitation:

008

The closure of the Swan Hunter shipyard on the Tyne in May 1993 is an illuminating example of the impact of de-industrialisation. Swan Hunter was the last working shipyard in the region but had failed to secure a warship contract. An old, established firm, it was suffering some of the same long-term decline that decimated shipbuilding employment nationally to 26,000 by the end of a century. This devastated the local economy, especially as a bitter legal wrangle over redundancy payments left many former workers with no compensation whatever for the loss of what they had believed was employment for life. But the effects of de-industrialisation could spread much further than local communities. The closure of the shipyard, as shown in the map above, but the failure of the firm also had a ‘knock-on’ effect as suppliers as far afield as London and Glasgow lost valuable orders and, as a result, jobs.

004

By 1994, employment in manufacturing in Britain had fallen to four million from the nine million it had reached at its peak in 1966. The resulting mass unemployment hurt the older industries of the Northwest worst, but the losses were proportionately as high in the Southeast, reflecting the decline in newer manufacturing industry. Across most of Britain and Ireland, there was also a decline in the number of manufacturing jobs continuing into and throughout the 1990s. The service sector, however, expanded, and general levels of unemployment, especially in Britain, fell dramatically in the nineties. Financial services showed strong growth, particularly in such places as London’s Docklands, with its new ‘light railway’, and Edinburgh. By the late nineties, the financial industry was the largest employer in northern manufacturing towns and cities like Leeds, which grew rapidly throughout the decade, aided by its ability to offer a range of cultural facilities that helped to attract an array of UK company headquarters. Manchester, similarly, enjoyed a renaissance, particularly in the spheres of music, the media and sport.

In July 1995, tormented by yet more rumours of right-wing conspiracies against him, Major riposted with a theatrical gesture of his own, resigning as leader of the Conservative Party and inviting all-comers to take him on. He told journalists gathered in the Number Ten garden that it was “put up or shut up time”. If he lost he would resign as Prime Minister. If he won, he would expect the party to rally around him. This was a gamble, since other potential leaders were available, not least Michael Heseltine, who had become Deputy Prime Minister, and Michael Portillo, then the pin-up boy of the Thatcherites, whose supporters prepared a campaign headquarters for him, only for him to then decide against standing. In the event, the challenger was John Redwood, the Secretary of State for Wales and a highly intelligent anti-EU right-winger. Major won his fight, though 109 Tory MPs refused to back him.

Fighting the return of Fascism in Europe:

Major was also having to deal with the inter-ethnic wars breaking out in the former Yugoslavia, following the recognition of Slovenia, Croatia and Bosnia as independent states in the early nineties. The worst violence occurred during the Serbian assault on Bosnia (I have written about the bloody 1992-94 Siege of Sarajevo, its capital, in an article elsewhere on this site based on John Simpson’s reporting). The term ‘ethnic cleansing’ was used for the first time as woeful columns of refugees fled in different directions. A nightmare which Europeans thought was over in 1945 was returning, only a couple of days’ drive away from London and half a day’s drive from where I was living on the southern borders of Hungary with Serbia and Croatia.

Six years after the siege, during a school visit to the Hague, I sat in the courtroom of the International War Crimes Tribunal on the former Yugoslavia and listened, in horror, to the testimonies of those who had been imprisoned and tortured in concentration camps during the Bosnian War. I couldn’t believe that what I was hearing had happened in the final decade of the twentieth century in Europe. Those on trial at that time were the prison camp guards who had carried out the atrocities, claiming what had become known as the Nuremberg Defence. Later on, those giving the orders, both Mladko Radic and Radovan Karadzic (pictured below with John Simpson in 1993), the military and political leaders of the Bosnian Serbs, went on trial in the same courtroom, were convicted of war crimes and duly locked away, together with the former Serbian President, Slobodan Milosevic. Major had asked how many troops it would take to keep the warring three sides apart and was told the number was four hundred thousand, three times the total size of the British Army at that time. He sent 1,800 men to protect the humanitarian convoys that were rumbling south from the UN bases in Hungary.

001

Although many British people sent food parcels, warm clothes, medicine and blankets, loaded onto trucks and driven across the Croatian border and into Bosnia, many in the government were reluctant for Britain to become further involved. But the evening news bulletins showed pictures of starving refugees, the uncovered mass graves of civilians shot dead by death squads, and children with appalling injuries. There was a frenzied campaign for Western intervention, but President Clinton was determined not to risk the lives of American soldiers on the ground. Instead, he considered less costly alternatives, such as air strikes. This would have put others who were on the ground, including the British and other nationalities involved in the UN operation, directly into the line of retaliatory fire of the Serbian troops. When the NATO air-strikes began, the Serbs took the UN troops hostage, including British soldiers, who were then used as human shields. When the Serbs captured the town of Srebrenica and carried out a mass slaughter of its Muslim citizens, there were renewed calls for ‘boots on the ground’, but they never came.

Following three years of fighting, sanctions on Serbia and the success of the Croat Army in fighting back, a peace agreement was finally made in Dayton, Ohio. The UN convoys and troops left Hungary. Major became the first British Prime Minister of the post-War World to grapple with the question of what the proper role of the West should be to ‘regional’ conflicts such as the Balkan wars. They showed quite clearly both the dangers and the limitations of intervention. When a civil conflict is relayed in all its horror to tens of millions of voters every night by television, the pressure to ‘do something’ is intense.  But mostly this requires not air strikes but a full-scale ground force, which will then be drawn into the war itself. Then it must be followed by years of neo-colonial aid and rebuilding. Major and his colleagues were accused of moral cowardice and cynicism in allowing the revival of fascist behaviour in one corner of Europe. Yet, especially given the benefit of hindsight of what happened subsequently in Iraq and Afghanistan, perhaps Western leaders were right to be wary of full-scale intervention.

Back to basics?

For many British voters, the Major years were associated with the sad, petty and lurid personal scandals that attended so many of his ministers, after he made an unwise speech calling for the return as old-style morality. In fact, back to basics referred to almost everything except personal sexual morality; he spoke of public service, industry, sound money, free trade, traditional teaching, respect for the family and the law and the defeat of crime. It gave the press, however, a fail-safe headline charge of hypocrisy whenever ministers were caught out. A series of infidelities were exposed; children born out-of-wedlock, a death from a sex stunt which went wrong, rumours about Major’s own affairs (which later turned out to be truer than realised at the time). More seriously, there was also an inquiry as to whether Parliament had been misled over the sale of arms to Iraq, but these were all knitted together into a single pattern of misbehaviour, referred to as ‘sleaze’.

In 1996, a three-year inquiry into whether the government had allowed a trial to go ahead against directors of an arms company, Matrix Churchill, knowing that they were, in fact, acting inside privately accepted guidelines, resulted in two ministers being publicly criticised. It showed that the government had allowed a more relaxed régime of military-related exports to Saddam Hussein even after the horrific gassing of five thousand Kurds at Falluja, also revealing a culture of secrecy and double standards in the process. Neil Hamilton MP was accused of accepting cash from Mohammed al-Fayed, the owner of Harrods, for asking questions in the Commons. One of the most dramatic episodes in the 1997 election was the overwhelming defeat he suffered in his Tatton constituency by the former BBC war reporter, Martin Bell, who had been badly injured in Sarajevo who became Britain’s first independent MP for nearly fifty years. Jonathan Aitken, a Treasury minister was accused of accepting improper hospitality from an Arab business contact. He resigned to fight the Guardian over the claims, with the simple sword of truth and the trusty shield of fair play. He was found guilty of perjury, spending eighteen months in prison.

002

By the end of Major’s government, it seemed that the Tories might have learned the lesson that disagreements over the EU were capable of splitting their party. However, there was a general mood of contempt for politicians and the press, in particular, had lost any sense of deference. The reforms of the health service, police and schools had produced few significant improvements. The post-Cold War world was turning out to be nastier and less predictable than the early nineties days of the ‘peace dividend’ had promised. The Labour Opposition would, in due course, consider how the country might be better governed and reformed, as well as what would be the right British approach to peace-keeping and intervention now that the United States was the last superpower left standing. But in the early months of 1997,  Tony Blair and his fresh young ‘New Labour’ team, including Alistair Campbell (pictured above), were oiling their effective election-winning machine and moving in to roll over a tired-looking John Major and his tarnished old Tories.

Sources:

Andrew Marr (2008), A History of Modern Britain. Basingstoke: Pan-Macmillan.

Simon Schama (2018), A History of Britain, 1776-2000; The Fate of Empire. London: BBC Worldwide.

John Simpson (1999), Strange Places, Questionable People. Basingstoke: Pan-Macmillan.

Peter Caterall, Roger Middleton, John Swift (2001), The Penguin Atlas of British & Irish History. London: Penguin Books.

Posted October 17, 2018 by TeamBritanniaHu in Apocalypse, Arabs, Balkan Crises, Britain, British history, Britons, Brussels, Christian Faith, Christian Socialism, Christianity, Church, Coalfields, Cold War, devolution, Egalitarianism, Ethnic cleansing, Europe, European Economic Community, European Union, Family, France, Genocide, German Reunification, Germany, Gorbachev, Humanism, Hungary, Immigration, Ireland, Irish history & folklore, Italy, Journalism, Labour Party, manufacturing, Margaret Thatcher, Marxism, morality, National Health Service (NHS), Refugees, Revolution, Scotland, Security, terrorism, Thatcherism, Unemployment, Wales

Tagged with , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

%d bloggers like this: