Archive for the ‘Queen Elizabeth II’ Tag

Cold Shoulder or Warm Handshake?
On 29 March 2019, the United Kingdom of Great Britain and Northern Ireland will leave the European Union after forty-six years of membership, since it joined the European Economic Community on 1 January 1973 on the same day and hour as the Republic of Ireland. Yet in 1999, it looked as if the long-standing debate over Britain’s membership had been resolved. The Maastricht Treaty establishing the European Union had been signed by all the member states of the preceding European Community in February 1992 and was succeeded by a further treaty, signed in Amsterdam in 1999. What, then, has happened in the space of twenty years to so fundamentally change the ‘settled’ view of the British Parliament and people, bearing in mind that both Scotland and Northern Ireland voted to remain in the EU, while England and Wales both voted to leave? At the time of writing, the manner of our going has not yet been determined, but the invocation of ‘article fifty’ by the Westminster Parliament and the UK government means that the date has been set. So either we will have to leave without a deal, turning a cold shoulder to our erstwhile friends and allies on the continent, or we will finally ratify the deal agreed between the EU Commission, on behalf of the twenty-seven remaining member states, and leave with a warm handshake and most of our trading and cultural relations intact.

As yet, the possibility of a second referendum – or third, if we take into account the 1975 referendum, called by Harold Wilson (above) which was also a binary leave/ remain decision – seems remote. In any event, it is quite likely that the result would be the same and would kill off any opportunity of the UK returning to EU membership for at least another generation. As Ian Fleming’s James Bond tells us, ‘you only live twice’. That certainly seems to be the mood in Brussels too. I was too young to vote in 1975 by just five days, and another membership referendum would be unlikely to occur in my lifetime. So much has been said about following ‘the will of the people’, or at least 52% of them, that it would be a foolish government, in an age of rampant populism, that chose to revoke article fifty, even if Westminster voted for this. At the same time, and in that same populist age, we know from recent experience that in politics and international relations, nothing is inevitable…
![referendum-ballot-box[1]](https://chandlerozconsultants.files.wordpress.com/2016/06/referendum-ballot-box11.jpg?w=328&h=185)
One of the major factors in the 2016 Referendum Campaign was the country’s public spending priorities, compared with those of the European Union. The ‘Leave’ campaign sent a double-decker bus around England stating that by ending the UK’s payments into the EU, more than 350 million pounds per week could be redirected to the National Health Service (NHS).
A British Icon Revived – The NHS under New Labour:
To understand the power of this statement, it is important to recognise that the NHS is unique in Europe in that it is wholly funded from direct taxation, and not via National Insurance, as in many other European countries. As a service created in 1948 to be ‘free at the point of delivery’, it is seen as a ‘British icon’ and funding has been a central issue in national election campaigns since 2001, when Tony Blair was confronted by an irate voter, Sharon Storer, outside a hospital. In its first election manifesto of 1997, ‘New Labour’ promised to safeguard the basic principles of the NHS, which we founded. The ‘we’ here was the post-war Labour government, whose socialist Health Minister, Aneurin Bevan, had established the service in the teeth of considerable opposition from within both parliament and the medical profession. ‘New Labour’ protested that under the Tories there had been fifty thousand fewer nurses but a rise of no fewer than twenty thousand managers – red tape which Labour would pull away and burn. Though critical of the internal markets the Tories had introduced, Blair promised to keep a split between those who commissioned health services and those who provided them.

Under Frank Dobson, Labour’s new Health Secretary, there was little reform of the NHS but there was, year by year, just enough extra money to stave off the winter crises. But then a series of tragic individual cases hit the headlines, and one of them came from a Labour peer and well-known medical scientist and fertility expert, Professor Robert Winston, who was greatly admired by Tony Blair. He launched a furious denunciation of the government over the treatment of his elderly mother. Far from upholding the NHS’s iconic status, Winston said that Britain’s health service was the worst in Europe and was getting worse under the New Labour government, which was being deceitful about the true picture. Labour’s polling on the issue showed that Winston was, in general terms, correct in his assessment in the view of the country as a whole. In January 2000, therefore, Blair announced directly to it that he would bring Britain’s health spending up to the European average within five years. That was a huge promise because it meant spending a third as much again in real terms, and his ‘prudent’ Chancellor of the Exchequer, Gordon Brown, was unhappy that Blair had not spoken enough on television about the need for health service reform to accompany the money, and had also ‘stolen’ his budget announcements. On Budget day itself, Brown announced that until 2004 health spending would rise at above six per cent beyond inflation every year, …
… by far the largest sustained increase in NHS funding in any period in its fifty-year history … half as much again for health care for every family in this country.
The tilt away from Brown’s sharp spending controls during the first three years of the New Labour government had begun by the first spring of the new millennium, and there was more to come. With a general election looming in 2001, Brown also announced a review of the NHS and its future by a former banker. As soon as the election was over, broad hints about necessary tax rises were dropped. When the Wanless Report was finally published, it confirmed much that the winter crisis of 1999-2000 had exposed. The NHS was not, whatever Britons fondly believed, better than health systems in other developed countries, and it needed a lot more money. ‘Wanless’ also rejected a radical change in funding, such as a switch to insurance-based or semi-private health care. Brown immediately used this as objective proof that taxes had to rise in order to save the NHS. In his next budget of 2002, Brown broke with a political convention that which had reigned since the mid-eighties, that direct taxes would not be raised again. He raised a special one per cent national insurance levy, equivalent to a penny on income tax, to fund the huge reinvestment in Britain’s health.
Public spending shot up with this commitment and, in some ways, it paid off, since by 2006 there were around 300,000 extra NHS staff compared to 1997. That included more than ten thousand extra senior hospital doctors (about a quarter more) and 85,000 more nurses. But there were also nearly forty thousand managers, twice as many as Blair and Brown had ridiculed the Tory government for hiring. An ambitious computer project for the whole NHS became an expensive catastrophe. Meanwhile, the health service budget rose from thirty-seven billion to more than ninety-two billion a year. But the investment produced results, with waiting lists, a source of great public anger from the mid-nineties, falling by 200,000. By 2005, Blair was able to talk of the best waiting list figures since 1988. Hardly anyone was left waiting for an inpatient appointment for more than six months. Death rates from cancer for people under the age of seventy-five fell by 15.7 per cent between 1996 and 2006 and death rates from heart disease fell by just under thirty-six per cent. Meanwhile, the public finance initiative meant that new hospitals were being built around the country. But, unfortunately for New Labour, that was not the whole story of the Health Service under their stewardship. As Andrew Marr has attested,
…’Czars’, quangos, agencies, commissions, access teams and planners hunched over the NHS as Whitehall, having promised to devolve power, now imposed a new round of mind-dazing control.
By the autumn of 2004 hospitals were subject to more than a hundred inspections. War broke out between Brown and the Treasury and the ‘Blairite’ Health Secretary, Alan Milburn, about the basic principles of running the hospitals. Milburn wanted more competition between them, but Brown didn’t see how this was possible when most people had only one major local hospital. Polling suggested that he was making a popular point. Most people simply wanted better hospitals, not more choice. A truce was eventually declared with the establishment of a small number of independent, ‘foundation’ hospitals. By the 2005 general election, Michael Howard’s Conservatives were attacking Labour for wasting money and allowing people’s lives to be put at risk in dirty, badly run hospitals. Just like Labour once had, they were promising to cut bureaucracy and the number of organisations within the NHS. By the summer of 2006, despite the huge injection of funds, the Service was facing a cash crisis. Although the shortfall was not huge as a percentage of the total budget, trusts in some of the most vulnerable parts of the country were on the edge of bankruptcy, from Hartlepool to Cornwall and across to London. Throughout Britain, seven thousand jobs had gone and the Royal College of Nursing, the professional association to which most nurses belonged, was predicting thirteen thousand more would go soon. Many newly and expensively qualified doctors and even specialist consultants could not find work. It seemed that wage costs, expensive new drugs, poor management and the money poured into endless bureaucratic reforms had resulted in a still inadequate service. Bupa, the leading private operator, had been covering some 2.3 million people in 1999. Six years later, the figure was more than eight million. This partly reflected greater affluence, but it was also hardly a resounding vote of confidence in Labour’s management of the NHS.
Public Spending, Declining Regions & Economic Development:
As public spending had begun to flow during the second Blair administration, vast amounts of money had gone in pay rises, new bureaucracies and on bills for outside consultants. Ministries had been unused to spending again, after the initial period of ‘prudence’, and did not always do it well. Brown and his Treasury team resorted to double and triple counting of early spending increases in order to give the impression they were doing more for hospitals, schools and transport than they actually could. As Marr has pointed out, …
… In trying to achieve better policing, more effective planning, healthier school food, prettier town centres and a hundred other hopes, the centre of government ordered and cajoled, hassled and harangued, always high-minded, always speaking for ‘the people’.
The railways, after yet another disaster, were shaken up again. In very controversial circumstances Railtrack, the once-profitable monopoly company operating the lines, was driven to bankruptcy and a new system of Whitehall control was imposed. At one point, Tony Blair boasted of having five hundred targets for the public sector. Parish councils, small businesses and charities found that they were loaded with directives. Schools and hospitals had many more. Marr has commented, …
The interference was always well-meant but it clogged up the arteries of free decision-taking and frustrated responsible public life.

Throughout the New Labour years, with steady growth and low inflation, most of the country grew richer. Growth since 1997, at 2.8 per cent per year, was above the post-war average, GDP per head was above that of France and Germany and the country had the second lowest jobless figures in the EU. The number of people in work increased by 2.4 million. Incomes grew, in real terms, by about a fifth. Pensions were in trouble, but house price inflation soured, so the owners found their properties more than doubling in value and came to think of themselves as prosperous. By 2006 analysts were assessing the disposable wealth of the British at forty thousand pounds per household. However, the wealth was not spread geographically, averaging sixty-eight thousand in the south-east of England, but a little over thirty thousand in Wales and north-east England (see map above). But even in the historically poorer parts of the UK house prices had risen fast, so much so that government plans to bulldoze worthless northern terraces had to be abandoned when they started to regain value. Cheap mortgages, easy borrowing and high property prices meant that millions of people felt far better off, despite the overall rise in the tax burden. Cheap air travel gave the British opportunities for easy travel both to traditional resorts and also to every part of the European continent. British expatriates were able to buy properties across the French countryside and in southern Spain. Some even began to commute weekly to jobs in London or Manchester from Mediterranean villas, and regional airports boomed as a result.
The internet, also known as the ‘World-Wide Web’, which was ‘invented’ by the British computer scientist Tim Berners-Lee at the end of 1989 (pictured right in 2014), was advancing from the colleges and institutions into everyday life by the mid- ‘noughties’. It first began to attract popular interest in the mid-nineties: Britain’s first internet café and magazine, reviewing a few hundred early websites, were both launched in 1994. The following year saw the beginning of internet shopping as a major pastime, with both ‘eBay’ and ‘Amazon’ arriving, though to begin with they only attracted tiny numbers of people.
But the introduction of new forms of mail-order and ‘click and collect’ shopping quickly attracted significant adherents from different ‘demographics’. The growth of the internet led to a feeling of optimism, despite warnings that the whole digital world would collapse because of the inability of computers to cope with the last two digits in the year ‘2000’, which were taken seriously at the time. In fact, the ‘dot-com’ bubble was burst by its own excessive expansion, as with any bubble, and following a pause and a lot of ruined dreams, the ‘new economy’ roared on again. By 2000, according to the Office of National Statistics (ONS), around forty per cent of Britons had accessed the internet at some time. Three years later, nearly half of British homes were ‘online’. By 2004, the spread of ‘broadband’ connections had brought a new mass market in ‘downloading’ music and video. By 2006, three-quarters of British children had internet access at home.

Simultaneously, the rich of America, Europe and Russia began buying up parts of London, and then other ‘attractive’ parts of the country, including Edinburgh, the Scottish Highlands, Yorkshire and Cornwall. ‘Executive housing’ with pebbled driveways, brick facing and dormer windows, was growing across farmland and by rivers with no thought of flood-plain constraints. Parts of the country far from London, such as the English south-west and Yorkshire, enjoyed a ripple of wealth that pushed their house prices to unheard-of levels. From Leith to Gateshead, Belfast to Cardiff Bay, once-derelict shorefront areas were transformed. The nineteenth-century buildings in the Albert Dock in Liverpool (above) now house a maritime museum, an art gallery, shopping centre and television studio. It has also become a tourist attraction. For all the problems and disappointments, and the longer-term problems with their financing, new schools and public buildings sprang up – new museums, galleries, vast shopping complexes (see below), corporate headquarters in a biomorphic architecture of glass and steel, more imaginative and better-looking than their predecessors from the dreary age of concrete.

Supermarket chains exercised huge market power, offering cheap meat and dairy products into almost everyone’s budgets. Factory-made ready-meals were transported and imported by the new global air freight market and refrigerated trucks and lorries moving freely across a Europe shorn of internal barriers. Out-of-season fruit and vegetables, fish from the Pacific, exotic foods of all kinds and freshly cut flowers appeared in superstores everywhere. Hardly anyone was out of reach of a ‘Tesco’, a ‘Morrison’s’, a ‘Sainsbury’s’ or an ‘Asda’. By the mid-noughties, the four supermarket giants owned more than 1,500 superstores throughout the UK. They spread the consumption of goods that in the eighties and nineties had seemed like luxuries. Students had to take out loans in order to go to university but were far more likely to do so than previous generations, as well as to travel more widely on a ‘gap’ year, not just to study or work abroad.
Those ‘Left Behind’ – Poverty, Pensions & Public Order:
Materially, for the majority of people, this was, to use Marr’s term, a ‘golden age’, which perhaps helps to explain both why earlier real anger about earlier pension decisions and stealth taxes did not translate into anti-Labour voting in successive general elections. The irony is that in pleasing ‘Middle Englanders’, the Blair-Brown government lost contact with traditional Labour voters, especially in the North of Britain, who did not benefit from these ‘golden years’ to the same extent. Gordon Brown, from the first, made much of New Labour’s anti-poverty agenda, and especially child poverty. Since the launch of the Child Poverty Action Group, this latter problem had become particularly emotive. Labour policies took a million children out of relative poverty between 1997 and 2004, though the numbers rose again later. Brown’s emphasis was on the working poor and the virtue of work. So his major innovations were the national minimum wage, the ‘New Deal’ for the young unemployed, and the working families’ tax credit, as well as tax credits aimed at children. There was also a minimum income guarantee and a later pension credit, for poorer pensioners.
The minimum wage was first set at three pounds sixty an hour, rising year by year. In 2006 it was 5.35 an hour. Because the figures were low, it did not destroy the two million jobs as the Tories claimed it would. Neither did it produce higher inflation; employment continued to grow while inflation remained low. It even seemed to have cut red tape. By the mid-noughties, the minimum wage covered two million people, the majority of them women. Because it was updated ahead of rises in inflation rates, the wages of the poor also rose faster. It was so successful that even the Tories were forced to embrace it ahead of the 2005 election. The New Deal was funded by a windfall tax on privatised utility companies, and by 2000 Blair said it had helped a quarter of a million young people back into work, and it was being claimed as a major factor in lower rates of unemployment as late as 2005. But the National Audit Office, looking back on its effect in the first parliament, reckoned the number of under twenty-five-year-olds helped into real jobs was as low as 25,000, at a cost per person of eight thousand pounds. A second initiative was targeted at the babies and toddlers of the most deprived families. ‘Sure Start’ was meant to bring mothers together in family centres across Britain – 3,500 were planned for 2010, ten years after the scheme had been launched – and to help them to become more effective parents. However, some of the most deprived families failed to show up. As Andrew Marr wrote, back in 2007:
Poverty is hard to define, easy to smell. In a country like Britain, it is mostly relative. Though there are a few thousand people living rough or who genuinely do not have enough to keep them decently alive, and many more pensioners frightened of how they will pay for heating, the greater number of poor are those left behind the general material improvement in life. This is measured by income compared to the average and by this yardstick in 1997 there were three to four million children living in households of relative poverty, triple the number in 1979. This does not mean they were physically worse off than the children of the late seventies, since the country generally became much richer. But human happiness relates to how we see ourselves relative to those around us, so it was certainly real.
The Tories, now under new management in the shape of a media-marketing executive and old Etonian, David Cameron, also declared that they believed in this concept of relative poverty. After all, it was on their watch, during the Thatcher and Major governments, that it had tripled, which is why it was only towards the end of the New Labour governments that they could accept the definition of the left-of-centre Guardian columnist, Polly Toynbee. A world of ‘black economy’ work also remained below the minimum wage, in private care homes, where migrant servants were exploited, and in other nooks and crannies. Some 336,000 jobs remained on ‘poverty pay’ rates. Yet ‘redistribution of wealth’, a socialist phrase which had become unfashionable under New Labour lest it should scare away middle Englanders, was stronger in Brown’s Britain than in other major industrialised nations. Despite the growth of the super-rich, many of whom were immigrants anyway, overall equality increased in these years. One factor in this was the return to the means-testing of benefits, particularly for pensioners and through the working families’ tax credit, subsequently divided into a child tax credit and a working tax credit. This was a U-turn by Gordon Brown, who had opposed means-testing when in Opposition. As Chancellor, he concluded that if he was to direct scarce resources at those in real poverty, he had little choice.
Apart from the demoralising effect it had on pensioners, the other drawback to means-testing was that a huge bureaucracy was needed to track people’s earnings and to try to establish exactly what they should be getting in benefits. Billions were overpaid and as people did better and earned more from more stable employment, they then found themselves facing huge demands to hand back the money they had already spent. Thousands of extra civil servants were needed to deal with the subsequent complaints and the scheme became extremely expensive to administer. There were also controversial drives to oblige more disabled people back to work, and the ‘socially excluded’ were confronted by a range of initiatives designed to make them more middle class. Compared with Mrs Thatcher’s Victorian Values and Mr Major’s Back to Basics campaigns, Labour was supposed to be non-judgemental about individual behaviour. But a form of moralism did begin to reassert itself. Parenting classes were sometimes mandated through the courts and for the minority who made life hell for their neighbours on housing estates, Labour introduced the Anti-Social Behaviour Order (‘Asbo’). These were first given out in 1998, granted by magistrates to either the police or the local council. It became a criminal offence to break the curfew or other sanction, which could be highly specific. Asbos could be given out for swearing at others in the street, harassing passers-by, vandalism, making too much noise, graffiti, organising ‘raves’, flyposting, taking drugs, sniffing glue, joyriding, prostitution, hitting people and drinking in public.

Although they served a useful purpose in many cases, there were fears that for the really rough elements in society and their tough children they became a badge of honour. Since breaking an Asbo could result in an automatic prison sentence, people were sent to jail for crimes that had not warranted this before. But as they were refined in use and strengthened, they became more effective and routine. By 2007, seven and a half thousand had been given out in England and Wales alone and Scotland had introduced its own version in 2004. Some civil liberties campaigners saw this development as part of a wider authoritarian and surveillance agenda which also led to the widespread use of CCTV (Closed Circuit Television) cameras by the police and private security guards, especially in town centres (see above). Also in 2007, it was estimated that the British were being observed and recorded by 4.2 million such cameras. That amounted to one camera for every fourteen people, a higher ratio than for any other country in the world, with the possible exception of China. In addition, the number of mobile phones was already equivalent to the number of people in Britain. With global satellite positioning chips (GPS) these could show exactly where their users were and the use of such systems in cars and even out on the moors meant that Britons were losing their age-old prowess for map-reading.


The ‘Seven Seven’ Bombings – The Home-grown ‘Jihadis’:
Despite these increasing means of mass surveillance, Britain’s cities have remained vulnerable to terrorist attacks, more recently by so-called ‘Islamic terrorists’ rather than by the Provisional IRA, who abandoned their bombing campaign in 1998. On 7 July 2005, at rush-hour, four young Muslim men from West Yorkshire and Buckinghamshire, murdered fifty-two people and injured 770 others by blowing themselves up on London Underground trains and on a London bus. The report into this worst such attack in Britain later concluded that they were not part of an al Qaeda cell, though two of them had visited camps in Pakistan, and that the rucksack bombs had been constructed at the cost of a few hundred pounds. Despite the government’s insistence that the war in Iraq had not made Britain more of a target for terrorism, the Home Office investigation asserted that the four had been motivated, in part at least, by ‘British foreign policy’.
They had picked up the information they needed for the attack from the internet. It was a particularly grotesque attack, because of the terrifying and bloody conditions in the underground tunnels and it vividly reminded the country that it was as much a target as the United States or Spain. Indeed, the long-standing and intimate relationship between Great Britain and Pakistan, with constant and heavy air traffic between them, provoked fears that the British would prove uniquely vulnerable. Tony Blair heard of the attack at the most poignant time, just following London’s great success in winning the bid to host the 2012 Olympic Games (see above). The ‘Seven Seven’ bombings are unlikely to have been stopped by CCTV surveillance, of which there was plenty at the tube stations, nor by ID cards (which had recently been under discussion), since the killers were British subjects, nor by financial surveillance, since little money was involved and the materials were paid for in cash. Even better intelligence might have helped, but the Security Services, both ‘MI5’ and ‘MI6’ as they are known, were already in receipt of huge increases in their budgets, as they were in the process of tracking down other murderous cells. In 2005, police arrested suspects in Birmingham, High Wycombe and Walthamstow, in east London, believing there was a plot to blow up as many as ten passenger aircraft over the Atlantic.
After many years of allowing dissident clerics and activists from the Middle East asylum in London, Britain had more than its share of inflammatory and dangerous extremists, who admired al Qaeda and preached violent jihad. Once 11 September 2001 had changed the climate, new laws were introduced to allow the detention without trial of foreigners suspected of being involved in supporting or fomenting terrorism. They could not be deported because human rights legislation forbade sending back anyone to countries where they might face torture. Seventeen were picked up and held at Belmarsh high-security prison. But in December 2004, the House of Lords ruled that these detentions were discriminatory and disproportionate, and therefore illegal. Five weeks later, the Home Secretary Charles Clarke hit back with ‘control orders’ to limit the movement of men he could not prosecute or deport. These orders would also be used against home-grown terror suspects. A month later, in February 2005, sixty Labour MPs rebelled against these powers too, and the government only narrowly survived the vote. In April 2006 a judge ruled that the control orders were an affront to justice because they gave the Home Secretary, a politician, too much power. Two months later, the same judge ruled that curfew orders of eighteen hours per day on six Iraqis were a deprivation of liberty and also illegal. The new Home Secretary, John Reid, lost his appeal and had to loosen the orders.

Britain found itself in a struggle between its old laws and liberties and a new, borderless world in which the hallowed principles of ‘habeas corpus’, free speech, a presumption of innocence, asylum, the right of British subjects to travel freely in their own country without identifying papers, and the sanctity of homes in which the law-abiding lived were all coming under increasing jeopardy. The new political powers seemed to government ministers the least that they needed to deal with a threat that might last for another thirty years in order, paradoxically, to secure Britain’s liberties for the long-term beyond that. They were sure that most British people agreed, and that the judiciary, media, civil rights campaigners and elected politicians who protested were an ultra-liberal minority. Tony Blair, John Reid and Jack Straw were emphatic about this, and it was left to liberal Conservatives and the Liberal Democrats to mount the barricades in defence of civil liberties. Andrew Marr conceded at the time that the New Labour ministers were ‘probably right’. With the benefit of hindsight, others will probably agree. As Gordon Brown eyed the premiership, his rhetoric was similarly tough, but as Blair was forced to turn to the ‘war on terror’ and Iraq, he failed to concentrate enough on domestic policy. By 2005, neither of them could be bothered to disguise their mutual enmity, as pictured above. A gap seemed to open up between Blair’s enthusiasm for market ideas in the reform of health and schools, and Brown’s determination to deliver better lives for the working poor. Brown was also keen on bringing private capital into public services, but there was a difference in emphasis which both men played up. Blair claimed that the New Labour government was best when we are at our boldest. But Brown retorted that it was best when we are Labour.

Tony Blair’s legacy continued to be paraded on the streets of Britain,
here blaming him and George Bush for the rise of ‘Islamic State’ in Iraq.
Asylum Seekers, EU ‘Guest’ Workers & Immigrants:
One result of the long Iraqi conflict, which President Bush finally declared to be over on 1 May 2003, was the arrival of many Iraqi asylum-seekers in Britain; Kurds, as well as Shiites and Sunnis. This attracted little comment at the time because there had been both Iraqi and Iranian refugees in Britain since the 1970s, especially as students and the fresh influx were only a small part of a much larger migration into the country which changed it fundamentally during the Blair years. This was a multi-lingual migration, including many Poles, some Hungarians and other Eastern Europeans whose countries had joined the EU and its single market in 2004. When the EU expanded Britain decided that, unlike France or Germany, it would not try to delay opening the country to migrant workers. The accession treaties gave nationals from these countries the right to freedom of movement and settlement, and with average earnings three times higher in the UK, this was a benefit which the Eastern Europeans were keen to take advantage of. Some member states, however, exercised their right to ‘derogation’ from the treaties, whereby they would only permit migrant workers to be employed if employers were unable to find a local candidate. In terms of European Union legislation, a derogation or that a member state has opted not to enforce a specific provision in a treaty due to internal circumstances (typically a state of emergency), and to delay full implementation of the treaty for five years. The UK decided not to exercise this option.
There were also sizeable inflows of western Europeans, though these were mostly students, who (somewhat controversially) were also counted in the immigration statistics, and young professionals with multi-national companies. At the same time, there was continued immigration from Africa, the Middle East and Afghanistan, as well as from Russia, Australia, South Africa and North America. In 2005, according to the Office for National Statistics, ‘immigrants’ were arriving to live in Britain at the rate of 1,500 a day. Since Tony Blair had been in power, more than 1.3 million had arrived. By the mid-2000s, English was no longer the first language of half the primary school children in London, and the capital had more than 350 different first languages. Five years later, the same could be said of many towns in Kent and other Eastern counties of England.
The poorer of the new migrant groups were almost entirely unrepresented in politics, but radically changed the sights, sounds and scents of urban Britain, and even some of its market towns. The veiled women of the Muslim world or its more traditionalist Arab, Afghan and Pakistani quarters became common sights on the streets, from Kent to Scotland and across to South Wales. Polish tradesmen, fruit-pickers and factory workers were soon followed by shops owned by Poles or stocking Polish and East European delicacies and selling Polish newspapers and magazines. Even road signs appeared in Polish, though in Kent these were mainly put in place along trucking routes used by Polish drivers, where for many years signs had been in French and German, a recognition of the employment changes in the long-distance haulage industry. Even as far north as Cheshire (see below), these were put in place to help monolingual truckers using trunk roads, rather than local Polish residents, most of whom had enough English to understand such signs either upon arrival or shortly afterwards. Although specialist classes in English had to be laid on in schools and community centres, there was little evidence that the impact of multi-lingual migrants had a long-term impact on local children and wider communities. In fact, schools were soon reporting a positive impact in terms of their attitudes toward learning and in improving general educational standards.

Problems were posed, however, by the operations of people smugglers and criminal gangs. Chinese villagers were involved in a particular tragedy when nineteen of them were caught while cockle-picking in Morecambe Bay by the notorious tides and drowned. Many more were working for ‘gang-masters’ as virtual, in some cases actual ‘slaves’. Russian voices became common on the London Underground, and among prostitutes on the streets. The British Isles found themselves to be ‘islands in the stream’ of international migration, the chosen ‘sceptred isle’ destinations of millions of newcomers. Unlike Germany, Britain was no longer a dominant manufacturing country but had rather become, by the late twentieth century, a popular place to develop digital and financial products and services. Together with the United States and against the Soviet Union, it was determined to preserve a system of representative democracy and the free market. Within the EU, Britain maintained its earlier determination to resist the Franco-German federalist model, with its ‘social chapter’ involving ever tighter controls over international corporations and ever closer political union. Britain had always gone out into the world. Now, increasingly, the world came to Britain, whether poor immigrants, rich corporations or Chinese manufacturers.

Multilingual & Multicultural Britain:
Immigration had always been a constant factor in British life, now it was also a fact of life which Europe and the whole world had to come to terms with. Earlier post-war migrations to Britain had provoked a racialist backlash, riots, the rise of extreme right-wing organisations and a series of new laws aimed at controlling it. New laws had been passed to control both immigration from the Commonwealth and the backlash to it. The later migrations were controversial in different ways. The ‘Windrush’ arrivals from the Caribbean and those from the Indian subcontinent were people who looked different but who spoke the same language and in many ways had had a similar education to that of the ‘native’ British. Many of the later migrants from Eastern Europe looked similar to the white British but shared little by way of a common linguistic and cultural background. However, it’s not entirely true to suggest, as Andrew Marr seems to, that they did not have a shared history. Certainly, through no fault of their own, the Eastern Europeans had been cut off from their western counterparts by their absorption into the Soviet Russian Empire after the Second World War, but in the first half of the century, Poland had helped the British Empire to subdue its greatest rival, Germany, as had most of the peoples of the former Yugoslavia. Even during the Soviet ‘occupation’ of these countries, many of their citizens had found refuge in Britain.
Moreover, by the early 1990s, Britain had already become both a multilingual nation. In 1991, Safder Alladina and Viv Edwards published a book for the Longman Linguistics Library which detailed the Hungarian, Lithuanian, Polish, Ukrainian and Yiddish speech communities of previous generations. Growing up in Birmingham, I certainly heard many Polish, Yiddish, Yugoslav and Greek accents among my neighbours and parents of school friends, at least as often as I heard Welsh, Irish, Caribbean, Indian and Pakistani accents. The Longman book begins with a foreword by Debi Prasanna Pattanayak in which she stated that the Language Census of 1987 had shown that there were 172 different languages spoken by children in the schools of the Inner London Education Authority. In an interesting precursor of the controversy to come, she related how the reaction in many quarters was stunned disbelief, and how one British educationalist had told her that England had become a third world country. She commented:
After believing in the supremacy of English as the universal language, it was difficult to acknowledge that the UK was now one of the greatest immigrant nations of the modern world. It was also hard to see that the current plurality is based on a continuity of heritage. … Britain is on the crossroads. It can take an isolationist stance in relation to its internal cultural environment. It can create a resilient society by trusting its citizens to be British not only in political but in cultural terms. The first road will mean severing dialogue with the many heritages which have made the country fertile. The second road would be working together with cultural harmony for the betterment of the country. Sharing and participation would ensure not only political but cultural democracy. The choice is between mediocrity and creativity.

Language and dialect in the British Isles, showing the linguistic diversity in many English cities by 1991 as a result of Commonwealth immigration as well as the survival and revival of many of the older Celtic languages and dialects of English.
Such ‘liberal’, ‘multi-cultural’ views may be unfashionable now, more than a quarter of a century later, but it is perhaps worth stopping to look back on that cultural crossroads, and on whether we are now back at that same crossroads, or have arrived at another one. By the 1990s, the multilingual setting in which new Englishes evolved had become far more diverse than it had been in the 1940s, due to immigration from the Indian subcontinent, the Caribbean, the Far East, and West and East Africa. The largest of the ‘community languages’ was Punjabi, with over half a million speakers, but there were also substantial communities of Gujurati speakers (perhaps a third of a million) and a hundred thousand Bengali speakers. In some areas, such as East London, public signs and notices recognise this (see below). Bengali-speaking children formed the most recent and largest linguistic minority within the ILEA and because the majority of them had been born in Bangladesh, they were inevitably in the greatest need of language support within the schools. A new level of linguistic and cultural diversity was introduced through Commonwealth immigration.


Birmingham’s booming postwar economy attracted West Indian settlers from Jamaica, Barbados and St Kitts in the 1950s. By 1971, the South Asian and West Indian populations were equal in size and concentrated in the inner city wards of North and Central Birmingham (see the map above). After the hostility towards New Commonwealth immigrants in some sections of the local White populations in the 1960s and ’70s, they had become more established in cities like Birmingham, where places of worship, ethnic groceries, butchers and, perhaps most significantly, ‘balti’ restaurants, began to proliferate in the 1980s and ’90s. The settlers materially changed the cultural and social life of the city, most of the ‘white’ population believing that these changes were for the better. By 1991, Pakistanis had overtaken West Indians and Indians to become the largest single ethnic minority in Birmingham. The concentration of West Indian and South Asian British people in the inner city areas changed little by the end of the century, though there was an evident flight to the suburbs by Indians. As well as being poorly-paid, the factory work available to South Asian immigrants like the man in a Bradford textile factory below, was unskilled. By the early nineties, the decline of the textile industry over the previous two decades had let to high long-term unemployment in the immigrant communities in the Northern towns, leading to serious social problems.

Nor is it entirely true to suggest that, as referred to above, Caribbean arrivals in Britain faced few linguistic obstacles integrating themselves into British life from the late 1940s to the late 1980s. By the end of these forty years, the British West Indian community had developed its own “patois”, which had a special place as a token of identity. One Jamaican schoolgirl living in London in the late eighties explained the social pressures that frowned on Jamaican English in Jamaica, but which made it almost obligatory in London. She wasn’t allowed to speak Jamaican Creole in front of her parents in Jamaica. When she arrived in Britain and went to school, she naturally tried to fit in by speaking the same patois, but some of her British Caribbean classmates told her that, as a “foreigner”, she should not try to be like them, and should speak only English. But she persevered with the patois and lost her British accent after a year and was accepted by her classmates. But for many Caribbean visitors to Britain, the patois of Brixton and Notting Hill was a stylized form that was not truly Jamaican, not least because British West Indians had come from all parts of the Caribbean. When another British West Indian girl, born in Britain, was taken to visit Jamaica, she found herself being teased about her London patois and told to speak English.

The predicament that still faced the ‘Black British’ in the late eighties and into the nineties was that, for all the rhetoric, they were still not fully accepted by the established ‘White community’. Racism was still an everyday reality for large numbers of British people. There was plenty of evidence of the ways in which Black people were systematically denied access to employment in all sections of the job market. The fact that a racist calamity like the murder in London of the black teenager Stephen Lawrence could happen in 1993 was a testimony to how little had changed in British society’s inability to face up to racism since the 1950s. As a result, the British-Caribbean population could still not feel itself to be neither fully British. This was the poignant outcome of what the British Black writer Caryl Phillips has called “The Final Passage”, the title of his novel which is narrated in Standard English with the direct speech by the characters rendered in Creole. Phillips migrated to Britain as a baby with his parents in the 1950s, and sums up his linguistic and cultural experience as follows:
“The paradox of my situation is that where most immigrants have to learn a new language, Caribbean immigrants have to learn a new form of the same language. It induces linguistic shizophrenia – you have an identity that mirrors the larger cultural confusion.”

One of his older characters in The Final Passage characterises “England” as a “college for the West Indian”, and, as Philipps himself put it, that is “symptomatic of the colonial situation; the language is divided as well”. As the “Windrush Scandal”, involving the deportation of British West Indians from the UK has recently shown, this post-colonial “cultural confusion” still ‘colours’ political and institutional attitudes twenty-five years after the death of Stephen Lawrence, leading to discriminatory judgements by officials. This example shows how difficult it is to arrive at some kind of chronological classification of migrations to Britain into the period of economic expansion of the 1950s and 1960s; the asylum-seekers of the 1970s and 1980s; and the EU expansion and integration in the 1990s and the first decades of the 2000s. This approach assumed stereotypical patterns of settlement for the different groups, whereas the reality was much more diverse. Most South Asians, for example, arrived in Britain in the post-war period but they were joining a migration ‘chain’ which had been established at the beginning of the twentieth century. Similarly, most Eastern European migrants arrived in Britain in several quite distinct waves of population movement. This led the authors of the Longman Linguistics book to organise it into geolinguistic areas, as shown in the figure below:

The Poles and Ukrainians of the immediate post-war period, the Hungarians in the 1950s, the Vietnamese refugees in the 1970s and the Tamils in the 1980s, sought asylum in Britain as refugees. In contrast, settlers from India, Pakistan, Bangladesh and the Caribbean, had, in the main come from areas of high unemployment and/or low wages, for economic reasons. It was not possible, even then, to make a simple split between political and economic migrants since, even within the same group, motivations differed through time. The Eastern Europeans who had arrived in Britain since the Second World War had come for a variety of reasons; in many cases, they were joining earlier settlers trying either to escape poverty in the home country or to better their lot. A further important factor in the discussion about the various minority communities in Britain was the pattern of settlement. Some groups were concentrated into a relatively small geographical area which made it possible to develop and maintain strong social networks; others were more dispersed and so found it more difficult to maintain a sense of community. Most Spaniards, Turks and Greeks were found in London, whereas Ukrainians and Poles were scattered throughout the country. In the case of the Poles, the communities outside London were sufficiently large to be able to sustain an active community life; in the case of Ukrainians, however, the small numbers and the dispersed nature of the community made the task of forging a separate linguistic and cultural identity a great deal more difficult.
Groups who had little contact with the home country also faced very real difficulties in retaining their distinct identities. Until 1992, Lithuanians, Latvians, Ukrainians and Estonians were unable to travel freely to their country of origin; neither could they receive visits from family members left behind; until the mid-noughties, there was no possibility of new immigration which would have the effect of revitalizing these communities in Britain. Nonetheless, they showed great resilience in maintaining their ethnic minority, not only through community involvement in the UK but by building links with similar groups in Europe and even in North America. The inevitable consequence of settlement in Britain was a shift from the mother tongue to English. The extent of this shift varied according to individual factors such as the degree of identification with the mother tongue culture; it also depended on group factors such as the size of the community, its degree of self-organisation and the length of time it had been established in Britain. For more recently arrived communities such as the Bangladeshis, the acquisition of English was clearly a more urgent priority than the maintenance of the mother tongue, whereas, for the settled Eastern Europeans, the shift to English was so complete that mother tongue teaching was often a more urgent community priority. There were reports of British-born Ukrainians and Yiddish-speaking Jews who were brought up in predominantly English-speaking homes who were striving to produce an environment in which their children could acquire their ‘heritage’ language.
Blair’s Open Door Policy & EU Freedom of Movement:
During the 1980s and ’90s, under the ‘rubric’ of multiculturalism, a steady stream of immigration into Britain continued, especially from the Indian subcontinent. But an unspoken consensus existed whereby immigration, while always gradually increasing, was controlled. What happened after the Labour Party’s landslide victory in 1997 was a breaking of that consensus, according to Douglas Murray, the author of the recent (2017) book, The Strange Death of Europe. He argues that once in power, Tony Blair’s government oversaw an opening of the borders on a scale unparalleled even in the post-war decades. His government abolished the ‘primary purpose rule’, which had been used as a filter out bogus marriage applications. The borders were opened to anyone deemed essential to the British economy, a definition so broad that it included restaurant workers as ‘skilled labourers’. And as well as opening the door to the rest of the world, they opened the door to the new EU member states after 2004. It was the effects of all of this, and more, that created the picture of the country which was eventually revealed in the 2011 Census, published at the end of 2012.

The numbers of non-EU nationals moving to settle in Britain were expected only to increase from 100,000 a year in 1997 to 170,000 in 2004. In fact, the government’s predictions for the number of new arrivals over the five years 1999-2004 were out by almost a million people. It also failed to anticipate that the UK might also be an attractive destination for people with significantly lower average income levels or without a minimum wage. For these reasons, the number of Eastern European migrants living in Britain rose from 170,000 in 2004 to 1.24 million in 2013. Whether the surge in migration went unnoticed or was officially approved, successive governments did not attempt to restrict it until after the 2015 election, by which time it was too late.
(to be continued)
Like this:
Like Loading...
Posted January 15, 2019 by AngloMagyarMedia in Affluence, Africa, Arabs, Assimilation, asylum seekers, Belfast, Birmingham, Black Market, Britain, British history, Britons, Bulgaria, Calais, Caribbean, Celtic, Celts, Child Welfare, Cold War, Colonisation, Commonwealth, Communism, Compromise, Conservative Party, decolonisation, democracy, Demography, Discourse Analysis, Domesticity, Economics, Education, Empire, English Language, Europe, European Economic Community, European Union, Factories, History, Home Counties, Humanism, Humanitarianism, Hungary, Immigration, Imperialism, India, Integration, Iraq, Ireland, Journalism, Labour Party, liberal democracy, liberalism, Linguistics, manufacturing, Margaret Thatcher, Midlands, Migration, Militancy, multiculturalism, multilingualism, Music, Mythology, Narrative, National Health Service (NHS), New Labour, Old English, Population, Poverty, privatization, Racism, Refugees, Respectability, Scotland, Socialist, south Wales, terror, terrorism, Thatcherism, Unemployment, United Kingdom, United Nations, Victorian, Wales, Welsh language, xenophobia, Yugoslavia
Tagged with 'Middle Englanders', al Qaeda, Alan Milburn, Albert Dock, Amazon, Amsterdam, Andrew Marr, Aneurin Bevan, Anti-Social Behaviour Order ('Asbo'), Bangladesh, Bengali, Blair-Brown government, Bradford, Brixton, broadband, Bupa, Business, Caribbean, Caryl Philipps, CCTV, Charles Clarke, Cheshire, Child Poverty Action Group, Chinese, civil liberties, Creole, culture, current-events, derogation, diversity, Douglas Murray, downloading, eBay, Education, England, Ethnic minorities, EU, European Community, federalism, France, Freedom of Movement, GDP, Germany, Gordon Brown, Gujurati, habeas corpus, History, Iraq, Ireland, Islam, Jack Straw, Jamaica, Jihad, John Reid, Kurds, Liverpool, Maastricht Treaty, Michael Howard, minimum wage, Morecambe Bay, New Deal, Notting Hill, Office of National Statistics (ONS), Olympic Games 2012, Open Door Policy, Pakistan, patois, Poland, politics, Polly Toynbee, Punjabi, Queen Elizabeth II, Railtrack, redistribution of wealth, referendum, Robert Winston, Royal College of Nursing, security services, society, South Asian, Spice Girls, Sure Start, surveillance, The Final Passage, the Guardian, third world, Tim Berners-Lee, Tony Blair, Treasury, UN, Victorian values, Wanless Report, West Indian, Westminster, Whitehall, Windrush, working families' tax credit
Chapter Four: Those Two Impostors: Triumph and Disaster
In 1978 the House of Lords held a special debate on the state of the English language. Due to rapid social and economic transformation, thanks mainly to the technology of mass communication, fears for the future of British English had become one of the staples of newspaper columns and television chat shows. Now it was the turn of the peers of the realm to have their say. The record of the debate, The English language: Deterioration and Usage, makes very interesting reading. All but one of the speakers in it accepted, without question, that the language was deteriorating. They unrolled a catalogue of familiar complaints. One peer remarked,
It seems to me virtually impossible for a modern poet to write ’the choir of gay companions’. What has happened is that is that a word has been used for propaganda purposes which have destroyed its useful meaning in English.
Pronunciation was also considered to be slipping, and here the BBC came in for a substantial amount of criticism for failing in its clear duty to uphold the standards of English. There was praise for the Plain English Campaign, which had begun a series of successful battles against Civil Service gobbledygook, and complaints about the prevalence of jargon in official documents. There were also laments over the latest translations of the Bible and the recent revisions of the Book of Common Prayer. And, of course, more than one noble speaker blamed the Americans. Lord Somers, observed:
If there is a more hideous language on the face of the earth than the American form of English, I should like to know what it is!
In fact, the noble peers blamed just about every institution in society – the schools, the universities, and the mass media. Children were no longer educated in grammar or the classics. Newspaper, radio and television were familiarising the public with a language that depends on generalisations which are usually imprecise and often deliberately ambiguous… a language that makes unblushing use of jargon whenever that can assist evasion. They also displayed more than a touch of xenophobia, one of them arguing rather perversely that a major cause of deterioration in the use of the English language is very simply the enormous increase in the number of people who are using it. The most revealing comment of all was perhaps the one made by Lord Davies of Leek who remarked,
Am I right in assuming that in an age tortured by uncertainty with respect to religion, God, family, self, money and property, there is a worldwide collapse of not only the values of the past but of our language which, more and more, tends to be vague, indecisive, careless and often callous?
Certainly, as with sexual intercourse, the moral relativist revolution of the sixties and seventies had also encouraged a more permissive approach to social intercourse. Tongues were loosened and noses unblocked. However, Lord Davies’ remark was using language-change as a means of complaining about deeper changes in society. Against this, we might point out that speakers of Standard British Mercian English have often taken second place to other users, whether Scots, Irish or Welsh, the East Anglian Founding Fathers, Cockneys, Jews, Caribbeans or Indians. Influential changes and diversifications have usually occurred at the cultural centre of the language rather than at its fringes, in Britain itself. From this perspective, Standard British English remains as radical a tool as it did in the sixteenth and seventeenth centuries. Just as in the ninth century, the fusion of Norse and Saxon languages was happening far from the main centres of trade and administration in the South of England, so in the late twentieth century the dominant forms, accents and voices in British English as it was used and taught were not those of the Establishment, speaking in the House of Lords, but those of Brixton, the East End and Coventry.
The Celtic countries and provinces also have their own brands of English, each of which can be subdivided into further localised varieties. For example, Welsh English, or Anglo-Welsh, has differing northern and southern varieties, also spoken in some of the border areas of England. The traditional Northumbrian Saxon dialect, sometimes referred to as the Scots’ language, and there is also Lallans, another lowland Scots dialect. Both have literary traditions. In Northern Ireland, Ulster Scots remains as the dialect of those who migrated from south-west Scotland. While some traditional features of these varieties fall out of use, other innovations, both regional and national, continue to be made to British English, so that the idea that there will one day be a uniform standard spoken English throughout the British Isles is unlikely to ever become a reality. In addition, there are still (officially) half a million Welsh-speakers, about one in five of the resident population of Wales. In Scotland, the Gaelic speech community is just over one per cent of the population, sparsely distributed through the western islands and highlands. In the Republic of Ireland, about forty per cent of the population have some level of Irish, but the number of habitual speakers is far lower. There are few monoglot speakers of either Irish or Welsh, but both languages are taught to school-leaving age to all students, thus ensuring continuing bilingualism. Both languages have strongly influenced the forms, vocabulary and pronunciation of Anglo-Welsh and Irish English, sometimes deliberately recorded by poets and writers.

Above: Factory workers strike over low pay
In the 1977-79 there was an explosion of resentment, largely by poorly paid public employees, against a minority Labour government incomes policy they felt was discriminatory. It began earlier in the year, but got far worse with a series of strikes going into winter, resulting in rubbish being left piled up in the streets throughout the country.This became known as the Winter of Discontent after Shakespeare’s opening soliloquy spoken by Richard, Duke of Gloucester in his history play, Richard III. The scenes provided convincing propaganda for the conservatives in the subsequent election in May. Using the slogan Labour isn’t working, which appeared on huge hoardings showing long dole queues, they came back to power with a clear majority in the General Election in 1979, led by Margaret Thatcher, who promised a return to the values which had made Victorian Britain great. However, what the British people got was more of a return to the hard-nosed Toryism of the interwar years as the Thatcher government set about the task of deliberately lengthening those dole queues. As wage-rises were believed to be the main source of inflation, heavy unemployment, it was often openly argued, would weaken trade union bargaining power, and was a price worth paying. At the same time, an economic squeeze was introduced, involving heavy tax increases and a reduction in public borrowing to deflate the economy, thus reducing both demand and employment. In the 1980s, two million manufacturing jobs disappeared, most of them by 1982.

Above: Rubbish is left piled up in London’s Leicester Square in February 1979
In Coventry, nearly sixty thousand jobs were lost in this period of recession. The Conservative policy of high interest rates tended to overvalue the pound, particularly in the USA, the major market for Coventry’s specialist cars, leading to a rapid decline in demand. Also, the Leyland management embarked on a new rationalisation plan. The company’s production was to be concentrated at its Cowley and Longbridge plants. Triumph production was transferred to Cowley, and Rover models were to be produced at the new Solihull plant. The Coventry engine plant at Courthouse Green was closed and Alvis, Climax and Jaguar were sold off to private buyers. In these first three years of the Thatcher government the number of Leyland employees in the city fell from twenty-seven thousand to eight thousand. One writer summarised the effects of Conservative policy on Coventry in these years as turning a process of gentle decline into quickening collapse. Overall the city’s top manufacturing firms shed thirty-one thousand workers between 1979 and 1982. Well-known pillars of Coventry’s economic base such as Herbert’s, Triumph Motors and Renold’s all disappeared. Unemployment had stood at just five per cent in 1979, the same level as in 1971. By 1982 it had risen to sixteen per cent.

None of this had been expected locally when the Thatcher government came to power. After all, Coventry had prospered reasonably well during the previous Tory administrations. The last real boom in the local economy had been stimulated by the policies of Ted Heath’s Chancellor, Anthony Barber. However, the brakes were applied rather than released by the new government. Monetarist policy was quick to bite into the local industry. Redundancy lists and closure notices in the local press became as depressingly regular as the obituary column. The biggest surprise was the lack of resistance from the local Labour movement, given Coventry’s still formidable trade union movement. There was an atmosphere of bewilderment and an element of resignation characterised the responses of many trades-union officials. It was as if the decades of anti-union editorials in the Coventry Evening Telegraph were finally being realised. There were signs of resistance at Longbridge, but the BL boss, Michael Edwardes, had introduced a tough new industrial relations programme which had seen the removal from the plant of Red Robbo, Britain’s strongest motor factory trade union leader. He had also closed the Speke factory on Merseyside, demonstrating that he could and would close plants in the face of trade union opposition. Coventry’s car workers and their union leaders had plenty of experience in local wage bargaining in boom times, but lacked strategies to resist factory closures in times of recession. Factory occupation, imitating its successful use on the continent, had been tried at the Meriden Triumph Motorcycle factory, but with disastrous results. The opposition from workers was undoubtedly diminished by redundancy payments which in many cases promised to cushion families for a year or two from the still unrealised effects of the recession.
Above: Employment levels in Coventry
Young people were the real victims of these redundancies, as there were now no places for them to fill. The most depressing feature of Coventry’s unemployment was that the most severely affected were the teenagers leaving the city’s newly-completed network of Community Comprehensives. As the recession hit the city large numbers of them joined the job market only to find that expected opportunities in the numerous factories had evaporated. By June 1980, forty-six per cent of the city’s sixteen to eighteen year-olds were seeking employment and over half of the fourteen thousand who had left school the previous year were still unemployed. Much prized craft apprentices all but vanished and only ninety-five apprentices commenced training in 1981. The Local Education Authority was pioneering in its attempts to provide even basic employment and training for youngsters in cooperation with central government schemes and with major firms such as GEC and Courtaulds. It established a city-wide Careers Service, with full-time officers attached to individual schools, but working from a centralised service for employers and school leavers. In 1981-2, some 5,270 youths were found posts in training course, work experience and community projects, but with limited long-term effects. The early 1980s were barren years for Coventry youngsters, despite the emergence of their own pop group, The Specials, and their own theme song, Ghost Town, which also gave vent to what was becoming a national phenomenon. The lyric’s sombre comparison of boom time and bust was felt much more sharply in Coventry than elsewhere.
Coventry paid a very heavy price in the 1980s for its over-commitment to the car industry, suffering more than other comparable Midland towns such as Leicester and Nottingham, both of which had broader-based economies. Its peculiar dependence on manufacturing and its historically weak tertiary sector meant that it was a poor location for the so-called sunrise industries. These were high-tech enterprises, based largely along the axial belt running from London to Slough, Reading and Swindon, so they had little initial impact on unemployment in Coventry and other Midland and Northern industrial centres. The growth in service industries was also, initially at least, mainly to the benefit of the traditional administrative centres, such as Birmingham, rather than to its West Midland neighbours. While little development work took place in local industry, but Nissan recruited hundreds of foremen from Coventry for its new plant in Sunderland, announced before the Thatcher government, and Talbot removed its Whitley research and development facility to Paris in 1983, along with its French-speaking Coventrians. Only at Leyland’s Canley site did research provide a service for plants outside the city. For the first time in a hundred years, Coventry had become a net exporter of labour. By the time of the 1981 Census, the city had already lost 7.5 per cent of its 1971 population. The main losses were among the young skilled and technical management sectors, people who any town or city can ill afford to lose. Summing up the city’s position at this time, Lancaster and Mason emphasised the dramatic transition in its fortunes from boomtown, a magnet for labour from the depressed areas, to a depressed district itself:
Coventry in the mid 1980s displays more of the confidence in the future that was so apparent in the immediate post-war years. The city, which for four decades was the natural habitat of the affluent industrial worker is finding it difficult to adjust to a situation where the local authority and university rank amongst the largest employers. Coventry’s self-image of progressiveness and modernity has all but vanished. The citizens now largely identify themselves and their environment as part of depressed Britain.
Above: A 1982 cartoon: Britain was at war with Argentina over the Falkland Islands. The inhabitants of the islands, a dependent territory of the United Kingdom, wanted to remain under British rule, but Argentina invaded.
Thatcher was victorious, but it was a costly war for the British.
Below: The Royal Marines march towards Port Stanley during the Falklands War, June 1982

The government had promised in 1979 that a restructuring of the economy would be followed by increased investment and employment opportunities but three years later, in the spring of 1982 there was no sign of this promise being kept. There had already been serious rioting by the disaffected of Brixton in 1981. After this, the Tories had looked destined for defeat in the 1983 General Election, but following the Falklands War, the Iron Lady, also variously characterised as Boadicea and Britannia, swept back to power on a tidal wave of revived jingoistic imperialism. Even in Labour heartlands, such as south Wales, the Tories made major gains. The government then took a more confrontational approach at home. As in the 1920s, resistance to brutal rationalisation through the closure or selling off of uneconomic enterprises, or by wage or job reductions, was met by determined opposition, never tougher than in the confrontation of 1984-85 with the National Union of Mineworkers, led by Arthur Scargill. The National Coal Board, supported by the government, put forward a massive programme of pit closures. The bitter, year-long miners’ strike which followed was roundly defeated, amid scenes of mass picketing and some violence from both miners and the police. Ultimately the government proved too determined even for the miners, and had, in any case, built up the resources to resist their anticipated demands for it to back down.


Above: Miners’ leader, Arthur Scargill/ Striking Yorkshire miners
However, the strike and the colliery closures left a legacy of bitterness and division in British which was only too apparent at the time of Margaret Thatcher’s recent state funeral, and is the subject or background for many recent films, some of which have distorted or trivialised our recollection of the reality. Among the better representations of it is Billy Elliott. Under the thirty years rule, the government documents from 1984 have only just become available, so we can now look forward to the more rounded perspectives of historians on these events. Already, politicians have called for government apologies to be given to the miners and their families.

Above: In the Durham Coalfield, pits were often the only real source of employment in local communities,
so the economic and social impact of closures could be devastating.
The 1984-5 Strike was an attempt to force a reversal of the decline.
The pit closures went ahead and the severe contraction of the mining industry continued: it vanished altogether in Kent, while in Durham two-thirds of the pits were closed. The government had little interest in ensuring the survival of the industry, determined to break its militant and well-organised union. The social cost of the closures, especially in places in which mining was the single major employer, as in many of the pit villages of Durham and the valleys of south Wales, was devastating. The entire local economy was crippled. On Tyneside and Merseyside a more general deindustrialisation occurred. Whole sections of industry, including coal, steel and shipbuilding, simply vanished from their traditional areas. Of all the areas of the United Kingdom, however, it was Northern Ireland that suffered the highest levels of unemployment. This was largely because the continuing sectarian violence discouraged inward investment in the six counties of the Province.
Nationally, in February 1986 there were over 3.4 million unemployed, although statistics were manipulated for political reasons and the real figure is therefore a matter of speculation. The socially corrosive effects of the return of widespread mass unemployment, not seen since the early thirties, were felt throughout the country, manifesting themselves in the further bouts of inner-city rioting that broke out in 1985. This was more serious for the government than the rioting against the Means Test of half a century before, because it occurred in cities throughout the country, rather than in depressed mining areas. London was just as vulnerable as Liverpool, and a crucial contributory factor was the number of young men of Asian and Caribbean origin who saw no hope of ever entering employment: opportunities were minimal and they felt particularly discriminated against. The term underclass was increasingly used to describe those who felt themselves to be completely excluded from the benefits of prosperity.
The only sizeable addition to the immigrant population during the recession of the early eighties was among the Polish community. After the Polish government’s clampdown on the shipyard-led Solidarity movement in 1980, about two thousand refugees entered Britain. It was hard for researchers at the time to assess the extent to which these new arrivals influenced the already well-established Polish communities and organisations throughout Britain. The only reported figures, taken from a Language Census conducted by ILEA between 1981 and 1987, shows nearly six hundred Polish pupils in London schools. Assuming that these were pupils with Polish as their strong first language (L1), requiring English as an Added Language (EAL) tuition support, rather than established Polish bilingual children with English as a strong L1 or L2, we might therefore conclude that the majority of these new immigrants settled in London, probably using already-established kinship networks and institutions. No matter how much Polish was the language used at home, second-generation Polish children showed a strong preference to switch to English in conversations involving the expression of abstract concepts, even within the home context.
The Linguistic Minorities Project (LMP) Survey, conducted in Coventry and Bradford in 1985 showed that the Polish language skills of the adult respondents were, perhaps predictably, very high. However, the reported levels of fluency in Polish for members of respondents’ households as a whole, likely to include a high proportion of British-born children, was significantly lower. Ninety-one per cent of the respondents in Coventry reported that their children used only English between themselves, and third-generation children in Polish Saturday schools used Polish only with the teachers and assistants. The influx of younger first-generation Poles in the 1980s helped to create new relationships in which second and third generations could use Polish in more realistic ways. The Survey also showed that in Coventry and Bradford, whereas almost half of Polish workers were in a workplace where at least one fellow-worker was a Polish-speaker, more than sixty per cent of them used only English with their workmates. Nevertheless, the Poles maintain a network of friends with whom they could use their mother tongue. They also had a wide range of opportunities to use the language in the community:
The Pole can buy Polish food from Polish shops, eat in Polish restaurants, sleep in Polish hotels or digs, with a Polish landlady, entertain friends in Polish clubs, attend a Polish doctor (over five hundred are practising in Britain) or dentist (eighty Polish dental surgeries), have a Polish priest and be buried by a Polish undertaker.
In the 1980s, Polish was not taught in the mainstream schools, though there were some unsuccessful attempts made in this direction in Stepney in 1981. Some years later, ILEA approached the Polish Educational Society Abroad with a similar suggestion which also failed, partly because Poles insist that mother tongue teaching must include Polish cultural content. In 1982 a section of Polish Studies was added to the School of Slavonic and Eastern European Studies at the University of London. For L1 or bilingual speakers of Polish, the degree lasted for three years and included language, literature and history as compulsory elements. Additional options included economics, politics, geography and planning. The Polish Section also organises conferences for Polish teachers and pupils. Otherwise, only Oxford and Cambridge hold lectures on Polish as a Slavic Language. These developments encouraged a note of optimism for the Polish community in Britain at a time when other immigrant groups were struggling to integrate, or felt alienated by the host country, particularly in the second and third generations. Together with the arrival of the Solidarity generation, there was a revival of awareness of linguistic and cultural roots in Britain in this decade. This helped the Poles to integrate into British society while resisting linguistic and cultural assimilation: becoming British did not necessarily involve losing their Polish identity.
By 1987, service industries were offering an alternative means of employment in Britain. Between 1983 and 1987 about one and a half million new jobs were created. Most of these were for women, many of whom were entering employment for the first time, and many of the jobs available were part-time and, of course, lower paid than the jobs lost in primary and secondary industries. By contrast, the total number of men in full-time employment fell still further. Many who had left mining or manufacturing for the service sector also earned far less. By the end of the century there were more people employed in Indian restaurants than in the coal and steel industries combined, but for much lower pay. The economic recovery that led to the growth of this new employment was based mainly on finance, banking and credit. Little was invested in home-grown manufacturing, but far more was invested overseas, with British foreign investments rising from 2.7 billion pounds in 1975 to 90 billion in 1985. At the same time, there was also a degree of re-industrialisation, especially in the Southeast, where new industries employing the most advanced technology were growing. In fact, many industries shed a large proportion of their workforce but, using new technology, maintained or improved their output. These new industries were certainly not confined to the M4 Corridor by the late eighties. By then, Nissan’s car plant in Sunderland had become the most productive in Europe, while Siemens established a microchip plant at Wallsend. However, such companies did not employ large numbers of local workers. Nissan recruited its foremen in Coventry, while Siemens invested more than a billion pounds, but only employed a workforce of about 1,800.
Regionally based industries suffered a dramatic decline during this period. Coal-mining, for example, was decimated in the decade following the 1984-85 miners’ strike, not least because of the shift of the electricity generating industry to other alternative energy sources, especially gas. During the period 1984-87 the coal industry shed a hundred and seventy thousand miners, and there was a further net loss of employment in the coalfields, with the exception of north Warwickshire and south Derbyshire, in the early 1990s. The economic effect upon local communities could be devastating, as the 1996 film Brassed Off accurately shows, with its memorable depiction of the social impact on the Yorkshire pit village of Grimethorpe of the 1992 closure programme.
The trouble with the economic strategy followed by the Thatcher governments was that South Wales, Lancashire, the West Riding of Yorkshire, Tyneside and Clydesdale were precisely those regions that had risen to extraordinary prosperity as part of the British imperial enterprise. Now they were being written off as disposable assets, so what interest did the Scots in particular, but also the Welsh, have in remaining as part of that enterprise, albeit a new corporation in the making? The understandable euphoria over Thatcher and her party winning three successive general elections disguised the fact the last of these victories was gained at the price of perpetuating a deep rift in Britain’s social geography. Without the Falklands factor to help revive the Union flag, a triumphalist English conservatism was increasingly imposing its rule over the other nations of an increasingly disunited Kingdom. Thatcher’s constituency was, overwhelmingly, the well-off middle and professional classes in the south of England, whilst the distressed northern zones of derelict factories, pits, ports and terraced streets were left to rot and rust. People living in these latter areas were expected to lift themselves up by their own bootstraps, retrain for work in the up-and-coming industries of the future and if need be get on Tory Chairman, Norman Tebbitt’s bicycle and move to one of the areas of strong economic growth such as Cambridge, Milton Keynes or Slough, where those opportunities were clustered. However, little was provided by publicly funded retraining and, if this was available, there was no guarantee of a job at the end of it. The point of the computer revolution in industry was to save labour, not to expand it.
In the late 1980s, the north-south divide seemed as intractable as it had all century, with high unemployment continuing to be concentrated in the declining manufacturing areas of the North and West of the British Isles. That the north-south divide increasingly had a political dimension as well as an economic one was borne out by the 1987 General Election in the UK. Margaret Thatcher’s third majority was this time largely based in the votes of the South and East of England. North of a line running from the Severn estuary through Coventry and on to the Humber estuary, the long decline of Toryism, especially in Scotland, where it was reduced to only ten seats, was apparent to all observers. At the same time, the national two-party system seemed to be breaking down so that south of that line, the Liberal-SDP Alliance were the main challengers to the Conservatives in many constituencies.
Culturally, the Thatcher counter-revolution ran into something of a cul-de-sac, or rather the cobbled streets of Salford, typified in the long-running TV soap opera, Coronation Street. Millions in the old British industrial economy had a deeply ingrained loyalty to the place where they had grown up, gone to school, got married and had their kids; to the pub, their park, their football team. In that sense at least the Social Revolution of the fifties and sixties had recreated cities and towns that, for all their ups and downs, their poverty and pain, were real communities. Fewer people were willing to give up on Liverpool and Leeds, Nottingham and Derby than the pure laws of the employment market-place demanded. For many working-class British people, it was their home which determined their quality of life, not the width of their wage-packet.
Not everything that the Thatcher governments did was out of tune with social reality. The sale of council houses created an owner-occupier class which, as Simon Schama has written, corresponded to the long passion of the British to be kings and queens of their own little castles. Sales of remaining state-owned industries, such as the public utility companies, were less successful, since the concept of stakeholderdship was much less deeply rooted in British traditions, and the mixed fortunes of both these privatised companies and their stocks did nothing to help change customs. Most misguided of all was the decision to call a poll tax imposed on house and flat owners a community charge, and then to impose it first, as a trial run, in Scotland, where the Tories already had little support. The grocer’s daughter from Grantham that it would be a good way of creating a property-owning, tax-paying democracy, where people paid according to the size of their household. This was another mistaken assumption. Soon after, the iron lady was challenged for her leadership of the Party, and therefore the country, and was forced to step down from the contest. She was then replaced as PM by one of her loyal deputies, John Major, another middle-class anti-patrician, the son of a garden-gnome salesman, apparently committed to family values and a return to basics. Although winning the 1992 General Election, the Major government ended up being overwhelmed by an avalanche of sexual and financial scandals and blunders, as well as by the back-bench right-wing in the House of Commons who wanted Britain to withdraw from the European Union.


The old north-south divide in Britain seemed to be eroding during the recession of the early 1990s, which hit southeast England relatively hard, but it soon reasserted itself with a vengeance later in the decade as young people moved south in search of jobs and property prices rose. Even though the shift towards service industries was reducing regional economic diversity, the geographical distribution of regions eligible for European structural funds for economic improvement confirmed the continuing north-south divide. The administrative structure of Britain also underwent major changes by the end of the nineties. The relative indifference of the Conservative ascendancy to the plight of industrial Scotland and Wales had transformed the prospects of the nationalist parties in both countries. In the 1987 election, Scottish and Welsh nationalists, previously confined mainly to middle-class, rural and intellectual constituencies, now made huge inroads into Conservative areas and even into the Labour heartlands of industrial south Wales and Clydeside.
In a 1992 poll in Scotland, half of those asked said that they were in favour of independence within the European Union. In the General Election of the same year, however, with Mrs Thatcher and her poll tax having departed the political scene, there was a minor Tory recovery. Five years later this was wiped out by the Labour landslide of 1997, when all the Conservative seats in both Scotland and Wales were lost. Only one Scottish seat was regained by the Tories in 2001. The Tories became labelled as a centralising, purely English party. Nationalist political sentiment grew in Scotland and to a lesser extent in Wales. The devolution promised and instituted by Tony Blair’s new landslide Labour government did seem to take some of the momentum out of the nationalist fervour , but apparently at the price of stoking the fires of English nationalism among Westminster Tories, resentful at the Scots and Welsh having representatives in their own assemblies as well as in the UK Parliament. In 1999, twenty years after the first campaigns for devolution, a devolved Parliament was set up in Scotland, in Edinburgh, Wales got an Assembly in Cardiff, and Northern Ireland had a power-sharing Assembly again at Stormont near Belfast. In 2000, an elected regional assembly was established for Greater London, the area covered by the inner and outer boroughs in the capital, with a directly elected Mayor. This new authority replaced the Greater London Council which had been abolished by the Thatcher Government in 1986, and was given responsibility over local planning and transport.

The process of deindustrialisation continued into the nineties with the closure of the Swan Hunter shipyard on the Tyne in May 1993. It was the last working shipyard in the region, but failed to secure a warship contract. It was suffering the same long-term decline that reduced shipbuilding from an employer of two hundred thousand in 1914 to a mere twenty-six thousand by the end of the century. This devastated the local economy, especially as a bitter legal wrangle over redundancy payments left many former workers without any compensation at all for the loss of what they had believed was employment for life. As the map above shows, the closure’s effects of spread far further than Tyneside and the Northeast, which were certainly badly hit by the closure, with two hundred and forty suppliers losing their contracts. According to Keynesian economics, the results of rising unemployment are multiplied as the demand for goods and services declines. The closure of Swan Hunter certainly had a widespread impact on Suppliers as far afield as Southampton and Glasgow, as well as in the West Midlands and the Southeast. They lost valuable orders and therefore also had to make redundancies. Forty-five suppliers in Greater London also lost business. Therefore, from the closure of one single, large-scale engineering concern, unemployment resulted even in the most prosperous parts of the country. In the opposite economic direction, the growing North Sea oil industry helped to spread employment more widely throughout the Northeast and the Eastern side of Scotland, with its demands for drilling platforms and support ships, and this benefit was also felt nationally, both within Scotland and more widely, throughout the UK. However, this did little in the short-term to soften the blow of the Swan Hunter closure.


Overall, however, the 1990s were years of general and long-sustained economic expansion. The continued social impact of the decline in coal, steel and shipbuilding was to some extent mitigated by inward investment initiatives. Across most of the British Isles, there was also a continuing decline in the number of manufacturing jobs throughout the nineties. Although there was an overall recovery in the car industry, aided by the high pound in the export market, much of this was due to the new technology of robotics which made the industry far less labour-intensive and therefore more productive. The service sector, however, expanded, and general levels of unemployment, especially in Britain, fell dramatically in the 1990s. Financial services saw strong growth, particularly in places such as the London Docklands and Edinburgh. Indeed, by the end of the decade, the financial industry was the largest employer in northern manufacturing towns like Leeds, which grew rapidly, aided by its ability to offer a range of cultural facilities that helped to attract an array of UK company headquarters. Manchester, similarly, enjoyed a renaissance, particularly in music and football. Manchester United’s commercial success led it to become the world’s largest sports franchise.

Other areas of the country were helped by their ability to attract high technology industry. Silicon Glen in central Scotland was, by the end of the decade, the largest producer of computer equipment in Europe. Computing and software design was also one of the main engines of growth along the silicon highway of the M4 Corridor west of London. But areas of vigorous expansion were not necessarily dominated by new technologies. The economy of East Anglia, especially Cambridgeshire, had grown rapidly in the 1980s and continued to do so throughout the 1990s. While Cambridge itself, aided by the university-related science parks, fostered high-tech companies, especially in biotechnology and pharmaceuticals, expansion in Peterborough, for instance, was largely in low-tech areas of business services and distribution.

Getting around Britain was, at least, getting easier. By 1980 there were nearly one and a half thousand miles of motorway in Britain. In the last twenty years of the century, the stretching of the congested motorway network to just over two thousand miles, mostly involving the linking of existing sections. Motorway building and airport development was delayed by lengthy public enquiries and well-organised public protest. Improving transport links was seen as an important means of stimulating regional development as well as combating local congestion. Major road developments in the 1990s included the completion of the M25 orbital motorway around London, the Skye bridge and the M40 link between London and Birmingham. However, despite this construction programme, congestion remained a problem: the M25 was labelled the largest car park on the planet, while average traffic speeds in central London fell to only ten miles per hour in 2001, a famous poster on the underground pointing out that this was the same speed as in 1901. Improvements to public transport networks tended to be concentrated in urban centres, such as the light rail networks in Manchester, Sheffield and Croydon. At the same time, the migration of some financial services and much of the Fleet Street national press to major new developments in London’s Docklands prompted the development of the Docklands Light Railway and the Jubilee line extension, as well as some of the most expensive urban motorway in Europe. Undoubtedly, the most important transport development was the Channel Tunnel rail link from Folkestone to Calais, completed in 1994. By the beginning of the new millennium, millions of people had travelled by rail from London to Paris in only three hours.

The development of Ashford in Kent, following the opening of the Channel Tunnel rail link, provides a good example of the relationship between transport links and general economic development. The railway had come to Ashford in 1842 and a railway works was established in the town. This was eventually run down and closed between 1981 and 1993, but this did not undermine the local economy. Instead, Ashford benefited from the Channel Tunnel rail link, which made use of the old railway lines running through the town, and its population actually grew by ten per cent in the 1990s. The completion of the Tunnel combined with the M25 London orbital motorway, with its M20 spur, to give the town an international catchment area of some eighty-five million people within a single day’s journey. This, together with the opening of Ashford International railway station as a main terminal for the rail link to Europe, attracted a range of engineering, financial, distribution and manufacturing companies. Fourteen business parks were opened in and around the town, together with a science park owned by Trinity College, Cambridge, and a popular outlet retail park on the outskirts of the town. By the beginning of the new millennium, the Channel Tunnel had transformed the economy of Kent. Ashford is closer to Paris and Brussels than it is to Manchester and Sheffield, both in time and distance. By the beginning of this century, it was in a position to be part of a truly international economy.


Transport policy was only one of the ways in which the EU increasingly came to shape the geography of the British Isles in the 1990s. It was a key factor in the creation of the new administrative regions of Britain in 1999. At the same time, a number of British local authorities opened offices in Brussels for lobbying purposes. The enthusiasm the Scottish National Party discovered in the late 1980s for the supposed benefits that would result from independence in Europe may help to explain its subsequent revival. The European connection has proved less welcome in other quarters. Fishermen, particularly in Cornwall and on the East coast of England, have felt themselves the victims of the Common Fisheries Policy quota system. A strong sense of Euroscepticism developed in England in particular, fuelled by a mixture of concerns about sovereignty and economic policy. Nevertheless, links with Europe have been growing, whether via the Channel Tunnel, or the connections between the French and British electricity grids, or airline policy, as have the number of policy decisions shaped by the EU. This pace of change quickened as the result of the 1987 Single European Act, as it became clear that the UK was becoming increasingly integrated with the European continent.

By the late 1990s, another indispensible marker of British identity, the monarchy, began to look tired, under strain of being simultaneously a ceremonial and familial institution. Ever since the abdication of Edward VIII in 1936, which suddenly propelled the ten year-old Princess Elizabeth into the spotlight as the heir apparent, the membership of this institution was thought to require standards of personal behaviour well above the norm of late twentieth century expectations. Just as the monarchy had gained from its marriages, especially that resulting from the fairy tale romance of the Prince of Wales and Lady Diana Spencer in 1981, whose wedding at St Paul’s in 1981 had a world-wide audience of at least eight hundred million viewers, so it lost commensurately from the failure of those unions. The year 1992, referred to by the Queen as her annus horriblis, saw not just the separations of Charles and Diana (the Wales) as well as Andrew and Sarah (the Yorks), but also a major fire at Windsor Castle in November. When it was announced that the Crown would only pay for the replacement and repair of items in the royal private collection, and that repairs to the fabric would therefore come from the tax-paying public, a serious debate began about the state of the monarchy’s finances. In a poll, eight out of ten people asked thought the Queen should pay taxes on her private income, hitherto exempt. A year later, Buckingham Palace was opened to the public tours for the first time and the Crown did agree to pay taxes. In 1994 the royal yacht Britannia, the emblem of the queen’s global presence, was decommissioned.

Above: A sea of flowers laid in tribute to Diana, Princess of Wales, outside Kensington Palace, London, August 1997
The most difficult moment came in August 1997, when Princess Diana was killed in a car accident in Paris. Royal protocol dictates that the royal standard should be flown above Buckingham Palace when the Queen is in residence. The Union Flag is only flown above the royal palaces and other government and public buildings on certain special days, such as the Princess Royal’s birthday, 15 August. Since it was holiday time for the Royal family, they were away from London, so there were no flags flying. The Queen, as the only person who could authorise an exception to these age-old customs, received criticism for not flying the union flag at half-mast in order to fulfill the deep need of a grief-stricken public. They are only flown at half-mast on the announcement of the death of a monarch until after the funeral, and on the day of the funeral only for other members of the royal family. Although Her Majesty meant no disrespect to her estranged daughter-in-law, the Crown lives and dies by such symbolic moments. The immense outpouring of public emotion in the days and weeks that followed was very different from the more conventional but no less heartfelt mourning of the Queen and her immediate family. The crisis was rescued by a television speech she made which was both informal and sincere in its expression of personal sorrow, adding to the tidal wave that swept over the whole country, for England’s rose, or the People’s Princess of Wales.


The monarchy was fully restored to popularity by the Millennium festivities, at which the Queen watched dancers from the Notting Hill carnival under the ill-fated Dome, and especially by the Golden Jubilee celebrations of 2002, which continued the newly struck royal mood of greater informality. Brian May, the lead guitarist of the rock-band Queen began the pop concert at Buckingham Palace by playing his instrumental version of God Save the Queen from the roof-top overlooking the Mall. Modern Britannia seemed at last to be at ease with its identity within a multi-national, multi-ethnic, United Kingdom, in all its mongrel glory.

Above: Her Majesty Queen Elizabeth II in 2001, aged 75. She has already (in 2014) reigned for another thirteen years,
and celebrated her Diamond Jubilee in 2012.
Sources:
Bill Lancaster & Tony Mason (eds.)(n.d.), Life and Labour in a Twentieth Century City: The Experience of Coventry. Coventry: University of Warwick Cryfield Press.
Simon Schama (2002), A History of Britain; The Fate of Empire, 1776-2000. London: BBC Worldwide.
Robert McCrum, William Cran & Robert MacNeil (1987), The Story of English. London: Penguin Books.
John Haywood & Simon Hall, et.al. (2001), The Penguin Atlas of British and Irish History. London: Penguin Books.
Safder Alladina, Viv Edwards & Elizabeth Muir (1991), Multilingualism in the British Isles. Harlow: Longman (Linguistics).
Like this:
Like Loading...
You must be logged in to post a comment.