Archive for the ‘Birmingham’ Category

Documenting Dialogues: The Roots & Growth of Modern Islam – Part One.   1 comment

Inter-Religious Literacy & Inter-Cultural Education:

In September 1982, I began training to become a teacher of Religious Education at Trinity College Carmarthen, then part of the University of Wales, now part of St. David’s University. I gained my PGCE (teaching qualification) the following summer and began teaching History and RE in a Church of England High School in Lancashire in September 1983. Before 1982, all I knew about Islam had been picked up from my Muslim friends at the inner-city school I attended in Birmingham, where the compulsory RE curriculum had been focused entirely on Christianity, taught by the choirmaster of St. Martin’s in the Bull Ring, and the GCE ‘O’ Level and ‘A’ Level syllabuses involved simple textual studies of the Old and New Testaments. My History courses at school and university only ever referred to Muslims as medieval Saracens and early modern invaders of Europe, with the Ottoman Empire ‘knocking at the gates of Vienna’. It was only later when teaching the Schools’ Council’s Medicine Through Time History syllabus in Lancashire that I discovered the extent to which Islamic scholars had kept classical scientific and medical knowledge alive throughout the ‘Dark Ages’ and the period of ‘the Inquisition’. For my pupils in a semi-rural part of Lancashire which had seen little immigration, ‘Muslims’ were people who lived in the old mill towns which, though only fifteen miles away, might just as well have been on a parallel planet. They were seen as objects of fun by the children, though there were one or two ‘National Front’ supporters among the staff. However, because we were a progressive Church school, all forms of racism were challenged, and we deliberately developed a multi-faith syllabus from 11-14, which involved a detailed understanding of both Judaism and Islam, including visits to places of worship. I remember one Pakistani boy joining my class, but when I left in 1986, there were no more than a handful of similar pupils in the school. In Coventry, where I went next, there were many more Muslims, Sikhs and Hindus in our classrooms, and the syllabus reflected this.

BELOW: Fig. 1 – Islamic Culture;

002

Above: Inter-Connecting Aspects of Islamic Culture (SHAP Working Party Handbook)

The diagram above is taken from a book, first published in 1977 by the Commission for Racial Equality, lent to me by a friend from teacher-training college (I was never very good at returning books!) It was put together by the SHAP Working Party on World Religions in Education, and edited by the RE ‘guru’, W. Owen Cole. Entitled World Religions: A Handbook for Teachers, it was already in its fourth edition by 1982 and was full of resource lists and activities for teachers with varying knowledge of world religions and limited know-how when it came to teaching about them in primary and secondary schools. Since the Birmingham Agreed Syllabus and Handbook were published in 1975, there had been some polarisation among RE teachers, some of it caused by unsatisfactory reporting in the UK press. Although the need for internationalization of syllabuses was emphasised by the immigration to countries like Britain from Africa and Asia, where Christians were in a minority, in other countries like Sweden where changes were made, there were then very few Hindu, Buddhist or Muslim pupils at that time. The motivation for change came rather from the pupils themselves, whose home backgrounds were increasingly ‘secularised’ or ‘unchurched’. They and their parents asked for more equality in the treatment of religions in school. They accepted the need for studies of existing religions but denied that there should be a bias towards Christianity. Television, increasingly delivered, in the 1980s, by satellite from around the world, informed pupils that Europe with its inherited Christian religion was only a small part of the world, so that they, together with parents and teachers, saw the need to move from an ethnocentric to a more international, intercontinental and inter-faith curriculum in all school subjects.

BELOW: MAP 1;

001 (2)

Above: Muslims in the World in the Year 2000

However, in tackling problems of immigration and integration, it also seemed a good thing that the teaching of religion in school should already be fundamentally inter-religious. The model adopted for the teaching of religion in school was one which both immigrants of various faiths and indigenous people of mainly Christian beliefs or of none, were able to accept: teaching about and of religions. Adherents of non-Christian beliefs were not always able to accept this model, however. Many Muslims maintained that the whole content of religious teaching had to be based on the Qur’an. In Britain, they continued to teach Muslim children separately, in mosques and Islamic centres, after school hours, though they did not withdraw them from statutory RE in schools. There were also ‘conservative’ Christian groups which opposed this development. In Sweden, for example, they looked back to a time when Luther’s catechism dominated the teaching of religion.

While teaching in Coventry, in 1987 I was invited by the Religious Society of Friends (Quakers) in the West Midlands of England to organise a ‘Peace Education Project’ based in the Selly Oak Colleges in Birmingham. That brought me into contact with many Jewish, Muslim, Christian and Humanist forums involved in inter-faith dialogue and Religious Education/ Global Education initiatives in schools and colleges throughout the region. As teams of teachers, we developed inter-cultural programmes in parallel in both Northern Ireland and West Midland secondary schools bridging (in the former) the sectarian divide and (in the latter) the ‘inner-city immigrant’/ suburban ‘white flight’ divide. The teachers involved in the programme in the West Midlands were concerned that their students should be given opportunities to explore the multicultural nature of society in the West Midlands constructively and creatively. There were large numbers of people of Asian and Afro-Caribbean heritage throughout the region, but the concentration of these ethnic minorities in particular areas meant that the schools involved were based in suburban communities containing a predominantly ‘white’, upwardly mobile working-class population, much of which had migrated from the multi-cultural ‘inner-city’ areas. It was thus considered important to enable the students to come to terms with their own fears and prejudices in this context. The module we developed had four main objectives:

(1) to raise awareness of conflict at the inter-personal and community levels;

(2) to raise awareness of the factors which generate/ escalate conflict in society e.g. prejudice, labelling, injustice, structural violence;

(3) to develop skills and attitudes in the handling of conflict situations e.g. assertiveness, affirmation, tolerance, mutual respect, co-operation;

(4) to enable pupils to develop for themselves their own creative responses to conflict.

The module used a pack of photographs, The World in Birmingham, which elicited responses to various images of contemporary Birmingham. When the photographs were collected in at the end of one session, some with images of Muslims had been marked with red-tip pen in ‘bullet points’. The materials encouraged the students to look at labelling, stereotyping and prejudice in a variety of ways which involved them in being given a real sense of discrimination through role-playing. This was then linked to an examination of the way in which groups of people and whole communities were often labelled and stereotyped. In particular, the students were given opportunities to explore and criticise popularly held images of Handsworth in Birmingham, the scene of serious rioting in 1985 involving the Afro-Caribbean community. Later, and for the publication of the resulting module pack, Conflict and Reconciliation by the Christian Education Movement in 1991, the ‘community’ study was replaced by materials showing Muslim life in Derby. At a primary level, we also worked in schools with large Muslim and Sikh majorities on ‘peacemaking skills’.

For the following five years I continued to be engaged with these projects, both in the UK and in Hungary, where, due to its forty years as a ‘People’s Republic’, there was no RE curriculum, though there was a growing interest in Peace and Global Education, including international exchange projects funded through the EU’s TEMPUS programme. Teachers are, then, necessarily engaged in the task of working out in detail and practice the values appropriate for a multi-cultural society which seeks to reconcile the maintenance of social harmony with the continuance of cultural diversity. In the 1980s and ’90s, it was obvious that this would not be achieved in the short-term or by good intentions alone. Thought, experience and judgment were required. In helping children to formulate and clarify their own values, teachers need continually to re-examine their own. A great deal depended on the primary school teacher, whose many responsibilities include guiding the child’s first steps into the world of organised social existence.

The Satanic Verses Affair, 1989:

005

As Douglas Murray has pointed out recently (2017) almost nobody would have predicted in the 1980s that the first decades of the twenty-first century in Europe would be riven by discussions about religion. The increasingly secular continent had expected to be able to leave faith behind it or had at least recognised that after many centuries the place of religion in the modern state had been pretty much settled. If anybody in the latter part of the twentieth century had said that the early years of the next century in Europe would be rife with discussions about blasphemy and that death for blasphemers would once again have to be accepted in Europe, any audience would have scorned the prediction and doubted the sanity of the claimant. It was not that the ‘early warning sirens’ that went off were not heard, the problem was that they were so consistently ignored by so many outside the faith organisations. Britain had one of the earliest warnings, on Valentine’s Day 1989, when the Supreme Leader of the Revolutionary Islamic Republic of Iran, Ayatollah Khomeini, issued a document calling on ‘all zealous Muslims of the World’ to know that:

…the author of the book entitled ‘The Satanic Verses’ – which had been compiled, printed and published in opposition to Islam, the Prophet and the Qur’an – and all those involved in its publication who were aware of its contents, are sentenced to death. … I call on all zealous Muslims to execute them quickly, wherever they may be found, so that no one else will dare to insult the Muslim sanctities.

The head of a Tehran ‘charitable foundation’ followed this up with a $ 3 million reward for the British novelist’s murder (the bounty to be reduced to $ 2 million if the murderer were a non-Muslim). Britain and the rest of Europe learned the word fatwa for the first time. Within less than twenty-four hours Rushdie was in hiding, with protection provided by the British State. Soon thousands of British Muslims were demonstrating on the streets for the imposition of Islamic blasphemy laws in Britain. In Bradford, in the north of England, the novel was nailed to a piece of wood and burnt in front of thousands. Across the cultural and political worlds, people debated the reawakening question of blasphemy. On both sides of the political spectrum, there were those who believed that the novelist had transgressed the rules of courtesy. Lord Dacre (the historian Hugh Trevor-Roper) told a newspaper that he “would not shed a tear if some British Muslims, deploring his manners, should… seek to improve them.” The Foreign Secretary, Sir Geoffrey Howe, also went on television to condemn the author, and even the Prince of Wales was said to have said in private that Rushdie had deserved this condemnation. Those of us involved with him in delicate inter-faith relations in Birmingham were certainly irate at what the novelist had written. The Archbishop of Canterbury, Robert Runcie, said that he “understood the Muslims’ feelings.” The Chief Rabbi, Immanuel Jakobovits, said that “both Mr Rushdie and the Ayatollah have abused the freedom of speech.” There were similar pronouncements from the leadership of the Catholic Church and the other denominations. The author John le Carré declared that “there is no law in life or nature that says great religions may be insulted with impunity”.

Undoubtedly, some of the reaction on both sides of the argument was ‘over the top’, but it did demonstrate that there were many in Britain who were prepared to uphold the right to religious faith more highly than the dubious rights of those who were determined to attack and ridicule it. Thanks to the protection measures put around Rushdie, he survived the fatwa, but there were many in the publishing industry and more widely in British society who ‘internalized’ it. Things that were published before 1989 would not be published again, and it became generally accepted that the founder of Islam was not a subject to be written or spoken of lightly or offensively. But the Rushdie affair also had the negative effect of making British society internalise the threat of violence from the radical Islam of the Iranian state. More positively, it ensured that British Muslims were better represented through the creation of the UK Action Committee on Islamic Affairs (UKACIA) and later to the creation of the Muslim Council of Britain (MCB), now the largest umbrella group representing British Muslims. The group was financially supported by Saudi Arabia, then vying with Iran to be the dominant Muslim power, In the short-term, the creation of such groups benefited community relations as more liberal elements within the Islamic community, including some of those who had engaged with us in Christian-Muslim relations in Birmingham, came to the fore. We succeeded in establishing the first initial training course for Muslim teachers in 1991.

The creation of these representative groups also appeared useful for the government. Michael Howard, the Conservative Home Secretary, encouraged the creation of the MCB and made it the interlocutory group for the government. The success of the model led to it being exported to other Western countries, including France, where – despite its secular traditions – Nicolas Sarkozy encouraged the formation of representative bodies for French Muslims, most notably the Conseil Francais du Culte Musulman (CFCM). But in the longer-term, the model favoured those in ‘the Muslim Community’ who were already politically active and engaged, while disadvantaging those too busy with their businesses to bother with community politics. This meant that the Pakistani Islamist group, Jamaat-e-Islami became the dominant group within these councils and that their brand of sectarian politics, often unpopular in their country of origin, became the mainstream voice for Muslims in Europe to the exclusion of more moderate ones. The Satanic Verses affair was, according to Rushdie himself, and in the opinion of many others, the prelude to the ‘main event’ which was to come twelve years later, on 11 September 2001, with the advent and impact of ‘Islamic’ terrorism.

The Five Pillars & Ten Forms of Religious Action:

For centuries Christian and Muslim writers composed imaginary dialogues between members of different faiths to explore, present and refute points of theology. St John of Damascus (d. 748) composed a dialogue on the divinity of Christ, which Muslims reject along with the doctrine of the Trinity, and the problem of free will, intending that this should be used as a manual for the guidance of Christians engaged in debate with learned Muslims. The differences over these questions are perhaps more apparent to Muslims than to Christians. Islam tends to be thought of by Muslims as a correction of Judaism and Christianity. For this reason, the differences tend to lie more in what Islam rejects as false rather than what it asserts as true. For example, Muslims accept the doctrines of a Day of Judgment, the forgiveness of sins and the resurrection of the body. But although Islam shares the same dramatic emphasis on the Day of Judgment with Judaism and Christianity, it also stresses that mankind lives in the here and now, and that the mutual obligations between fellow humans should discourage ascetic withdrawal. The concepts of obligation and ‘right action’ can be traced out in terms of family, community and state. The essential point is that a spiritual dimension is an integral part of ‘the good life’. Besides their common historical roots, therefore, all three faiths of the ‘One God’ share fundamental doctrinal beliefs.

The problem of implicit value judgements has already been mentioned, but it is compounded by the tendency of Christian scholars to apply Christian concepts to the analysis of phenomena within Islam. Thus one eminent authority observes that Islam has a defective conception of sin. While this may well be so from the point of view of Christian dogma, ‘sin’ does not occupy the same place in the thought of Islam as it does in the Judeo-Christian tradition. We may wish to understand why this may be so, but we will not attain that understanding by labelling a particular belief or practice as ‘defective’ or ‘distorted’. Understanding can better be reached by accepting the methodological criteria for ‘Comparative Religion’ advocated by Michael Pye:

(1) a temporary suspension of presuppositions and conclusions about the truth, falsity, value or otherwise of a given set of concepts and actions, and…

(2) the attempt to elucidate as fully as possible what the concepts, actions, social associations and states of mind mean for the persons involved in them.

009

We could usefully adopt Pye’s categorising framework of ‘Religious Action; Groups; States of Mind; Concepts’ as an approach in our own dialogues with Islam. The category of ‘Religious Action’ can be sub-divided as follows:

1. Special Places, Times and Objects;

Places – e.g. Ka’aba stone, Mosques, Tombs;

Times – e.g. Friday prayers, Ramadan, Dhul-Hajj;

Objects – e.g. Qur’an;

2. The Use of the Body (e.g. prayer rituals, asceticism and fasting);

3. Separation and Ritual Cleansing (e.g. ablutions, diet, pollution & purification);

4. Sacrifice, Offering & Worship (atonement, thanksgiving, celebration);

5. Rehearsal of Significant Past or Myth (especially important for Shia);

6. Meditation & Prayer;

Thanksgiving – for the revelation given to the prophet;

Adoration – of God & his works;

Pledges – to uphold ethical standards;

7. Seeking Specific Benefits (rain, victory, wealth, health or exorcism);

8. Occasional Rites ( e.g. rites of passage, work, hunting, building, harvesting);

9. Ethics & Society (Islam lays great emphasis on the link between religion & correct social relations and thus clearly defines roles and approved patterns of behaviour);

10. Propagation (the organised missionary method of the Christian churches c.f. more informal process of proselytising in Islam).

In the 1980s, British teachers of RE were becoming increasingly familiar with such phrases as ‘pluralism’, ‘multi-cultural society’ and ‘mutual respect’. The were occurring, with ever-growing frequency, in speeches made by politicians and pundits, newspaper editorials and the reports of numberless committees and working parties. By the turn of the century, they had fossilised into meaningless clichés because they failed to acquire a more precise and comprehensive usage with clear implications for action and practice. ‘Respect’ came to mean merely avoiding open disrespect for the beliefs and customs of others, whereas Mutual Understanding, the phrase used in Northern Ireland’s schools, meant making a positive effort to achieve a genuine empathy for different cultural values and actively seeking accommodation between those values and our own. This was a more inter-cultural approach, more about integration rather than assimilation, and one which also suited our needs in the West Midlands. But while the two Christian traditions in Northern Ireland traced their conflict back to the sixteenth century, it was not so clear how such ‘accommodation’ could be truly mutual in a European society whose overall framework bore with it the marks of fifteen hundred years of dominant Christian values, assumptions, taboos, customs and prejudices. The following ‘approach’ to teaching and learning ‘about Islam’ suggests potential paths towards ‘mutual understanding’ and inter-faith dialogue:

BELOW: FIG. 2;

003

This heritage, whatever we might have thought about the contemporary state of institutionalised Christianity, was still a living heritage and an active component of our daily lives and thoughts and actions. Not so with Islam, at least not in Britain. This Christian heritage may have been all but invisible to most people in Britain at the turn of the century, but those brought up in a different tradition would have had less difficulty in perceiving it. As a people, the late twentieth century British did not seem to care too much for abstractions. Some teachers were therefore either suspicious or uncomfortable with the idea that they should be teaching ‘values’. But values were very commonplace in the school playground; ‘fairness’, ‘trust’ and ‘sharing’ were simple, integrating concepts with a wealth of childhood experience defining them. Islam as a faith is synonymous with ‘sharing’ and can be related both to the overtly religious experiences which bind the believer to God and to the ethical prescriptions which bind that believer to his/ her fellows. The Five Pillars of the Faith – summarised below – can be presented in the light of this concept of ‘sharing’:

(1) Shahada: The Profession of Faith, according to the formula There is no god but God. Muhammad is the Messenger of God. To this, the Shi’i minority add: Ali is the Friend of God. It must be made in the presence of other believing Muslims. It is whispered in the ear of a new-born baby. These acts represent the sharing of knowledge of God’s truth revealed to Mankind;

010 (2)

(2) Salat: Prayer – may be individual, but is more often communal and on Fridays at noon is congregational, when all adult male members of the community are gathered. Males and females are usually separated, with women worshipping behind the men or in a screened-off section of the mosque. It takes the form of a ritual prostration in which the precise bodily movements are as important as the accompanying mental activity. Sunni Muslims are required to perform salat five times daily – at dawn, noon, mid-afternoon, sunset and evening. Worshippers must be in a state of ritual purity achieved by performing major or minor ablutions, depending on the degree of pollution brought about by bodily secretions, sexual activity, contact with animals and so on. Salat may be performed virtually anywhere, provided the worshipper faces the qibla, the direction of the Ka’aba in Makka. Muslims share the experience of worship on Fridays when a sermon is usually delivered by the Imam or prayer-leader;

(3) Sawm: Fasting during Ramadam. The fast, which takes place during daylight hours in the holy month of Ramadam, the ninth month of the lunar calendar, applies to eating, drinking, smoking, and sexual activity. The fast begins at dawn and ends at sunset. In Muslim countries such as Egypt, the breaking of the fast at sundown is an occasion for joyful celebration, with tables laid out in the streets and feasting that carries on well into the night. A pre-fast meal is usually served before dawn. Ramadam is traditionally an occasion for both family get-togethers and religious reflection. It is considered especially meritorious to recite the whole of the Qur’an during the sacred month. According to tradition, the Qur’an ‘came down’ on 27 Ramadan, the ‘Night of Power’. During fasting, the individual feels the pangs of hunger but does so as a member of an entire community which is fasting. The experience enables him to share the sufferings of the poor and hungry. The ending of the fast is marked by a great communal festival (Eid al-Fitr);

(4) Zakat: Alms-giving/ Compulsory Charity. This tax, payable once a year by all adult Muslims, is assessed at 2.5 per cent of capital assets over and above a minimum known as the nisab. For example, the nisab for livestock consists of five camels, thirty cows, or forty sheep or goats. It is also payable on bank deposits, precious metals, merchandise used in trade (but not personal possessions) and crops from tilled land. The recipients should be the poor and needy. In the past, zakat was collected by the Muslim governments and distributed according to prescribed patterns, but in modern times it has usually been a matter for the believer’s conscience. Thus, the giving of alms exemplifies very clearly the Islamic obligation to share one’s property with others. Numerous quotations can be shared from the Qur’an relating to the duty to care for widows, orphans, etc. The institution of waqf should also be mentioned here, an endowment made by a Muslim to a religious, educational, or charitable cause;

011

(5) Hajj: The pilgrimage to Makka is a great spiritual experience in which the individual and collective aspects cannot be separated. It is an intense and demanding religious obligation, required of every adult Muslim at least once in his or her lifetime. The annual pilgrimage takes place during the last ten days of the twelfth lunar month (Dhu’l al-Hijja) reaching its climax with the Feast of Sacrifice (Eid al-Adha), a festival honoured throughout the Muslim world with the slaughter of a specially fattened sheep, cow or camel in commemoration of the Sacrifice of Abraham. The ‘minor pilgrimage’, or ‘Umra, may be performed at any time of the year. In the past, Muslims from far-flung regions would spend the best part of a lifetime on the journey, working their way across Africa or Asia to reach the Holy City. On their return they enjoyed the honoured status of Hajji – one who has made the pilgrimage. By meeting together at Mecca Muslims can share their sense of belonging to a worldwide community, the umma, which embraces all believers. Sharing the hazards and expenses of a long journey also reinforces this experience. The diagram below (fig. 3) shows how the Islamic metaphor of an ‘inner journey’, or Haqiqah, can be explored in relation to the physical pilgrimage, or Hajj.

FIG. 3; ALTERNATIVE METAPHORS (II) – THE ‘INNER JOURNEY’

004

The concept of sharing can be explored further by examining how, in the Middle Ages, Muslims shared useful knowledge with the peoples they came into contact with, e.g. new crops, irrigation systems, medical and architectural techniques, Arabic numerals, etc. It is also important to emphasise that Muslims do not simply share experiences, beliefs and goods with each other, but they also share the following beliefs and values with both Christians and Jews:

(a) a belief in the One God as Creator and Guide;

(b) a concern for weaker members of the community;

(c) a duty to deal justly and kindly with fellow human beings;

(d) delight in the beauty of the natural world.

Images of Islam & Muslim Identities:

In 2000, I received a copy of a small book by Malise Ruthven with my copy of The Times Higher Education Supplement called Islam: A Very Short Introduction. At the time, I regarded it as a useful addition to my collection of small reference books, but I have since lent it to several friends who have engaged me in discussions of the role of Islam in the modern world, especially since the attacks on Washington and New York of 11th September 2001. Even before what has become known as 9/11, when I was teaching international students including Muslims at a Quaker school in Britain, Islam was seen by many as a hostile force, a possible replacement for communism as the main ideological challenge to post-Enlightenment liberalism. When we opened any newspaper or turned on the radio or television (in a time before social media added another dimension to political and educational discourse), there were stories about Islam. Many of these were accompanied by images of violence, whether from Kashmir, Bosnia, Algeria or Palestine. These images of Islam were usually of a hard, uncompromising faith whose adherents would resort to violence in defence of their principles or in order to impose their will on others. Yet for those of us more familiar with Muslims and their traditions over the previous quarter century or more, the image of ‘militant Islam’ was at odds with the faith that most of its adherents would regard as no less pacific than Buddhism or Christianity. The word ‘Islam’ in Arabic means ‘self-surrender’ and is closely related, etymologically, to Salaam, the word for ‘peace’. The universal greeting with which Muslims address each other, and foreigners, is as-Salaam ‘Alaikum – ‘Peace be upon you’.

In the eyes of many Muslims, this was a distorted image in the Western media. In an age of sound-bites and newspaper headlines driven by tabloid sales, the lives and values of peace-loving majorities were inevitably obscured by the attention-seeking acts of noisy minorities. The news media acted as a distorting mirror at a fairground, exaggerating the militancy of the few while minimising the quietism or indifference of the many. Samuel Huntingdon, a Harvard professor, stated that Islam has bloody borders and predicted that there would be a clash of civilisations between Islam, ‘the West’ and China after the collapse of Marxism-Leninism. Fred Halliday, a perceptive observer of world affairs, wrote that:

… the myth of confrontation is sustained from two apparently contradictory sides – from the camp of those, mainly in the West, seeking to turn the Muslim world into another enemy, and from those within the Islamic countries who advocate confrontation with the non-Muslim, particularly Western, world.

Defining Islam is far from a simple matter. Using Western categories that may be alien to Muslim perceptions, Islam may be defined as both a religion and a political ideology; it is also, in some contexts, a mark of personal and group identity. These three definitions neither exclude nor include each other. As already noted above, ‘Islam’ in Arabic is a verbal noun or gerund, meaning ‘surrendering to God’ as revealed through the message and the life of the prophet, Muhammad. In its primary meaning, as employed in the Qur’an and other foundational texts, the word ‘Muslim’ refers to one who so surrenders himself or herself, from the active participle of the verb aslama, ‘to surrender oneself’. It also has a secondary meaning, referring to one who takes on their parent’s confessional identity without necessarily subscribing to the beliefs and practices of the faith, just as a Jew may define herself as ‘Jewish’ without observing the Halacha. In non-Muslim societies, these Muslims may subscribe to, and be vested with, secular identities. The Muslim population of Bosnia, descendants of Slavs who converted to Islam under the Ottoman rule, were not always noted for their attendance at prayers, abstention from alcohol, seclusion of women, and other social practices associated with believing Muslims in other parts of the world. They were officially designated as ‘Muslims’ in order to distinguish them from the mainly ‘Orthodox’ Serbs and ‘Catholic’ Croats under the former Yugoslav Republic. The ‘label’ therefore applied to their ethnic identity, rather than to their faithfulness to the religion.

In this limited context, which could also be applied to many of the second and third generations of immigrants from ‘Muslim societies’, there was no contradiction between being ‘culturally’ Muslim and simultaneously an atheist or agnostic. This is also the case with the word ‘Jewish’, but the adjective ‘Christian’ can only strictly be applied to a confessional identity. However, the secular definition of ‘Muslim’ has been rejected by modern Muslim scholars have tended to redraw the boundaries between themselves and ‘nominal’ Muslims, even going so far as to describe the latter as ‘infidels’ (i.e. ‘outside the faith’). Similarly, ‘evangelicals’ among Christians have reappropriated the word ‘Christian’ to apply solely to those who accept Jesus as Messiah, rather than accepting its use as a means of nominal reference to Western culture as predominantly Christian. Generally, there has been little consistency in the way such nomenclature has been used. Where ‘Muslims’, however secular or ‘cultural’, were beleaguered, as in Bosnia, rhetorical generosity would include them among the believers.

Identifying Islam as a Faith without Leadership:

No less than other successful modern religions, Islam contains a rich repertoire of concepts, symbols and spiritual disciplines through which believers maintain their identities and sense of being in the world, their sense of being in contact with God. The crisis many Muslims were facing at the turn of the millennium was not the result of some inherent lack of flexibility in the realm of ideas. Historically, Islam has shown enormous flexibility in adjusting to the complexities of the contemporary world and in accommodating different cultural systems within its overarching framework: the Abrahamic ‘family’ of western Asian monotheism which includes Judaism and Christianity as well as Islam, as one of three world faiths with a common familial ancestor and origin. The crisis of modern Islam (and few denied that such a crisis existed two decades ago, and still does) was not so much a ‘spiritual crisis’ as a crisis of authority – political, intellectual and legal as well as spiritual. The best community or umma ordained by God for enjoining what is right and forbidding what is wrong – a community that successfully conducted its affairs for fifteen centuries without external interference – demanded leadership. Yet outside the ‘Shi’ite’ minority tradition in Iran, a leadership commanding universal support among Muslims of all traditions was conspicuously absent.

001

Above: The Rise and Spread of Islam

There is no ‘church’ in Islam, no formally instituted body empowered to supervise or dictate the religious agenda, to articulate an ‘official’ Islamic view comparable to that of the Papacy, Bishops, Synods and Moderators, nor even Chief Rabbis. With the collapse of the Islamic superstate that lasted barely two centuries after the death of the Prophet Muhammad (see the map above), religious authority was entrusted to the ‘ulama (‘learned men’), a class of scholars whose role as guardians and interpreters of the tradition is much closer to those of the Pharisaic rabbis in Judaism than that of a Christian ‘apostolic succession’. They did not exercise political power but acted as a break on the power of the rulers, the sultans (‘authorities’) and amirs (‘commanders’), most of whom came to power by force of arms, interpreting and administering the divine law according to complex rules developed in the academies. The most prestigious of these academies, Al-Azhar in Cairo, was founded in AD 971 and claims to be the oldest university in the world. Though its rector enjoys a pre-eminent position in ‘Sunni’ Islam, his decisions are not binding on his peers. Similarly, although all Muslim governments appointed an official mufti from among the ‘ulama, his opinions were purely consultative unless supported in court by a judge, placing religious law under the law of the state. Mass education policies were undertaken by most post-colonial governments, thus short-circuiting the traditional body of scholarship surrounding the interpretation of the sacred texts, leading to a crisis of intellectual authority and a failure by the ‘ulama to incorporate reformist thinking into their discourse.

‘Islamism’ as a Political Ideology:

The word ‘fundamentalist’ had passed into English usage as a term of abuse, whether applied to faithful Christians or Muslims, but by the end of the century it also became applied to Muslims who sought to establish an ‘Islamic state’. According to this view, it was the task of the Islamic state to enforce obedience to the revealed law of Islam – the Shari’a. The term ‘fundamentalist’ is problematic because of its Christian origins. Fundamentalism was originally a theological movement directed against liberal or modernist theology, in particular, those teachings that questioned literal understandings of ‘supernatural’ events such as the six-day creation, the virgin birth and the physical resurrection of Christ. Muslim writers and scholars described as ‘fundamentalist’ have all adopted some modernistic and allegorical interpretations of the Qur’an which, as demonstrated above, is full of metaphor anyway. At the same time, all believing Muslims, not just those described as ‘fundamentalists’, have continued to see the Qur’an as the eternal, unmediated word of God. As Ruthven has pointed out:

The focus for those seeking to defend Islam against what they see as the corrupting effects of modern secularism and the ‘West’ is action rather than belief. This agenda, however novel its methods of application (including the adoption of terrorist methods), generally accords with long-established historical patterns. Throughout history Islamic rectitude has tended to be defined in relation to practice rather than doctrine. … It is in enforcing behavioural conformity (orthopraxy) rather than in doctrinal conformity (orthodoxy) that Muslim radicals or activists look to a ‘restoration’ of Islamic law backed by the power of the state.

The means adopted towards achieving this end, however, varied greatly according to the political and institutional contexts of the countries in which it took shape and form. In Jordan, Muslim radicals sat as parliamentary representatives, but the democratic system was adopted purely as a means to an end in which it would be rejected; in Algeria, and to a lesser extent in Egypt, they were involved in armed conflict with the state; in Pakistan and more recently in Sudan, they exercised power on the backs of military dictatorships; in Iran they operated under a hybrid system, sitting as parliamentary representatives chosen from a restricted list of like-minded candidates. Most ‘militant’ Muslims challenged the fundamentals of the international order. They aimed to replace the sovereignty of the people expressed through parliamentary law-makers, with the ‘sovereignty of God’ as revealed, in its perfection and finality, through the Shari’a law. The many critics of this approach directed their fire at two of its arguments. Firstly, they pointed out that, historically, no Islamic society, even during the high tide of the Ottoman Empire, was governed exclusively according to Shari’a law. There was always a gap between the theoretical formulations of the jurists and the de facto exercise of political power. Moreover, there was always an enormous diversity among Muslim societies, so that everywhere Islamic law was supplemented by local customary laws. Secondly, those who insist on politicising Islam were charged with misrepresentation of the faith. Far from drawing exclusively upon Quranic teaching, the ideology being advanced were hybrids, mixing Islamic ideas with modern totalitarian ones.

Therefore, to refer to modern political Islam as ‘radical’ or ‘fundamentalist’ is not only misleading, but it makes a gratuitous concession to its advocates by implying that the defence of the ‘roots’ or ‘fundamentals’ of Islam invariably demands political action. Muslims who contest this view argue that as long as governments do not prevent the believer from carrying out his or her religious duties, it cannot be described as anti-Islam.

The Growth of Islam & Islamophobia in Europe to 2015:

When the 2001 Census for England and Wales was published the following year, a Times journalist made comments about likely future immigration which were denounced in the House of Commons by the Home Secretary David Blunkett as bordering on fascism. By the time of the next census of 2011 (published at the end of 2012), however, showed that very major ethnic changes had taken place over the decade. But there were equally striking findings about the changing religious make-up of Britain. For instance, they revealed that almost every belief was on the rise except Christianity. Since the previous census, the number of people identifying themselves as Christian had fallen from seventy-two per cent to fifty-nine per cent. The number of Christians in England and Wales had dropped by nearly four million, from thirty-seven million to thirty-three. But while Christianity witnessed this huge collapse in its ‘professing’ followers, one which was only expected to continue, mass migration had led to a dramatic increase in the Muslim population. Between 2001 and 2011 the number of Muslims in England and Wales rose from 1.5 million to 2.7 million. Moreover, the beliefs and values of these recent immigrants were more socially conservative than those of the majority of the population. A Gallup survey conducted in 2009 found that none of the five hundred British Muslims interviewed thought that homosexuality was morally acceptable. Seven years later, another survey found that more than half (52%) of British Muslims thought that homosexuality should be made illegal. The common response to these were that these were the attitudes of many indigenous British people a generation or two previously.

More serious threats to community cohesion were posed by the attitudes of some Muslim communities towards women and teenage girls. From the early 2000s onwards, stories and evidence emerged of organised grooming of often underage girls by gangs of men of Pakistani ‘heritage’ in towns in the north of England. A 2004 television documentary on social services in Bradford had its screening postponed after self-proclaimed ‘anti-fascists’ and local police chiefs appealed to Channel Four to drop the documentary. The sections that dealt with the sexual exploitation of ‘white’ girls by ‘Asian’ gangs were thought to be potentially inflammatory, especially ahead of local elections in which the ‘ultra-right’ British National Party was standing. But everything about this case provided a microcosm of a problem and a reaction which would shortly spread across Europe. Campaigning on, or even mentioning, the issue of grooming during those years brought with it terrible animosity towards those who did so. When the northern Labour MP Ann Cryer took up the issue of the rape of underage girls in her own constituency, she was swiftly and widely denounced as an ‘Islamophobe’ and a ‘racist’, and at one stage had to receive police protection. It took years for the central government, the police, local authorities and the Crown Prosecution Service to face up to the issue. When they finally did so, an official enquiry into abuse in the town of Rotherham alone revealed the sexual exploitation of 1,400 children over the period 1997-2014. The victims were all non-Muslim girls from the local community, the youngest of whom was eleven. The enquiry found that because almost all men were of Pakistani ‘heritage’, the staff at the local council had described their…

… nervousness about identifying the ethnic origins of perpetrators for fear of being thought racist; others remembered clear direction from their managers not to do so.

To make matters worse, the communities from which the men came, by then well-established in the town, showed no willingness to confront the problem and every desire to cover it up. Even at the courts, after sentencing, families of those accused claimed that the whole thing was a government ‘stitch-up’ of some kind. Those Muslims who did speak out against the abuse by members of his own community, they received death threats from fellow British Muslims for doing so. The judges who eventually presided over the trials summed up the evidence by stating that the girls were chosen because they were from different communities, non-Muslim and therefore regarded as ‘easy meat’. Many of these men had brought ideas about women and especially about unaccompanied or ‘unprotected’ women with them from Pakistan and other patriarchal Muslim cultures. However, in the face of such attitudes being expressed towards women in the United Kingdom, the British state in all its agencies was clearly culpable in failing to uphold the law of the land and the norms of British society. The British police remained scarred from the Macpherson Report of 1999 which had charged them with ‘institutional racism’ and feared any repeat of such findings.

At the same time, over the course of the 2000s, criticisms of extreme examples of ‘multiculturalism’ in Britain and ‘political correctness’ came from politicians on the left as well as the right. These ‘breakages’, as Douglas Murray has described them, also came from those of ‘ethnic’ backgrounds, like Trevor Phillips, a former National Union of Students colleague of mine, who opened up territory that others had not dared to walk in. His realisation that the race-relations industry was part of the problem, and that partly as a result of talking up ‘diversity’ the country was ‘sleepwalking to segregation’, was an insight that others began to share, not just in Britain, but also across the continent. The emergence of Ahmed Aboutaleb and Ayaan Hirsi in Holland, Nyamki Sabuni in Sweden, Naser Khader in Denmark and Magdi Allam in Italy, had a palpably liberating effect. All spoke from within their communities to countries that needed people to do so with varying degrees of success. In each country, the issues of ‘honour’ killings and female genital mutilation received massive attention. The era of multiculturalism quietly transformed itself into the era of ‘multifaithism’. Ethnic identity began to recede and faith identity, which to many people outside the faith communities seemed to have come from nowhere, instead became the crucial issue. What had been a question of blacks or Caribbeans, North Africans and Pakistanis, now became a question of relations between Christian, Jewish and Muslim ‘cultures’.

Everywhere in Europe concerns over the integration of faith-based immigrant cultures were growing. During these decades in which European governments allowed immigration to run at the levels they did, few if any expected that they would spend the foreseeable future trying to balance Islamic laws and demands with European culture and traditions. Yet, as immigrant populations grew, everywhere the same problems erupted. Sometimes it occurred because of the discovery of what was going on within the new immigrant communities. In the United Kingdom, for example, the police were forced to admit that they had failed to investigate scores of suspicious deaths of young Muslim women because they had thought these potential ‘honour killings’ were community matters. In 2006 the British Medical Association reported that at least 74,000 women in Britain had been subjected to genital mutilation.

Nobody flinched in 2015 at a passing mention in a piece in The Atlantic magazine of Europe’s endless, debilitating blasphemy wars. Despite a couple of decades of warnings, from the Rushdie affair onwards, no one in any position of authority or power had prepared for the possibility the wave of events that followed. Before that affair, no one had ever thought about it as a Muslim issue. No one in Britain had thought that those arriving might not only prove much harder to integrate than the Pakistani Muslims and east African ‘Asians’ of the sixties and seventies but that they would also bring with them many socially conservative with them, or that other religious and ethnic minorities, such as the Jews, might be the first victims of such a lack of foresight. No one in a position of authority had ever predicted that an upsurge in immigration would lead to an increase in anti-Semitism and homophobia. No-one in the post-Christian West, even the religiously literate, had foreseen that ‘blasphemy’ would again become one of the major cultural and security issues of early twenty-first century Europe. Those who had warned about it in public had been ignored, defamed, dismissed, prosecuted or physically attacked. What mainstream politicians and much of the media had done, from the 1990s to the 2010s was to encourage a sense that the people in Europe who were shouting ‘fire’ were the actual arsonists, fanning the flames of Islamophobia rather than seeking to extinguish them. Three decades after the Rushdie affair changed the world, there was almost no one in Europe who would dare write a novel, compose a piece of music or even draw a mildly satirical image that might risk Muslim anger. We went out of our way to show how much we admired Islam but did not apply the same rigorous standards of criticism and that secular society had applied to Christianity decades earlier.

(to be continued… )

Primary Sources:

W. Owen Cole (ed) (1982), World Religions: A Handbook for Teachers. London: The Commission for Racial Equality.

Luc Heymans (ed.) (1989), Trans Europe Peace: Linking bulletin for Peace Education movements among the EEC State members, no. 3, February 1989. Namur: Universite de Paix.

 

 

Posted April 1, 2019 by TeamBritanniaHu in Africa, Anti-racism, anti-Semitism, Arabs, Asia, Assimilation, Belfast, Birmingham, Britain, British history, Brussels, Caribbean, Christian Faith, Christianity, Church, Civil Rights, Civilization, Cold War, Commemoration, Commonwealth, Communism, Conquest, Coventry, decolonisation, Discourse Analysis, Egypt, Empire, eschatology, Ethnicity, Europe, France, History, homosexuality, Hungary, hygeine, Immigration, Integration, Jews, Marxism, Middle East, Migration, monotheism, multiculturalism, Narrative, Population, Racism, Respectability, Social Service, Statehood, Syria, terror, terrorism, theology, Turkey, Uncategorized, United Kingdom, Warfare, West Midlands, xenophobia

Tagged with , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

You Only Live Twice – Cool Britannia to Cold Brexit: The United Kingdom, 1999-2019. Part One: Economics, Culture & Society.   Leave a comment

Europe-map-without-UK-012

Cold Shoulder or Warm Handshake?

On 29 March 2019, the United Kingdom of Great Britain and Northern Ireland will leave the European Union after forty-six years of membership, since it joined the European Economic Community on 1 January 1973 on the same day and hour as the Republic of Ireland. Yet in 1999, it looked as if the long-standing debate over Britain’s membership had been resolved. The Maastricht Treaty establishing the European Union had been signed by all the member states of the preceding European Community in February 1992 and was succeeded by a further treaty, signed in Amsterdam in 1999. What, then, has happened in the space of twenty years to so fundamentally change the ‘settled’ view of the British Parliament and people, bearing in mind that both Scotland and Northern Ireland voted to remain in the EU, while England and Wales both voted to leave? At the time of writing, the manner of our going has not yet been determined, but the invocation of ‘article fifty’ by the Westminster Parliament and the UK government means that the date has been set. So either we will have to leave without a deal, turning a cold shoulder to our erstwhile friends and allies on the continent, or we will finally ratify the deal agreed between the EU Commission, on behalf of the twenty-seven remaining member states, and leave with a warm handshake and most of our trading and cultural relations intact.

As yet, the possibility of a second referendum – or third, if we take into account the 1975 referendum, called by Harold Wilson (above) which was also a binary leave/ remain decision – seems remote. In any event, it is quite likely that the result would be the same and would kill off any opportunity of the UK returning to EU membership for at least another generation. As Ian Fleming’s James Bond tells us, ‘you only live twice’. That certainly seems to be the mood in Brussels too. I was too young to vote in 1975 by just five days, and another membership referendum would be unlikely to occur in my lifetime. So much has been said about following ‘the will of the people’, or at least 52% of them, that it would be a foolish government, in an age of rampant populism, that chose to revoke article fifty, even if Westminster voted for this. At the same time, and in that same populist age, we know from recent experience that in politics and international relations, nothing is inevitable…

referendum-ballot-box[1]

One of the major factors in the 2016 Referendum Campaign was the country’s public spending priorities, compared with those of the European Union. The ‘Leave’ campaign sent a double-decker bus around England stating that by ending the UK’s payments into the EU, more than 350 million pounds per week could be redirected to the National Health Service (NHS).

A British Icon Revived – The NHS under New Labour:

To understand the power of this statement, it is important to recognise that the NHS is unique in Europe in that it is wholly funded from direct taxation, and not via National Insurance, as in many other European countries. As a service created in 1948 to be ‘free at the point of delivery’, it is seen as a ‘British icon’ and funding has been a central issue in national election campaigns since 2001, when Tony Blair was confronted by an irate voter, Sharon Storer, outside a hospital. In its first election manifesto of 1997, ‘New Labour’ promised to safeguard the basic principles of the NHS, which we founded. The ‘we’ here was the post-war Labour government, whose socialist Health Minister, Aneurin Bevan, had established the service in the teeth of considerable opposition from within both parliament and the medical profession. ‘New Labour’ protested that under the Tories there had been fifty thousand fewer nurses but a rise of no fewer than twenty thousand managers – red tape which Labour would pull away and burn. Though critical of the internal markets the Tories had introduced, Blair promised to keep a split between those who commissioned health services and those who provided them.

001

Under Frank Dobson, Labour’s new Health Secretary, there was little reform of the NHS but there was, year by year, just enough extra money to stave off the winter crises. But then a series of tragic individual cases hit the headlines, and one of them came from a Labour peer and well-known medical scientist and fertility expert, Professor Robert Winston, who was greatly admired by Tony Blair. He launched a furious denunciation of the government over the treatment of his elderly mother. Far from upholding the NHS’s iconic status, Winston said that Britain’s health service was the worst in Europe and was getting worse under the New Labour government, which was being deceitful about the true picture. Labour’s polling on the issue showed that Winston was, in general terms, correct in his assessment in the view of the country as a whole. In January 2000, therefore, Blair announced directly to it that he would bring Britain’s health spending up to the European average within five years. That was a huge promise because it meant spending a third as much again in real terms, and his ‘prudent’ Chancellor of the Exchequer, Gordon Brown, was unhappy that Blair had not spoken enough on television about the need for health service reform to accompany the money, and had also ‘stolen’ his budget announcements. On Budget day itself, Brown announced that until 2004 health spending would rise at above six per cent beyond inflation every year, …

… by far the largest sustained increase in NHS funding in any period in its fifty-year history … half as much again for health care for every family in this country.       

The tilt away from Brown’s sharp spending controls during the first three years of the New Labour government had begun by the first spring of the new millennium, and there was more to come. With a general election looming in 2001, Brown also announced a review of the NHS and its future by a former banker. As soon as the election was over, broad hints about necessary tax rises were dropped. When the Wanless Report was finally published, it confirmed much that the winter crisis of 1999-2000 had exposed. The NHS was not, whatever Britons fondly believed, better than health systems in other developed countries, and it needed a lot more money. ‘Wanless’ also rejected a radical change in funding, such as a switch to insurance-based or semi-private health care. Brown immediately used this as objective proof that taxes had to rise in order to save the NHS. In his next budget of 2002, Brown broke with a political convention that which had reigned since the mid-eighties, that direct taxes would not be raised again. He raised a special one per cent national insurance levy, equivalent to a penny on income tax, to fund the huge reinvestment in Britain’s health.

Public spending shot up with this commitment and, in some ways, it paid off, since by 2006 there were around 300,000 extra NHS staff compared to 1997. That included more than ten thousand extra senior hospital doctors (about a quarter more) and 85,000 more nurses. But there were also nearly forty thousand managers, twice as many as Blair and Brown had ridiculed the Tory government for hiring. An ambitious computer project for the whole NHS became an expensive catastrophe. Meanwhile, the health service budget rose from thirty-seven billion to more than ninety-two billion a year. But the investment produced results, with waiting lists, a source of great public anger from the mid-nineties, falling by 200,000. By 2005, Blair was able to talk of the best waiting list figures since 1988. Hardly anyone was left waiting for an inpatient appointment for more than six months. Death rates from cancer for people under the age of seventy-five fell by 15.7 per cent between 1996 and 2006 and death rates from heart disease fell by just under thirty-six per cent. Meanwhile, the public finance initiative meant that new hospitals were being built around the country. But, unfortunately for New Labour, that was not the whole story of the Health Service under their stewardship. As Andrew Marr has attested,

…’Czars’, quangos, agencies, commissions, access teams and planners hunched over the NHS as Whitehall, having promised to devolve power, now imposed a new round of mind-dazing control.

By the autumn of 2004 hospitals were subject to more than a hundred inspections. War broke out between Brown and the Treasury and the ‘Blairite’ Health Secretary, Alan Milburn, about the basic principles of running the hospitals. Milburn wanted more competition between them, but Brown didn’t see how this was possible when most people had only one major local hospital. Polling suggested that he was making a popular point. Most people simply wanted better hospitals, not more choice. A truce was eventually declared with the establishment of a small number of independent, ‘foundation’ hospitals. By the 2005 general election, Michael Howard’s Conservatives were attacking Labour for wasting money and allowing people’s lives to be put at risk in dirty, badly run hospitals. Just like Labour once had, they were promising to cut bureaucracy and the number of organisations within the NHS. By the summer of 2006, despite the huge injection of funds, the Service was facing a cash crisis. Although the shortfall was not huge as a percentage of the total budget, trusts in some of the most vulnerable parts of the country were on the edge of bankruptcy, from Hartlepool to Cornwall and across to London. Throughout Britain, seven thousand jobs had gone and the Royal College of Nursing, the professional association to which most nurses belonged, was predicting thirteen thousand more would go soon. Many newly and expensively qualified doctors and even specialist consultants could not find work. It seemed that wage costs, expensive new drugs, poor management and the money poured into endless bureaucratic reforms had resulted in a still inadequate service. Bupa, the leading private operator, had been covering some 2.3 million people in 1999. Six years later, the figure was more than eight million. This partly reflected greater affluence, but it was also hardly a resounding vote of confidence in Labour’s management of the NHS.

Public Spending, Declining Regions & Economic Development:

As public spending had begun to flow during the second Blair administration, vast amounts of money had gone in pay rises, new bureaucracies and on bills for outside consultants. Ministries had been unused to spending again, after the initial period of ‘prudence’, and did not always do it well. Brown and his Treasury team resorted to double and triple counting of early spending increases in order to give the impression they were doing more for hospitals, schools and transport than they actually could. As Marr has pointed out, …

… In trying to achieve better policing, more effective planning, healthier school food, prettier town centres and a hundred other hopes, the centre of government ordered and cajoled, hassled and harangued, always high-minded, always speaking for ‘the people’.  

The railways, after yet another disaster, were shaken up again. In very controversial circumstances Railtrack, the once-profitable monopoly company operating the lines, was driven to bankruptcy and a new system of Whitehall control was imposed. At one point, Tony Blair boasted of having five hundred targets for the public sector. Parish councils, small businesses and charities found that they were loaded with directives. Schools and hospitals had many more. Marr has commented, …

The interference was always well-meant but it clogged up the arteries of free decision-taking and frustrated responsible public life. 

002

Throughout the New Labour years, with steady growth and low inflation, most of the country grew richer. Growth since 1997, at 2.8 per cent per year, was above the post-war average, GDP per head was above that of France and Germany and the country had the second lowest jobless figures in the EU. The number of people in work increased by 2.4 million. Incomes grew, in real terms, by about a fifth. Pensions were in trouble, but house price inflation soured, so the owners found their properties more than doubling in value and came to think of themselves as prosperous. By 2006 analysts were assessing the disposable wealth of the British at forty thousand pounds per household. However, the wealth was not spread geographically, averaging sixty-eight thousand in the south-east of England, but a little over thirty thousand in Wales and north-east England (see map above). But even in the historically poorer parts of the UK house prices had risen fast, so much so that government plans to bulldoze worthless northern terraces had to be abandoned when they started to regain value. Cheap mortgages, easy borrowing and high property prices meant that millions of people felt far better off, despite the overall rise in the tax burden. Cheap air travel gave the British opportunities for easy travel both to traditional resorts and also to every part of the European continent. British expatriates were able to buy properties across the French countryside and in southern Spain. Some even began to commute weekly to jobs in London or Manchester from Mediterranean villas, and regional airports boomed as a result.

Sir Tim Berners Lee arriving at the Guildhall to receive the Honorary Freedom of the City of LondonThe internet, also known as the ‘World-Wide Web’, which was ‘invented’ by the British computer scientist Tim Berners-Lee at the end of 1989 (pictured right in 2014), was advancing from the colleges and institutions into everyday life by the mid- ‘noughties’. It first began to attract popular interest in the mid-nineties: Britain’s first internet café and magazine, reviewing a few hundred early websites, were both launched in 1994. The following year saw the beginning of internet shopping as a major pastime, with both ‘eBay’ and ‘Amazon’ arriving, though to begin with they only attracted tiny numbers of people.

But the introduction of new forms of mail-order and ‘click and collect’ shopping quickly attracted significant adherents from different ‘demographics’.  The growth of the internet led to a feeling of optimism, despite warnings that the whole digital world would collapse because of the inability of computers to cope with the last two digits in the year ‘2000’, which were taken seriously at the time. In fact, the ‘dot-com’ bubble was burst by its own excessive expansion, as with any bubble, and following a pause and a lot of ruined dreams, the ‘new economy’ roared on again. By 2000, according to the Office of National Statistics (ONS), around forty per cent of Britons had accessed the internet at some time. Three years later, nearly half of British homes were ‘online’. By 2004, the spread of ‘broadband’ connections had brought a new mass market in ‘downloading’ music and video. By 2006, three-quarters of British children had internet access at home.

001

Simultaneously, the rich of America, Europe and Russia began buying up parts of London, and then other ‘attractive’ parts of the country, including Edinburgh, the Scottish Highlands, Yorkshire and Cornwall. ‘Executive housing’ with pebbled driveways, brick facing and dormer windows, was growing across farmland and by rivers with no thought of flood-plain constraints. Parts of the country far from London, such as the English south-west and Yorkshire, enjoyed a ripple of wealth that pushed their house prices to unheard-of levels. From Leith to Gateshead, Belfast to Cardiff Bay, once-derelict shorefront areas were transformed. The nineteenth-century buildings in the Albert Dock in Liverpool (above) now house a maritime museum, an art gallery, shopping centre and television studio. It has also become a tourist attraction. For all the problems and disappointments, and the longer-term problems with their financing, new schools and public buildings sprang up – new museums, galleries, vast shopping complexes (see below), corporate headquarters in a biomorphic architecture of glass and steel, more imaginative and better-looking than their predecessors from the dreary age of concrete.

002

Supermarket chains exercised huge market power, offering cheap meat and dairy products into almost everyone’s budgets. Factory-made ready-meals were transported and imported by the new global air freight market and refrigerated trucks and lorries moving freely across a Europe shorn of internal barriers. Out-of-season fruit and vegetables, fish from the Pacific, exotic foods of all kinds and freshly cut flowers appeared in superstores everywhere. Hardly anyone was out of reach of a ‘Tesco’, a ‘Morrison’s’, a ‘Sainsbury’s’ or an ‘Asda’. By the mid-noughties, the four supermarket giants owned more than 1,500 superstores throughout the UK. They spread the consumption of goods that in the eighties and nineties had seemed like luxuries. Students had to take out loans in order to go to university but were far more likely to do so than previous generations, as well as to travel more widely on a ‘gap’ year, not just to study or work abroad.

Those ‘Left Behind’ – Poverty, Pensions & Public Order:

Materially, for the majority of people, this was, to use Marr’s term, a ‘golden age’, which perhaps helps to explain both why earlier real anger about earlier pension decisions and stealth taxes did not translate into anti-Labour voting in successive general elections. The irony is that in pleasing ‘Middle Englanders’, the Blair-Brown government lost contact with traditional Labour voters, especially in the North of Britain, who did not benefit from these ‘golden years’ to the same extent. Gordon Brown, from the first, made much of New Labour’s anti-poverty agenda, and especially child poverty. Since the launch of the Child Poverty Action Group, this latter problem had become particularly emotive. Labour policies took a million children out of relative poverty between 1997 and 2004, though the numbers rose again later. Brown’s emphasis was on the working poor and the virtue of work. So his major innovations were the national minimum wage, the ‘New Deal’ for the young unemployed, and the working families’ tax credit, as well as tax credits aimed at children. There was also a minimum income guarantee and a later pension credit, for poorer pensioners.

The minimum wage was first set at three pounds sixty an hour, rising year by year. In 2006 it was 5.35 an hour. Because the figures were low, it did not destroy the two million jobs as the Tories claimed it would. Neither did it produce higher inflation; employment continued to grow while inflation remained low. It even seemed to have cut red tape. By the mid-noughties, the minimum wage covered two million people, the majority of them women. Because it was updated ahead of rises in inflation rates, the wages of the poor also rose faster. It was so successful that even the Tories were forced to embrace it ahead of the 2005 election. The New Deal was funded by a windfall tax on privatised utility companies, and by 2000 Blair said it had helped a quarter of a million young people back into work, and it was being claimed as a major factor in lower rates of unemployment as late as 2005. But the National Audit Office, looking back on its effect in the first parliament, reckoned the number of under twenty-five-year-olds helped into real jobs was as low as 25,000, at a cost per person of eight thousand pounds. A second initiative was targeted at the babies and toddlers of the most deprived families. ‘Sure Start’ was meant to bring mothers together in family centres across Britain – 3,500 were planned for 2010, ten years after the scheme had been launched – and to help them to become more effective parents. However, some of the most deprived families failed to show up. As Andrew Marr wrote, back in 2007:

Poverty is hard to define, easy to smell. In a country like Britain, it is mostly relative. Though there are a few thousand people living rough or who genuinely do not have enough to keep them decently alive, and many more pensioners frightened of how they will pay for heating, the greater number of poor are those left behind the general material improvement in life. This is measured by income compared to the average and by this yardstick in 1997 there were three to four million children living in households of relative poverty, triple the number in 1979. This does not mean they were physically worse off than the children of the late seventies, since the country generally became much richer. But human happiness relates to how we see ourselves relative to those around us, so it was certainly real. 

The Tories, now under new management in the shape of a media-marketing executive and old Etonian, David Cameron, also declared that they believed in this concept of relative poverty. After all, it was on their watch, during the Thatcher and Major governments, that it had tripled, which is why it was only towards the end of the New Labour governments that they could accept the definition of the left-of-centre Guardian columnist, Polly Toynbee. A world of ‘black economy’ work also remained below the minimum wage, in private care homes, where migrant servants were exploited, and in other nooks and crannies. Some 336,000 jobs remained on ‘poverty pay’ rates. Yet ‘redistribution of wealth’, a socialist phrase which had become unfashionable under New Labour lest it should scare away middle Englanders, was stronger in Brown’s Britain than in other major industrialised nations. Despite the growth of the super-rich, many of whom were immigrants anyway, overall equality increased in these years. One factor in this was the return to the means-testing of benefits, particularly for pensioners and through the working families’ tax credit, subsequently divided into a child tax credit and a working tax credit. This was a U-turn by Gordon Brown, who had opposed means-testing when in Opposition. As Chancellor, he concluded that if he was to direct scarce resources at those in real poverty, he had little choice.

Apart from the demoralising effect it had on pensioners, the other drawback to means-testing was that a huge bureaucracy was needed to track people’s earnings and to try to establish exactly what they should be getting in benefits. Billions were overpaid and as people did better and earned more from more stable employment, they then found themselves facing huge demands to hand back the money they had already spent. Thousands of extra civil servants were needed to deal with the subsequent complaints and the scheme became extremely expensive to administer. There were also controversial drives to oblige more disabled people back to work, and the ‘socially excluded’ were confronted by a range of initiatives designed to make them more middle class. Compared with Mrs Thatcher’s Victorian Values and Mr Major’s Back to Basics campaigns, Labour was supposed to be non-judgemental about individual behaviour. But a form of moralism did begin to reassert itself. Parenting classes were sometimes mandated through the courts and for the minority who made life hell for their neighbours on housing estates, Labour introduced the Anti-Social Behaviour Order (‘Asbo’). These were first given out in 1998, granted by magistrates to either the police or the local council. It became a criminal offence to break the curfew or other sanction, which could be highly specific. Asbos could be given out for swearing at others in the street, harassing passers-by, vandalism, making too much noise, graffiti, organising ‘raves’, flyposting, taking drugs, sniffing glue, joyriding, prostitution, hitting people and drinking in public.

001 (2)

Although they served a useful purpose in many cases, there were fears that for the really rough elements in society and their tough children they became a badge of honour. Since breaking an Asbo could result in an automatic prison sentence, people were sent to jail for crimes that had not warranted this before. But as they were refined in use and strengthened, they became more effective and routine. By 2007, seven and a half thousand had been given out in England and Wales alone and Scotland had introduced its own version in 2004. Some civil liberties campaigners saw this development as part of a wider authoritarian and surveillance agenda which also led to the widespread use of CCTV (Closed Circuit Television) cameras by the police and private security guards, especially in town centres (see above). Also in 2007, it was estimated that the British were being observed and recorded by 4.2 million such cameras. That amounted to one camera for every fourteen people, a higher ratio than for any other country in the world, with the possible exception of China. In addition, the number of mobile phones was already equivalent to the number of people in Britain. With global satellite positioning chips (GPS) these could show exactly where their users were and the use of such systems in cars and even out on the moors meant that Britons were losing their age-old prowess for map-reading.

002003

The ‘Seven Seven’ Bombings – The Home-grown ‘Jihadis’:

Despite these increasing means of mass surveillance, Britain’s cities have remained vulnerable to terrorist attacks, more recently by so-called ‘Islamic terrorists’ rather than by the Provisional IRA, who abandoned their bombing campaign in 1998. On 7 July 2005, at rush-hour, four young Muslim men from West Yorkshire and Buckinghamshire, murdered fifty-two people and injured 770 others by blowing themselves up on London Underground trains and on a London bus. The report into this worst such attack in Britain later concluded that they were not part of an al Qaeda cell, though two of them had visited camps in Pakistan, and that the rucksack bombs had been constructed at the cost of a few hundred pounds. Despite the government’s insistence that the war in Iraq had not made Britain more of a target for terrorism, the Home Office investigation asserted that the four had been motivated, in part at least, by ‘British foreign policy’.

They had picked up the information they needed for the attack from the internet. It was a particularly grotesque attack, because of the terrifying and bloody conditions in the underground tunnels and it vividly reminded the country that it was as much a target as the United States or Spain. Indeed, the long-standing and intimate relationship between Great Britain and Pakistan, with constant and heavy air traffic between them, provoked fears that the British would prove uniquely vulnerable. Tony Blair heard of the attack at the most poignant time, just following London’s great success in winning the bid to host the 2012 Olympic Games (see above). The ‘Seven Seven’ bombings are unlikely to have been stopped by CCTV surveillance, of which there was plenty at the tube stations, nor by ID cards (which had recently been under discussion), since the killers were British subjects, nor by financial surveillance, since little money was involved and the materials were paid for in cash. Even better intelligence might have helped, but the Security Services, both ‘MI5’ and ‘MI6’ as they are known, were already in receipt of huge increases in their budgets, as they were in the process of tracking down other murderous cells. In 2005, police arrested suspects in Birmingham, High Wycombe and Walthamstow, in east London, believing there was a plot to blow up as many as ten passenger aircraft over the Atlantic.

After many years of allowing dissident clerics and activists from the Middle East asylum in London, Britain had more than its share of inflammatory and dangerous extremists, who admired al Qaeda and preached violent jihad. Once 11 September 2001 had changed the climate, new laws were introduced to allow the detention without trial of foreigners suspected of being involved in supporting or fomenting terrorism. They could not be deported because human rights legislation forbade sending back anyone to countries where they might face torture. Seventeen were picked up and held at Belmarsh high-security prison. But in December 2004, the House of Lords ruled that these detentions were discriminatory and disproportionate, and therefore illegal. Five weeks later, the Home Secretary Charles Clarke hit back with ‘control orders’ to limit the movement of men he could not prosecute or deport. These orders would also be used against home-grown terror suspects. A month later, in February 2005, sixty Labour MPs rebelled against these powers too, and the government only narrowly survived the vote. In April 2006 a judge ruled that the control orders were an affront to justice because they gave the Home Secretary, a politician, too much power. Two months later, the same judge ruled that curfew orders of eighteen hours per day on six Iraqis were a deprivation of liberty and also illegal. The new Home Secretary, John Reid, lost his appeal and had to loosen the orders.

006

Britain found itself in a struggle between its old laws and liberties and a new, borderless world in which the hallowed principles of ‘habeas corpus’, free speech, a presumption of innocence, asylum, the right of British subjects to travel freely in their own country without identifying papers, and the sanctity of homes in which the law-abiding lived were all coming under increasing jeopardy. The new political powers seemed to government ministers the least that they needed to deal with a threat that might last for another thirty years in order, paradoxically, to secure Britain’s liberties for the long-term beyond that. They were sure that most British people agreed, and that the judiciary, media, civil rights campaigners and elected politicians who protested were an ultra-liberal minority. Tony Blair, John Reid and Jack Straw were emphatic about this, and it was left to liberal Conservatives and the Liberal Democrats to mount the barricades in defence of civil liberties. Andrew Marr conceded at the time that the New Labour ministers were ‘probably right’. With the benefit of hindsight, others will probably agree. As Gordon Brown eyed the premiership, his rhetoric was similarly tough, but as Blair was forced to turn to the ‘war on terror’ and Iraq, he failed to concentrate enough on domestic policy. By 2005, neither of them could be bothered to disguise their mutual enmity, as pictured above. A gap seemed to open up between Blair’s enthusiasm for market ideas in the reform of health and schools, and Brown’s determination to deliver better lives for the working poor. Brown was also keen on bringing private capital into public services, but there was a difference in emphasis which both men played up. Blair claimed that the New Labour government was best when we are at our boldest. But Brown retorted that it was best when we are Labour. 

002 (2)

Tony Blair’s legacy continued to be paraded on the streets of Britain,

here blaming him and George Bush for the rise of ‘Islamic State’ in Iraq.

Asylum Seekers, EU ‘Guest’ Workers & Immigrants:

One result of the long Iraqi conflict, which President Bush finally declared to be over on 1 May 2003, was the arrival of many Iraqi asylum-seekers in Britain; Kurds, as well as Shiites and Sunnis. This attracted little comment at the time because there had been both Iraqi and Iranian refugees in Britain since the 1970s, especially as students and the fresh influx were only a small part of a much larger migration into the country which changed it fundamentally during the Blair years. This was a multi-lingual migration, including many Poles, some Hungarians and other Eastern Europeans whose countries had joined the EU and its single market in 2004. When the EU expanded Britain decided that, unlike France or Germany, it would not try to delay opening the country to migrant workers. The accession treaties gave nationals from these countries the right to freedom of movement and settlement, and with average earnings three times higher in the UK, this was a benefit which the Eastern Europeans were keen to take advantage of. Some member states, however, exercised their right to ‘derogation’ from the treaties, whereby they would only permit migrant workers to be employed if employers were unable to find a local candidate. In terms of European Union legislation, a derogation or that a member state has opted not to enforce a specific provision in a treaty due to internal circumstances (typically a state of emergency), and to delay full implementation of the treaty for five years. The UK decided not to exercise this option.

There were also sizeable inflows of western Europeans, though these were mostly students, who (somewhat controversially) were also counted in the immigration statistics, and young professionals with multi-national companies. At the same time, there was continued immigration from Africa, the Middle East and Afghanistan, as well as from Russia, Australia, South Africa and North America. In 2005, according to the Office for National Statistics, ‘immigrants’ were arriving to live in Britain at the rate of 1,500 a day. Since Tony Blair had been in power, more than 1.3 million had arrived. By the mid-2000s, English was no longer the first language of half the primary school children in London, and the capital had more than 350 different first languages. Five years later, the same could be said of many towns in Kent and other Eastern counties of England.

The poorer of the new migrant groups were almost entirely unrepresented in politics, but radically changed the sights, sounds and scents of urban Britain, and even some of its market towns. The veiled women of the Muslim world or its more traditionalist Arab, Afghan and Pakistani quarters became common sights on the streets, from Kent to Scotland and across to South Wales. Polish tradesmen, fruit-pickers and factory workers were soon followed by shops owned by Poles or stocking Polish and East European delicacies and selling Polish newspapers and magazines. Even road signs appeared in Polish, though in Kent these were mainly put in place along trucking routes used by Polish drivers, where for many years signs had been in French and German, a recognition of the employment changes in the long-distance haulage industry. Even as far north as Cheshire (see below), these were put in place to help monolingual truckers using trunk roads, rather than local Polish residents, most of whom had enough English to understand such signs either upon arrival or shortly afterwards. Although specialist classes in English had to be laid on in schools and community centres, there was little evidence that the impact of multi-lingual migrants had a long-term impact on local children and wider communities. In fact, schools were soon reporting a positive impact in terms of their attitudes toward learning and in improving general educational standards.

001

Problems were posed, however, by the operations of people smugglers and criminal gangs. Chinese villagers were involved in a particular tragedy when nineteen of them were caught while cockle-picking in Morecambe Bay by the notorious tides and drowned. Many more were working for ‘gang-masters’ as virtual, in some cases actual ‘slaves’. Russian voices became common on the London Underground, and among prostitutes on the streets. The British Isles found themselves to be ‘islands in the stream’ of international migration, the chosen ‘sceptred isle’ destinations of millions of newcomers. Unlike Germany, Britain was no longer a dominant manufacturing country but had rather become, by the late twentieth century, a popular place to develop digital and financial products and services. Together with the United States and against the Soviet Union, it was determined to preserve a system of representative democracy and the free market. Within the EU, Britain maintained its earlier determination to resist the Franco-German federalist model, with its ‘social chapter’ involving ever tighter controls over international corporations and ever closer political union. Britain had always gone out into the world. Now, increasingly, the world came to Britain, whether poor immigrants, rich corporations or Chinese manufacturers.

005

Multilingual & Multicultural Britain:

Immigration had always been a constant factor in British life, now it was also a fact of life which Europe and the whole world had to come to terms with. Earlier post-war migrations to Britain had provoked a racialist backlash, riots, the rise of extreme right-wing organisations and a series of new laws aimed at controlling it. New laws had been passed to control both immigration from the Commonwealth and the backlash to it. The later migrations were controversial in different ways. The ‘Windrush’ arrivals from the Caribbean and those from the Indian subcontinent were people who looked different but who spoke the same language and in many ways had had a similar education to that of the ‘native’ British. Many of the later migrants from Eastern Europe looked similar to the white British but shared little by way of a common linguistic and cultural background. However, it’s not entirely true to suggest, as Andrew Marr seems to, that they did not have a shared history. Certainly, through no fault of their own, the Eastern Europeans had been cut off from their western counterparts by their absorption into the Soviet Russian Empire after the Second World War, but in the first half of the century, Poland had helped the British Empire to subdue its greatest rival, Germany, as had most of the peoples of the former Yugoslavia. Even during the Soviet ‘occupation’ of these countries, many of their citizens had found refuge in Britain.

Moreover, by the early 1990s, Britain had already become both a multilingual nation. In 1991, Safder Alladina and Viv Edwards published a book for the Longman Linguistics Library which detailed the Hungarian, Lithuanian, Polish, Ukrainian and Yiddish speech communities of previous generations. Growing up in Birmingham, I certainly heard many Polish, Yiddish, Yugoslav and Greek accents among my neighbours and parents of school friends, at least as often as I heard Welsh, Irish, Caribbean, Indian and Pakistani accents. The Longman book begins with a foreword by Debi Prasanna Pattanayak in which she stated that the Language Census of 1987 had shown that there were 172 different languages spoken by children in the schools of the Inner London Education Authority. In an interesting precursor of the controversy to come, she related how the reaction in many quarters was stunned disbelief, and how one British educationalist had told her that England had become a third world country. She commented:

After believing in the supremacy of English as the universal language, it was difficult to acknowledge that the UK was now one of the greatest immigrant nations of the modern world. It was also hard to see that the current plurality is based on a continuity of heritage. … Britain is on the crossroads. It can take an isolationist stance in relation to its internal cultural environment. It can create a resilient society by trusting its citizens to be British not only in political but in cultural terms. The first road will mean severing dialogue with the many heritages which have made the country fertile. The second road would be working together with cultural harmony for the betterment of the country. Sharing and participation would ensure not only political but cultural democracy. The choice is between mediocrity and creativity.

002

Language and dialect in the British Isles, showing the linguistic diversity in many English cities by 1991 as a result of Commonwealth immigration as well as the survival and revival of many of the older Celtic languages and dialects of English.

Such ‘liberal’, ‘multi-cultural’ views may be unfashionable now, more than a quarter of a century later, but it is perhaps worth stopping to look back on that cultural crossroads, and on whether we are now back at that same crossroads, or have arrived at another one. By the 1990s, the multilingual setting in which new Englishes evolved had become far more diverse than it had been in the 1940s, due to immigration from the Indian subcontinent, the Caribbean, the Far East, and West and East Africa. The largest of the ‘community languages’ was Punjabi, with over half a million speakers, but there were also substantial communities of Gujurati speakers (perhaps a third of a million) and a hundred thousand Bengali speakers. In some areas, such as East London, public signs and notices recognise this (see below). Bengali-speaking children formed the most recent and largest linguistic minority within the ILEA and because the majority of them had been born in Bangladesh, they were inevitably in the greatest need of language support within the schools. A new level of linguistic and cultural diversity was introduced through Commonwealth immigration.

003

007

Birmingham’s booming postwar economy attracted West Indian settlers from Jamaica, Barbados and St Kitts in the 1950s. By 1971, the South Asian and West Indian populations were equal in size and concentrated in the inner city wards of North and Central Birmingham (see the map above).  After the hostility towards New Commonwealth immigrants in some sections of the local White populations in the 1960s and ’70s, they had become more established in cities like Birmingham, where places of worship, ethnic groceries, butchers and, perhaps most significantly, ‘balti’ restaurants, began to proliferate in the 1980s and ’90s. The settlers materially changed the cultural and social life of the city, most of the ‘white’ population believing that these changes were for the better. By 1991, Pakistanis had overtaken West Indians and Indians to become the largest single ethnic minority in Birmingham. The concentration of West Indian and South Asian British people in the inner city areas changed little by the end of the century, though there was an evident flight to the suburbs by Indians. As well as being poorly-paid, the factory work available to South Asian immigrants like the man in a Bradford textile factory below, was unskilled. By the early nineties, the decline of the textile industry over the previous two decades had let to high long-term unemployment in the immigrant communities in the Northern towns, leading to serious social problems.

006

Nor is it entirely true to suggest that, as referred to above, Caribbean arrivals in Britain faced few linguistic obstacles integrating themselves into British life from the late 1940s to the late 1980s. By the end of these forty years, the British West Indian community had developed its own “patois”, which had a special place as a token of identity. One Jamaican schoolgirl living in London in the late eighties explained the social pressures that frowned on Jamaican English in Jamaica, but which made it almost obligatory in London. She wasn’t allowed to speak Jamaican Creole in front of her parents in Jamaica. When she arrived in Britain and went to school, she naturally tried to fit in by speaking the same patois, but some of her British Caribbean classmates told her that, as a “foreigner”, she should not try to be like them, and should speak only English. But she persevered with the patois and lost her British accent after a year and was accepted by her classmates. But for many Caribbean visitors to Britain, the patois of Brixton and Notting Hill was a stylized form that was not truly Jamaican, not least because British West Indians had come from all parts of the Caribbean. When another British West Indian girl, born in Britain, was taken to visit Jamaica, she found herself being teased about her London patois and told to speak English.

003

The predicament that still faced the ‘Black British’ in the late eighties and into the nineties was that, for all the rhetoric, they were still not fully accepted by the established ‘White community’. Racism was still an everyday reality for large numbers of British people. There was plenty of evidence of the ways in which Black people were systematically denied access to employment in all sections of the job market.  The fact that a racist calamity like the murder in London of the black teenager Stephen Lawrence could happen in 1993 was a testimony to how little had changed in British society’s inability to face up to racism since the 1950s. As a result, the British-Caribbean population could still not feel itself to be neither fully British. This was the poignant outcome of what the British Black writer Caryl Phillips has called “The Final Passage”, the title of his novel which is narrated in Standard English with the direct speech by the characters rendered in Creole. Phillips migrated to Britain as a baby with his parents in the 1950s, and sums up his linguistic and cultural experience as follows:

“The paradox of my situation is that where most immigrants have to learn a new language, Caribbean immigrants have to learn a new form of the same language. It induces linguistic shizophrenia – you have an identity that mirrors the larger cultural confusion.”

One of his older characters in The Final Passage characterises “England” as a “college for the West Indian”, and, as Philipps himself put it, that is “symptomatic of the colonial situation; the language is divided as well”.  As the “Windrush Scandal”, involving the deportation of British West Indians from the UK has recently shown, this post-colonial “cultural confusion” still ‘colours’ political and institutional attitudes twenty-five years after the death of Stephen Lawrence, leading to discriminatory judgements by officials. This example shows how difficult it is to arrive at some kind of chronological classification of migrations to Britain into the period of economic expansion of the 1950s and 1960s; the asylum-seekers of the 1970s and 1980s; and the EU expansion and integration in the 1990s and the first decades of the 2000s. This approach assumed stereotypical patterns of settlement for the different groups, whereas the reality was much more diverse. Most South Asians, for example, arrived in Britain in the post-war period but they were joining a migration ‘chain’ which had been established at the beginning of the twentieth century. Similarly, most Eastern European migrants arrived in Britain in several quite distinct waves of population movement. This led the authors of the Longman Linguistics book to organise it into geolinguistic areas, as shown in the figure below:

001

The Poles and Ukrainians of the immediate post-war period, the Hungarians in the 1950s, the Vietnamese refugees in the 1970s and the Tamils in the 1980s, sought asylum in Britain as refugees. In contrast, settlers from India, Pakistan, Bangladesh and the Caribbean, had, in the main come from areas of high unemployment and/or low wages, for economic reasons. It was not possible, even then, to make a simple split between political and economic migrants since, even within the same group, motivations differed through time. The Eastern Europeans who had arrived in Britain since the Second World War had come for a variety of reasons; in many cases, they were joining earlier settlers trying either to escape poverty in the home country or to better their lot. A further important factor in the discussion about the various minority communities in Britain was the pattern of settlement. Some groups were concentrated into a relatively small geographical area which made it possible to develop and maintain strong social networks; others were more dispersed and so found it more difficult to maintain a sense of community. Most Spaniards, Turks and Greeks were found in London, whereas Ukrainians and Poles were scattered throughout the country. In the case of the Poles, the communities outside London were sufficiently large to be able to sustain an active community life; in the case of Ukrainians, however, the small numbers and the dispersed nature of the community made the task of forging a separate linguistic and cultural identity a great deal more difficult.

Groups who had little contact with the home country also faced very real difficulties in retaining their distinct identities. Until 1992, Lithuanians, Latvians, Ukrainians and Estonians were unable to travel freely to their country of origin; neither could they receive visits from family members left behind; until the mid-noughties, there was no possibility of new immigration which would have the effect of revitalizing these communities in Britain. Nonetheless, they showed great resilience in maintaining their ethnic minority, not only through community involvement in the UK but by building links with similar groups in Europe and even in North America. The inevitable consequence of settlement in Britain was a shift from the mother tongue to English. The extent of this shift varied according to individual factors such as the degree of identification with the mother tongue culture; it also depended on group factors such as the size of the community, its degree of self-organisation and the length of time it had been established in Britain. For more recently arrived communities such as the Bangladeshis, the acquisition of English was clearly a more urgent priority than the maintenance of the mother tongue, whereas, for the settled Eastern Europeans, the shift to English was so complete that mother tongue teaching was often a more urgent community priority. There were reports of British-born Ukrainians and Yiddish-speaking Jews who were brought up in predominantly English-speaking homes who were striving to produce an environment in which their children could acquire their ‘heritage’ language.

Blair’s Open Door Policy & EU Freedom of Movement:

During the 1980s and ’90s, under the ‘rubric’ of multiculturalism, a steady stream of immigration into Britain continued, especially from the Indian subcontinent. But an unspoken consensus existed whereby immigration, while always gradually increasing, was controlled. What happened after the Labour Party’s landslide victory in 1997 was a breaking of that consensus, according to Douglas Murray, the author of the recent (2017) book, The Strange Death of Europe. He argues that once in power, Tony Blair’s government oversaw an opening of the borders on a scale unparalleled even in the post-war decades. His government abolished the ‘primary purpose rule’, which had been used as a filter out bogus marriage applications. The borders were opened to anyone deemed essential to the British economy, a definition so broad that it included restaurant workers as ‘skilled labourers’. And as well as opening the door to the rest of the world, they opened the door to the new EU member states after 2004. It was the effects of all of this, and more, that created the picture of the country which was eventually revealed in the 2011 Census, published at the end of 2012.

004

The numbers of non-EU nationals moving to settle in Britain were expected only to increase from 100,000 a year in 1997 to 170,000 in 2004. In fact, the government’s predictions for the number of new arrivals over the five years 1999-2004 were out by almost a million people. It also failed to anticipate that the UK might also be an attractive destination for people with significantly lower average income levels or without a minimum wage. For these reasons, the number of Eastern European migrants living in Britain rose from 170,000 in 2004 to 1.24 million in 2013. Whether the surge in migration went unnoticed or was officially approved, successive governments did not attempt to restrict it until after the 2015 election, by which time it was too late.

(to be continued)

Posted January 15, 2019 by TeamBritanniaHu in Affluence, Africa, Arabs, Assimilation, asylum seekers, Belfast, Birmingham, Black Market, Britain, British history, Britons, Bulgaria, Calais, Caribbean, Celtic, Celts, Child Welfare, Cold War, Colonisation, Commonwealth, Communism, Compromise, Conservative Party, decolonisation, democracy, Demography, Discourse Analysis, Domesticity, Economics, Education, Empire, English Language, Europe, European Economic Community, European Union, Factories, History, Home Counties, Humanism, Humanitarianism, Hungary, Immigration, Imperialism, India, Integration, Iraq, Ireland, Journalism, Labour Party, liberal democracy, liberalism, Linguistics, manufacturing, Margaret Thatcher, Midlands, Migration, Militancy, multiculturalism, multilingualism, Music, Mythology, Narrative, National Health Service (NHS), New Labour, Old English, Population, Poverty, privatization, Racism, Refugees, Respectability, Scotland, Socialist, south Wales, terror, terrorism, Thatcherism, Unemployment, United Kingdom, United Nations, Victorian, Wales, Welsh language, xenophobia, Yugoslavia

Tagged with , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Borderlines: Remembering Sojourns in Ireland.   Leave a comment

001

Edited by Sam Burnside, published by Holiday Projects West, Londonderry, 1988.

The recent ‘Brexit’ negotiations over the issue of the land border between Northern Ireland and the Republic of Ireland have made me think about my two visits to the island as an adult, in 1988 and 1990, a decade before the Belfast talks led to the ‘Good Friday Agreement’. I had been to Dublin with my family in the early sixties, but recalled little of that experience, except that it must have been before 1966, as we climbed Nelson’s Column in the city centre before the IRA blew it up to ‘commemorate’ the fiftieth anniversary of the Easter Rising. I had never visited Northern Ireland, however.

Image result for nelson's column dublin

Image result for nelson's column dublin

Nelson’s Column in the centre of Dublin in 1961.

A Journey to Derry & Corrymeela, June 1988:

In June 1988, while working for the Quakers in Selly Oak, Birmingham, I drove a group of students from Westhill College to Corrymeela, a retreat and reconciliation centre in the North. We drove to Belfast, being stopped by army blockades and visiting the Shankill and the Falls Road, witnessing the murals and the coloured curb-stones. Political violence in Belfast had largely been confined to the confrontation lines where working-class unionist districts, such as the Shankill, and working-class nationalist areas, such as the Falls, Ardoyne and New Lodge, border directly on one another (see the map below). We also visited Derry/ Londonderry, with its wall proclaiming ‘You are now entering Free Derry’, and with its garrisons protected by barbed wire and soldiers on patrol with automatic rifles. Then we crossed the western border into Donegal, gazing upon its green fields and small hills.

My Birmingham colleague, a Presbyterian minister and the son of a ‘B Special’ police officer, was from a small village on the shores of Lough Neagh north of Belfast. So while he visited his family home there, I was deputed to drive the students around, guided by Jerry Tyrrell from the Ulster Quaker Peace Education Project. He described himself as a ‘full-time Peace worker’ and a ‘part-time navigator’. I had already met him in Birmingham, where I was also running a Peace Education Project for the Quakers in the West Midlands. He was born in London but had come to live in Derry in 1972, where he had worked on holiday projects for groups of mixed Catholic and Protestant students. It provided opportunities for them to meet and learn together during organised holidays, work camps and other activities. He had left this in April 1988 to take up a post running a Peace Education Project at Magee College.

Image result for Magee College

Magee College, Londonderry.

Jerry gave me a copy of a slim volume entitled Borderlines: A Collection of New Writing from the North West, containing prose and poems by members of the Writers’ Workshop based at Magee College, including some of his own poetry. The Workshop promoted and encouraged new writing in the North-west, and acted as a forum for a large number of local writers. In his preface, Frank McGuinness wrote of how …

… freedom is full of contradictions, arguments, the joy of diversity, the recognition and celebration of differences.

After reading the collection, I agreed with him that the collection contained that diversity and that it stood testimony to the writers’ experiences and histories, their fantasies and dreams. Its contributors came from both sides of the Derry-Donegal border we had driven over, and from both sides of the Foyle, a river of considerable beauty which, in its meandering journey from the Sperrins to the Atlantic, assumes on its path through Derry a socio-political importance in symbolising the differences within the City. However, in his introduction to the collection, Sam Burnside, an award-winning poet born in County Antrim, but living in Derry, wrote of how …

… the borders which give definition to the heart of this collection are not geographical, nor are they overtly social or political; while … embedded in time and place, they are concerned to explore emotional and moral states, and the barriers they articulate are … those internal to the individual, and no less detrimental to freedom for that.

If borders indicate actual lines of demarcation between places and … powers, they suggest also the possibility of those barriers being crossed, of change, of development, from one state to another. And a border, while it is the mark which distinguishes and maintains a division, is also the point at which the essence of real or assumed differences are made to reveal themselves; the point at which they may be forced to examine their own natures, for good or ill.

001

A page from an Oxford Bookworms’ Reader for EFL students.

In the short story ‘Blitzed’ by Tessa Johnston, a native of Derry where she worked as a teacher, Kevin has moved, in a fictional future (in 1998), from Derry to Manchester, to escape from the troubles, but the report of a car-bombing by the Provisional IRA in Manchester brings back memories of his encounter with a soldier in Derry as a schoolboy, fifteen years old. On his way from his home in Donegal to the Grammar School in Derry, in the week before Christmas, he had been blinded by the snow so that he didn’t see the soldier on patrol until he collided with him:

Over the years Kevin had grown accustomed to being stopped regularly on his way to and from school; to being stopped, questioned and searched, but never until that day had he experienced real hostility, been aware of such hatred. Spread-eagled against the wall he had been viciously and thoroughly searched. His school-bag had been ripped from his back and its contents strewn on the pavement; then, triumphantly, the soldier held aloft his bible, taunting him:

“So, you’re a Christian, are you? You believe in all that rubbish? You wanna convert me? Wanna convert the heathen, Fenian scum? No?”

On and on he ranted and raved until Kevin wondered how much more of this treatment he could endure. Finally, his anger exhausted, he tossed the offending book into the gutter and in a last act of vandalism stamped heavily upon it with his sturdy Army boots, before turning up Bishop Street to continue his patrol.

With trembling hands Kevin began to gather up his scattered possessions. Then, like one sleep-walking, he continued his journey down Bishop Street. He had only gone a few steps when a shot rang out. Instinctively, he threw himself to the ground. Two more shots followed in quick succession, and then silence.

He struggled to his feet and there, not fifty yards away his tormentor lay spread-eagled in the snow. Rooted to the spot, Kevin viewed the soldier dis passionately. A child’s toy, he thought, that’s what he looks like. Motionless and quiet;

a broken toy …

Then the realisation dawned as he watched the ever-increasing pool of blood stain the new snow.”

What haunted Kevin from that day, however, was not so much this picture of the dead soldier, but the sense that he himself had crossed an internal border. He had been glad when the soldier was shot and died; he had been unable to come to terms with the knowledge that he could feel like that. He had been unable to forgive not just the young soldier, but – perhaps worse – himself. The shadow of that day would never leave him, even after his family moved to Manchester. This had worked for a while, he’d married and had a child, and he had coped. But in the instant of the TV news report all that had been wiped out. The ‘troubles’ had found him again. They knew no borders.

Fortunately, this was a piece of fiction. Though there were thousands of deaths in Northern Ireland like that of the soldier throughout the troubles and bombings even after the PIRA cease-fire by the ‘Real IRA’, there was no renewal of the bombing campaigns on the mainland of Britain. But it could easily have been a real future for someone had it not been for the Good Friday Agreement.

An Easter ‘Pilgrimage’ to Dublin & Belfast, 1990:

Britain, Ireland and Europe, 1994-99: Peace, Devolution & Development.   Leave a comment

LSF (1947) Nobel Peace Prize obv

Unionists & Nationalists – The Shape of Things to Come:

In Northern Ireland, optimism was the only real force behind the peace process. Too often, this is remembered by one of Blair’s greatest soundbites as the talks reached their climax: This is no time for soundbites … I feel the hand of history on my shoulder. Despite the comic nature of this remark, it would be churlish not to acknowledge this as one of his greatest achievements. Following the tenacious efforts of John Major to bring Republicans and Unionists to the table, which had resulted in a stalemate. Tony Blair had already decided in Opposition that an Irish peace settlement would be one of his top priorities in government. He went to the province as his first visit after winning power and focused Number Ten on the negotiations as soon as the IRA, sensing a fresh opportunity, announced a further ceasefire. In Mo Mowlem, Blair’s brave new Northern Ireland Secretary, he had someone who was prepared to be tough in negotiations with the Unionists and encouraging towards Sinn Feiners in order to secure a deal. Not surprisingly, the Ulster Unionist politicians soon found her to be too much of a ‘Green’. She concentrated her charm and bullying on the Republicans, while a Number Ten team dealt with the Unionists. Blair emphasised his familial links with Unionism in order to win their trust.

004

There were also direct talks between the Northern Irish political parties, aimed at producing a return of power-sharing in the form of an assembly in which they could all sit. These were chaired by former US Senator George Mitchell and were the toughest part. There were also talks between the Northern Irish parties and the British and Irish governments about the border and the constitutional position of Northern Ireland in the future. Finally, there were direct talks between London and Dublin on the wider constitutional and security settlement. This tripartite process was long and intensely difficult for all concerned, which appeared to have broken down at numerous points and was kept going mainly thanks to Blair himself. He took big personal risks, such as when he invited Gerry Adams and Martin McGuinness of Sinn Fein-IRA to Downing Street. Some in the Northern Ireland office still believe that Blair gave too much away to the Republicans, particularly over the release of terrorist prisoners and the amnesty which indemnified known terrorists, like those responsible for the Birmingham bombings in 1974, from prosecution. At one point, when talks had broken down again over these issues, Mo Mowlem made the astonishing personal decision to go into the notorious Maze prison herself and talk to both Republican and Loyalist terrorist prisoners. Hiding behind their politicians, the hard men still saw themselves as being in charge of their ‘sides’ in the sectarian conflict. But Blair spent most of his time trying to keep the constitutional Unionists ‘on board’, having moved Labour policy away from support for Irish unification. In Washington, Blair was seen as being too Unionist.

005

Given a deadline of Easter 1998, a deal was finally struck, just in time, on Good Friday, hence the alternative name of ‘the Belfast Agreement’. Northern Ireland would stay part of the United Kingdom for as long as the majority in the province wished it so. The Republic of Ireland would give up its territorial claim to the North, amending its constitution to this effect. The parties would combine in a power-sharing executive, based on a newly elected assembly. There would also be a North-South body knitting the two political parts of the island together for various practical purposes and mundane matters. The paramilitary organisations would surrender or destroy their weapons, monitored by an independent body. Prisoners would be released and the policing of Northern Ireland would be made non-sectarian by the setting up of a new police force to replace the Royal Ulster Constabulary (RUC), whose bias towards the Unionist community had long been a sore point for Nationalists. The deal involved a great deal of pain, particularly for the Unionists. It was only the start of a true peace and would be threatened frequently afterwards, such as when the centre of Omagh was bombed only a few months after its signing by a renegade splinter group of the IRA calling itself ‘the Real IRA’ (see the photo below). It murdered twenty-nine people and injured two hundred. Yet this time the violent extremists were unable to stop the rest from talking.

004 (2)

Once the agreement had been ratified on both sides of the border, the decommissioning of arms proved a seemingly endless and wearisome game of bluff. Though the two leaders of the moderate parties in Northern Ireland, David Trimble of the Ulster Unionists and John Hume of the Nationalist SDLP, won the Nobel Prize for Peace, both these parties were soon replaced in elections by the harder-line Democratic Unionist Party led by Rev. Dr Ian Paisley, and by Sinn Fein, under Adams and McGuinness. Initially, this made it harder to set up an effective power-sharing executive at Stormont (pictured below). Yet to almost everyone’s surprise, Paisley and McGuinness sat down together and formed a good working relationship. The thuggery and crime attendant on years of paramilitary activity took another decade to disappear. Yet because of the agreement hundreds more people are still alive who would have died had the ‘troubles’ continued. They are living in relatively peaceful times. Investment has returned and Belfast has been transformed into a busier, more confident city. Large businesses increasingly work on an all-Ireland basis, despite the continued existence of two currencies and a border. The fact that both territories are within the European Union enables this to happen without friction at present, though this may change when the UK leaves the EU and the Republic becomes a ‘foreign country’ to it for the first time since the Norman Conquest. Tony Blair can take a sizeable slice of credit for this agreement. As one of his biographers has written:

He was exploring his own ability to take a deep-seated problem and deal with it. It was a life-changing experience for him.

003

If the Good Friday Agreement changed the future relationship of the UK and Ireland, Scottish and Welsh devolution changed the future political shape of Great Britain. The relative indifference of the eighteen-year Tory ascendancy to the plight of the industrial areas of Scotland and Wales had transformed the prospects of the nationalist parties in both countries. Through the years of Tory rule, the case for a Scottish parliament had been bubbling under north of the border. Margaret Thatcher had been viewed as a conspicuously English figure imposing harsh economic penalties on Scotland, which had always considered itself to be inherently more egalitarian and democratic. The Tories, who had successfully played the Scottish card against centralising Labour in 1951, had themselves become labelled as a centralising and purely English party. Local government had already been reorganised in Britain and Northern Ireland in the early 1990s with the introduction of ‘unitary’ authorities.

002

Scotland had a public culture further to the left than that of southern England, and therefore the initiatives on devolution came from the respectable middle-classes. A group of pro-devolution activists, including SNP, Labour and Liberal supporters, churchmen, former civil servants and trade unionists to found the Campaign for a Scottish Assembly. In due course, this produced a Constitutional Convention meant to bring in a wider cross-section of Scottish life behind their ‘Claim of Right’. It argued that if Scots were to stand on their own two feet as Mrs Thatcher had insisted, they needed control over their own affairs. Momentum increased when the Scottish Tories lost half their remaining seats in the 1987 election, and, following the poll tax rebellion, the Convention got going in March 1989, after Donald Dewar, Labour’s leader in Scotland, decided to work with other parties. The Convention brought together the vast majority of Scottish MPs, all but two of Scotland’s regional, district and island councils, the trade unions, churches, charities and many other organisations, in fact almost everyone except the Conservatives, who were sticking with the original Union, and the SNP, who wanted full independence.

Scottish Tories, finding themselves increasingly isolated, fought back vainly. They pointed out that if a Tory government, based on English votes, was regarded as illegitimate by the Scots, then in future a Labour government based on Scottish votes might be regarded as illegitimate by the English. In a 1992 poll in Scotland, fifty per cent of those asked said they were in favour of independence within the European Union. In the 1992 election, John Major had made an impassioned appeal for the survival of the Union. Had the four countries never come together, he argued, their joint history would have never been as great: Are we, in our generation, to throw all that away?  He won back a single Scottish seat. Various minor sops were offered to the Scots during his years in office, including the return of the Stone of Destiny, with much ceremony. However, the minor Tory recovery in 1992 was wiped out in the Labour landslide of 1997, when all the Conservatives seats north of the border, where they had once held the majority of them, were lost, as they were in Wales. Formerly just contestants in middle-class, rural and intellectual constituencies, in 1997 Scottish and Welsh nationalists now made huge inroads into former Conservative areas, and even into the Labour heartlands, and the latter despite the Labour leadership being held consecutively by a Welshman and a Scot.

By the time Tony Blair became the party leader, Labour’s commitment to devolution was long-standing. Unlike his predecessor, he was not much interested in devolution or impressed by it, particularly not for Wales, where support had been far more muted. The only thing he could do by this stage was to insist that a Scottish Parliament and Welsh Assembly would only be set up after referenda in the two countries, which in Scotland’s case would include a second question as to whether the parliament should be given the power to vary the rate of income tax by 3p in the pound. In September 1997, Scotland voted by three to one for the new Parliament, and by nearly two to one to give it tax-varying powers. The vote for the Welsh Assembly was far closer, with a wafer-thin majority secured by the final constituency to declare, that of Carmarthen. The Edinburgh parliament would have clearly defined authority over a wide range of public services – education, health, welfare, local government, transport and housing – while Westminster kept control over taxation, defence, foreign affairs and some lesser matters. The Welsh assembly in Cardiff would have fewer powers and no tax-raising powers. The Republic of Ireland was similarly divided between two regional assemblies but unlike the assemblies in the UK, these were not elected.

In 1999, therefore, devolved governments, with varying powers, were introduced in Scotland, Wales and, following the ratification referendum on the Belfast Agreement, in Northern Ireland. After nearly three hundred years, Scotland got its parliament with 129 MSPs, and Wales got its assembly with sixty members. Both were elected by proportional representation, making coalition governments almost inevitable. In Scotland, Labour provided the first ‘first minister’ in Donald Dewar, a much-loved intellectual, who took charge of a small group of Labour and Liberal Democrat ministers. To begin with, Scotland was governed from the Church of Scotland’s general assembly buildings. The devolution promised by John Smith and instituted by Tony Blair’s new Labour government in the late 1990s did, initially, seem to take some of the momentum out of the nationalist fervour, but apparently at the expense of stoking the fires of English nationalism, resentful at having Scottish and Welsh MPs represented in their own assemblies as well as in Westminster. But there was no early crisis at Westminster because of the unfairness of Scottish and Welsh MPs being able to vote on England-only business, the so-called Midlothian Question, particularly when the cabinet was so dominated by Scots. But despite these unresolved issues, the historic constitutional changes brought about by devolution and the Irish peace process reshaped both Britain and Ireland, producing irrevocable results. In his television series A History of Britain, first broadcast on the BBC in 2000, Simon Schama argued that…

Histories of Modern Britain these days come not to praise it but to bury it, celebrating the denationalization of Britain, urging on the dissolution of ‘Ukania’ into the constituent European nationalities of Scotland, Wales and England (which would probably tell the Ulster Irish either to absorb themselves into a single European Ireland or to find a home somewhere else – say the Isle of Man). If the colossal asset of the empire allowed Britain, in the nineteenth and early twentieth century, to exist as a genuine national community ruled by Welsh, Irish and (astonishingly often) Scots, both in Downing Street and in the remote corners of the empire, the end of that imperial enterprise, the theory goes, ought also to mean the decent, orderly liquidation of Britannia Inc. The old thing never meant anything anyway, it is argued; it was just a spurious invention designed to seduce the Celts into swallowing English domination where once they had been coerced into it, and to persuade the English themselves that they would be deeply adored on the grouse moors of the Trossachs as in the apple orchards of the Weald. The virtue of Britain’s fall from imperial grace, the necessity of its European membership if only to avoid servility to the United States, is that it forces ‘the isles’ to face the truth: that they are many nations, not one.

However, in such a reduction of false British national consciousness to the ‘true’ identities and entities of Scotland, Wales and England, he argued, self-determination could go beyond the ‘sub-nations’, each of which was just as much an invention, or a re-invention, as was Britain. Therefore an independent Scotland would not be able to resist the rights to autonomy of the Orkney and Shetland islands, with their Nordic heritage, or the remaining Gallic-speaking isles of the Outer Hebrides. Similarly, the still primarily Anglophone urban south-Walians and the inhabitants of the Welsh borders and south coast of Pembrokeshire might in future wish to assert their linguistic and cultural differences from the Welsh-speakers of the rural Welsh-speakers of West and North Wales. With the revival of their Celtic culture, the Cornish might also wish to seek devolution from a country from which all other Celts have retreated into their ethnolinguistic heartlands. Why shouldn’t post-imperial Britain undergo a process of ‘balkanization’ like that of the Former Yugoslavia?

LSF RSF Lets build a culture of peace LR

Well, many like Schama seemed to answer at that time, and still do today, precisely because of what happened due to ethnonationalism in the Balkans, especially in Bosnia and Kosovo, where the conflicts were only just, in 1999, being brought to an end by air-strikes and the creation of tides of refugees escaping brutal ethnic cleansing. The breaking up of Britain into ever smaller and purer units of pure white ethnic groups was to be resisted. Instead, a multi-national, multi-ethnic and multi-cultural Britain was coming into being through a gradual and peaceful process of devolution of power to the various national, ethnic and regional groups and a more equal re-integration of them into a ‘mongrel’ British nation within a renewed United Kingdom.

Economic Development, the Regions of Britain & Ireland and the Impact of the EU:

004

The late twentieth century saw the transformation of the former docklands of London into offices and fashionable modern residential developments, with a new focus on the huge Canary Wharf scheme (pictured above) to the east of the city. The migration of some financial services and much of the national press to the major new developments in London’s Docklands prompted the development of the Docklands Light Railway and the Jubilee line extension. The accompanying modernisation of the London Underground was hugely expensive in legal fees and hugely complex in contracts. Outside of London, improvements in public transport networks were largely confined to urban and suburban centres with light railway networks developed in Manchester, Sheffield and Croydon.

Beyond Canary Wharf to the east, the Millennium Dome, which Blair’s government inherited from the Tories, was a billion pound gamble which Peter Mandelson and ‘Tony’s cronies’ decided to push ahead with, despite cabinet opposition. Architecturally, the dome was striking and elegant, a landmark for London which can be seen from by air passengers arriving in the capital. The millennium was certainly worth celebrating but the conundrum ministers and their advisers faced was what to put in their ‘pleasure’ Dome. It would be magnificent, unique, a tribute to British daring and ‘can-do’. Blair himself said that it would provide the first paragraph of his next election manifesto. But this did not answer the current question of what it was for, exactly. When the Dome finally opened at New Year, the Queen, Prime Minister and celebrities were treated to a mish-mash of a show which embarrassed many of them. When it opened to the public, the range of mildly interesting exhibits was greeted as a huge disappointment. Optimism and daring, it seemed, were not enough to fill the people’s expectations. Later that year, Londoners were given a greater gift in the form of a mayor and regional assembly with powers over local planning and transport. This new authority in part replaced the Greater London Council abolished by the Thatcher government in 1986.

However, there were no signs that the other conurbations in the regions of England wanted regionalisation, except for some stirrings in the Northeast and Cornwall. The creation of nine Regional Development Agencies in England in 1998-99 did not seek to meet a regionalist agenda. In fact, these new bodies to a large extent matched the existing structures set up since the 1960s for administrative convenience and to encourage inward investment. Improving transport links were seen as an important means of stimulating regional development and combating congestion. Major Road developments in the 1990s included the completion of the M25 orbital motorway around London and the M40 link between London and Birmingham. However, despite this construction programme, congestion remained a problem: the M25, for example, became the butt of jokes labelling it as the largest car park on the planet, while traffic speeds in central London continued to fall, reaching fifteen kilometres per hour by 1997, about the same as they had been in 1907. Congestion was not the only problem, however, as environmental protests led to much of the road-building programme begun by the Tory governments being shelved after 1997. The late nineties also saw the development of some of the most expensive urban motorways in Europe.

In the Sottish Highlands and Islands, the new Skye road bridge connected the Isle of Skye to the mainland. A group led by the Bank of America built and ran the new bridge. It was one of the first projects built under a ‘public finance initiative’, or PFI, which had started life under Tory Chancellor Norman Lamont, five years before Labour came to power when he experimented with privatising public projects and allowing private companies to run them, keeping the revenue. Although the basic idea was simple enough, this represented a major change in how government schemes were working, big enough to arouse worry even outside the tribes of political obsessives. There were outraged protests from some islanders about paying tolls to a private consortium and eventually the Scottish Executive bought the bridge back. At the opposite corner of the country, the Queen Elizabeth II road bridge was built joining Kent and Essex across the Thames at Dartford, easing congestion on both sides of the Dartford tunnel. It was the first bridge across the river in a new place for more than half a century and was run by a company called ‘Le Crossing’, successfully taking tolls from motorists.

006

Undoubtedly the most important transport development was the Channel Tunnel rail link to France, completed in 1994. It was highly symbolic of Britain’s commitment to European integration, and millions of people and vehicles had travelled from London to Paris in under three hours by the end of the century. The town of Ashford in Kent was one of the major beneficiaries of the ‘Chunnel’ rail link, making use of railway links running through the town. Its population grew by over ten per cent in the 1990s. By the end of that decade, the town had an international catchment area of some eighty-five million people within a single day’s journey. This and the opening of Ashford International railway station as the main terminal in the rail link to the continent attracted a range of engineering, financial, distribution and manufacturing companies to the town. In addition to the fourteen business parks that were established in the town, new retail parks were opened. Four green-field sites were also opened on the outskirts of the town, including a science park owned by Trinity College, Cambridge. Ashford became closer to Paris and Brussels than it was to Manchester and Liverpool, as can be seen on the map below. In addition to its international rail link, the town’s position at the hub of a huge motorway network was in a position to be an integral part of a truly international transport system.

005

002

Modern-day affluence at the turn of the century was reflected in the variety of goods and services concentrated in shopping malls. They are now often built on major roads outside towns and cities to make them accessible to the maximum number of people in a region.

Economic change was most dramatic in the Irish Republic, which enjoyed the highest growth rates in Europe in the 1990s. The so-called ‘Celtic Tiger’ economy boomed, aided by inward investment so that by the end of the decade GDP per capita had surpassed that of the UK. Dublin, which remained if anything more dominant than London as a capital city, flourished as a result of a strong growth in the service industries. Growth rates for the ‘new economy’ industries such as information and communications technology were among the highest in the world. Generous tax arrangements and the city’s growing reputation as a cultural centre meanwhile helped to encourage the development of Dublin’s ‘rockbroker belt’. Even agriculture in the Irish Republic, in decline in the early 1990s, still contributed nine per cent of Ireland’s GDP, three times the European average. In the west of Ireland, it was increasingly supplemented by the growth of tourism.

Nevertheless, while the expansion of Ireland’s prosperity lessened the traditional east-west divide, it did not eliminate it. Low population density and a dispersed pattern of settlement were felt to make rail developments unsuitable. Consequently, Ireland’s first integrated transport programme, the Operational Programme for Peripherality, concentrated on improving: the routes from the west of Ireland to the ferry port of Rosslare; the routes from Belfast to Cork; Dublin and the southwest; east-west routes across the Republic. Many of these improvements benefited from EU funding. The EU also aided, through its ‘peace programme’, the development of transport planning in Britain, with infrastructure projects in, for example, the Highlands and Islands of Scotland. In 1993, the EU had decided to create a combined European transport network. Of the fourteen projects associated with this aim, three were based in Britain and Ireland – a rail link from Cork to Larne in Northern Ireland, the ferry port for Scotland; a road link from the Low Countries across England and Wales to Ireland, and the West Coast mainline railway route in Britain.

The old north-south divide in Britain reasserted itself with a vengeance in the late 1990s as people moved south in search of jobs and prosperity as prices and wages rose. Even though the shift towards service industries was reducing regional economic diversity, the geographical distribution of regions eligible for European structural funds for economic improved the continuing north-south divide. Transport was only one way in which the EU increasingly came to shape the geography of the British Isles in the nineties. It was, for example, a key factor in the creation of the new administrative regions of Britain and Ireland in 1999. At the same time, a number of British local authorities opened offices in Brussels for lobbying purposes and attempts to maximise receipts from European structural funds also encouraged the articulation of regionalism. Cornwall, for instance, ‘closed’ its ‘border’ with Devon briefly in 1998 in protest at not receiving its EU social funds, while the enthusiasm for the supposed economic benefits that would result from ‘independence in Europe’ helped to explain the revival of the Scottish Nationalist Party following devolution. ‘Silicon Gen’ in central Scotland was, by the end of the decade, the largest producer of computing equipment in Europe.

The European connection was less welcome in other quarters, however. Fishermen, particularly in Devon and Cornwall and on the North Sea Coast of England, felt themselves the victims of the Common Fisheries Policy quota system. There was also a continuing strong sense of ‘Euroscepticism’ in England, fuelled at this stage by a mixture of concerns about ‘sovereignty’ and economic policy, which I will deal with in a separate article. Here, it is worth noting that even the most enthusiastic Europhiles, the Irish, sought to reject recent EU initiatives which they felt were not in their interests in their 2001 referendum on the Treaty of Nice. Nevertheless, the growth of physical links with Europe, like the Channel Tunnel, the connections between the British and French electricity grids, and the development of ‘budget’ airlines, made it clear that both of the main ‘offshore’ islands, Britain and Ireland were, at the turn of the century, becoming increasingly integrated, both in economic and administrative terms, with the continent of Europe.

006 (2)

At the beginning of 1999, however, a debate began over British membership of the euro, the single currency which was finally taking shape within the EU. Though he was never a fanatic on the subject, Blair’s pro-European instincts and his desire to be a leading figure inside the EU predisposed him to announce that Britain would join, not in the first wave, but soon afterwards. He briefed that this would happen. British business seemed generally in favour, but the briefing and guesswork in the press were completely baffling. For Gordon Brown, stability came first, and he concluded that it was not likely that Britain could safely join the euro within the first Parliament. When he told Blair this, the two argued and then eventually agreed on a compromise. Britain would probably stay out during the first Parliament, but the door should be left slightly ajar. Pro-European business people and those Tories who had lent Blair and Brown their conditional support, as well as Blair’s continental partners, should be kept on board, as should the anti-Euro press. The terms of the delicate compromise were meant to be revealed in an interview given by Brown to The Times. Being more hostile to entry than Blair, and talking to an anti-euro newspaper, his team briefed more strongly than Blair would have liked. By the time the story was written, the pound had been saved from extinction for the lifetime of the Parliament. Blair was aghast at this.

Featured Image -- 33924

The chaos surrounding this important matter was ended and the accord with Blair patched up by Brown and his adviser Ed Balls, who quickly produced five economic tests which would be applied before Britain would enter the euro. They required more detailed work by the Treasury; the underlying point was that the British and continental economies must be properly aligned before Britain would join. Brown then told the Commons that though it was likely that, for economic reasons, Britain would not join the euro until after the next election, there was no constitutional or political reason not to join. Preparations for British entry would therefore begin. This gave the impression that once the tests were met there would be a post-election referendum, followed by the demise of sterling.

In 1999, with a full-scale launch at a London cinema, Blair was joined by the Liberal Democrat leader Charles Kennedy and the two former Tory cabinet ministers Ken Clarke and Michael Heseltine to launch ‘Britain in Europe’ as a counter-blast to the anti-Euro campaign of ‘Business for Sterling’. Blair promised that together they would demolish the arguments against the euro, and there was alarmist media coverage about the loss of eight million jobs if Britain pulled out of the EU. But the real outcome of this conflict was that the power to decide over membership of the euro passed decisively from Blair to Brown, whose Treasury fortress became the guardian of the economic tests. Brown would keep Britain out on purely economic grounds, something which won him great personal credit among Conservative ‘press barons’. There was to be no referendum on the pound versus euro, however much the Prime Minister wanted one.

Very little of what New Labour had achieved up to 1999 was what it was really about, however, and most of its achievements had been in dealing with problems and challenges inherited from previous governments or with ‘events’ to which it had to react. Its intended purpose was to deliver a more secure economy, radically better public services and a new deal for those at the bottom of British society. Much of this was the responsibility of Gordon Brown, as agreed in the leadership contest accord between the two men. The Chancellor would become a controversial figure later in government, but in his early period at the Treasury, he imposed a new way of governing. He had run his time in Opposition with a tight team of his own, dominated by Ed Balls, later an MP and Treasury minister before becoming shadow chancellor under Ed Miliband following the 2010 general election. Relations between Team Brown and the Treasury officials began badly and remained difficult for a long time. Brown’s handing of interest control to the Bank of England was theatrical, planned secretly in Opposition and unleashed to widespread astonishment immediately New Labour won. Other countries, including Germany and the US, had run monetary policy independently of politicians, but this was an unexpected step for a left-of-centre British Chancellor. It turned out to be particularly helpful to Labour ministers since it removed at a stroke the old suspicion that they would favour high employment over low inflation. As one of Brown’s biographers commented, he…

 …could only give expression to his socialist instincts after playing the role of uber-guardian of the capitalist system.

The bank move has gone down as one of the clearest achievements of the New Labour era. Like the Irish peace process and the devolution referenda, it was an action which followed on immediately after Labour won power, though, unlike those achievements, it was not something referred to in the party’s election manifesto. Brown also stripped the Bank of England of its old job of overseeing the rest of the banking sector. Otherwise, it would have had a potential conflict of interest if it had had to concern itself with the health of commercial banks at the same time as managing interest rates. As a result of these early actions, New Labour won a reputation for being economically trustworthy and its Chancellor was identified with ‘prudent’ management of the nation’s finances. Income tax rates did not increase, which reassured the middle classes. Even when Brown found what has more recently been referred to as ‘the magic money tree’, he did not automatically harvest it. And when the ‘dot-com bubble’ was at its most swollen, he sold off licenses for the next generation of mobile phones for 22.5 bn, vastly more than they were soon worth. The produce went not into new public spending but into repaying the national debt, 37 bn of it. By 2002 government interest payments on this were at their lowest since 1914, as a proportion of its revenue.

Despite his growing reputation for prudence, Brown’s introduction of ‘stealth taxes’ proved controversial, however. These included the freezing of income tax thresholds so that an extra 1.5 million people found themselves paying the top rate; the freezing of personal allowances; rises in stamp duties on houses and a hike in national insurance. In addition, some central government costs were palmed off onto the devolved administrations or local government, so that council tax rose sharply, and tax credits for share dividends were removed. Sold at the time as a ‘prudent’ technical reform, enabling companies to reinvest in their core businesses, this latter measure had a devastating effect on the portfolios of pension funds, wiping a hundred billion off the value of retirement pensions. This was a staggering sum, amounting to more than twice as much as the combined pension deficits of Britain’s top 350 companies. Pensioners and older workers were angered when faced with great holes in their pension funds. They were even more outraged when Treasury papers released in 2007 showed that Brown had been warned about the effect this measure would have. The destruction of a once-proud pension industry had more complex causes than Brown’s decision; Britain’s fast-ageing population was also a major factor, for one. But the pension fund hit produced more anger than any other single act by the New Labour Chancellor.

Perhaps the most striking long-term effect of Brown’s careful running of the economy was the stark, dramatic shape of public spending. For his first two years, he stuck fiercely to the promise he had made about continuing the Major government’s spending levels. These were so tight that even the man who set these levels, Kenneth Clarke, said that he would not actually have kept to them had the Tories been re-elected and had he been reappointed as Chancellor. Brown brought down the State’s share of public spending from nearly 41% of GDP to 37.4% by 1999-2000, the lowest percentage since 1960 and far below anything achieved under Thatcher. He was doing the opposite of what previous Labour Chancellors had done. On arriving in office, they had immediately started spending, in order to stimulate the economy in classical Keynesian terms. When they had reached their limits, they had then had to raise taxes. He began by putting a squeeze on spending and then loosening up later. There was an abrupt and dramatic surge in public spending, particularly on health, back up to 43%. The lean years were immediately followed by the fat ones, famine by the feast. But the consequence of the squeeze was that the first New Labour government of 1997-2001 achieved far less in public services than it had promised. For example, John Prescott had promised a vast boost in public transport, telling the Commons in 1997:

I will have failed if in five years’ time there are not many more people using public transport and far fewer journeys by car. It’s a tall order, but I urge you to hold me to it.

Because of ‘Prudence’, and Blair’s worries about being seen as anti-car, Prescott had nothing like the investment to follow through and failed completely. Prudence also meant that Brown ploughed ahead with cuts in benefit for lone-parent families, angering Labour MPs and resulting in a Scottish Labour conference which labelled their Westminster government and their own Scots Chancellor as economically inept, morally repugnant and spiritually bereft. Reform costs money and without money, it barely happened in the first term, except in isolated policy areas where Blair and Brown put their heads down and concentrated. The most dramatic programme was in raising literacy and numeracy among younger children, where Number Ten worked closely with the Education Secretary, David Blunkett, and scored real successes. But unequivocally successful public service reforms were rare.

At first, Labour hated the idea of PFIs, which were a mixture of two things associated with Thatcherite economic policies, the privatisation of capital projects, with the government paying a fee to private companies over many years, and the contracting out of services – waste collection, school meals, cleaning – which had been imposed on unwilling socialist councils from the eighties. Once in power, however, Labour ministers began to realise that those three little letters were political magic because they allowed them to announce and oversee exciting new projects and take the credit for them in the full knowledge that the full bill would be left for the taxpayers of twenty to fifty years hence. In this way, spending and funding of new hospitals or schools would be a problem for a future health or education minister.

PFIs were particularly attractive when other kinds of spending were tightly controlled by ‘Prudence’. Large amounts of capital for public buildings were declared to be ‘investment’, not spending, and put to one side of the public accounts. The justification was that private companies would construct and run this infrastructure so much more efficiently than the State and that profits paid to them by taxpayers would be more than compensated for. Ministers replied to criticisms of these schemes by pointing out that, without them, Britain would not get the hundreds of new school buildings, hospitals, health centres, fire stations, army barracks, helicopter training schools, prisons, government offices, roads and bridges that it so obviously needed by the nineties. Significantly, the peak year for PFIs was 1999-2000, just as the early Treasury prudence in conventional spending had bitten hardest and was being brought to an end.

Sources:

Andrew Marr (2008), A History of Modern Britain. Basingstoke: Pan Macmillan.

Simon Schama (2000), A History of Britain: The Fate of Empire, 1776-2000. London: BBC Worldwide.

Peter Catterall (et. al.) (2001), The Penguin Atlas of British & Irish History. London: Penguin Books.

Posted November 23, 2018 by TeamBritanniaHu in Affluence, Agriculture, Balkan Crises, BBC, Belfast Agreement, Birmingham, Britain, British history, Britons, Brussels, Celtic, devolution, Education, Ethnic cleansing, Europe, European Union, History, Immigration, Integration, Irish history & folklore, John Major, Margaret Thatcher, Migration, morality, nationalism, Nationality, New Labour, Population, privatization, Quakers (Religious Society of Friends), Reconciliation, Respectability, Social Service, south Wales, Thatcherism, Unionists, Wales, War Crimes, Welsh language, Yugoslavia

Tagged with , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

‘Celebrity’ Britain: The Arrival of ‘New Labour’ & Diana’s Demise, 1994-99.   Leave a comment

The Advent of Brown & Blair:

Featured Image -- 41771

Tony Blair was far more of an establishment figure than his mentor John Smith, or his great political ‘friend’ and future rival, Gordon Brown. He was the son of a Tory lawyer and went to preparatory school in Durham and then to a fee-paying boarding school in Edinburgh. He then went ‘up’ to Oxford, becoming a barrister and joining the Labour Party before he fell in love with a young Liverpudlian socialist called Cherie Booth, who sharpened his left-wing credentials before he became an MP at the 1983 General Election, winning a safe Labour seat in the north-east of England. Once in the Commons, he quickly fell in with Gordon Brown, another new MP, who was much that Blair was not. He was a tribal Labour Party man from a family which was strongly political and had rarely glimpsed the English Establishment, even its middle ranks from which Blair sprung. Brown had been Scotland’s best-known student politician and player in Scottish Labour politics from the age of twenty-three, followed by a stint in television. Yet the two men had their Christian beliefs in common, Anglo-Catholic in Blair’s case and Presbyterian in Brown’s. Most importantly, they were both deeply impatient with the state of the Labour Party. For seven or eight years they had seemed inseparable, sharing a small office together. Brown tutored Blair in some of the darker arts of politics while Blair explained the thinking of the English metropolitan and suburban middle classes to Brown. Together they made friends with Westminster journalists, both maturing as performers in the Commons, and together they worked their way up the ranks of the shadow cabinet.

After the 1992 defeat, Blair made a bleak public judgement about why Labour had lost so badly. The reason was simple: Labour has not been trusted to fulfil the aspirations of the majority of people in a modern world. As shadow home secretary he began to put that right, promising to be tough on crime and tough on the causes of crime. He was determined to return his party to the common-sense values of Christian Socialism, also influenced by the mixture of socially conservative and economically liberal messages used by Bill Clinton and his ‘New Democrats’. So too was Gordon Brown but as shadow chancellor, his job was to demolish the cherished spending plans of his colleagues. Also, his support for the ERM made him ineffective when Major and Lamont suffered their great defeat. By 1994, the Brown-Blair relationship was less strong than it had been, but they visited the States together to learn the new political style of the Democrats which, to the advantage of Blair, relied heavily on charismatic leadership. Back home, Blair pushed Smith to reform the party rulebook, falling out badly with him in the process. Media commentators began to tip Blair as the next leader, and slowly but surely, the Brown-Blair relationship was turning into a Blair-Brown one.

002 (2)

In the days following the sudden death of the Labour leader, John Smith (pictured right), Tony Blair decided almost immediately to run as his replacement, while Gordon Brown hesitated, perhaps more grief-stricken. But he had assumed he would eventually inherit the leadership, and was aghast when he heard of Blair’s early declaration. There were at least ten face-to-face confrontations between the two men, in London and Edinburgh. In the opinion polls, Blair was shown to be more popular, and he had the backing of more MPs as well as that of the press. Crucial to Blair’s case was his use of received pronunciation which, after Neil Kinnock and John Smith’s heavily accented English, would reassure those more prejudiced parts of the Kingdom which were the main battlegrounds for Labour, and in which Celtic tones were not perhaps as appreciated as they might be nowadays. They were alright when heard from actors and BBC presenters, but they made politicians seem more ‘peripheral’. Brown had a deeper knowledge of the Labour movement and broader support among the trade unions, however, and had also thought through his policy agenda for change in greater detail. Given the vagaries of Labour’s electoral college system, it is impossible to judge, even now, what might have happened had the ‘young English hart’ locked horns with the ‘tough Scottish stag’, but they agreed at the time that it would be disastrous for them to fight each other as two ‘modernisers’ since Brown would have to attack Blair from the left and the unions would then demand their tribute from him if he won.

So the two men came to a deal, culminating in a dinner at a ‘chic’ Islington restaurant. The outcome is still a matter of some dispute, but we know that Blair acknowledged that Brown, as Chancellor in a Labour government, would have complete authority over a wide range of policy which he would direct through the Treasury, including the ‘social justice’ agenda. But it is unlikely that he would have been so arrogant as to agree, as some have suggested, that he would hand over the premiership to Brown after seven years. After all, at that time Labour was already still three years away from winning its first term and not even the sharpest crystal ball could have projected the second term at that juncture. The most significant result of their dinner-table deal was that, following all the battles between Tory premiers and chancellors of the then recent and current Conservative governments, Brown’s Treasury would become a bastion for British home affairs, while Blair was left to concentrate on foreign policy largely unimpeded, with all the tragic consequences with which we are now familiar, with the benefit of the hindsight of the last twenty years.

Team Tony & ‘Blair’s Babes’:

When it arrived, the 1997 General Election demonstrated just what a stunningly efficient and effective election-winning team Tony Blair led, comprising those deadly masters of spin, Alistair Campbell and Peter Mandelson. ‘New Labour’ as it was now officially known, won 419 seats, the largest number ever for the party and comparable only with the seats won by the National Government of 1935. Its Commons majority was also a modern record, 179 seats, and thirty-three more than Attlee’s landslide majority of 1945. The swing of ten per cent from the Conservatives was another post-war record, roughly double that which the 1979 Thatcher victory had produced in the opposite direction. But the turn-out was very low, at seventy-one per cent the lowest since 1935. Labour had won a famous victory but nothing like as many actual votes as John Major had won five years earlier. But Blair’s party also won heavily across the south and in London, in parts of Britain where it had previously been unable to reach or represent in recent times.

As the sun came up on a jubilant, celebrating Labour Party returning to power after an eighteen-year absence, there was a great deal of Bohemian rhapsodizing about a new dawn for Britain. Alistair Campbell had assembled crowds of party workers and supporters to stand along Downing Street waving union flags as the Blairs strode up to claim their victory spoils. Briefly, at least, it appeared that the whole country had turned out to cheer the champions. In deepest, Lib-Con ‘marginal’ Somerset, many of us had been up all night, secretly sipping our Cava in front of the incredible scenes unfolding before our disbelieving eyes, and when the results came in from Basildon and Birmingham Edgbaston (my first constituency at the age of eighteen when it had already been a safe seat for Tory matron Jill Knight for at least a decade), we were sure that this would indeed be a landslide victory, even if we had had to vote for the Liberal Democrats in the West Country just to make sure that there was no way back for the Tories. The victory was due to a small group of self-styled modernisers who had seized the Labour Party and made it a party of the ‘left and centre-left’, at least for the time being, though by the end of the following thirteen years, and after two more elections, they had taken it further to the right than anyone expected on that balmy early summer morning; there was no room for cynicism amid all the euphoria. Labour was rejuvenated, and that was all that mattered.

A record number of women were elected to Parliament, 119, of whom 101 were Labour MPs, the so-called ‘Blair’s babes’. Despite becoming one of the first countries in the world to have a female prime minister, in 1987 there were just 6.3% of women MPs in government in the UK, compared with 10% in Germany and about a third in Norway and Sweden. Only France came below the UK with 5.7%.

Official portrait of Dame Margaret Hodge crop 2.jpgBefore the large group of women MPs joined her in 1997, Margaret Hodge (pictured below, c.1992, and right, in c. 2015) had already become MP for Barking in a 1994 by-election, following the death of Labour MP Jo Richardson. While still a new MP, Hodge endorsed the candidature of Tony Blair, a former Islington neighbour, for the Labour Party leadership, and was appointed Junior Minister for Disabled People in 1998. Before entering the Commons, she had been Leader of Islington Council and had not been short of invitations from constituencies to stand in the 1992 General Election. Given that she is now referred to as a ‘veteran MP’ it is therefore interesting to note that she had turned these offers down, citing her family commitments:

002

“It’s been a hard decision; the next logical step is from local to national politics and I would love to be part of a Labour government influencing change. But it’s simply inconsistent with family life, and I have four children who mean a lot to me. 

“It does make me angry that the only way up the political ladder is to work at it twenty-four hours a day, seven days a week. That’s not just inappropriate for a woman who has to look after children or relatives, it’s inappropriate for any normal person.

“The way Parliament functions doesn’t attract me very much. MPs can seem terribly self-obsessed, more interested in their latest media appearance than in creating change.” 

003

Patricia Hewitt.jpg

Patricia Hewitt (pictured above, in 1992, and more recently, right) had first begun looking for a seat in the 1970s when she was general secretary of the National Council of Civil Liberties (NCCL). She later commented that… looking for a seat takes an enormous amount of time, and money, too if you’re travelling a lot. Eventually, she was chosen to fight Leicester East in 1983, a contest which she lost by only nine hundred votes to the Conservative in what was then a relatively safe Tory seat. She later recalled driving up to Leicester on two evenings every week:

“I was planning to have a child after the elections – looking back I don’t know I imagined I was going to cope if Labour had won the seat… Even without children, I was leading such a pressured life – and my partner was doing the same as a Labour councillor – that it did put a strain on our relationship.”

She then became Neil Kinnock’s press and broadcasting secretary. In this role, she was a key player in the first stages of the ‘modernisation’ of the Labour Party, and along with Clive Hollick, helped set up the Institute for Public Policy Research and was its deputy director 1989–1994. By the time of the 1992 General Election she had two small children, so she decided not to look for a seat. Following Labour’s defeat in 1992, Hewitt was asked by the new Labour Leader, John Smith, to help establish the Commission on Social Justice, of which she became deputy chair. She then became head of research with Andersen Consulting, remaining in the post during the period 1994–1997. Hewitt was finally elected to Parliament to the House of Commons as the first female MP for Leicester West at the 1997 General Election, following the retirement of Greville Janner. She was elected with a majority of 12,864 and remained the constituency MP until stepping down in 2010.

001

Mary Kaldor (pictured right in the 1980s, and below in 2000), by contrast, never became an MP, one of the ‘loves’ Labour lost. A British academic, currently Professor of Global Governance at the London School of Economics, where she is also the Director of the Civil Society and Human Security Research Unit, she was the daughter of the economist Nicholas Kaldor, an exiled Hungarian economist who became an adviser to Harold Wilson in the 1960s. In the nineties, she was a senior research fellow at the Science Policy Research Unit of Sussex, and former foreign policy adviser to the Labour Party. She was shortlisted for Hackney and Dulwich in 1981, attending masses of meetings, many of which were boring at which she was endlessly having to be nice to people. Her youngest child was two years old at the time and was therefore ambivalent about the idea of becoming an MP:

“I was very well-equipped with baby minders and a nice understanding husband, but what on earth is the point of having children if you’re not going to see them?

“Building links with eastern Europe through the peace movement was more exciting than anything I could ever have done as an MP … (which seemed) entirely about competitiveness and being in the limelight, giving you no time to think honestly about your political views.”

Mary Kaldor crop.jpg

In 1999, Kaldor supported international military intervention over Kosovo on humanitarian grounds, calling for NATO ground forces to follow aerial bombardment in an article for The Guardian. I have written about the war in Kosovo in a separate article in this series. Significantly, however, by the end of the next decade Kaldor lost faith in the principle and practice of humanitarian intervention, telling the same paper:

The international community makes a terrible mess wherever it goes…

It is hard to find a single example of humanitarian intervention during the 1990s that can be unequivocally declared a success. Especially after Kosovo, the debate about whether human rights can be enforced through military means is ever more intense.

Moreover, the wars in Afghanistan and Iraq, which have been justified in humanitarian terms, have further called into question the case for intervention.

002

Blair needed the support and encouragement of admirers and friends who would coax and goad him. There was Mandelson, the brilliant but temperamental former media boss, who had now become an MP. Although adored by Blair, he was so mistrusted by other members of the team that Blair’s inner circle gave him the codename ‘Bobby’ (as in Bobby Kennedy). Alistair Campbell, Blair’s press officer and attack-dog is pictured above, in a characteristic ‘pose’. A former journalist and natural propagandist, he had helped orchestrate the campaign of mockery against Major. Then there was Anji Hunter, the contralto charmer who had known Blair as a young rock-singer and was his best hotline to middle England. Derry Irvine was a brilliant Highlands lawyer who had first found a place in his chambers for Blair and Booth. He advised on constitutional change and became Lord Chancellor in due course. These people, with the Brown team working in parallel, formed the inner core. The young David Miliband, son of a well-known Marxist philosopher, provided research support. Among the MPs who were initially close were Marjorie ‘Mo’ Mowlem and Jack Straw, but the most striking aspect about ‘Tony’s team’ was how few elected politicians it included.

The small group of people who put together the New Labour ‘project’ wanted to find a way of governing which helped the worse off, particularly by giving them better chances in education and to find jobs, while not alienating the mass of middle-class voters. They were extraordinarily worried by the press and media, bruised by what had happened to Kinnock, whom they had all worked with, and ruthlessly focused on winning over anyone who could be won. But they were ignorant of what governing would be like. They were able to take power at a golden moment when it would have been possible to fulfil all the pledges they had made. Blair had the wind at his back as the Conservatives would pose no serious threat to him for many years to come. Far from inheriting a weak or crisis-ridden economy, he was actually taking over at the best possible time when the country was recovering strongly but had not yet quite noticed that this was the case. Blair had won by being ruthless, and never forgot it, but he also seemed not to realise quite what an opportunity ‘providence’ had handed him.

Cool Britannia and the Celebrity Princess:

001

Above: a page from a recent school text.

Tony Blair arrived in power in a country with a revived fashion for celebrity, offering a few politicians new opportunities but at a high cost. It was not until 1988 that the full shape of modern celebrity culture had become apparent. That year had seen the publication of the truly modern glossy glamour magazines when Hello! was launched. Its successful formula was soon copied by OK! from 1993 and many other magazines soon followed suit, to the point where the yards of coloured ‘glossies’ filled the newsagents’ shelves in every town and village in the country. Celebrities were paid handsomely for being interviewed and photographed in return for coverage which was always fawningly respectful and never hostile. The rich and famous, no matter how flawed in real life, were able to shun the mean-minded sniping of the ‘gutter press’, the tabloid newspapers. In the real world, the sunny, airbrushed world of Hello! was inevitably followed by divorces, drunken rows, accidents and ordinary scandals. But people were happy to read good news about these beautiful people even if they knew that there was more to their personalities and relationships than met the eye. In the same year that Hello! went into publication, ITV also launched its the most successful of the daytime television shows, This Morning, hosted from Liverpool by Richard Madeley and Judy Finnigan, providing television’s celebrity breakthrough moment.

This celebrity fantasy world, which continued to open up in all directions throughout the nineties, served to re-emphasise to alert politicians, broadcasting executives and advertisers the considerable power of optimism. The mainstream media in the nineties was giving the British an unending stream of bleakness and disaster, so millions tuned in and turned over to celebrity. That they did so in huge numbers did not mean that they thought that celebrities had universally happy lives. And in the eighties and nineties, no celebrity gleamed more brightly than the beautiful yet troubled Princess Diana. For fifteen years she was an ever-present presence: as an aristocratic girl, whose childhood had been blighted by her parents’ divorce, her fairytale marriage in 1981 found her pledging her life to a much older man who shared few of her interests and did not even seem to be truly in love with her. Just as the monarchy had gained from its marriages, especially the filmed-for-television romance, engagement and wedding of Charles and Diana, the latter attracting a worldwide audience of at least eight hundred million, so it lost commensurately from the failure of those unions.

050

Above: Hello! looks back on the 1981 Royal Wedding from that of 2011.

Diana quickly learned how to work the crowds and to seduce the cameras like Marilyn Monroe. By the end of the eighties, she had become a living fashion icon. Her eating disorder, bulimia, was one suffered by a growing number of young women and teenage girls from less privileged homes. When AIDS was in the news, she hugged its victims to show that it was safe, and she went on to campaign for a ban on the use of land-mines. The slow disintegration of this marriage transfixed Britain, as Diana moved from a china-doll debutante to painfully thin young mother, to an increasingly charismatic and confident public figure, surprising her husband who had always assumed she would be in his shadow. After the birth of their second son Harry in 1987, Charles and Diana’s marriage was visibly failing.

When rumours spread of her affairs, they no longer had the moral impact that they might have had in previous decades. By the nineties, Britain was now a divorce-prone country, in which ‘what’s best for the kids’ and ‘I deserve to be happy’ were phrases which were regularly heard in suburban kitchen-diners. Diana was not simply a pretty woman married to a king-in-waiting but someone people felt, largely erroneously, would understand them. There was an obsessive aspect to the admiration of her, something that the Royal Family had not seen before, and its leading members found it very uncomfortable and even, at times, alarming. They were being challenged as living symbols of Britain’s ‘family values’ and found wanting, just as John Major’s government would also be hoisted by its own petard as its ‘Back to Basics’ campaign was overwhelmed by an avalanche of sexual and financial scandals.

By the mid-1990s, the monarchy was looking shaky, perhaps even mortal. The strain of being at once a ceremonial and a familial institution was proving a bit much. The year 1992, referred to as the Queen as her ‘annus horribilis’ in her Christmas speech, first saw the separation of the other royal couple, Andrew and Sarah, followed by a major fire at Windsor Castle in November. The journalist Andrew Morton claimed to tell Diana’s True Story in a book which described suicide attempts, blazing rows, her bulimia and her growing certainty that Prince Charles had resumed an affair with his old love Camilla Parker-Bowles, something he later confirmed in a television interview with Jonathan Dimbleby. In December, John Major announced the separation of Charles and Diana to the House of Commons. There was a further blow to the Royal Family’s prestige in 1994 when the royal yacht Britannia, the floating emblem of the monarch’s global presence, was decommissioned.

046

 Above: Prince William with his mother, c. 1994.

Then came the revelatory 1995 interview on BBC TV’s Panorama programme between Diana and Martin Bashir. Breaking every taboo left in Royal circles, she freely discussed the breakup of her marriage, claiming that there were ‘three of us’ in it, attacked the Windsors for their cruelty and promised to be ‘a queen of people’s hearts’. Finally divorced in 1996, she continued her charity work around the world and began a relationship with Dodi al-Fayed, the son of the owner of Harrods, Mohammed al-Fayed. To many in the establishment, she was a selfish, unhinged woman who was endangering the monarchy. To many millions more, however, she was more valuable than the formal monarchy, her readiness to share her pain in public making her even more fashionable. She was followed all around the world, her face and name selling many papers and magazines. By the late summer of 1997, Britain had two super-celebrities, Tony Blair and Princess Diana.

It was therefore grimly fitting that Tony Blair’s most resonant words as Prime Minister which brought him to the height of his popularity came on the morning when Diana was killed in a car-crash, together with Dodi, in a Paris underpass. Blair was woken from a deep sleep at his constituency home, first to be told about the accident, and then to be told that Diana had died. Deeply shocked and worried about what his proper role should be, Blair spoke first to Campbell and then to the Queen, who told him that neither she nor any other senior member of the Royal Family would be making a statement. He decided, therefore, that he had to say something. Later that Sunday morning, standing in front of his local parish church, he spoke words which were transmitted live around the world:

“I feel, like everyone else in this country today, utterly devastated. Our thoughts and prayers are with Princess Diana’s family, in particular her two sons, her two boys – our hearts go out to them. We are today a nation in a state of shock…

“Her own life was often sadly touched the lives of so many others in Britain and throughout the world with joy and with comfort. How many times shall we remember her in how many different ways, with the sick, the dying, with children, with the needy? With just a look or a gesture that spoke so much more than words, she would reveal to all of us the depth of her compassion and her humanity.

“People everywhere, not just here in Britain, kept faith with Princess Diana. They liked her, they loved her, they regarded her as one of the people. She was – the People’s Princess and that is how she will stay, how she will remain in our hearts and our memories for ever.”

Although these words seem, more than twenty years on, to be reminiscent of past tributes paid to religious leaders, at the time they were much welcomed and assented to. They were the sentiments of one natural charismatic public figure to another. Blair regarded himself as the people’s Prime Minister, leading the people’s party, beyond left and right, beyond faction or ideology, with a direct line to the people’s instincts. After his impromptu eulogy, his approval rating rose to over ninety per cent, a figure not normally witnessed in democracies. Blair and Campbell then paid their greatest service to the ancient institution of the monarchy itself. The Queen, still angry and upset about Diana’s conduct and concerned for the welfare of her grandchildren, wanted a quiet funeral and to remain at Balmoral, away from the scenes of public mourning in London. However, this was potentially disastrous for her public image. There was a strange mood in the country deriving from Diana’s charisma, which Blair had referenced in his words at Trimdon. If those words had seemed to suggest that Diana was a saint, a sub-religious hysteria responded to the thought. People queued to sign a book of condolence at St James’ Palace, rather than signing it online on the website of the Prince of Wales. Those queuing even reported supernatural appearances of the dead Princess’ image. By contrast, the lack of any act of public mourning by the Windsors and the suggestion of a quiet funeral seemed to confirm Diana’s television criticisms of the Royal Family as being cold if not cruel towards her.

001

In particular, the Queen was criticised for following protocol, which prohibited the flying of flags at Buckingham Palace when she was not in residence, rather than fulfilling the deep need of a grief-stricken public to see the Union flag flying there at half-mast. According to another protocol, flags were only flown at half-mast on the deaths of the monarch or their immediate blood relatives. But the crown lives or dies by such symbolic moments, and the Queen relented. Also, with Prince Charles’ full agreement, Blair and his aides put pressure on the Palace first into accepting that there would have to be a huge public funeral so that the public could express their grief, and second into accepting that the Queen should return to London. She did, just in time to quieten the genuine and growing anger about her perceived attitude towards Diana. This was a generational problem as well as a class one. The Queen had been brought up in a land of buttoned lips, stoicism and private grieving. She now reigned over a country which expected and almost required exhibitionism. For some years, the deaths of children, or the scenes of fatal accidents had been marked by little shrines of cellophane-wrapped flowers, soft toys and cards. In the run-up to Diana’s funeral parts of central London seemed almost Mediterranean in their public grieving. There were vast mounds of flowers, people sleeping out, holding up placards and weeping in the streets, strangers hugging each other.

The immense outpouring of public emotion in the weeks that followed seemed both to overwhelm and distinguish itself from the more traditional devotion to the Queen herself and to her immediate family. The crisis was rescued by a live, televised speech she made from the Palace which was striking in its informality and obviously sincere expression of personal sorrow. As Simon Schama has put it,

The tidal wave of feeling that swept over the country testified to the sustained need of the public to come together in a recognizable community of sentiment, and to do so as the people of a democratic monarchy.

003

The funeral itself was like no other before, bringing the capital to a standstill. In Westminster Abbey, campaigners stood alongside aristocrats, entertainers with politicians and rock musicians with charity workers. Elton John performed a hastily rewritten version of ‘Candle in the Wind’, originally his lament for Marilyn Monroe, now dedicated to ‘England’s Rose’, and Princess Diana’s brother Earl Spencer made a half-coded attack from the pulpit on the Windsors’ treatment of his sister. This was applauded when it was relayed outside and clapping was heard in the Abbey itself. Diana’s body was driven to her last resting place at the Spencers’ ancestral home of Althorp in Northamptonshire. Nearly a decade later, and following many wild theories circulated through cyberspace which reappeared regularly in the press, an inquiry headed by a former Metropolitan Police commissioner concluded that she had died because the driver of her car was drunk and was speeding in order to throw off pursuing ‘paparazzi’ photographers. The Queen recovered her standing after her live broadcast about her wayward former daughter-in-law. She would later rise again in public esteem to be seen to be one of the most successful monarchs for centuries and the longest-serving ever. A popular film about her, including a sympathetic portrayal of these events, sealed this verdict.

012

HM Queen Elizabeth II in 2001.

Tony Blair never again quite captured the mood of the country as he did in those sad late summer days. It may be that his advice and assistance to the Queen in 1997 was as vital to her as it was, in the view of Palace officials, thoroughly impertinent. His instinct for popular culture when he arrived in power was certainly uncanny. The New Age spiritualism which came out into the open when Diana died was echoed among Blair’s Downing Street circle. What other politicians failed to grasp and what he did grasp, was the power of optimism expressed in the glossy world of celebrity, and the willingness of people to forgive their favourites not just once, but again and again. One of the negative longer-term consequences of all this was that charismatic celebrities discovered that, if they apologised and bared a little of their souls in public, they could get away with most things short of murder. For politicians, even charismatic ones like Blair, life would prove a little tougher, and the electorate would be less forgiving of oft-repeated mistakes.

(to be continued).

Posted October 22, 2018 by TeamBritanniaHu in Affluence, Agriculture, BBC, Belfast Agreement, Belgium, Birmingham, Britain, Brussels, Christian Faith, Christianity, Church, Conquest, Conservative Party, devolution, Europe, European Economic Community, European Union, France, History, Integration, Ireland, Irish history & folklore, Journalism, Labour Party, Margaret Thatcher, Migration, Millenarianism, Monarchy, Narrative, nationalism, Nationality, New Labour, Population, Respectability, Scotland, Uncategorized, West Midlands

Tagged with , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

The Other Side of the Eighties in Britain, 1983-1988: The Miners and The Militants.   Leave a comment

Labour – Dropping the Donkey Jacket:

From 1980 to 1983, Michael Foot’s leadership had saved the Labour Party from splitting into two, but in all other respects, it was a disaster. He was too old, too decent, too gentle to take on the hard left or to modernise his party. Foot’s policies were those of a would-be parliamentary revolutionary detained in the second-hand bookshops in Hay-on-Wye. I enjoyed this experience myself in 1982, with a minibus full of bookish ‘revolutionaries’ from Cardiff, who went up there, as it happened, via Foot’s constituency. When roused, which was often, his Cromwellian hair would flap across a face contorted with passion, his hands would whip around excitedly and denunciations would pour forth from him with a fluency ‘old Noll’ would have envied. During his time as leader, he was in his late sixties, and would have been PM at seventy, had he won the 1983 General Election, which, of course, was never a remote possibility. Unlike Thatcher, he was contemptuous of the shallow presentational tricks demanded by television, and he could look dishevelled, being famously denounced for wearing a ‘donkey jacket’, in reality, a Burberry-style woollen coat, at the Remembrance Service at the Cenotaph. But he was more skilled than anyone I saw then or have seen since, in whipping up the socialist faithful in public meetings, or in finger-stabbing attacks on the Tory government in the House of Commons, both in open debates and questions to the PM. He would have been happier communing with Jonathan Swift and my Gulliver forbears in Banbury than attempting to operate in a political system which depended on television performances, ruthless organisation and managerial discipline. He was a political poet in an age of prose.

Nobody in the early eighties could have reined in its wilder members; Foot did his best but led the party to its worst defeat in modern times, on the basis of a hard-left, anti-Europe, anti-nuclear, pro-nationalisation manifest famously described by Gerald Kaufman as the longest suicide note in history. Kaufman had also urged Foot to stand down before the election. It was a measure of the affection felt for him that his ‘swift’ retirement after the defeat was greeted with little recrimination. Yet it also meant that when Neil Kinnock won the subsequent leadership election he had a mandate for change no previous Labour leader had enjoyed. He won with seventy-one per cent of the electoral college votes, against nineteen per cent for Roy Hattersley. Tony Benn was out of Parliament, having lost his Bristol seat, and so could not stand as the standard-bearer of the hard left. Kinnock had been elected after a series of blistering campaign speeches, a Tribunite left-winger who, like Foot, advocated the unilateral abandonment of all Britain’s nuclear weapons, believed in nationalisation and planning and wanted Britain to withdraw from the European Community. A South Wales MP from the same Bevanite stock as Foot, he also supported the abolition of private medicine and the repeal of the Tory trade union reforms. To begin with, the only fights he picked with the Bennites were over the campaign to force Labour MPs to undergo mandatory reselection, which handed a noose to local Militant activists. Yet after the chaos of the 1983 Campaign, he was also sure that the party was in need of radical remedies.

003

To win power, Labour needed to present itself better in the age of the modern mass media. Patricia Hewitt (pictured above), known for her campaigning on civil liberties, joined Kinnock’s new team. She was chosen to fight Leicester East in the 1983 Election but was unsuccessful. In her new role, she began trying to control interviews and placing the leader in more flattering settings than those Foot had found himself in. Kinnock knew how unsightly ‘old’ Labour had looked to the rest of the country and was prepared to be groomed. He gathered around him a ‘Pontypool front row’ of tough, aggressive heavy-weights, including Charles Clarke, the former communist NUS leader; John Reid, another former communist and Glaswegian backbench bruiser. Hewitt herself and Peter Mandelson, grandson of Herbert Morrison and Labour’s side-stepping future director of communications, led the three-quarter line with Kinnock himself as the able scrum-half. Kinnock was the first to flirt with the once-abhorred world of advertising and to seek out the support of pro-Labour pop artists such as Tracy Ullman and Billy Bragg. In this, he was drawing on a long tradition on the Welsh left, from Paul Robeson to the Hennesseys. He smartened up his own style, curtailing the informal mateyness which had made him popular among the ‘boyos’ and introduced a new code of discipline in the shadow cabinet.

004

Neil Kinnock attacking the Militant Tendency at the party conference in 1985.

In the Commons, he tried hard to discomfit Thatcher at her awesome best, which was difficult and mostly unsuccessful. The mutual loathing between them was clear for all to see, and as Thatcher’s popularity began to decline in 1984, Labour’s poll ratings slowly began to improve. But the party harboured a vocal minority of revolutionaries of one kind or another. They included not only the long-term supporters of Tony Benn, like Jeremy Corbyn, but also Arthur Scargill and his brand of insurrectionary syndicalism; the Trotskyist Militant Tendency, a front for the Revolutionary Socialist League, which had been steadily infiltrating the party since the sixties; and assorted hard-left local councillors, like Derek Hatton in Liverpool, a Militant member who were determined to defy Thatcher’s government, no matter how big its democratic mandate, by various ‘ultra-vires’ and illegal stratagems. Kinnock dealt with them all. Had he not done so New Labour would never have happened, yet he himself was a passionate democratic socialist whose own politics were well to the left of the country.

Neil Kinnock was beginning a tough journey towards the centre-ground of British politics, which meant leaving behind people who sounded much like his younger self. On this journey, much of his natural wit and rhetoric would be silenced. He had created his leadership team as if it were a rugby team, involved in a confrontational contact sport against opponents who were fellow enthusiasts, but with their own alternative strategy. He found that political leadership was more serious, drearier and nastier than rugby. And week after week, he was also confronting someone in Thatcher someone whose principles had been set firm long before and whose politics clearly and consistently expressed those principles on the field of play. Yet, like a Welsh scrum-half, he was always on the move, always having to shadow and shade, to side-step and shimmy, playing the ball back into the scrum or sideways to his three-quarters rather than kicking it forward. The press soon dubbed him ‘the Welsh windbag’, due to his long, discursive answers in interviews.

001 (3)

The first and toughest example of what he was up against came with the miners’ strike. Neil Kinnock and Arthur Scargill (above) had already shown their loathing for each other over the mainstream leadership’s battles with the Bennites. The NUM President was probably the only person on the planet that Kinnock hated more than Thatcher. He distrusted Scargill’s aims, despised his tactics and realised early on that he was certain to fail. In this, he was sharing the views of the South Wales NUM who had already forced a U-turn on closures from an unprepared Thatcher in 1981. Yet they, and he had to remain true to their own traditions and heritage. They both found themselves in an embarrassing situation, but more importantly, they realised that like it or not, they were in an existential struggle. As the violence spread, the Conservatives in the Commons and their press continually goaded and hounded him to denounce the use of ‘flying pickets’ and to praise the police. He simply could not do so, as so many on his own side had experienced the violence of the police, or heard about it from those who had. For him to attack the embattled trade union would be seen as the ultimate betrayal by a Labour leader. He was caught between the rock of Thatcher and hard place of Scargill. In the coalfields, even in South Wales, he was shunned on the picket lines as the miner’s son too “frit” in Thatcher’s favourite phrase, to come to the support of the miners in their hour of need. Secretly, however, there was some sympathy for his impossible situation among the leadership of the South Wales NUM. Kinnock at least managed to avoid fusing Labour and the NUM in the mind of many Labour voters, ensuring that Scargill’s ultimate, utter defeat was his alone. But this lost year destroyed his early momentum and stole his hwyl, his Welsh well-spring of ‘evangelical’ socialist spirit.

The Enemy Within?:

002

Above: Striking Yorkshire miners barrack moderate union leaders in Sheffield.

The first Thatcher government was had been dominated by the Falklands War; the second was dominated by the miners’ strike. Spurred on by ‘the spirit of the Falklands’, the government took a more confrontational attitude towards the trade unions after the 1983 General Election. This year-long battle, 1984-5, was the longest strike in British history, the most bitter, bloody and tragic industrial dispute since the General Strike and six-month Miners’ Lock-out of 1926. The strike was eventually defeated, amid scenes of mass picketing and running battles between the police and the miners. It resulted in the total defeat of the miners followed by the end of deep coal-mining in Britain. In reality, the strike simply accelerated the continuing severe contraction in the industry which had begun in the early eighties and which the South Wales NUM had successfully resisted in what turned out, however, to be a Pyrrhic victory. By 1984, the government had both the resources, the popular mandate and the dogged determination to withstand the miners’ demands. The industry had all but vanished from Kent, while in Durham two-thirds of the pits were closed. They were the only real source of employment to local communities, so the social impact of closures proved devastating. In the Durham pit villages, the entire local economy was crippled and the miners’ housing estates gradually became the ghost areas they remain today.

001

The government had little interest in ensuring the survival of the industry, with its troublesome and well-organised union which had already won a national strike against the Heath government a decade earlier. For the Thatcher government, the closures resulting from the defeat of the strike were a price it was willing to pay in order to teach bigger lessons. Later, the Prime Minister of the time reflected on these:

What the strike’s defeat established was that Britain could not be made ungovernable by the Fascist Left. Marxists wanted to defy the law of the land in order to defy the laws of economics. They failed and in doing so demonstrated just how mutually dependent the free economy and a free society really are.

It was a confrontation which was soaked in history on all sides. For the Tories, it was essential revenge for Heath’s humiliation, a score they had long been eager to settle; Margaret Thatcher spoke of Arthur Scargill and the miners’ leaders as ‘the enemy within’, as compared to Galtieri, the enemy without. For thousands of traditionally ‘militant’ miners, it was their last chance to end decades of pit closures and save their communities, which were under mortal threat. For their leader Arthur Scargill, it was an attempt to follow Mick McGahey in pulling down the government and winning a class war. He was no more interested than the government, as at least the other former, more moderate leaders had been, in the details of pay packets, or in a pit-by-pit review to determine which pits were truly uneconomic. He was determined to force the government, in Thatcher’s contemptuous phrase, to pay for mud to be mined rather than see a single job lost.

The Thatcher government had prepared more carefully than Scargill. Following the settlement with the South Wales NUM, the National Coal Board (NCB) had spent the intervening two years working with the Energy Secretary, Nigel Lawson, to pile up supplies of coal at the power stations; stocks had steadily grown, while consumption and production both fell. Following the riots in Toxteth and Brixton, the police had been retrained and equipped with full riot gear without which, ministers later confessed, they would have been unable to beat the pickets. Meanwhile, Thatcher had appointed a Scottish-born Australian, Ian MacGregor, to run the NCB. He had a fierce reputation as a union-buster in the US and had been brought back to Britain to run British Steel where closures and 65,000 job cuts had won him the title ‘Mac the Knife’. Margaret Thatcher admired him as a tough, no-nonsense man, a refreshing change from her cabinet, though she later turned against him for his lack of political nous. His plan was to cut the workforce of 202,000 by 44,000 in two years, then take another twenty thousand jobs out. Twenty pits would be closed, to begin with. When he turned up to visit mines, he was abused, pelted with flour bombs and, on one occasion, knocked to the ground.

Arthur Scargill was now relishing the coming fight as much as Thatcher. In the miners’ confrontation with Heath, Scargill had led the flying pickets at the gates of the Saltley coke depot outside Birmingham. Some sense of both his revolutionary ‘purity’, combined with characteristic Yorkshire bluntness, comes from an exchange he had with Dai Francis, the Welsh Miners’ leader at that time. He had called Francis to ask for Welsh pickets to go to Birmingham and help at the depot. Francis asked when they were needed and Scargill replied:

“Tomorrow, Saturday.”

“But Wales are playing Scotland at Cardiff Arms Park.”

“But Dai, the working class are playing the ruling class at Saltley.”

009

Many found Scargill inspiring; many others found him scary. Like Francis, he had been a Communist, but unlike Dai (pictured above, behind the poster, during the 1972 strike), he retained hard-line Marxist views and a penchant for denouncing anyone who disagreed with him. Kim Howells, also a former Communist and an officer of the South Wales NUM, gained a sense of Scargill’s megalomania when, just prior the 1984-5 strike, he visited his HQ in Barnsley, already known as ‘Arthur’s Castle’. Howells, a historian of the Welsh Labour movement, later becoming an MP and New Labour minister, was taken aback to find him sitting at this Mussolini desk with a great space in front of it. Behind him was a huge painting of himself on the back of a lorry, posed like Lenin, urging picketing workers in London to overthrow the ruling class. Howells thought anyone who could put up a painting like that was nuts and returned to Pontypridd to express his fears to the Welsh miners:

And of course the South Wales executive almost to a man agreed with me. But then they said, “He’s the only one we’ve got, see, boy.  The Left has decided.”

Scargill had indeed been elected by a huge margin and had set about turning the NUM’s once moderate executive, led by Joe Gormley, into a militant group. The Scottish Miners’ leader, Mick McGahey, although older and wiser than his President, was his Vice-President. Scargill had been ramping up the rhetoric for some time. He had told the NUM Conference in 1982, …

If we do not save our pits from closure then all our other struggles will become meaningless … Protection of the industry is my first priority because without jobs all our other claims lack substance and become mere shadows. Without jobs, our members are nothing …

Given what was about to happen to his members’ jobs as a result of his uncompromising position in the strike, there is a black irony in those words. By insisting that no pits should be closed on economic grounds, even if the coal was exhausted, and that more investment would always find more coal, from his point of view the losses were irrelevant. He made sure that confrontation would not be avoided. An alternative strategy put forward by researchers for the South Wales NUM was that it was the NCB’s economic arguments that needed to be exposed, along with the fact that it was using the Miners’ Pension Fund to invest in the production of cheap coal in Poland and South Africa. It’s definition of what was ‘economic’ in Britain rested on the comparative cost of importing this coal from overseas. If the NCB had invested these funds at home, the pits in Britain would not be viewed as being as ‘uneconomic’ as they claimed. But Scargill was either not clever enough to deploy these arguments or too determined to pursue the purity of his brand of revolutionary syndicalism, or both.

The NUM votes which allowed the strike to start covered both pay and closures, but from the start Scargill emphasised the closures. To strike to protect jobs, particularly other people’s jobs, in other people’s villages and other countries’ pits, gave the confrontation an air of nobility and sacrifice which a mere wages dispute would not have enjoyed. But national wage disputes had, for more than sixty years, been about arguments over the ‘price of coal’ and the relative difficulties of extracting it from a variety of seams in very different depths across the various coalfields. Neil Kinnock, the son and grandson of Welsh miners, found it impossible to condemn Scargill’s strategy without alienating support for Labour in its heartlands. He did his best to argue the economics of the miners’ case, and to condemn the harshness of the Tory attitude towards them, but these simply ran parallel to polarised arguments which were soon dividing the nation.

Moreover, like Kinnock, Scargill was a formidable organiser and conference-hall speaker, though there was little economic analysis to back up his rhetoric. Yet not even he would be able to persuade every part of the industry to strike. Earlier ballots had shown consistent majorities against striking. In Nottinghamshire, seventy-two per cent of the areas 32,000 voted against striking. The small coalfields of South Derbyshire and Leicestershire were also against. Even in South Wales, half of the NUM lodges failed to vote for a strike. Overall, of the seventy thousand miners who were balloted in the run-up to the dispute, fifty thousand had voted to keep working. Scargill knew he could not win a national ballot, so he decided on a rolling series of locally called strikes, coalfield by coalfield, beginning in Yorkshire, then Scotland, followed by Derbyshire and South Wales. These strikes would merely be approved by the national union. It was a domino strategy; the regional strikes would add up to a national strike, but without a national ballot.

But Scargill needed to be sure the dominoes would fall. He used the famous flying pickets from militant areas to shut down less militant ones. Angry miners were sent in coaches and convoys of cars to close working pits and the coke depots, vital hubs of the coal economy. Without the pickets, who to begin with rarely needed to use violence to achieve their end, far fewer pits would have come out. But after scenes of physical confrontation around Britain, by April 1984 four miners in five were on strike. There were huge set-piece confrontations with riot-equipped police bused up from London or down from Scotland, Yorkshire to Kent and Wales to Yorkshire, generally used outside their own areas in order to avoid mixed loyalties. As Andrew Marr has written, …

It was as if the country had been taken over by historical re-enactments of civil war battles, the Sealed Knot Society run rampant. Aggressive picketing was built into the fabric of the strike. Old country and regional rivalries flared up, Lancashire men against Yorkshire men, South Wales miners in Nottinghamshire.

The Nottinghamshire miners turned out to be critical since without them the power stations, even with the mix of nuclear and oil, supplemented by careful stockpiling, might have begun to run short and the government would have been in deep trouble. To Scargill’s disdain, however, other unions also refused to come out in sympathy, thus robbing him of the prospect of a General Strike, and it soon became clear that the NUM had made other errors in their historical re-enactments. Many miners were baffled from the beginning as to why Scargill had opted to strike in the spring when the demand for energy was relatively low and the stocks at the power stations were not running down at anything like the rate which the NUM needed in order to make their action effective. This was confirmed by confidential briefings from the power workers, and it seemed that the government just had to sit out the strike.

In this civil war, the police had the cavalry, while the miners were limited to the late twentieth-century equivalent of Oakey’s dragoons at Naseby, their flying pickets, supporting their poor bloody infantry, albeit well-drilled and organised. Using horses, baton charges and techniques learned in the aftermath of the street battles at Toxteth and Brixton, the police defended working miners with a determination which delighted the Tories and alarmed many others, not just the agitators for civil rights. An event which soon became known as the Battle of Orgreave (in South Yorkshire) was particularly brutal, involving ‘Ironside’ charges by mounted police in lobster-pot style helmets into thousands of miners with home-made pikes and pick-axe handles.

The NUM could count on almost fanatical loyalty in coalfield towns and villages across Britain. Miners gave up their cars, sold their furniture, saw their wives and children suffer and lost all they had in the cause of solidarity. Food parcels arrived from other parts of Britain, from France and most famously, from Soviet Russia. But there was a gritty courage and selflessness in mining communities which, even after more than seventy years of struggle, most of the rest of Britain could barely understand. But an uglier side to this particularly desperate struggle also emerged when a taxi-driver was killed taking a working miner to work in Wales. A block of concrete was dropped from a pedestrian bridge onto his cab, an act swiftly condemned by the South Wales NUM.

In Durham, the buses taking other ‘scabs’ to work in the pits were barraged with rocks and stones, as later portrayed in the film Billy Elliot. The windows had to be protected with metal grills. There were murderous threats made to strike-breaking miners and their families, and even trade union ‘allies’ were abused. Norman Willis, the amiable general secretary of the TUC, had a noose dangled over his head when he spoke at one miners’ meeting. This violence was relayed to the rest of the country on the nightly news at a time when the whole nation still watched together. I remember the sense of helplessness I felt watching the desperation of the Welsh miners from my ‘exile’ in Lancashire, having failed to find a teaching post in the depressed Rhondda in 1983. My Lancastrian colleagues were as divided as the rest of the country over the strike, often within themselves as well as from others. In the end, we found it impossible to talk about the news, no matter how much it affected us.

Eventually, threatened by legal action on the part of the Yorkshire miners claiming they had been denied a ballot, the NUM was forced onto the back foot. The South Wales NUM led the calls from within for a national ballot to decide on whether the strike should continue. Scargill’s decision to accept a donation from Colonel Gaddafi of Libya found him slithering from any moral ground he had once occupied. As with Galtieri, Thatcher was lucky in the enemies ‘chosen’ for her. Slowly, month by month, the strike began to crumble and miners began to trail back to work. A vote to strike by pit safety officers and overseers, which would have shut down the working pits, was narrowly avoided by the government. By January 1985, ten months after they had first been brought out, strikers were returning to work at the rate of 2,500 a week, and by the end of February, more than half the NUM’s membership was back at work. In some cases, especially in South Wales, they marched back proudly behind brass bands.

001 (2)

Above: ‘No way out!’ – picketing miners caught and handcuffed to a lamp-post by police.

Scargill’s gamble had gone catastrophically wrong. He has been compared to a First World War general, a donkey sending lions to the slaughter, though at Orgreave and elsewhere, he had stood with them too. But the political forces engaged against the miners in 1984 were entirely superior in strength to those at the disposal of the ill-prepared Heath administration of ten years earlier. A shrewder, non-revolutionary leader would not have chosen to take on Thatcher’s government at the time Scargill did, or having done so, would have found a compromise after the first months of the dispute. Today, there are only a few thousand miners left of the two hundred thousand who went on strike. An industry which had once made Britain into a great industrial power, but was always dangerous, disease-causing, dirty and polluting, finally lay down and died. For the Conservatives and perhaps for, by the end of the strike, the majority of the moderate British people, Scargill and his lieutenants were fighting parliamentary democracy and were, therefore, an enemy which had to be defeated. But the miners of Durham, Derbyshire, Kent, Fife, Yorkshire, Wales and Lancashire were nobody’s enemy. They were abnormally hard-working, traditional people justifiably worried about losing their jobs and loyal to their union, if not to the stubborn syndicalists in its national leadership.

Out with the Old Industries; in with the New:

In Tyneside and Merseyside, a more general deindustrialisation accompanied the colliery closures. Whole sections of industry, not only coal but also steel and shipbuilding, virtually vanished from many of their traditional areas.  Of all the areas of Britain, Northern Ireland suffered the highest level of unemployment, partly because the continuing sectarian violence discouraged investment. In February 1986, there were officially over 3.4 million unemployed, although statistics were manipulated for political reasons and the real figure is a matter of speculation. The socially corrosive effects were felt nationally, manifested in further inner-city rioting in 1985. Inner London was just as vulnerable as Liverpool, a crucial contributory factor being the number of young men of Asian and Caribbean origin who saw no hope of ever entering employment: opportunities were minimal and they felt particularly discriminated against. The term ‘underclass’ was increasingly used to describe those who felt themselves to be completely excluded from the benefits of prosperity.

Prosperity there certainly was, for those who found alternative employment in the service industries. Between 1983 and 1987, about 1.5 million new jobs were created. Most of these were for women, and part-time vacancies predominated. The total number of men in full-time employment fell still further, and many who left the manufacturing for the service sector earned much-reduced incomes. The economic recovery that led to the growth of this new employment was based mainly on finance, banking and credit. Little was invested in British manufacturing. Far more was invested overseas; British foreign investments rose from 2.7 billion in 1975 to a staggering 90 billion in 1985. At the same time, there was a certain amount of re-industrialisation in the South East, where new industries employing the most advanced technology grew. In fact, many industries shed a large proportion of their workforce but, using new technology, maintained or improved their output.

These new industries were not confined to the South East of England: Nissan built the most productive car plant in Europe at Sunderland. After an extensive review, Sunderland was chosen for its skilled workforce and its location near major ports. The plant was completed in 1986 as the subsidiary Nissan Motor Manufacturing (UK) Ltd. Siemens established a microchip plant at Wallsend on Tyneside in which it invested 1.1 billion. But such industries tended not to be large-scale employers of local workers. Siemens only employed about 1,800. Traditional regionally-based industries continued to suffer a dramatic decline during this period. Coal-mining, for example, was decimated in the years following the 1984-5 strike, not least because of the shift of the electricity generation of the industry towards alternative energy sources, especially gas. During 1984-7 the coal industry shed 170,000 workers.

The North-South Divide – a Political Complex?:

By the late 1980s, the north-south divide in Britain seemed as intractable as it had all century, with high unemployment particularly concentrated in the declining extractive and manufacturing industries of the North of England, Scotland and Wales. That the north-south divide increasingly had a political as well as an economic complexion was borne out by the outcome of the 1987 General Election. While Margaret Thatcher was swept back to power for the third time, her healthy Conservative majority largely based on the voters of the South and East of England. North of a line roughly between the Severn and the Humber, the long decline of the Tories, especially in Scotland, where they were reduced to ten seats, was increasingly apparent. At the same time, the national two-party system seemed to be breaking down. South of the Severn-Humber line, where Labour seats were now very rare outside London, the Liberal-SDP Alliance were the main challengers to the Conservatives in many constituencies.

The Labour Party continued to pay a heavy price for its internal divisions, as well as for the bitterness engendered by the miners’ strike. It is hardly Neil Kinnock’s fault that he is remembered for his imprecise long-windedness, the product of self-critical and painful political readjustment. His admirers recall his great platform speeches, the saw-edged wit and air-punching passion. There was one occasion, however, when Kinnock spoke so well that he united most of the political world in admiration. This happened at the Labour conference in Bournemouth in October 1985. A few days before the conference, Liverpool City Council, formally Labour-run but in fact controlled by the Revolutionary Socialist League, had sent out redundancy notices to its thirty-one thousand staff. The revolutionaries, known by the name of their newspaper, Militant, were a party-within-a-party, a parasitic body within Labour. They had some five thousand members who paid a proportion of their incomes to the RSL so that the Militant Tendency had a hundred and forty full-time workers, more than the staff of the Social Democrats and Liberals combined. They had a presence all around Britain, but Liverpool was their great stronghold. There they practised Trotsky’s politics of the transitional demand, the tactic of making impossible demands for more spending and higher wages so that when the ‘capitalist lackeys’ refused these demands, they could push on to the next stage, leading to collapse and revolution.

In Liverpool, where they were building thousands of new council houses, this strategy meant setting an illegal council budget and cheerfully bankrupting the city. Sending out the redundancy notices to the council’s entire staff was supposed to show Thatcher they would not back down, or shrink from the resulting chaos. Like Scargill, Militant’s leaders thought they could destroy the Tories on the streets. Kinnock had thought of taking them on a year earlier but had decided that the miners’ strike made that impossible. The Liverpool mayhem gave him his chance, so in the middle of his speech at Bournemouth, he struck. It was time, he said, for Labour to show the public that it was serious. Implausible promises would not bring political victory:

I’ll tell you what happens with impossible promises. You start with far-fetched resolutions. They are then pickled into a rigid dogma, a code, and you go through the years sticking to that, outdated, misplaced, irrelevant to the real needs, and you end in the grotesque chaos of a Labour council – a Labour council – hiring taxis to scuttle round a city handing out redundancy notices to its own workers.

By now he had whipped himself into real anger, a peak of righteous indignation, but he remained in control. His enemies were in front of him, and all the pent-up frustrations of the past year were being released. The hall came alive. Militant leaders like Derek Hatton stood up and yelled ‘lies!’ Boos came from the hard left, and some of their MPs walked out, but Kinnock was applauded by the majority in the hall, including his mainstream left supporters. Kinnock went on with a defiant glare at his opponents:

I’m telling you, and you’ll listen, you can’t play politics with people’s jobs and with people’s services, or with their homes. … The people will not, cannot abide posturing. They cannot respect the gesture-generals or the tendency tacticians.

Most of those interviewed in the hall and many watching live on television, claimed it was the most courageous speech they had ever heard from a Labour leader, though the hard left remained venomously hostile. By the end of the following month, Liverpool District Labour Party, from which Militant drew its power, was suspended and an inquiry was set up. By the spring of 1986, the leaders of Militant had been identified and charged with behaving in a way which was incompatible with Labour membership. The process of expelling them was noisy, legally fraught and time-consuming, though more than a hundred of them were eventually expelled. There was a strong tide towards Kinnock across the rest of the party, with many left-wingers cutting their ties to the Militant Tendency. There were many battles with the hard left to come, and several pro-Militant MPs were elected in the 1987 Election. These included two Coventry MPs, Dave Nellist and John Hughes, ‘representing’ my own constituency, whose sole significant, though memorable ‘contribution’ in the House of Commons was to interrupt prayers. Yet by standing up openly to the Trotskyist menace, as Wilson, Callaghan and Foot had patently failed to do, Kinnock gave his party a fresh start. It began to draw away from the SDP-Liberal Alliance in the polls and did better in local elections. It was the moment when the New Labour project became possible.

A Third Victory and a Turning of the Tide:

Yet neither this internal victory nor the sharper management that Kinnock introduced, would bring the party much good against Thatcher in the following general election. Labour was still behind the public mood. Despite mass unemployment, Thatcher’s free-market optimism was winning through, and Labour was still committed to re-nationalisation, planning, a National Investment Bank and unilateral nuclear disarmament, a personal cause of both Neil and his wife, Glenys, over the previous twenty years. The Cold War was thawing and it was not a time for the old certainties, but for the Kinnocks support for CND was fundamental to their political make-up. So he stuck to the policy, even as he came to realise how damaging it was to Labour’s image among swing voters. Under Labour, all the British and US nuclear bases would be closed, the Trident nuclear submarine force cancelled, all existing missiles scrapped and the UK would no longer expect any nuclear protection from the US in time of war. Instead, more money would be spent on tanks and conventional warships. All of this did them a lot of good among many traditional Labour supporters; Glenys turned up at the women’s protest camp at Greenham Common. But it was derided in the press and helped the SDP to garner support from the ‘middle England’ people Labour needed to win back. In the 1987 General Election campaign, Kinnock’s explanation about why Britain would not simply surrender if threatened by a Soviet nuclear attack sounded as if he was advocating some kind of Home Guard guerrilla campaign once the Russians had arrived. With policies like this, he was unlikely to put Thatcher under serious pressure.

When the 1987 election campaign began, Thatcher had a clear idea about what her third administration would do. She wanted more choice for the users of state services. There would be independent state schools outside the control of local councillors, called grant-maintained schools.  In the health services, though it was barely mentioned in the manifesto, she wanted money to follow the patient. Tenants would be given more rights. The basic rate of income tax would be cut and she would finally sort out local government, ending the ‘rates’ and bringing in a new tax. On paper, the programme seemed coherent, which was more than could be said for the Tory campaign itself. Just as Kinnock’s team had achieved a rare harmony and discipline, Conservative Central Office was riven by conflict between politicians and ad-men. The Labour Party closed the gap to just four points and Mrs Thatcher’s personal ratings also fell as Kinnock’s climbed. He was seen surrounded by admiring crowds, young people, nurses, waving and smiling, little worried by the hostile press. In the event, the Conservatives didn’t need to worry. Despite a last-minute poll suggesting a hung parliament, and the late surge in Labour’s self-confidence, the Tories romped home with an overall majority of 101 seats, almost exactly the share, forty-two per cent, they had won in 1983. Labour made just twenty net gains, and Kinnock, at home in Bedwellty, was inconsolable. Not even the plaudits his team had won from the press for the brilliance, verve and professionalism of their campaign would lift his mood.

The SDP-Liberal Alliance had been floundering in the polls for some time, caught between Labour’s modest revival and Thatcher’s basic and continuing popularity with a large section of voters. The rumours of the death of Labour had been greatly exaggerated, and the ‘beauty contest’ between the two Davids, Steel and Owen, had been the butt of much media mockery. Owen’s SDP had its parliamentary presence cut from eight MPs to five, losing Roy Jenkins in the process. While most of the party merged with the Liberals, an Owenite rump limped on for a while. Good PR, packaging and labelling were not good enough for either Labour or the SDP. In 1987, Thatcher had not yet created the country she dreamed of, but she could argue that she had won a third consecutive victory, not on the strength of military triumph, but on the basis of her ideas for transforming Britain. She also wanted to transform the European Community into a free-trade area extending to the Baltic, the Carpathians and the Balkans. In that, she was opposed from just across the Channel and from within her own cabinet.

In the late eighties, Thatcher’s economic revolution overreached itself. The inflationary boom happened due to the expansion of credit and a belief among ministers that, somehow, the old laws of economics had been abolished; Britain was now supposed to be on a continual upward spiral of prosperity. But then, on 27 October 1986, the London Stock Exchange ceased to exist as the institution had formerly done. Its physical floor, once heaving with life, was replaced by dealing done by computer and phone. The volume of trading was fifteen times greater than it had been in the early eighties. This became known as ‘the Big Bang’ and a country which had exported two billion pounds-worth of financial services per year before it was soon exporting twelve times that amount. The effect of this on ordinary Britons was to take the brake off mortgage lending, turning traditional building societies into banks which started to thrust credit at the British public. Borrowing suddenly became a good thing to do and mortgages were extended rather than being paid off. The old rules about the maximum multiple of income began to dissolve. From being two and a half times the homeowner’s annual salary, four times became acceptable in many cases. House prices began to rise accordingly and a more general High Street splurge was fuelled by the extra credit now freely available. During 1986-88 a borrowing frenzy gripped the country, egged on by swaggering speeches about Britain’s ‘economic miracle’ from the Chancellor, Nigel Lawson, and the Prime Minister. Lawson later acknowledged:

My real mistake as Chancellor was to create a climate of optimism that, in the end, encouraged borrowers to borrow more than they should.

In politics, the freeing up and deregulation of the City of London gave Margaret Thatcher and her ministers an entirely loyal and secure base of rich, articulate supporters who helped see her through some tough battles. The banks spread the get-rich-quick prospect to millions of British people through privatisation share issues and the country, for a time, came closer to the share-owning democracy that Thatcher dreamed of.

The year after the election, 1988, was the real year of hubris. The Thatcher government began an attack on independent institutions and bullying the professions. Senior judges came under tighter political control and University lecturers lost the academic tenure they had enjoyed since the Middle Ages. In Kenneth Baker’s Great Education Reform Bill (‘Gerbil’) of that year, Whitehall grabbed direct control over the running of the school curriculum, creating a vast new state bureaucracy to dictate what should be taught, when and how, and then to monitor the results. Teachers could do nothing. The cabinet debated the detail of maths courses; Mrs Thatcher spent much of her time worrying about the teaching of history. Working with history teachers, I well remember the frustration felt by them at being forced to return to issues of factual content rather than being able to continue to enthuse young people with a love for exploring sources and discovering evidence for themselves. Mrs Thatcher preferred arbitrary rules of knowledge to the development of know-how. She was at her happiest when dividing up the past into packages of ‘history’ and ‘current affairs’. For example, the 1956 Hungarian Revolution was, she said, part of history, whereas the 1968 Prague Spring was, twenty years on, still part of ‘current affairs’ and so should not appear in the history curriculum, despite the obvious connections between the two events. It happened at a time when education ministers were complaining bitterly about the lack of talent, not among teachers, but among civil servants, the same people they were handing more power to. A Hungarian history teacher, visiting our advisory service in Birmingham, expressed his discomfort, having visited a secondary school in London where no-one in a Humanities’ class could tell him where, geographically, his country was.

At that time, my mother was coming to the end of a long career in NHS administration as Secretary of the Community Health Council (‘The Patients’ Friend’) in Coventry which, as elsewhere, had brought together local elected councillors, health service practitioners and managers, and patients’ groups to oversee the local hospitals and clinics and to deal with complaints. But the government did not trust local representatives and professionals to work together to improve the health service, so the Treasury seized control of budgets and contracts. To administer the new system, five hundred NHS ‘trusts’ were formed, and any involvement by elected local representatives was brutally terminated. As with Thatcher’s education reforms, the effect of these reforms was to create a new bureaucracy overseeing a regiment of quangos (quasi/ non-governmental organisations). She later wrote:

We wanted all hospitals to have greater responsibility for their affairs.  … the self-governing hospitals to be virtually independent.

In reality, ‘deregulation’ of care and ‘privatisation’ of services were the orders of the day. Every detail of the ‘internal market’ contracts was set down from the centre, from pay to borrowing to staffing. The rhetoric of choice in practice meant an incompetent dictatorship of bills, contracts and instructions. Those who were able to vote with their chequebooks did so. Between 1980 and 1990, the number of people covered by the private health insurance Bupa nearly doubled, from 3.5 million to a little under seven million. Hubris about what the State could and could not do was to be found everywhere. In housing, 1988 saw the establishment of unelected Housing Action Trusts to take over the old responsibility of local authorities for providing what is now known as ‘affordable housing’. Mrs Thatcher claimed that she was trying to pull the State off people’s backs. In her memoirs, she wrote of her third government,

… the root cause of our contemporary social problems … was that the State had been doing too much.

Yet her government was intervening in public services more and more. The more self-assured she became, the less she trusted others to make the necessary changes to these. That meant accruing more power to the central state. The institutions most heart in this process were local councils and authorities. Under the British constitution, local government is defenceless against a ‘Big Sister’ PM, with a secure parliamentary majority and a loyal cabinet. So it could easily be hacked away, but sooner or later alternative centres of power, both at a local and national level, would be required to replace it and, in so doing, overthrew the overbearing leader.

Sources:

Andrew Marr (2008), A History of Modern Britain. Basingstoke: Pan Macmillan.

Peter Catterall, Roger Middleton & John Swift (2001), The Penguin Atlas of British & Irish History. London: Penguin Books.

 

 

 

Posted October 1, 2018 by TeamBritanniaHu in Affluence, Birmingham, Britain, British history, Britons, Caribbean, Coalfields, Cold War, Communism, Conservative Party, Coventry, democracy, Europe, European Economic Community, France, guerilla warfare, History, Humanities, Hungary, Ireland, Journalism, Labour Party, Marxism, Midlands, Migration, Militancy, Narrative, National Health Service (NHS), nationalisation, Population, Remembrance, Revolution, Russia, Social Service, south Wales, Thatcherism, Uncategorized, Unemployment, USA, USSR, Victorian, Wales, Welfare State

Tagged with , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Britain, 1974-79: The Three-Day Week to the Winter of Discontent: Part Two.   Leave a comment

001

The Decade of Extremes – Punks, Skinheads & Hooligans:

The 1970s was an extreme decade; the extreme left and extreme right were reflected even in its music. Much of what happened in British music and fashion during the seventies was driven by the straightforward need to adopt and then outpace what had happened the day before. The ‘Mods’ and ‘Hippies’ of the sixties and early seventies were replaced by the first ‘skinheads’, though in the course there were ‘Ziggy Stardust’ followers of David Bowie who would bring androgyny and excess to the pavements and even to the playground. Leather-bound punks found a way of offending the older rockers; New Romantics with eye-liner and quiffs challenged the ‘Goths’. Flared jeans and then baggy trousers were suddenly ‘in’ and then just as quickly disappeared. Shoes, shirts, haircuts, mutated and competed. For much of this time, the game didn’t mean anything outside its own rhetoric. One minute it was there, the next it had gone. Exactly the same can be said of musical fads, the way that Soul was picked up in Northern clubs from Wigan to Blackpool to Manchester, the struggle between the concept albums of the art-house bands and the arrival of punkier noises from New York in the mid-seventies, the dance crazes that came and went. Like fashion, musical styles began to break up and head in many directions in the period, coexisting as rival subcultures across the country. Rock and roll was not dead, as Don McLean suggested in American Pie, when heavy metal and punk-rock arrivednor was Motown, when reggae and ska arrived. The Rolling Stones and Yes carried on oblivious to the arrivals of the Sex Pistols and the Clash. 

In this stylistic and musical chaos, running from the early seventies to the ‘noughties’, there were moments and themes which stuck out. Yet from 1974 until the end of 1978, living standards, which had doubled since the fifties, actually went into decline. The long boom for the working-classes was over. British pop had been invented during the optimistic years of 1958-68 when the economy was most of the time buoyant and evolving at its fastest and most creative spirit. The mood had turned in the years 1968-73, towards fantasy and escapism, as unemployment arrived and the world seemed bleaker and more confusing. This second phase involved the sci-fi glamour of David Bowie and the gothic mysticism of the ‘heavy metal’ bad-boy bands like Black Sabbath and Led Zeppelin. The picture below shows Robert Plant and Jimmy Page on stage in Chicago during their 1977 North American tour (Page is playing the double-neck Gibson used for their classic song, Stairway to Heaven).

A colour photograph of Robert Plant with microphone and Jimmy Page with a double necked guitar performing on stage.

The years 1974-79 were a period of deep political disillusion, with strains that seemed to tear at the unity of the United Kingdom: First there was Irish terrorism on the mainland, when in October two IRA bombs exploded in Guildford, followed by two more in Birmingham. Like many others, I will never forget the horrendous scenes in England’s second city the day after the Tavern in the Town was blasted. This was followed by a rise in racial tension and widespread industrial mayhem. The optimism which had helped to fuel the flowering of popular culture in the sixties was suddenly exhausted, so it is perhaps not a coincidence that this period was a darker time in music and fashion, a nightmare inversion of the sixties dream. In sport, the mid-seventies saw the invention of the ‘football hooligan’.

002

This led on to serious problems for football grounds around the country, as the government introduced the 1975 Safety of Sports Grounds Act. The home of Wolverhampton Wanderers, ‘Molineux’, had remained virtually unchanged since 1939, apart from the Molineux Street Stand, which had been made all-seater. But this distinctive seven-gabled stand (seen in the picture above) was deemed unsafe according to the act’s regulations and therefore had to be replaced. Architects were commissioned to replace the old stand, with its unique shape, with a new stand. To do this, the club had to purchase the remaining late Victorian terraced houses in Molineux Street and North Street which pre-dated the football ground, and all seventy-one of them were demolished to clear space for the new two million pound stand to be built at the rear of the old stand. The ‘new’ stand, with its 9,348 seats and forty-two executive boxes, was officially opened on 25 August 1979. Once the debris of the old stand was moved away, the front row of seats were almost a hundred feet from the pitch. From the back row, the game was so far away that it had to be reported by rumour! Also, throughout this period, the team needed strengthening.

001

006

In the 1974-75 season, Wolves won the League Cup, beating star-studded Manchester City 2-1 at Wembley, and nearly reversed a 4-1  deficit against FC Porto in the UEFA Cup with an exciting 3-1 home victory. Wolves finished in a respectable twelfth place in the League. But at the end of the season, the team’s talisman centre-forward, Belfast-born Derek Dougan, decided to retire. He had joined the club in 1967, becoming an instant hit with the Wolves fans when he scored a hat-trick on his home debut, and netting nine times in eleven games to help Wolves win promotion that season. He was a charismatic man, a thrilling player and one of the best headers of the ball ever seen. He also held the office of Chairman of the PFA (Professional Football Association) and in 1971/72 forged a highly successful striking partnership with John Richards. Their first season together produced a forty League and UEFA Cup goals, twenty-four the Doog and sixteen for Richards. In 1972/73, they shared fifty-three goals in all competitions, Richards getting thirty-six and Dougan seventeen. In two and a half seasons of their partnership, the duo scored a total of 125 goals in 127 games. Derek Dougan signed off at Molineux on Saturday, 26th April 1975. In his nine years at Wolves, Dougan made 323 appearances and scored 123 goals, including five hat-tricks. He also won 43 caps for Northern Ireland, many of them alongside the great George Best, who himself had been a Wolves fan as a teenager.

005

Above: Derek Dougan in 1974/75, the season he retired.

004

Wolves had always been considered ‘too good to go down’ after their 1967 promotion but following the departure of ‘the Doog’ they embarked on a run to obscurity, finishing twentieth at the end of the 1975/76 season, resulting in their relegation to the second tier of English football. Worse still, early in 1976, Wolves’ fabulously speedy left-winger, Dave Wagstaffe, was transferred to Blackburn Rovers. In his twelve years at Molineux, ‘Waggy’ had scored thirty-one goals, including a ‘screamer’ in a 5-1 defeat of Arsenal, in over four hundred appearances. In time-honoured fashion, the majority of fans wanted money to be spent on new players, not on a stand of such huge proportions. Although Wolves returned to the League’s top flight at the end of the next season, they were still not good enough to finish in the top half of the division. More departures of longstanding stalwarts followed, including that of captain Mike Bailey, Frank Munro and goalkeeper Phil Parkes. The East Midlands clubs took over in the spotlight, first Derby County and then Nottingham Forest, who won the European Cup in 1979, to make Brian Clough’s dream a reality. Before the 1979-80 season kicked off, Wolves’ manager John Barnwell produced a stroke of genius by signing Emlyn Hughes from Liverpool to be his captain. Then he sold Steve Daley to Manchester City for close to 1.5 million pounds, and three days later signed Andy Gray from Aston Villa for a similar amount. Daley (pictured below in action against FC Porto) was a versatile, attacking midfielder who played in 218 senior games for Wolves, scoring a total of forty-three goals. Andy Gray scored on his debut for Wolves and went on to get another eleven League goals, one behind John Richards. He also scored in the League Cup Final in March to give Wolves a 1-0 victory over Nottingham Forest, and a place in the next season’s UEFA Cup.

003

John Richards continued to play on into the 1980s for Wolves. According to John Shipley, he was a true Wolves legend, a player who would have graced any of Wolves’ Championship-winning teams. He was also a true gentleman, in the Billy Wright mould. He had signed for Wolves in 1967, turning professional two years later. I remember seeing him make his first-team debut at the Hawthorns against West Bromwich Albion on 28 February 1970, scoring alongside Derek Dougan in a 3-3 draw. They both played and scored in the 3-1 away victory against Fiorentina the following May. Richards went on to score 194 goals in 486 appearances, a goalscoring record which stood for ten years. He won only one full England cap, due mainly to injury.

007

Like me, the entertainer Frank Skinner grew up on the fictional cartoon comic strip hero, Roy of the Rovers. Of course, when – as in his case – you support a real-life team that never wins anything, like West Bromwich Albion, it’s nice to follow a fictional team that scoops the lot. Melchester Rovers were his mythical alternative, and following them came with none of the attendant guilt that comes with slyly supporting another club, say Liverpool in the seventies. They were his ‘dream team’ with a cabinet of silverware and a true superstar-striker as player-manager. The 1970s were a time when both life and the beautiful game seemed far less complicated for teenagers. Watching it on TV, we would frequently hear a commentator say “this is real Roy of the Rovers Stuff”. What they usually meant was that there was one player on the pitch was doing something remarkable, unbelievable or against all odds. But even in the fictional pages, Roy had to confront the dark realities of hooligans among his own fans, and do battle with it in his own way, as the following frames show:

003001002

Vivienne Westwood and Malcolm McLaren turned from creating beatnik jumpers to the ripped T-shirts and bondage gear of punk: the Sex Pistols portrayed themselves as a kind of anti-Beatles. Westwood was in many ways the perfect inheritor of Quant’s role of a dozen years earlier. Like Quant, she was brought up to make her own clothes and came through art college. She was similarly interested in the liberating power of clothes, setting herself up in a Kings Road shop which first needed to be braved before it could be patronised. Yet she was also very different from Quant, in that she had first mixed and matched to create a style of her own at the Manchester branch of C&A and claimed that her work was rooted in English tailoring. Her vision of fashion was anything but simple and uncluttered. According to Andrew Marr, it was a magpie, rip-it-up and make it new assault on the history of coiture, postmodern by contrast with straightforward thoroughly modern designs of Quant. The latter’s vision had been essentially optimistic – easy to wear, clean-looking clothes for free and liberated women. Westwood’s vision was darker and more pessimistic. Her clothes were to be worn like armour in a street battle with authority and repression, in an England of flashers and perverts. Malcolm McLaren formed the Sex Pistols in December 1975, with Steve Jones, Paul Cook, John Lydon and Glen Matlock making up a foursome which was anything but ‘fab’. Pockmarked, sneering, spitting, spikey-haired and exuding violence, they dutifully performed the essential duty of shocking a nation which was still too easily shocked. The handful of good songs they recorded have a leaping energy which did take the rock establishment by storm, but their juvenile antics soon became embarrassing. They played a series of increasingly wild gigs and made juvenile political attacks in songs such as ‘Anarchy in the UK’ and, in the year of the Silver Jubilee (1977), ‘God Save the Queen. Jim Callaghan could be accused of many things, but presiding over a ‘fascist régime’ was surely not one of them.

On the other side of the political divide was an eruption of racist, skinhead rock, and an interest in the far right. Among the rock stars who seemed to flirt with these ideas was Eric Clapton. On 5th August 1976, I went, with a group of friends, to his concert at the Odeon in Birmingham. He came on stage an hour late, obviously stoned and drunk, and stated, to a mixed audience, that Enoch Powell was the only bloke who’s telling the truth, for the good of the country. In his autobiography, Clapton apologised for his behaviour and his outburst. He was not alone in his ‘flirting’ with racist views. David Bowie spoke of Hitler as being the first superstar, musing that he might make a good Hitler himself. Though the Sex Pistols liked to see themselves as vaguely on the anarchist left, their enthusiasm for shocking, nihilistic and amoral lyrics left room for ambiguity, particularly after ‘Sid Vicious’ joined them. McLaren and Westwood produced clothing with swastikas and other Nazi emblems if only to outrage people, while Vicious’s dubious contribution to political discourse can be summed up by his lyrics,

Belsen was a gas, I read the other day, about the open graves, where the Jews all lay …

Reacting to the surrounding mood, Rock Against Racism was formed in August 1976. My diary for 1976 records that I attended four anti-Fascist and anti-racist meetings in Birmingham that summer. These concerts and meetings led to the creation of the Anti-Nazi League a year later. Punk bands were at the forefront of the RAR movement, above all the Clash, whose lead singer Joe Strummer became more influential and admired than Johnny Rotten and the rest of the Sex Pistols, and bands such as the Jam. Black music – reggae, ska and soul – was popular enough among white youth like my friends for it to have a real influence in turning the fashion in street culture decisively against racism. Ska revival bands such as the Specials and the reggae-influenced Police and UB40. The latter lived in the same terraced street as my brother in Moseley, Birmingham, and came together as unemployed men whose name was drawn from the unemployment benefit claim form. They had an effect which went beyond the odd memorable song. The seventies produced, in the middle of visions of social breakdown, this musical revival produced a more upbeat atmosphere, especially on the Liberal-Left, as well as the Hard-Left. The racist skinhead bands soon found themselves in a violent and uncomfortable ghetto. As one cultural critic of the time put it, …

A lifestyle – urban , mixed, music-loving, modern and creative – had survived, despite being under threat from the NF.

The NF had been founded in 1967 after the original British National Party and the old League of Empire Loyalists joined together. Electorally it was struggling, though Martin Webster, its leader, polled sixteen per cent in the West Bromwich by-election of May 1973 and in the two 1974 general elections the NF put up first fifty-four and then ninety candidates, entitling them to a television broadcast. More important to their strategy were the street confrontations, engineered by marching through Bangladeshi or Pakistani areas in Leeds, Birmingham and London with Union Jacks and anti-immigrant slogans. A more extreme offshoot of the original skinheads attached themselves to the NF’s racialist politics and by the mid-seventies, they too were on the march. Throughout the summer of 1976, broad-based anti-Fascist meetings took place in Dudley and Birmingham, involving Young Liberals, Labour Party members and more left-wing socialists. There were also national anti-racist conferences in London. The Trotskyist Socialist Workers’ Party determined to organise street politics of their own to bring things to a halt, forming the Anti-Nazi League in 1977. The ANL brought in tens of thousands of young people who had no interest in Leninism or Trotskyism, but who saw the NF as a genuine threat to immigrants. They flooded to the ANL rallies, marches and confrontations, during which there were two deaths as police weighed in to protect the NF’s right to march.

This was a youth lifestyle which also provided an alternative to the drift to the right more generally in British society and the establishment of ‘Thatcherism’ as the dominant ideology of the late seventies and eighties. But to understand what this ideology was, and how it was able to gain its hold on society, we need first to examine the parliamentary politics of the mid to late seventies.

The Callaghan Years:

James Callaghan.JPGJim Callaghan (right) was the Home Secretary who sent British troops into Northern Ireland, for which, at the time, he was hailed as a hero. He was not such a hero among reformers in the Labour Party, however, when he scuppered the chances of Wilson and Castle of finally curbing the power of the trade union ‘barons’. In the spring of 1976, he finally entered Number Ten after a series of votes by Labour MPs shaved off his rivals – Denis Healey, Tony Crosland and Roy Jenkins on the right, and Michael Foot and Tony Benn on the left. After three ballots, he defeated Foot by 176 votes to 137 and replaced Wilson as Prime Minister. For the next three turbulent years, he ran a government with no overall majority in Parliament, kept going by a series of deals and pacts, and in an atmosphere of almost constant crisis. He was, already, on becoming PM, in Andrew Marr’s description,

… a familiar and reassuring figure in Britain, tall, ruddy, no-nonsense, robust and, by comparison with Wilson, straightforward.

He had held all three great offices of state and, at sixty-five, he was one of the most experienced politicians to become Prime Minister. After Heath and Wilson, he was the third and last of the centrist consensus-seekers between hard left and hard right, though he was instinctively looking to the right in the ethos of the mid to late seventies. Churchill apart, all his post-war predecessors had been Oxbridge men, whereas Callaghan had never been to university at all. He was the son of a Royal Navy chief petty officer who had died young, and a devout Baptist mother from Portsmouth. He had known real poverty and had clawed his way up as a young clerk working for the Inland Revenue, then becoming a union official before wartime and national service. As one of the 1945 generation of MPs, he was a young rebel who had drifted to the right as he mellowed and matured, though he always held firm to his pro-trade union instincts. He was a social conservative, uneasy about divorce, homosexuality and vehemently pro-police, pro-monarchy and pro-armed forces, though he was anti-hanging and strongly anti-racialist. As Home Secretary, he had announced that the ‘Permissive Society’ of the sixties had gone too far. As PM, he initiated a debate on ‘trendy teaching’ in schools, calling for an inquiry into teaching methods, standards, discipline and the case for a national curriculum.

Callaghan’s first few days as Prime Minister in April 1976 must have brought back some grim memories. A dozen years earlier, as Chancellor, he had been confronted with awful economic news which nearly crushed him and ended in the forced devaluation of the pound. Now, on the first day of his premiership, he was told that the pound was falling fast, no longer ‘floating’, the euphemism used since the Heath years. A devaluation by sterling holders was likely. The Chancellor, Denis Healey, had negotiated a six-pound pay limit and this would feed through to much lower wage increases and eventually to lower inflation. Cash limits on public spending brought in by Healey under Wilson would also radically cut public expenditure. But in the spring of 1976 inflation was still rampant and unemployment was rising fast. Healey now told Callaghan that due to the billions spent by the Bank of England supporting sterling in the first few months of the year, a loan from the International Monetary Fund (IMF) looked essential. In June, standby credits were arranged with the IMF and countries such as the US, Germany, Japan and Switzerland.

002 (2)

Healey had imposed tough cuts in the summer but by its end, the pound was under immense pressure again. On 27th September, Healey was meant to fly out to a Commonwealth finance ministers’ conference in Hong Kong with the Governor of the Bank of England. But the crisis was so great and the markets so panicked that he decided he could not afford to be out of touch for the seventeen hours’ flying time. In full view of the television cameras, he turned around at Heathrow airport and went back to the Treasury. There he decided to apply to the IMF for a conditional loan, one which gave authority to the international banking officials above Britain’s elected leaders. With exquisite timing, the Ford workers began a major strike. Healey, for the first and last time in his life, he later said, was close to demoralization. Against Callaghan’s initial advice, Healey decided to dash to the Labour conference in Blackpool and made his case to an anguished and angry party. At the time, there was there was a powerful mood for a siege economy, telling the IMF to ‘get lost’, cutting imports and nationalising swathes of industry. Given just five minutes to speak from the conference floor due to the absurdities of Labour Party rules, the Chancellor warned the party that this would mean a trade war, mass unemployment and the return of a Tory government. But, he shouted against a rising hubbub, emulating his younger self as Major Healey speaking at the 1945 conference, in full battle dress, he was speaking to them from the battlefront again. He would negotiate with the IMF and that would mean…

… things we do not like as well as things we do like. It means sticking to the very painful cuts in public expenditure … it means sticking to the pay policy.

As Healey ruefully recorded in his autobiography, he had begun with a background of modest cheers against a rumble of booing. When he sat down, both the cheering and the booing were a lot louder. Benn called the speech vulgar and abusive, but Healey was one of British politics greatest showmen. Meanwhile, Callaghan had become steadily more convinced, during the crisis, by the monetarists on his right. He told the stunned 1976 Labour conference that the Keynesian doctrines of governments spending their way out of recession, cutting taxes and boosting investment, had had their day …

I tell you in all candour that that option no longer exists and that insofar as it ever did exist, it worked by injecting inflation into the economy … Higher inflation, followed by higher unemployment. That is the history of the last twenty years.

So, with the cabinet nervously watching, the negotiations with the IMF started. Callaghan and Healey tried to limit as far as possible the cuts being imposed on them. The IMF, with the US Treasury standing behind them, was under pressure to squeeze ever harder. The British side was in a horribly weak position. The government was riven by argument and threats of resignation, including from Healey himself. In secret talks, Callaghan warned the IMF’s chief negotiator bitterly that British democracy itself would be imperilled by mass unemployment. When the tense haggling came to an end, the IMF was still calling for an extra billion pounds’ worth of cuts and it was only when Healey, without telling Callaghan, threatened the international bankers with yet another Who runs Britain? election, that they gave way. The final package of cuts was announced in Healey’s budget, severe but not as grim as had been feared, and greeted with headlines about Britain’s shame. But the whole package was unnecessary from the start, since the cash limits Healey had already imposed on Whitehall would cut spending far more effectively than anyone realised. Moreover, the public spending statistics, on which the cuts were based, were wrong. Public finances were stronger than they had appeared to be. The Treasury estimate for public borrowing in 1974-5 had been too low by four thousand million, a mistake greater than any tax changes ever made by a British Chancellor; but the 1976 estimate was twice as high as it should have been. The IMF-directed cuts were, therefore, more savage than they needed to have been.

When Britain’s spending was defined in the same way as other countries’, and at market prices, the figure was forty-six per cent of national wealth, not the sixty per cent mistakenly stated in a government white paper of early 1976. By the time Labour left office, it was forty-two per cent, about the same as West Germany’s and well below that of the social democratic Scandinavian countries. Britain’s balance of payments came back into balance long before the IMF cuts could take effect and Healey reflected later that if he had been given accurate forecasts in 1976, he would never have needed to go to the IMF at all. In the end, only half the loan was used, all of which was repaid by the time Labour left office. Only half the standby credit was used and it was untouched from August 1977 onwards. Healey had talked about ‘Sod Off Day’ when he and Britain would finally be free from outside control. That day came far sooner than he had expected, but at the time nobody knew that Britain’s finances were far stronger than they had seemed.

Yet in the national memory, the Callaghan administration soon became associated with failure and remained in that category throughout the Thatcher years, used repeatedly as clinching evidence of its bankruptcy. All of this could have been avoided if only the Tories had been in power, it was argued. The initial drama of the crisis imprinted itself on Britain’s memory – the rush back from Heathrow, the dramatic scenes at the Labour conference, the humiliating arrival of the IMF hard men, backed by Wall Street, a political thriller which destroyed Labour’s self-confidence for more than a decade. But that was only the start of Labour’s woes. It was the prospect of ever greater cuts in public spending, inflation out of control, and the economy in the hands of in the hands of outsiders that helped break the Labour Party into warring factions and gave the hard left its first great opportunity. Healey and the Treasury were operating in a new economic world of ‘floating’ exchange rates, huge capital flows and speculation still little understood. It made him highly critical of monetarism, however, and all academic theories which depended on accurate measurement and forecasting of the money supply. Healey was bitter, though, about the Treasury’s mistakes over the true scale of public spending which so hobbled his hopes of becoming a successful Chancellor. He said later that he could not forgive them for this ‘sin’:

I cannot help suspecting that Treasury officials deliberately overstated public spending in order to put pressure on the governments which were reluctant to cut it. Such dishonesty for political purposes is contrary to all the proclaimed traditions of the British civil service.

After the humiliating, cap-in-hand begging for help from the International Monetary Fund, there was the soaring inflation and high interest rates, and finally the piled-up rubbish, strike meetings and unburied dead of the 1978-79 Winter of Discontent. But the true narrative of the Callaghan-Healey years, for the two must be seen together, is also a story of comparative success before its Shakespearean tragic final act. His defenders point out that Callaghan actually presided over a relatively popular and successful government for more than half of his time in power, some twenty out of thirty-seven months. Following the IMF affair, the pound recovered strongly, the markets recovered, inflation fell, eventually to single figures, and unemployment fell too. By the middle of 1977, the Silver Jubilee year, North Sea Oil was coming ashore to the extent of more than half a million barrels a day, a third of the country’s needs. Britain would be self-sufficient in oil by 1980 and was already so in gas. The pay restraint agreed earlier with Healey was still holding, though only just. Besides their success in getting inflation down, they also got the best deals with international bankers that could be done.

Callaghan also succeeded in purging the left from his cabinet, sidelining Michael Foot, sacking Barbara Castle, and constructing the most right-wing Labour cabinet since the war, including Bill Rodgers, David Owen and Shirley Williams. All would later join Roy Jenkins, for now European Commissioner in Brussels, in forming the breakaway Social Democratic Party. Callaghan’s newly found faith in monetarism and his increasingly aggressive attitude to high wage demands also put him to the right of Wilson and Healey. In the late seventies, Callaghan was, for the first time, getting a good press while the Tory opposition under Margaret Thatcher seemed to be struggling. After having to rely on an odd mixture of nationalist MPs for its precarious Commons majority, Labour entered a deal with David Steel’s Liberals from March 1977 to August of the following year, giving Callaghan a secure parliamentary position for the first time. The Lib-Lab Pact gave the smaller party, with only thirteen MPs, rights only to be consulted, plus vague promises on possible changes to the voting system: it was far more helpful to Labour, who gained a modest majority over the Tories in the opinion polls and the prospect of Callaghan being returned to rule well into the eighties. It did not look like a dying government, much less the end of an era.

The Labour left believed that Callaghan and Healey had been captured by international capitalism, as had many MPs. Their answer was to make the MPs accountable to ‘ordinary people’, as the obsessive activists of Labour politics innocently believed themselves to be. So the siege economy, or Alternative Economic Strategy as it became known by 1978, following the publication of a book by Sam Aaronovitch, a Marxist economist, and the mandatory reselection of MPs became the two main planks of the left. The AES was soon abandoned by many on the broad left, however, who, following the fall of the Callaghan government, tired of Keynesian solutions involving Labour governments spending their way out of crises. But Tony Benn (pictured below) persisted in his enthusiasm for workers’ cooperatives and nationalisation. He became increasingly detached from his cabinet colleagues in the Callaghan government, including the remaining left-wingers, like Michael Foot. He came close to leaving it over his opposition to Labour’s deal with the Liberals. His general attitude to the party is well expressed in his diary entry for 15 January 1978:

The whole Labour leadership now is totally demoralised and all the growth on the left is going to come up from the outside and underneath. This is the death of the Labour Party. It believes in nothing any more, except staying in power.

Képtalálat a következőre: „tony benn”

Benn was still a senior member of the government when he wrote this, attending intimate meetings at Chequers, hearing deep military and security secrets, while at the same time becoming an ‘inside-outsider’.

The Winter of Their Discontent:

The ‘winter of discontent’, a Shakespearean phrase, was used by James Callaghan himself to describe the industrial and social chaos of 1978-9. It has stuck in the popular memory as few events have since because schools were closed, ports were blockaded, rubbish was rotting in the streets and the dead were unburied. Left-wing union leaders and activists whipped up the disputes for their own purposes. Right-wing newspapers, desperate to see the end of Labour, exaggerated the effects and rammed home the picture of a country which had become ungovernable.

002

It came an explosion of resentment, largely by poorly paid public employees, against a public incomes policy they felt was discriminatory. In the picture above, rubbish is left piled up in London’s Leicester Square in February 1979. Such scenes provided convincing propaganda for the Conservatives in the subsequent general election. Callaghan himself had been part of the problem, since his failure to understand the threat posed by the union challenge to the elected power, and his earlier lack of interest in radical economic ideas, came home to haunt him as the incumbent of Number Ten. But it was not just that he had opposed the legal restrictions on union power pleaded for by Wilson and Castle, and then fought for vainly by Heath. Nor was it even that he and Healey, acting in good faith, had imposed a more drastic squeeze on public funding and thus on the poorest families than was economically necessary. It was also that by trying to impose an unreasonably tough new pay limit on the country, and then dithering about the date of the election, he destroyed the fragile calm he had so greatly enjoyed.

Most people, including most of the cabinet, had assumed that Callaghan would call a general election in the autumn of 1978. The economic news was still good and Labour was ahead in the polls. Two dates in October had been pencilled in, though 12th October had been ruled out because it was Margaret Thatcher’s birthday. But Callaghan did not trust the polls and during the summer he decided that he would ‘soldier on’ until the spring. But he didn’t tell anyone until, at the TUC conference in September, he sang a verse from an old music hall song:

There was I waiting at the church, waiting at the church,

When I found he’d left me in the lurch, Lor’ how it did upset me.

All at once he sent me round a note, here’s the very note, this is what he wrote,

Can’t get away to marry you today: My wife won’t let me!

While it was a popular song in its day, fondly remembered by many in his audience, it was hardly a clear message to Britain as a whole. Was the jilted bride supposed to be Mrs Thatcher? The trade union movement? Callaghan’s intention was to suggest that he was delaying the election, but many trade union leaders, journalists and even cabinet ministers were confused. When he finally told the cabinet, they were genuinely shocked. The decision to delay might not have mattered so much had Callaghan not also promised a new five per cent pay limit to bring inflation down further. Because of the 1974-5 cash limit on pay rises at a time of high inflation, take-home pay for most people had been falling. Public sector workers, in particular, were having a tough time. The union leaders and many ministers thought that a further period of pay limits would be impossible to sell, while a five per cent limit, which seemed arbitrary on Callaghan’s part, was considered to be ridiculously tough. But had Callaghan gone to the country in October then the promise of further pay restraint might have helped boost Labour’s popularity still further, while the trade union leaders could believe that the five per cent ceiling was designed to appease rightward-drifting middle-class voters. By not going to the country in the autumn, Callaghan ensured that his five per cent ceiling would, instead, be tested in Britain’s increasingly impatient and dangerous industrial relations market.

Almost as soon as Callaghan had finished his music-hall turn, the Transport & General Workers’ Union smashed it by calling for the 57,000 car workers employed by Ford, the US giant, to receive a thirty per cent wage increase, citing the huge profits being made by the company and the eighty per cent pay rise just awarded to Ford’s chairman. Callaghan was sorely embarrassed, not least because his son worked for the company. After five weeks of lost production, Ford eventually settled for seventeen per cent, convincing Callaghan that he would now lose the coming election. Oil tanker drivers, also in the T&GWU, came out for forty per cent, followed by road haulage drivers, then workers at nationalised British Leyland. They were followed by public sector workers in water and sewerage. BBC electricians threatened a black-out of Christmas television. The docks were picketed and closed down, blazing braziers, surrounded by huddled figures with snow whirling around them, were shown nightly on the television news. Hull, virtually cut off by the action, became known as the ‘second Stalingrad’. In the middle of all this, Callaghan went off for an international summit in the Caribbean, staying on for a sightseeing holiday in Barbados. Pictures of him swimming and sunning himself did not improve the national mood. When he returned to Heathrow, confronted by news reporters asking about the industrial crisis, he replied blandly:

I don’t think other people in the world will share the view that there is mounting chaos.

This was famously translated by the Daily Mail and the Sun into the headline, Crisis? What Crisis. As the railwaymen prepared to join the strikes, the worst blow for the government came when the public sector union NUPE called out more than a million school caretakers, cooks, ambulance men and refuse collectors on ‘random stoppages’ for a sixty pound guaranteed minimum wage. Now the public was being hit directly, and the most vulnerable were being hit the hardest. Children’s hospitals, old people’s homes and schools were all plunged into turmoil. The most notorious action was taken by the Liverpool Parks and Cemeteries Branch of the General & Municipal Workers’ Union refused to bury dead bodies, leaving more than three hundred to pile up in a cold storage depot and a disused factory. Liverpool Council discussed emergency plans to dispose of some of the corpses at sea. Funeral cortéges were met at some cemeteries by pickets and forced to turn back. Strikers were confronted with violence in local pubs. Of course, most of those striking were woefully badly paid and living in relative poverty. Moreover, many had no history of industrial militancy. Nor was the crisis quite as bad as some of the papers and politicians represented it. As with Heath’s three-day week, many people enjoyed the enforced holiday from their poorly paid jobs and tough working conditions. Contrary to rumour, no-one was proved to have died in hospital as a result of union action, there were no food shortages and there was, besides the odd punch-up in the pubs, there was no violence and troops were never used. If it was a ‘revolt’, it was a very British one. It was chaos and a direct, coordinated challenge to the authority of the government, but it was not an attempt to overthrow it, as the 1974 Miners’ Strike had been. This was not a revolution.

002

Nevertheless, in London (above) and other cities, rotting rubbish piled up, overrun by rats and posing a serious health hazard. The effects of isolated incidents and images were revolutionary, ushering in not socialism, but Thatcherism. Inside government, ordinary work had almost ground to a halt. Eventually, a St Valentine’s Day concordat was reached between the government and the TUC, talking of annual assessments and guidance, targeting long-term inflation and virtually admitting, on the government’s part, that the five per cent wage ceiling had been a mistake. By March most of the industrial action had ended and various generous settlements had been reached, or inquiries had been set up which would lead to them. But in the Commons, the government was running out of allies, spirit and hope.

Spring ‘Awakening’:

The failure of the referenda on Scottish and Welsh devolution gave the nationalists no reason to continue supporting Labour. A bizarre amendment to the Bill had meant that, although the Scots voted in favour, the ‘absences’ of dead people and those who had left but were still registered, were counted against, so the act had to be repealed. In Wales, the measure was in any case defeated by four to one of those voting, in a tidal-wave shift to the right across North Wales and an anti-Nationalist and anti-establishment surge in the valleys. This was led by Neil Kinnock and the Labour left against the leaders of their own party, including Callaghan, himself a Cardiff MP, the Wales TUC and the allegedly corrupt Labour leaders of local authorities. The political division of Wales was confirmed soon after the St David’s Day ‘massacre’ when, as broad left student leaders we witnessed, with horror, the Young Conservatives take control of half the six University College unions in Wales (Bangor, Aberystwyth and UWIST in Cardiff), a sure sign of a sea-change which was soon confirmed at the general election. After the devolution debácle, the nationalists, especially in Scotland, would never trust Labour again.

The Liberals, facing the highly embarrassing trial of Jeremy Thorpe for conspiracy to murder, had their own reasons for wanting a spring election. In the frenetic atmosphere of an exhausted Parliament, in which dying MPs had been carried through the lobbies to vote in order to keep the government afloat, final attempts were made by Michael Foot and the Labour whips to find some kind of majority with the help of whatever support they could muster from a motley crew of Ulster Unionists, Irish Nationalists (SDLP) and renegade Scots. But by now, Callaghan himself was in a calmly fatalistic mood. He did not want to struggle on through another chaotic summer and early autumn. His famous and much-quoted remark to an aide, just as Labour was losing power in 1979, that the country was going through a once-in-thirty-years sea change, suggested that he half-accepted that the years of consensus had failed:

There is a shift in what the public wants and what it approves of. I suspect there is now such a sea-change – and it is for Mrs Thatcher.

Margaret Thatcher during the 1979 General Election campaign.

Finally, on 28th March 1979, the game ended when the government was defeated by a single vote, brought down at last by a ragged coalition of Tories, Liberals, Scottish Nationalists and Ulster Unionists. Callaghan was the first Prime Minister since 1924 to have to go to Buckingham Palace and ask for a dissolution of Parliament because he had lost a vote in the House of Commons. The five-week election campaign started with the IRA’s assassination of Mrs Thatcher’s campaign manager, Airey Neave, on his way into the underground car-park at Westminster. On the Labour side, it was dominated by Callaghan, still more popular than his party, emphasising stable prices and his ‘deal’ with the unions. On the Tory side, Thatcher showed a clever use of the media, working with television news teams and taking advice from her advertising ‘gurus’, the Saatchis. Callaghan was soundly beaten, as he himself had suspected he would be, with the Conservatives taking sixty-one seats directly from Labour, gaining nearly forty-three per cent of the vote and a substantial majority with 339 seats.

Sources:

Andrew Marr (2008), A History of Modern Britain. London: Macmillan.

Roger Middleton & John Swift, et.al. (2001), The Penguin Atlas of British and Irish History. London: Penguin Books.

John Shipley (2003), Wolves Against the World: European Nights, 1953-80. Stroud: Tempus Publishing.

Frank Skinner (Foreword) (2009), Roy of the Rovers: The 1970s. London: Titan Books.

Posted September 16, 2018 by TeamBritanniaHu in Anti-racism, Baptists, BBC, Birmingham, Black Country, Britain, British history, Caribbean, Christian Faith, Christian Socialism, Christianity, Church, Commonwealth, Communism, Europe, European Economic Community, Factories, Family, Germany, History, homosexuality, hygeine, Immigration, Integration, Japan, Journalism, manufacturing, Marxism, Midlands, Militancy, morality, Narrative, National Health Service (NHS), nationalisation, nationalism, Poverty, Racism, Revolution, Scotland, Shakespeare, south Wales, Thatcherism, Trade Unionism, Uncategorized, Unemployment, USA, Wales, West Midlands

Tagged with , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

%d bloggers like this: