Archive for the ‘Midlands’ Category

What, When & Where Was Socialism?: The Contrasting Cases of Britain & Hungary   Leave a comment

 

012

Thirty Years After the Fall: Is Socialism Dead?

Júlia Tar’s recent piece on the Hungarian government’s online media outlet, Hungary Today, points out that 2019 is the anniversary of not one, but three remarkable events of the 20th century: NATO’s 70th anniversary; Hungary, Poland and the Czech Republic’s 20th anniversary since joining NATO, and the thirtieth anniversary of dismantlement of the Iron Curtain and of the Berlin Wall. According to Eugene Megyesy, the former Senior Advisor to the Prime Minister of Hungary and a Member of the Board of Trustees of the Friends of Hungary Foundation, publisher of Hungary Today, we might not have learned from these historical events. 1956 was a significant year for Hungary because of its revolt against the Soviet Union and dictatorial communism. The revolt was followed by the Prague Spring in 1968 and the Polish Solidarity movement in the early 1980s. Then,

Hungary opened the Iron Curtain toward Austria, allowing East Germans to flee the oppression of the Utopian socialist system, thereby rendering the Berlin Wall obsolete.

004

This was on 11 September 1989 (not June, as stated), when a courageous decision was taken at the urging of leading Socialist reformers in the government like Imre Pozsgay, and in spite of threats of invasion from Berlin. By November, the Berlin Wall had itself been destroyed. In summarising Megyesy’s ‘view’, Tar claims that…

… socialism was always built on the promises of a Utopian system, equality and the ability to solve all social problems (“heaven on earth”).

Eugene Megyesy warns that this is happening again in some countries:

Sadly, there are politicians and bureaucrats in Washington and Brussels, supported by ivory tower academics, media pundits and Hollywood luminaries, who believe socialism is viable.

Megyesy urges today’s generation to look back and think about whether socialism was ever successful. It may have been, but only for a limited period of time. He cites the unsustainability of the capitalism-backed socialistic systems in the Scandinavian countries as an example. In Cuba, North Korea and Venezuela, it is even worse and only serves to highlight the gap between the poor and the leaders living in luxury, Megyesy explains. Before socialism, Venezuela was one of the richest countries; now it’s one of the poorest. According to Megyesy, socialism means…

… control over all means of production and the redistribution of wealth by the government.

Definitions and Debates:

But not every ‘socialist’ today would agree with this definition, and especially the idea that public control means control by the central or federal government. Neither does this interpretation match those of the multifarious strands of socialism in western Europe which developed from the middle of the nineteenth century. To define socialism and understand its roots, a longer and broader view is necessary, not just one which draws conclusions based on events since the spread of Stalinism across eastern Europe, or which focuses on recent events in North Korea or Venezuela for evidence of the failings of the Utopian Socialist system. Many of the twentieth century’s ‘dystopias’ may have had their origins among the nineteenth century ‘isms’, as in previous centuries they were often the product of misguided Christian millenarianism, like ‘anti-Semitism’, but that does not mean that we should simply discard the thinking of the philosophers and political economists who developed their detailed critiques of capitalism any more than we should reject two millennia of Christian theology. After all, as Marx himself noted, philosophers only interpret the world: the point is to change it. 

In seeking to change its own world, each new generation must produce its own reinterpretation of the ideas handed down to it from past generations and come up with its own solutions to its own moral dilemmas and social problems. That is, in essence, what socialism means to me. We should neither rely on theories from posterity nor reject them out of hand as if all who came before us were thieves and robbers. We can only learn from the past by giving it a fair hearing, remembering as the novelist J P Hartley famously wrote, the past is a foreign country; they do things differently there. We are solely responsible for our own ‘country’ in equity

the ‘present’, and for not learning from our own mistakes in its past. In this context, and according to the eminent ‘man of letters’ of the twentieth century, Raymond Williams (1983), ‘Socialist’ emerged as a philosophical description in the early nineteenth century. In that century and beyond, it could be used in two ways, which have had profound effects on the use of the term by radically different political tendencies. Of course, social was the merely descriptive term for a society in its now predominant sense of the system of common life; a social reformer wished to reform this system. But ‘social’ was also …

… an emphatic and distinguishing term, explicitly contrasted with ‘individual’ and ‘individualist’ theories of society.

Naturally, there has always been a great deal of interaction and overlap between these two meanings, but their varying effect can be seen in the beginning in the formation of the term. In the first sense, it was viewed as an extension of ‘liberalism’ as it referred to radical political reform of the social order, in order to develop, extend and secure the main liberal values for all members of society; political freedom, the ending of privileges and formal inequalities, social justice (conceived as ‘equity’ between different individuals and groups). In the second sense, it was seen as the ‘enemy’ of competitive, individualist forms of society, specifically industrial capitalism with its system of wage-labour. Truly social forms depended on practical co-operation and mutuality, which in turn could not be achieved while there was still private (individual) ownership of the means of production. Real freedom could not be achieved, basic equalities could not be ended, social justice (conceived as a just social order rather than simply ‘equity’ between individuals) could not be established unless a society based on private property was replaced by one based on social ownership and control.

005

H. G. Wells, writing his well-known book in 1922 (above), expressed the dichotomy in the following terms:

On the one hand are the individualists, who would protect and enlarge our present freedoms with what we possess, and on the other hand the socialists, who would pool our ownerships and restrain our proprietary acts. In practice one will find every graduation between the extreme individualist, who will scarcely tolerate a tax of any sort to support a government, and the communist, who would deny any possessions at all. The ordinary socialist of today is what is called a collectivist; he would allow a considerable amount of private property, but put such affairs as education, transport, mines, land-owning, most mass production of staple articles, and the like, into the hands of a highly organised state. Nowadays there does seem to be a gradual convergence of reasonable men towards a scientifically studied and planned socialism.  

The resulting controversy among the many groups and tendencies all calling themselves ‘socialist’ has been, long, intricate and frequently bitter. Each main tendency has developed alternative, often derogatory terms for the others. But until circa 1850, the word was too new and too general to have any predominant use. The earliest known use in English is in Hazlitt’s On Persons One Would Wish to Have Seen (1826), in which he recalls a conversation from 1809 in writing those profound and redoubted socialists, Thomas Aquinas and Duns Scotus. There is also a contemporary use in the 1827 Owenite Co-operative Magazine. Its first recorded political use in French dates from 1833. However, ‘socialisme’ was first used in 1831 in the more generic meaning, and Owen’s New Moral World also contains a similar use. Given the intense political climate in both France and England in the 1820s and 30s, these references provide a sense of the period in which the word came into ‘common coinage’. It could not have been known at that time which meaning of the word would come through as dominant. It was a period of very rapid developments in political discourse, and until well into the 1840s there were a number of alternative words for ‘socialist’, some of which were in more common usage: co-operative, mutualist, associationist, societarian, phalansterian, agrarian, radical. As late as 1848 Webster’s (AmE) Dictionary defined ‘socialism’ as ‘a new term for agrarianism’. By that time in Europe, especially in France and Germany, and to a lesser extent in Britain, both ‘socialist’ and ‘socialism’ were common terms.

One alternative term, Communist, had begun to be used in France and England by the 1840s, but the sense of the word varied according to particular national contexts. In England in the 1840s, communist had strong religious associations, dating back to the Puritan sects of the seventeenth century. Thus its use was distinct from the secular word ‘socialist’ as used by Robert Owen, which was sometimes avoided for that reason. ‘Communism’ before Marx meant the primitive form practised in the early church when the followers of Jesus ‘held all things in common’. The ‘True Levellers’ or ‘Diggers’ of the English Commonwealth similarly wanted to abolish private property and social distinctions altogether. In the nineteenth century, their ideological ‘descendants’ believed this could only happen if a democratic state was to own all property. The French ‘anarchist’ philosopher Proudhon wrote that all property is theft. But the development of political ideas in France and Germany were different; so much so that Engels, in his Preface of 1888, looking back to the Communist Manifesto which he and Marx had written in 1848, observed:

We could not have called it a ‘Socialist’ manifesto. In 1847, Socialism was a middle-class movement. Socialism was, on the continent at least, respectable; Communism was the very opposite.

For a time, the stresses between employers and employees led to the worldwide dissemination of the very harsh and elementary form of communism which is associated with Karl Marx in particular. However, we need to view Marx’s political economy in its proper context as an integral shift in thinking about how to interpret the new industrial world which had grown up ‘like Topsy’ around the common man. It was only as the nineteenth century developed, according to H. G. Wells, that:

… men began to realise that property was not one simple thing but  a great complex of ownerships of different values and consequences … that there is a very great range of things, railways, machinery of various sorts, homes, cultivated gardens, pleasure-boats, for example, which need each to be considered very particularly to determine how far and under what limitations it may come under private ownership, and how far it falls into the public domain and may be administered and let out by the state in the collective interest. 

006

The Growth of Democratic Socialism in Britain & Ireland, 1880-1918:

‘Communist’ had French and German senses of a militant movement, at the same time that in Britain it was preferred to ‘socialist’ because it did not involve atheism. Modern usage began to settle from the 1860s, and in spite of the earlier variations and distinctions, it was ‘socialist’ and ‘socialism’ which became established as the predominant words. Communist, in spite of the distinction originally made, was much less used, and parties in the Marxian tradition took some variant of social and ‘socialist’ as titles; usually Social Democratic, which meant adherence to socialism. Even in the renewed and bitter internal disputes of the period 1880 to 1914 in Europe, these titles held. Communism was in this period most often used either as a description of an earlier form of society – primitive communism – or as a description of an ultimate form, a utopia, which would be achieved after passing through socialism. Yet, also in this period, movements describing themselves as ‘socialist’, for example, the English Fabians, powerfully revived what was really a variant sense in which ‘socialism’ was seen as necessary to complete liberalism, rather than as an alternative theory of society. To George Bernard Shaw and others in Britain and Ireland, socialism was the economic side of the democratic ideal (Fabian Essays, 33) and its achievement was an inevitable prolongation of the earlier tendencies which Liberalism had represented. Opposing this view, and emphasising the resistance of the capitalist economic system to such ‘inevitable’ development, William Morris used the word communism.

004

Morris was a well-established writer, artist, craftsman and an honorary fellow at Exeter College, Oxford, and was one of the middle-class Socialists who joined the Social Democratic Federation just as the working-class radicals left it. The Federation’s intransigent opposition to the Liberal Party was unpalatable for many of its promoters and early members, and its denunciation of ‘capitalist radicalism’ led to the defection of nearly all the Radical clubs. As Socialism began to spread in Britain, it became possible for its leader, H. M. Hyndman, to convert it into an openly Socialist body, which he did at its annual conference in 1883. It had begun to concentrate on issues such as Housing and the Eight Hours Working Day, which showed that the emphasis was no longer on purely political radicalism. Hyndman wrote to Henry George that same year that Socialist ideas are growing rapidly among the educated class… It was notable that many of these middle-class Socialists found their way to Socialism by way of the land reform movement: this was true of Henry George, whose views were published by the Land Reform Union and (in 1883) the Christian Socialist (I have written about ‘Christian Socialism’ elsewhere on this website). Morris, however, had not taken part in the land agitation: Ruskin, rather than George, seems to have been the means of Morris’ introduction to Socialism. He gives accounts of his political development in a collection of testimonies edited by Hyndman, How I became a Socialist (n.d.). The Federation accepted Hyndman’s declaration of principles, Socialism Made Plain.

002

In short, British Socialists were a sort of ‘stage army’ in the 1880s. There were plenty of leaders, but a limited number of followers. But these leaders were successful in creating a much greater impression than would be expected from such a small body of opinion. The fact was that, although it was in the interest of the working classes to follow their lead, there was a very high proportion of middle-class people among the converts of this period, and what the societies lacked in numbers they made up in the comparative energy, ability and financial generosity of their members. This alone can account for the flood of Socialist periodicals and pamphlets which were already pouring from the presses. There were first of all weekly papers of the SDF and the Socialist League, which enjoyed a circulation considerably larger than its immediate membership. The Commonweal, the League’s paper, issued fifty-two numbers and sold 152,186 copies. The Christian Socialist, nominally an organ of the land reformers, but edited by Socialists, gave the cause a great deal of publicity over a long period. Annie Besant, the early trade union leader and the editor of the journal of the Law and Liberty League, ensured that Fabian meetings were well-reported in it. The Fabians also issued tracts and the Socialist League published pamphlets and its own reports of debates.

The SDF’s paper, Justice, simply represented the views of the Hyndman group, or ‘clique’, who greeted with scorn and vituperation the slightest sign of deviation from an attitude of uncompromising hostility to all other parties and to alternative views of how to achieve socialism within the Federation itself. In 1895, George Lansbury, who stood for Walworth as an SDF Parliamentary candidate, ventured to write in his manifesto of ‘the transformation of society by peaceful means’, and was severely taken to task by Hyndman for his abandonment of the true revolutionary attitude. Yet in spite of all its defects, the SDF continued to provide a serious challenge to the other early socialist society, the ILP (Independent Labour Party). In 1898, it claimed a total of 137 branches, which was twice as many as it had had in 1893, and roughly two-thirds of the ILP figure. The SDF was, much more obviously than the ILP, a Socialist party; and those who were converted to Socialism by Hyndman and other leaders might well feel that there was an element of compromise about a party which failed to call itself ‘Socialist’ in its title. Members of the SDF were expected to make a real attempt to master Marx’s theories, and even Lansbury’s Bow and Bromley Socialists wearily struggled with ‘Das Kapital’ and Engels’s ‘Socialism, Utopian and Scientific’; this was much more than the ILP branches were usually prepared to do.

001

Without a programme, Engels realised, there could not be a united Socialist Party on a permanent basis, and every attempt to found one would fail. Indeed, the political independence of the nascent Labour Party from the Liberal Party was always in doubt until in 1918 it accepted a Socialist constitution. In addition, British Socialists possessed a ‘faith’ in the righteousness and ultimate victory of their cause which acted as a powerful driving force. This faith owed as much to Methodism as to Marxism, being based both on Christian principles and the analysis of contemporary society first presented by Marx and Engels. Much of this analysis was modified, however, by Hyndman and the Fabians, by Morris and Blatchford, though it still had a comprehensive reality for those who accepted it. To its working-class adherents, like my own grandparents who founded and campaigned for it in Coventry, it gave a sense of purpose and pride in class consciousness; to middle-class philanthropists, it afforded the consolation that they were working in solidarity with a range of tendencies of social change and progress. As Pelling concluded in his seminal work, the history of the world had often shown the dynamic qualities of a faith devoutly held, like that of the early Christians, the Calvinist reformers and the millenarian sects of the seventeenth century. Faith may feed on illusions, but it is capable of conquering reality:

Socialism had this quality for the early members of the SDF, the Socialist League and the ILP. It led them at times into foolish misstatements, such as that of ‘Justice’ in 1885:

‘If Socialism were the law in England every worker would get at least four times his present wages for half his present work. Don’t you call that practical politics?’

… or such as Blatchford’s declaration in ‘Merrie England’ that…

‘ … this country is capable of feeding more than treble her present population.’

But the faith did not stand or fall by the accuracy of facts and figures: it depended much less for its sources and strength upon reason than upon deeper and simpler forces in human nature: ‘Socialism’, said Shaw in 1897, ‘wins its disciples by presenting civilization as a popular melodrama, or as a Pilgrim’s Progress through suffering, trial, and combat against the powers of evil to the bar of poetic justice with paradise beyond. … The Socialists made up in energy and enthusiasm for their lack of numbers; in spite of their eccentricities and discords, they formed, in a real sense, a political ‘élite’.

The fact was that the British working class as a whole had no use for the conception of violent revolution. Any leader who failed to recognise this could not expect to win widespread support. Economic grievances could temporarily arouse bitter discontent as they had done in the early years of the industrial revolution. But dislocations of this type were for the most part transitory: a permanent political organization of the working class needed to disavow the use of violence. Only those who recognised this could effectively set in motion the movement to form a Labour Party. At the time Keir Hardie (right) retired from the chairmanship of the ILP in 1900, it had captured trade-union support, with the ultimate objective of tapping trade union funds for the attainment of political power.

But soon the ILP was deeply in debt and was only saved from bankruptcy by the generosity of wealthy supporters such as George Cadbury, who, as a Quaker, appreciated its stance against the Boer War. With Hardie’s re-election to Parliament, and the reaction against imperialism, the ILP’s position steadily improved, and it began to build itself up again and gained fresh recruits. By 1906 it was as strong as it had not yet the full force of the Socialist revival of that time. The Labour Representation Committee was a pressure group founded in 1900 as an alliance of socialist organisations and trade unions, aimed at increasing representation for labour interests in the Parliament. The Socialists were a minority force within it, and even after the formation of the Labour Party and its adoption of Socialism as its political creed in 1918, there were many within the party who were hostile to it as an ideology.  There is little doubt that most of the non-Socialist trade-union leaders would have been happy to stay in the Liberal Party, which most of them had belonged to in the past if the Liberals had made arrangements for a larger representation of the working classes among their Parliamentary candidates. So the early components of the Labour Party formed a curious mixture of political idealists and heard-headed trade unionists: of convinced Socialists and loyal, but disheartened Gladstonian Liberals. Despite the persistence of  this mixture of ideas, Pelling concluded:

The association of Socialist faith and trade-union interest, of hope for an ideal future and fear for an endangered present, seemed on the point of disruption at times: yet it survived, for a variety of reasons … because in the years before the party’s birth there had been men and women who believed that the unity of the working-class movement, both in industry and politics, was an object to be striven for, just as now most of their successors regard it as an achievement to be maintained.

003

Socialism and Communism in Europe, 1871-1918:

Across the continent, the relative militancy associated with the word communist was further strengthened by the very visual effect of the Paris Commune of 1871 (depicted below), though there was a significant argument as to whether the correct term to be derived from the event was Communist or Communard. For at least a ten-year period, the word Syndicalist became at least as important across Europe as a whole. It described the development of industrial trades unionism as a revolutionary force which would overthrow the capitalist system through the use of the General Strike and revolutionary violence in general. The word appeared in French in 1904 and in English in 1907; but it went through varying combinations with anarchism (in its stress on mutuality) and socialism, especially with Guild Socialism and Cooperative movements, emphasising the important interests of the consumer in economic models for the future.

The Commune as Seen by Jacques Tardi (“Le cri du peuple”), 2002.

The decisive distinction between ‘socialist’ and ‘communist’ came with the renaming, in 1918, of the Russian Social-Democratic Labour Party as the All-Russian Communist Party (the ‘majority’ or Bolsheviks). From that time on, a distinction of ‘socialist’ from ‘communist’, often with supporting terms and adjectives such as ‘social democrat’ or ‘democratic socialist’ came into common currency, although it is significant that all ‘communist’ parties, especially in the Union of Soviet Socialist Republics and its ‘satellite’ states, continued to describe themselves as ‘socialist’ and dedicated to ‘socialism’. This is one reason why, in central-eastern Europe, socialism is still viewed by many as synonymous with communism in contrast to the use of the word throughout the rest of Europe. That does not mean, however, that the history of socialist and social democratic parties in southern, western and northern Europe can simply be tarnished with the same brush of the ‘Stalinist’ past, as Medgyesy and other politicians have attempted to do in the run-up to this year’s European Parliament elections. Even Jean-Claude Junker, President of the European Commission and a member of the conservative European People’s Party has been characterised as a ‘socialist’ in the Hungarian press and media.

The First Hungarian Republic, the ‘Dictatorship of the Proletariat’ & the Horthy Era, 1918-44:

002

The Proclamation of Mihály Károlyi as President of the new Republic of Hungary.

Elsewhere on this site, I have written about the roots and development of liberal democracy in Hungary, and of how both of these have been fractured by various forms of authoritarianism and dictatorship, more recently of a populist variety. Yet even in Hungary, we can trace the origins of socialist movements back to 1907, when a series of strikes and disturbances among both the urban and rural workers. But the promise of electoral reform, for which a crowd of a hundred thousand demonstrated for a second time on ‘Red Thursday’, 10th October 1907, came to nothing when Andrássy’s modest bill expanding the suffrage was rejected by the Hungarian parliament. Seven years later, the Social Democrats, as elsewhere in Europe, supported the patriotic war effort, perhaps hoping for democratic concessions in return. Following the Revolution of November 1918, with the establishment of a republic ruled by a National Council, the Károlyi government embarked on the programme of social and political reforms it had announced. These were badly needed, given the explosive atmosphere in the country. There was no political force in Hungary at the time that would have been able to answer all of the conflicting interests and expectations of these turbulent times. Although the elections to the new national assembly were conducted on the basis of a franchise including half the population, second only those in Scandinavia at that time, the effects of progressive social legislation, including the introduction of unemployment benefit and the eight-hour working day, the abolition of child labour and the extension of insurance schemes, could not yet be felt. The political scene became polarised, involving the appearance of radical movements both on the Right and the Left.

The streets, for the time being, belonged to the political Left. Appeals of moderate Social Democratic ministers to order and patience evoked the contrary effect and served to alienate the disaffected masses from them. Their new heroes were the Communists, organised as a party on 24 November 1918 and led by Béla Kun. He was a former journalist and trades unionist, who had recently returned from captivity in Russia, where he had become convinced of the superiority of the system of Soviets to parliamentary democracy.  Communist propaganda also promised an end to all exploitation through the nationalisation of property, as well as international stability through the fraternity of Soviet republics which were prophesied to arise all over Europe. Within a few weeks, this attractive utopia, underpinned by well-designed social demagogy, had earned the Communists a membership of about forty thousand. Their supporters, several times that number, mobilised among the marginalised masses and the younger members of the intelligentsia, susceptible to revolutionary romanticism. By January 1919, a wave of strikes had swept across the country, in the course of which factories, transport and communication installations were occupied; in addition, land seizures and attempts to introduce collective agriculture marked the communist initiative, which also included the demand not only to eradicate all remnants of feudalism, but also the proclamation of a Hungarian Soviet Republic, and a foreign policy seeking the friendship of Soviet Russia instead of the Entente powers.

While the radicals on both the Right and the Left openly challenged the fundamental tenets of the Károlyi government, his Independence Party evaporated around him. Unhappy with the reform projects which Károlyi embraced and seemed too radical for them, most of the Independent ministers left the government, leaving the Social Democrats as the main government party. But they were struggling helplessly to tame their own radical left, who effectively constituted an internal opposition to the government, and gravitated towards the Communists. On 21 March 1919, the Social Democrats accepted the invitation to take sole responsibility for the government, but only to accelerate and conclude negotiations with the imprisoned Communist leaders about forming a united workers’ party. A new government, the Revolutionary General Council, presided over by a Social Democrat but in effect led by Béla Kun, was formed on the same day, with the declared aim of establishing a Leninist ‘dictatorship of the proletariat’.

001 (2)

Certainly, the measures introduced by the Revolutionary government went beyond anything attempted in Soviet Russia at that time. The counterpart of these measures in the administrative and political reorganisation of the country was the replacement of old local, municipal and county bureaucracies with soviets of workers, peasants and soldiers. A ‘Committee of Public Safety’ was organised to put pressure on the civilian population where it was needed in order to maintain the dictatorship of the proletariat, its head, Tibor Szamuely travelling in his ‘death train’ to trouble spots in order to preside in summary courts, assisted by the notorious ‘Lenin Boys’, created to supplement the ‘Red Guard’, which took over the ordinary functions of the police and gendarmerie. Besides common murders of actual or alleged enemies by the ‘élite detachments, some 120 death sentences were meted out by the tribunals for political reasons.

The great momentum of the changes was partly intended to convince people that the realisation of the ‘socialist utopia’ was imminent. Social policy measures, the expected alleviation of housing shortages through public ownership of accommodation in a country flooded by refugees, the nationalisation of large firms, improved educational opportunities, the more effective supply of food and consumer goods through rationing and supervised distribution met with widespread approval, especially among the urban population. The intellectual élite, who had applauded the democratic reforms of the autumn of 1918, was initially also allured by the attractive goals of the Soviet Republic. They not only included known Marxists like György Lukács, the writer, who became People’s Commissar for Education, but also members of the Nyugati (Western) Circle, who held positions in the Directorate for Literature, and Bartók and Kodály, who became members of the one for music. Gradually, however, these figures became disaffected, as did the intelligentsia and middle classes in general and the leaders of the October 1918 democratic revolution, some of whom emigrated the following summer. By then, the historian Gyula Székfű, who was appointed professor at the University of Budapest, was already at work on his highly influential Three Generations (1920), in which he was hostile not only towards the communist revolution but also towards democracy and liberalism, which he blamed for paving the way for Kun’s régime.

The revolution and the village were unable to come to terms with each other. Despite the steady urbanisation of the previous half-century, Hungary still remained a largely agricultural country, especially after much of its towns were taken away by occupation even before the Treaty of Trianon of 1920. Besides being economically unsound the amidst the shortage of raw materials and fuel to supply machinery supposedly more efficient large-scale co-operatives than in smallholdings, the nationalisation scheme embittered not only the smallholders themselves, who actually lost land, but also the landless peasants, domestic servants and the agricultural labourers whose dreams of becoming independent farmers were thwarted by the same urban revolutionaries who had formerly encouraged land seizures. Decrees regarding the compulsory delivery of agricultural surplus and requisitioning further undermined whatever popularity the government still enjoyed in the countryside. It blamed the food shortages on the peasantry, which exacerbated the already existing rift between town and country, and served as a pretext for further central control of the economy. The anti-clerical measures taken by the government also annoyed the traditionally devout peasants, concerned about the security of ‘the family hearth’.

001

All of this made the communists more susceptible to counter-revolutionary propaganda, which did not fail to emphasise the foreign (that is, Jewish) character of the revolution (over half of the commissars were indeed of Jewish ethnicity). An ‘Anti-Bolshevik’ Committee was set up in Vienna in April by representatives of nearly all the old parties led by Count István Bethlen, and a counter-revolutionary government was set up at Arad on 5 May, later moving to Szeged. Paradoxically, the Soviet Republic was maintained in power for over four months, despite the increasingly dictatorial means it employed, mainly by the temporary successes it scored on the nationalities’ issue; it collapsed not in the face of internal counter-revolution but when its military position against the allies of the Entente in the region became untenable. The Entente powers, gathered at the Paris Peace Conference, sent General Smuts, the prime minister of South Africa, to Budapest, mainly to obtain reliable first-hand information about the situation there in April 1919. Smuts concluded that Hungary truly had a government of Bolshevik character, which gave weight to the French Prime Minister Clemenceau’s proposal to suppress German revanchist designs as well as the spread of Soviet communism into Western Europe by a cordon sanitaire established out of the new states of Central Europe. Harold Nicolson, the young British diplomat who accompanied Smuts on the train leaving Paris on April Fools’ Day, wrote about these concerns about the Germans turning to Bolshevism in a letter to his wife Vita (pictured below, together in Paris):

They have always got the trump card, i.e. Bolshevism – and they will go the moment they feel it is hopeless for them to get good terms. 

006

Small wonder, therefore, that Béla Kun’s strike for communism triggered many anxious moments for the Supreme Council. The negotiations were conducted from the wagon-lit of Smuts’ train at the Eastern Station in Budapest, so as not to imply recognition of the régime, encircled by Red Guards with ‘fixed bayonets and scarlet brassards’. They centred on whether or not the Hungarian Bolsheviks would accept the Allies’ armistice terms, which would commit them to accept considerable territorial losses. As they hesitated, Harold decided to explore Budapest, a city he had grown up in before the war. He was alarmed and saddened by what he saw:

‘The whole place is wretched – sad – unkempt.’ He took tea at the Hungaria, Budapest’s leading hotel. Although it had been ‘communised’, it flew ‘a huge Union Jack and Tricoleur’, a gesture of good intent. Red Guards with bayonets patrolled the hall, but in the foyer what remained of Budapest society ‘huddled sadly together with anxious eyes and a complete, ghastly silence’, sipping their lemonade ‘while the band played’. ‘I shudder and feel cold,’ Harold remarked. ‘We leave as soon as possible. Silent eyes search out at us as we go.’

Kun desperately needed allied recognition of his government, but he inserted a clause into Smuts’ draft agreement that the Romanian forces should withdraw to a line east of the neutral zone established by the 1918 Armistice, in effect to evacuate Transylvania. Smuts would not countenance this, however, and the Bolsheviks were ‘silent and sullen’. Nicolson wrote that they looked like convicts standing before the Director of the Prison. Smuts concluded that ‘Béla Kun is just an incident and not worth taking seriously’. This proved to be only too true, as on 10 April, only a day after Harold’s account to Vita, a provisional government was set up in Budapest seeking to reinstate the old ruling Hungarian cliques. On 1 August, Kun fled the capital in the face of invading Romanian armies. He ended his days in Russia, dying in 1936, ironically as the victim of one of Stalin’s innumerable purges. The world revolution that was expected to sweep away the corrupt bourgeois politicians of the peace conference and their allies spluttered to a halt. The Bavarian Soviet Republic, proclaimed on 7 April, hardly survived into May and the communist putsch planned by Kun’s agents in Vienna on 15 June also failed. Meanwhile, General Deniken’s counter-revolutionary offensive in Russia thwarted hopes of help from across the Carpathians.

Facing an ever more turbulent domestic situation marked by widespread peasant unrest and an uprising of the students of the military academy in Budapest, the Revolutionary government, after heated debates, decided to give in to the demands of the Peace Conference, withdrawing Hungarian forces from Slovakia behind the demarcation line at the end of June. Aurél Stromfeld, who as Chief of the General Staff led the Red Army into Slovakia which led to the short-lived Soviet Republic proclaimed there on 16 June, resigned in protest against the ‘capitulation’. Some of his generals now started to join the National Army, organised by the counter-revolutionary government in Szeged, under the command of Admiral Miklós Horthy, the last commander-in-chief of the Austro-Hungarian navy. When the Romanians refused to retreat behind the neutral zone as envisaged, the Red Army launched a surprise offensive along the River Tisza. The initial advance was aborted, however, and ended in a disorderly flight of the Red Army. On 1 August, with the Romanian forces threatening to occupy the Hungarian capital, the commissars handed back power to the Social Democrats on the advice of trade union leaders that the creation of a government acceptable to the Entente powers was the only way to avoid complete foreign occupation. The next day, a government led by the trade unionist leader Gyula Peidl, who had refused to accept the creation of a united workers’ party, took office.

Although it promised to end the ‘dictatorship of the proletariat’ while at the same time defying a conservative restoration, the new government was still regarded as crypto-Bolshevik not only by conservatives but also by Liberals, peasant democrats and Christian Socialists. It also failed to gain support from the Entente. Assisted by the Romanian army, occupying Budapest, a coup forced the government to resign on 6 August. The government headed by István Friedrich, immediately set about annulling all the measures associated with the Soviet Republic, especially the nationalisation process. It also dismantled all the major social reforms of the democratic revolution, including those associated with individual civil liberties. Revolutionary tribunals were replaced by counter-revolutionary ones, packing prisons with workers, poor peasants and intellectuals, and by the beginning of 1920 it had passed roughly as many death sentences as had the lackeys of the ‘red terror’, the ‘Lenin Boys’. The intellectual élite of the country suffered a serious blow. Bartók and Kodály were prosecuted, Móricz was imprisoned and several dozen left the country, including Lukács, Mannheim and Korda. Horthy’s ‘National Army’, now transferred to Transdanubia, controlled and gave orders to local authorities and its most notorious detachments were instruments of naked terror. In three months, they may have killed as many as two thousand suspected former Soviet members, Red Army soldiers, and ordinary Jews who were in no way associated with the proletarian dictatorship. Besides executions and lynchings, about seventy thousand people were imprisoned or sent to internment camps during these few months.

002

Despite the protests of the Social Democrats and other left-wing forces, the occupying Romanian forces were replaced by Horthy’s National Army in Budapest. His speech before the notables of the capital stigmatised it as ‘the sinful city’ that had rejected its glorious past, Holy Crown and national colours for the sake of red rags. This suited an atmosphere in which most of the remaining adherents of the democratic revolution as well as the communist one were neutralised in one way or another. The returning conservatives promised to heal the country’s war-wounds by returning it to order, authority and the mythical ‘Christian-national system of values’. Sir George Clerk, the leader of the Peace Conference’s mission to Budapest in October 1919, abandoned his initial insistence that the Social Democrats and the Liberals should have an important role in a coalition government. As Horthy commanded the only troops capable of maintaining order and was ready to subordinate them to government control, it had to be acceptable to Horthy personally and the military in general. As a result, the cabinet formed by Károly Huszár on 24 November 1919 was one in which the Christian National Unity Party and other conservative-agrarian groups prevailed over those of the Independent Smallholder Party, the Social Democrats and the Liberals. Even though the great powers insisted that voting should take place by universal and secret ballot, the circumstances were unfavourable to fulfilling any illusion of a democratic outcome. Terrorist actions by detachments of the National Army and the recovering extreme right-wing organisations, designed to intimidate the candidates and voters for the Social Democrats, Smallholders and Liberals, led to the former boycotting the elections of January 1920 and withdrawing from the political arena until mid-1922.

On 1 March 1920, the army occupied the square in front of the Parliament building, and, accompanied by his officers, Horthy entered and, according to medieval precedent, was ‘elected’ Regent, with strong Presidential powers. This signalled the end of Hungary’s own short experiment with democratic socialism, following its even briefer experience of home-grown communism. Count Pál Teleki and Count István Bethlen, the dominant political figures of inter-war Hungary, both from Transylvanian aristocratic families, argued that the immediate post-war events had shown that the country was not yet ready to graft full democracy onto the parliamentary system. They advocated a limited ‘conservative democracy’, guided by the landed gentry and the aristocracy, as the proper response of the region to the challenges of the democratic age. They opposed all endeavours aimed at the radical extension of the liberal rights enshrined in the parliamentarism of the dualist. Liberal democracy seemed to them a mechanical application of the majority principle, undermining political responsibility and stability. They despised communism and were suspicious of social democracy because of its antipathy to private property. But they also opposed the right-wing radical and fascist trends epitomised by Gyula Gömbös and other ‘protectors of the race’ who thought that the parliamentary system had outlived its usefulness and ought to be replaced by an authoritarian rule which would facilitate a redistribution of economic functions in favour of the Hungarian Christian middle classes and away from the ‘foreign’ bourgeoisie (in other words, the Jews).

The fundamental character which the political system of the country retained until the German occupation of 1944 had emerged by 1922 as a result of Bethlenite consolidation. Hungary became a parliamentary state with strong elements of authoritarianism and a hegemonistic party structure, in which the institutions inherited from the liberal era were operated in an anti-democratic fashion. The government acknowledged a lawful political opposition, consisting on the left of Social Democrats, bourgeois liberals and, after 1930 a rejuvenated Independent Smallholder Party; and on the right of different groups of Christian Socialists as well as right radicals. One of the most important developments in the intellectual life of the Horthy era was the development of ‘populist’ writers, predominantly young and of peasant origin, who wrote ethnographically-based pieces revealing the economic and intellectual poverty of life in rural Hungary and drawing the attention of the ruling classes to the need for change. In ideological terms, some of them, most notably László Németh, advocated a ‘third way’ for Hungary between East and West, or between Soviet collectivism and capitalist individualism. Some, including Gyula Illyés and Ferenc Erdei, sympathised with socialism. Their top priority was the improvement in the lot of the poor peasantry through a genuine redistribution of land among them. But their willingness to engage with both the extreme Left and the extreme Right, as well as their emphasis on the ‘village’ as the root of ‘Hungarianness’, with its anti-Semitic overtones, led it into conflict with more cosmopolitan democrats and ‘urbanist’ intellectuals. This was symptomatic of a broader and longer-term division among Hungarian progressives which survived the attempts of even the Soviet communists to homogenise Hungarian society as well as the post-1989 transition to democracy and is resurgent in the propaganda of the current right-wing populist era.

003

The Second Hungarian Republic & The Eras of Rákosi & Kádár, 1945-1989:

The second Republic of 1945 was equally as brittle as that which followed the First World War, ending in a Soviet-style government which lasted more than forty years. By the time of the elections of November 1945, the communist vanguard, which had numbered only three thousand a year before, had managed to create a mass party of half a million members as a result of an unscrupulous recruiting campaign. Unlike the Social Democrats, they did not mention socialism as being even their strategic goal, and their rhetoric concentrated mainly on the pressing tasks of reconstruction combined with reform. Their avowed programme was essentially the same as the Independence Front; however, they did not refrain from occasionally playing nationalist tunes. Workers and smallholding peasants out of conviction, intellectuals out of idealism, civil servants out of fear and opportunism, all augmented the party ranks; the surviving Jews of Budapest joined out of gratitude to their liberators and their search for a new experience of community. Besides boasting an ever-growing influence on its own, the Communist Party was also able to manipulate the other parties of the Left. The Social Democratic Party, whose 350,000 strong membership possessed a powerful working-class consciousness, found it increasingly difficult to resist the call of the Communists for working-class unity. Together with the National Peasant Party, the Social Democrats chose to join the Communists in the Left-Wing Bloc on 5 March 1946, following the elections of the previous November which was won by the Smallholder Party, who collected fifty-seven per cent of the votes, with both the Social Democrats and the Communists polling seventeen per cent each, and the National Peasant Party a mere seven percent.

002

‘Forward to Peace & Socialism!’ The Young Pioneers’ Congress.

The elections themselves, by secret ballot and without a census, were the freest ever to be held in Hungary until 1990. Cardinal Mindszenty, the head of the Hungarian Catholic hierarchy, had condemned the ‘Marxist evil’ in a pastoral letter and called upon the faithful to support the Smallholders. Whatever the voters made of this intervention, the verdict of 4.8 million of them, over ninety per cent of the enfranchised, clearly showed their preference for the return of parliamentary democracy based on support for private property and the market economy over socialism with state management and central economic planning. But then the Smallholders gave in to Soviet pressure for the formation of a ‘grand coalition’ in which the communists were able to preserve the gains they had already secured and to secure a firm base from which they were gradually able to bully their way to power by 1949. After the tribulations of the Rákosi dictatorship, it was not surprising that, in 1956, what was initially a struggle between ‘reform’ communists and orthodox within the party, set off by and adjusting to changes in Moscow, and in the meantime itself triggering off a growing ferment among the intelligentsia, became a national anti-Soviet uprising. The events which began, from 20 October onwards, with meetings and demonstrations at the universities in Budapest and the provinces, culminating with a peaceful demonstration in support of Gomulka’s reforms in Poland on 23rd, became a ‘revolution’ when the crowd successfully laid siege to the radio station and fighting began the next day between Soviet tanks and young working-class ‘guerillas’ whom even the restored Prime Minister referred to as ‘counter-revolutionaries’ at this stage.

014

All the insurgents agreed about was their desire to return national sovereignty and to put an end to arbitrary rule. They did not call for a reversal of nationalisation or a return to the pre-1945 order.  As fighting continued, by 28 October, Nagy had dropped the label ‘counter-revolution’ and started to talk about a ‘national democratic movement’, acknowledging the revolutionary bodies created during the previous days. The Hungarian Workers’ (Communist) Party was reformed as the Hungarian Socialist Workers’ Party (MSZMP) and the old coalition parties became active again, including the Social Democrats. After his initial uncertainty, the PM kept pace with developments on the streets, closing the gap between himself and the insurgents step by step. His changes culminated in the formation of a new multi-party cabinet on 2 November, including reform Communist, Social Democrat (Anna Kéthély, below), Smallholder and Peasant Party members.

006

However, this consolidation of power by a now avowedly ‘Revolutionary Government’ involved the collapse of the whole system of institutions of the party-state on which the cohesion of the Soviet bloc rested, and this was unacceptable for the Moscow leadership, Khrushchev included. It could not afford to lose a country of Hungary’s strategic location and mineral wealth from among its satellite states. But it was the radicalisation of the revolution in Budapest which made it impossible for a compromise deal to be struck. After announcing the formation of the MSZMP, also declaring himself to be in favour of neutrality and willing to fight in the streets, János Kádár left Parliament on 1 November for the Soviet Embassy. He quickly found himself in Moscow where he became the latest figure selected by the politburo to steer Hungary on a course acceptable to them. Having accepted this assignment, he entered Budapest with his cabinet in Soviet tanks on 7 November.

Although the pockets of armed resistance had been mopped up by 11 November, the most peculiar forms of the revolution, the workers’ councils, started to exert their true impact after 4 November, with an attempt to organise a nationwide network. Initially set up as strike committees, their basic idea was self-management in the factory, owned principally by the workers. On the initiative of the workers’ councils, a massive wave of strikes lasted into January 1957. The intellectuals, rallying mainly in the Writers’ Association, the students’ committees and the Journalists’ Association, founded the Revolutionary Council of the Hungarian Intelligentsia, chaired by composer Zoltán Kodály, which demanded the restoration of the country’s sovereignty and representative government. These movements marked out the Revolution as more than simply a defeated National Uprising. They were clearly socialist in their aims and membership. Kádár, on the other hand, did not have a clear policy to cope with this situation. The government programme which he drafted while still in Moscow, included promises of welfare measures, workers’ self-management and policies to aid the peasantry and small-scale enterprises. But these were clearly not the reasons for his ‘appointment’ by his Moscow patrons. To begin with, he was too busy organising special police forces for the purposes of retaliation and repression to spend time setting out policies. Although he negotiated with the leaders of the Budapest Workers’ Council on 22 November, on the previous day the special police squads prevented the creation of a National Workers’ Council and in early December, two hundred members of the movement were arrested on the same day that saw the abduction of Nagy and his associates.

006

The revolutionary committees which had been set up were dissolved, and the police shot dead nearly a hundred demonstrators in Sálgotorján, Miskolc and Eger. The ideological justification for these actions and the continuing repression and the impending campaign of retaliation was created at a party conference which identified the causes of the October Uprising as the mistakes of the Rákosi-Gerő faction on the one hand and, on the other, the undermining of the party by ‘Nagy circle’ leading to a capitalist-feudal counter-revolution of Horthyite fascism… supported by international imperialism. Given the trauma created by the revolution, its repression and the retaliation which followed in 1956-58, it is not surprising that Hungarian society was in the mood for Kádár’s Realsozialismus, based on his personalised creed that the ‘little man’ was interested simply in a decent living, instead of the great political issues of the day. He used the scope created by the ruins of the revolt on which he built his power to buy the complicity of Hungarians by unorthodox methods. In November 1962, Kádár somewhat pompously announced that the foundations of socialism in Hungary had been laid and that the construction of socialism was an all-national task, dependent on co-operation between Communists and non-party members, irrespective of personal convictions. There was to be no ‘class war’; this was what became known as the ‘Kádár doctrine’. These were the foundations of the ‘Hungarian model’, often referred to as ‘Gulyás communism’ in the 1970s, which was a far cry from utopian models. With characteristic persistence, Kádár managed to earn legitimacy, retaining it until it became apparent in the 1980s that Realsozialismus was not a functioning system, but merely ‘the longest path from capitalism to capitalism’.

Conclusion: The End of ‘Class-War’ Socialism?

In late 1946 a group of historians, friends and members of the Communist Party started regularly meeting in Marx’s House in London, picture here.

Marx House (Memorial Library) in London.

Marx (before ‘Marxism’) based his theories on a belief that men’s minds are limited by their economic circumstances and that there is a necessary conflict of interests in our present civilization between the prosperous and employing classes of people and the employed masses. With the advance in education necessitated by the mechanical revolution, this great employed majority would become more and more class-conscious and more and more solid in antagonism to the ruling minority. In some way the class-conscious workers would seize power, he prophesied, and inaugurate a new social state. The antagonism, the insurrection, the possible revolution are understandable enough, but it did not follow that a new social state or anything but a socially destructive process would ensue. Marx sought to replace national antagonism by class antagonisms, but it is interesting to see how the two lines of thought, so diverse in spirit, so different in substance as this class-war socialism of the Marxists and the individualistic theory and socialist theory have continued to be part of a common search for more spacious social and political ideas and interpretations. In the long history of socialism in western Europe, as contrasted with the seventy years of Soviet-style Communism, the logic of reality has usually triumphed over the logic of theory.

Sources:

Raymond Williams (1983), Keywords: A vocabulary of culture and society. London: Harper Collins.

Henry Pelling (1965), Origins of the Labour Party (second edition). Oxford: Oxford University Press.

László Kontler (2001), A History of Hungary. Budapest: Atlantisz Publishing.

H. G. Wells (1922, 1946), A Short History of the World. Harmondsworth: Penguin Books.

  

Posted April 19, 2019 by TeamBritanniaHu in Affluence, Agriculture, anti-Communist, anti-Semitism, Austria-Hungary, Baltic States, Britain, British history, Christian Faith, Christian Socialism, Christianity, Church, Civil Rights, Civilization, Co-operativism, Cold War, Commemoration, Commonwealth, Communism, Conservative Party, Dark Ages, democracy, Discourse Analysis, Economics, Education, Egalitarianism, English Language, eschatology, Ethnicity, European Union, First World War, France, German Reunification, Germany, History, Hungarian History, Hungary, Imperialism, Integration, Ireland, Jews, liberal democracy, liberalism, Literature, Marxism, Memorial, Methodism, Midlands, Militancy, Millenarianism, nationalisation, nationalism, Nationality, NATO, Oxford, Population, populism, Proletariat, Quakers (Religious Society of Friends), Remembrance, Revolution, Social Service, Socialist, south Wales, Trade Unionism, USSR, Utopianism

Tagged with , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

Paul of Tarsus: Endnotes & Evaluations on his Legacy to the Early Church.   Leave a comment

001 (3)

Archaeological Insights:

002 (2)

The first missions to the Gentiles, as presented in the Acts of the Apostles offers a fruitful field for archaeological study. Different kinds of detail interlock. For example, Paul met the Christian couple Priscilla and Aquila in Corinth, after Emperor Claudius had expelled the Jews from Rome (Acts 18: 2). This expulsion is mentioned in pagan literature and dated to AD 49 by a later writer. During Paul’s long stay in Corinth, Gallio became governor (Acts 18: 12); he is known elsewhere from the writings of his more famous brother Seneca, and his governorship can be dated to AD 51-2 by an inscription found in Delphi. This evidence helps build a consistent and fairly precise outline for this part of Paul’s life and helps relate Acts to Paul’s letters.

001 (2)

Many details of the names of people and officials, places and customs in the book can be exactly illustrated from inscriptions. This does not prove its account to be historically accurate, but it does rule out any view which holds that the writer, probably Luke (Paul’s early travelling companion and author of the synoptic gospel which bears his name), was careless about such details. It also makes it hard to believe that the book was written long after the events it describes. A test case of the relationship between Acts, the Epistles and the archaeology is Paul’s letter to the Galatians. Sir William Ramsay used the evidence of inscriptions to clearly establish clearly the extent of Galatia and then argued that the letter was sent to the southern cities such as Pisidian Antioch, in Phrygia (above), which Paul had visited on his first journey (Acts 13-14). This, in turn, fits the very early dating of the letter. Thus the details of Paul’s life contained in the letter may be linked directly to those in Acts.

The Greek Writer and Theologian:

Paul’s surviving letters are found in the New Testament. Galatians was probably written before the Council of Jerusalem in about AD 50. The two letters to the Thessalonians date from his first journey to into Greece; Romans and I & II Corinthians come from his last spell in Greece before his arrest at Jerusalem. Philippians, Colossians, and Ephesians were probably written from Rome during Paul’s first imprisonment there, and Philemon may have been written during his earlier house arrest in Ephesus. The two letters to Timothy and the letter to Titus were probably written after Paul’s first stay in Rome. In them, Paul showed his mastery of Greek, and these two ‘pastoral’ letters can be counted among the classics of Greek literature. The letters were highly valued during Paul’s lifetime and were collected together soon after his death. By AD 95 they were accepted on an equal basis with other Scripture and were in their present form by AD 140. Paul’s theology was not well understood in the period immediately after his death. This was partly because the heretic Marcion rejected the Old Testament and much that was Jewish in the emerging canon of the New Testament. He considered that Matthew, Mark, Acts and Hebrews favoured Jewish readers exclusively. He also cut out the pastoral letters to Timothy and Titus, which left him with only a mutilated version of Luke’s Gospel and ten of Paul’s letters. He believed that Paul was the only apostle who did not corrupt the gospel of Jesus. As long as Marcion’s heresy was a threat, mainstream Christian teachers did not stress many of Paul’s more distinctive doctrines, such as that regarding the law of Christ and God’s grace. It was not until the time of Augustine that full weight was given to Paul’s theology.

004 (2)

The Missionary’s Achievements:

002 (5)

Paul’s achievements as a missionary were immense. The years between his Damascene conversion in AD 35 and his Antiochene preparations and initial discussions with the church in Jerusalem from AD 45 remain somewhat obscure, but during the next ten or twelve years, his activity was astounding. Between AD 47/48, when he set sail with Barnabas on his first missionary journey, and AD 57, when he returned to Jerusalem for the last time, he established flourishing churches in the major cities of the Roman provinces of Galatia, Asia, Macedonia and Achaia. His decisive role in the early Christian mission to the Gentiles was due principally to his championing of it to the first churches in Jerusalem and Antioch in Syria.

He then developed the theological defence of the Gentile mission which is clearly set out in Romans 1-11, while working hard to hold together and reconcile Jewish and Gentile Christians in the Diaspora. With this purpose in view, he kept in constant touch with the ‘mother church’ in Jerusalem, collecting a considerable sum of money among the Gentile converts for the needs of the Christians in Judea, and regularly underlined the importance of Christian unity in his letters. Finally, Paul’s principle of being ‘all things to all people’ helped him to move with relative ease between the synagogues, halls and house-churches of Graeco-Roman society, where ultimately the gospel received its greatest response. Moreover, his personal example as a self-supporting travelling missionary and his ‘settlements’ in significant cities provided a pattern of ministry for others to follow. His preference for the single life was based not on the kind of celibacy which Jesus advocated for some in Matthew 19, but on his initial sense that Christ’s return might come very soon. He certainly recognised the practical advantages for missionaries of remaining unmarried. However, like Jesus, he did not advocate a life of asceticism and self-denial as the norm for ministry and attacked the teaching that it was wrong to marry.

The origin and meaning of the word ‘apostle’ are hard to establish, and it obviously means very different things to different New Testament writers. For Luke, an apostle is one who accompanied us during all the time that the Lord Jesus went in and out among us (Acts 1: 21), thus excluding Paul. But for Paul himself, apostleship was something to be proud of, and he is very anxious to defend his own (I Cor. 9: 1). For him, the apostles are those who have been commissioned by an appearance of the risen Lord, as he had been on the road to Damascus. Later, in his Pastoral letters, Paul is the Apostle, the guardian of the faith. The one point of agreement is that apostleship is not something that can be passed on. A famous passage, I Cor. 12: 28, mentions in succession apostles, prophets and teachers, and Eph. 4: 11 has a similar list. It is doubtful, however, whether these can be regarded as different classes of ministry. Rather, they are different activities, more than one of which might be practised by a single individual:

  • Deacon is usually a general term, describing any form of ministry or service. In two passages, the deacon seems to be a particular minister, subordinate to the bishop (Phil. 1: 1; I Tim. 3: 8-13). If the two terms are used technically in Phil 1: 1, this is the only evidence we have of such a formal ministry from the Pauline letters so the terms may be general even there.

  • Elders are not mentioned at all by Paul but are to be found as ministers throughout Acts, appointed by Paul and Barnabas in every church (Acts 14: 23; cf. 15: 12 ff.; 16: 4; 20: 17; 21: 18). Here Jewish practice is followed. Villages and towns had their groups of Jewish elders, seven in each village, twenty-three in each town and seventy in Jerusalem. When a place fell vacant, it was filled by the laying on of hands, the pattern found in Acts.

  • Bishop is a term which occurs in a technical sense in Acts 20: 28., but as in Phil 1: 1 the word may be used generally as ‘overseer’. Bishop is a definite office in I Tim. 3: 1-7; Titus 1: 7-9. The relationship between elders and bishops is a classic problem, as at times the two terms could be synonyms. At the end of the second century, each bishop was in charge of a particular area. All bishops were elders, but not all elders were bishops.

We have even less evidence about the ministry at this time than about other important matters, and what is said in the ‘Apostolic Fathers’ does little to help. Clearly, the pattern varied from place to place, and development was by no means uniform.

How would Paul have assessed the significance of his work?

From differing angles, more can be said about the reasons for the surprising long-term success of Paul’s work. Tom Wright tells us that Paul’s particular vocation was to found and maintain Jew-plus-Gentile churches on Gentile soil. He realised early on that it was his job not just to teach people what to think and believe, but to teach them how; how to think clearly, scripturally, prayerfully. The One God had already built his new Temple, his new microcosmos; the Jew-plus-Gentile church was the place where the divine spirit already revealed his glory as a sign of what would happen one day throughout the whole world. Of course, Paul would not have expected all this to happen smoothly or easily. He was a realist and would never have assumed that the transformation of small and often confused communities into a much larger body, forming a majority in the Roman world, would come about without terrible suffering and horrible pitfalls. He would also have been saddened by the mistakes and heresies of the following centuries and the battles that would have to be fought. But he would also have pointed out that something had happened in Jesus which was of cosmic significance. The success of the ‘Jesus Movement’ wasn’t simply the accidental product of energetic work meeting historical opportunity. God was at work in the midst of his people to produce both the will and the energy for it to succeed. This divine design and Spirit-led motivation were bound to have their larger effect, sooner or later, and by whatever means they could find.

Paul was also very much alive to all the factors that the historian, as opposed to the theologian, might want to study. He would have been very much aware of the need for historians to demythologise scriptural narratives. In his own day, Greek scholars were doing the same kind of thing with the stories of Homer. Paul would not, himself, have wanted to ascribe the whole happening of Jesus to divine or angelic power operating without human agency, since he believed that when grace was at work, human agents were themselves were regularly called upon to work hard as a result, not least in prayer. He said this of himself (I Cor. 15: 10; Col. 1: 29). The Creator may work in a thousand ways, but one central way is, for Paul, through people who think freely, pray, make difficult decisions and work hard, especially in prayer. Since heaven and earth had come together in the persons of Jesus and his Spirit, we should expect different layers of explanation to reside together and reinforce each other. Paul was one of the most successful public intellectuals of all time precisely because he was able to take advantage of the human circumstances of his time – a common language, freedom of travel and citizenship of the Roman Empire – to establish an international movement not only for the course of his own lifetime but for an indeterminate historical future.

010

Paul’s Personal Attributes:

Tom Wright highlights a number of personal attributes which enabled Paul to develop the early Christian church throughout the Empire of the Eastern Mediterranean and in Rome itself. First of all, he points to the sheer energy of the missionary, which can be found not only in the narratives of Acts but also pulsing through his letters. He responds to violence in one city by going straight on to the next, saying and doing the same things there. He worked all hours, making tents when not preaching, teaching or dictating letters to a scribe. He was also ready every moment for the visitor with a question or local official worried about his status. He was ready to put down his tools and leave his workbench for an hour or two in order to go from house to house making pastoral visits to encourage the faithful, to comfort the bereaved, downhearted and distressed, to warn and pray. In between his house calls, he was thinking about what he would say in his afternoon address in the house of Titus Justus in Corinth or the hall of Tyrannus in Ephesus. In the evening, he would pause to say prayers with his close friends and travelling companions, before working long into the night, praying for those he had met that day, for the city officials and for the Christians in other cities, for the next day’s work and the next phase of his mission.

006

His second attribute was his direct, up-front habit of telling it as he saw it, no matter who was confronting him. From his early days in Damascus, getting into trouble, to his arguments with the apostles in Jerusalem and his confrontation with Peter in Antioch, he didn’t hold back from controversy or seek to avoid conflict if he thought it would advance the church’s mission by confronting and seeking to resolve it. Wright suggests that the only reason he didn’t say more at the Jerusalem Conference was that Barnabas was there to act as a moderating influence. His debating style might have proved effective, but it might also have alienated many more sensitive souls. He also confronted the magistrates at Philippi and relished speaking truth to the vast crowd in Ephesus; he is fearless in trying to explain himself to the lynching mob in Jerusalem and is not afraid to rebuke the High Priest.  He was an astute politician who knew how to turn the various factions of the Sanhedrin against each other. He also lectured the Roman governor himself about justice, self-control, and the coming judgement. As a travelling companion, he must have been exhilarating and exasperating in equal measure, depending on whether things were going well or badly. He must have been a formidable an opponent since he seems to have driven some people to contemplate murder as their only means of ridding themselves of this troublesome missionary.

004

Yet there must have been something quite disarming about Paul’s vulnerable side, which helps to explain why people wanted to work alongside. He was the sort of person for whom there were no limitations in affection for his fellow Christians. His honesty shines through in the pages of his letters. He would do anything he could for the churches since God had done everything for him through the Messiah. Neither would he have asked anyone to face anything he himself had not faced, including terrible suffering and hardship. The Corinthians would have immediately recognised a self-portrait in his poem about divine love, and when he told the Philippians to rejoice and celebrate, they knew that, given half a chance, Paul would have been at the party in spirit, the life and soul of it. He modelled what he taught, and what he taught was the utter, exuberant, self-giving love of the Messiah and the joy that accompanied it. His associates were fiercely loyal to him, and there was mutual love between them. He was the sort of person who enabled others to change and grow so that they themselves would take forward the same missionary work with as much of the same energy as they themselves could muster.

Paul’s Writing:

But within two or three generations the memory of this personal relationship had faded so that it was his letters which kept his influence alive. The flow of words from his daily teaching, arguing, praying and pastoral work was captured for future generations in these short, challenging epistles. It isn’t just their content, strikingly original and authentic as it is. He wasn’t synthesising the worlds of Israel, Greece and Rome; his was a firmly Jewish picture, rooted in Israel’s ancient narrative, with its Messiah occupying centre stage and the nations of the world and their best ideas brought into new coherence around him. Nor was he simply teaching a ‘religion’ or ‘theology’, but drawing together wisdom learnt from many different ancient disciplines, which we would class under economics, history and philosophy. Yet within a generation people were grumbling that Paul was sometimes too difficult to understand and that some were misinterpreting him. But it is no accident that many of the great moments of church history and Christian thought, involving  Augustine, Luther and Barth, have come about through fresh engagement with Paul’s work. Paul had insisted that what mattered was not just what you thought but how you thought. He modelled what he advocated, and generation after generation has since learned to think in this new way. In this way, his legacy has continued to generate fresh dividends.

Culture, Politics & Society:

Paul himself would claim that all this was the doing of the One God and his Messiah, whereas ‘sceptics’ might retort that the movement owed much to the spread of the Greek language and culture combined with the increasing ease of travel throughout the Roman Empire. This meant that conditions were ripe for the spread of new ideas and movements throughout the known world and even into South Asia. Paul would perhaps have rejoindered that if the Messiah was sent when the fullness of time arrived (Gal. 4: 4), then perhaps Greece and Rome were part of the plan and the preparation, as well as part of the problem. Tom Wright does not agree, however, with those who have claimed that people were getting tired of the old philosophies and pagan religions and were ready for something new. The problem in Ephesus, for example, was not that people had stopped worshipping Artemis, and so were ready for Paul’s message, but that Paul’s message about the One God had burst on the scene and stopped the worship of Artemis. Social and cultural conditions can help to explain the way things worked out, but they cannot explain it away. Paul emphasised, in letter after letter, the family life of believers; what he begins to call ‘the church’, the ekklesia. He continually emphasises the unity and the holiness of the church, as well as highlighting and ‘celebrating’ the suffering that he and others would and did endure as a result of their loyalty to Jesus. This was not about pagans experimenting with new ideas, but about a new kind of spiritual community and even a new kind of ‘politics’.

003 (2)

Politics is concerned with the polis – the city, the community – and how it works and runs. Sophisticated theories had been advanced in Paul’s day, often by theoreticians like Cicero and Seneca, who were also members of the ruling élite. The main feature of Paul’s political landscape was Rome, which had united the world, or so it claimed. But that top-down uniformity in which diversity was tolerated as long as it didn’t threaten the absolute sovereignty of Caesar, was often ugly. ‘Diversity’ was still seen in strictly hierarchical terms: men over women, free over slaves, Romans over everyone else. Rebels were ruthlessly suppressed. They make a wilderness, sighed the Briton Calgacus, and they call it ‘peace’ (Tacitus, Agricola 30.6). What Paul had been doing was undoubtedly building a different kind of community offering a different vision of unity, hosting a different kind of diversity based on churches of Gentiles and Jews. He was founding and maintaining an interrelated network of communities for which the only analogies were synagogue communities, on the one hand, and the Roman army and civil service on the other. But Paul’s communities were very different from either. They had the deepest roots and were not simply a freestanding innovation. Rome traced its story back nearly a thousand years, while the synagogue told the still longer story which went back to Abraham. Paul told that story too and regularly explained to his communities that they had been grafted into that great tradition. In Paul’s work, this was as much a social and communal strength as it was a theological one.

Morality & Marriage:

When the new communities spoke of a different kind of kyrios, one whose sovereignty was gained through humility and suffering, rather than wealth and conquest, many must have found that attractive, not simply for what we would call ‘religious’ reasons, but precisely because for what they might call ‘political’ ones. Paul did not, of course, have time to develop his picture of the differentiated unity of the body of Christ into a larger exposition of the church as a whole. He had not articulated a political authority to match that of Aristotle or his successors. But it was that kind of social experiment, of developing a new way of living together, that the churches of the second and third centuries sought to develop. Their inspiration for this went back to Paul’s theological vision and was not pure pragmatism. It had the power to generate an alternative social and cultural reality, to announce to the world that Jesus was Lord and Caesar wasn’t. What Paul had articulated in his letters, often in haste and to meet particular crises, was reused to encourage Christians to develop a refreshingly new kind of human society. In particular, the Christian message provided a much better prospect for women than the pagan religions, which routinely practised infanticide for unwanted children in general and girls in particular. The Christians followed the Jews in renouncing such behaviour. The consequent shortage of marriageable girls among pagans and the surplus among Christians led to an increase in inter-cultural marriages, with many of the offspring being brought up as Christians. The fresh evaluation of the role of women, begun by Jesus himself, was developed by Paul, who listed several women among his colleagues and fellow workers. For example, Phoebe was entrusted with the responsibility of delivering and expounding his letter to the Romans.

With sexual excesses all around them, it is likely that some Christians reacted against sexual indulgence from a fairly early period. However, this was not formally set out or made a matter of special praise. In fact, special vows by younger women to abstain from marriage were discouraged by Paul. During the period which followed, abstinence from marriage was left as a matter of personal choice, although in most ‘Gnostic’ sects marriage was actively discouraged on the grounds that it entangled the spiritual soul with the evil physical world. Some Jewish and Christian traditions blamed sexual differences on ‘the fall’ and believed that salvation included a return to a ‘unisex’ or asexual life. In the mainstream churches, leaders such as Melito of Sardis became known for their austere personal lives; abstinence from marriage was part of this. In many churches, too, Christian women had difficulty in finding suitable husbands. Those who remained unmarried had more time for prayer and devotion. In the same way, men who were free from family ties had more time to devote to church affairs and were often obvious choices as leaders. By the third century, celibacy was beginning to be valued as a mark of holiness. Even so, extremes were frowned upon, and Origen earned considerable disapproval because he made himself a eunuch, believing that this was commended in the Gospels. As martyrdom declined, asceticism began to become the measure of spirituality; the leaders regarded as more spiritual in the churches tended to be those who practised an ascetic way of life, though the clergy was not generally obliged to be celibate.

Poverty & Social Action:

Within a few generations, the early Christian communities set up hospitals, caring for all those within reach, and they were also enthusiastic about education, teaching their converts to read the scriptures of ancient Israel, and thereby giving them the literacy skills that previously only a maximum of thirty per cent of the populations had acquired, almost exclusively male. Some of the older Greek cities and islands had a tradition of elementary education for citizens, but for many people, this would have been minimal, and women and slaves were excluded. Converts to Christianity, therefore, gained basic reading skills that they had hitherto lacked. Christians were also technological pioneers in making books, abandoning scrolls with their natural limitations and developing the ‘codex’, the ancestor of the modern bound book. The earliest Christian congregations quickly appreciated the value of the letters written by the apostles. Some of them were obviously intended for public reading, perhaps in place of, or alongside, a sermon on the Old Testament, and for circulating among the churches. But they clearly wanted more and more people to be able to read the books the community was producing. This insistence on education and especially reading can be traced back directly to Paul, who told his churches to be ‘grown-up’ in their thinking, to be transformed by the renewal of their minds as well as their hearts. He wanted the early Christians not only to think the right things but also to think in the right way. Though he did not himself found what we would today call ‘schools’ when such things did come about, they had him to thank for the underlying impetus.

Paul’s collection for the poor of Jerusalem was followed up in each local Jesus community in its work among the poor around it. Paul congratulated the Thessalonians on their practical ‘loving-kindness’ or agape and urged them to work at it more and more. “Do good to everyone,” he wrote to the Galatians, “and particularly to the household of the faith.” He encouraged them to… Celebrate with those who are celebrating, mourn with the mourners… Shine like lights in the world. The gospel itself was designed to generate a new kind of people, a people who would be eager for good works; in fact, the new kind of humanity that was brought to birth through the gospel was created for the specific purpose of ‘good works’ (Gal. 2: 10; I Thess. 4: 9-10; Gal. 6: 10; Rom. 12: 15; Phil. 2: 15; Titus 2: 14; Eph. 2: 10). This phrase means more than ‘the performance of moral rules’, especially when played off against Paul’s doctrine of justification by faith alone. Morals matter, faith matters, but that isn’t the point here. Paul’s emphasis is all about communities through whose regular practice the surrounding world is made a better place. Through Christ’s faithfulness and their own loving-kindness, these communities would find the right way to live. Good morals and good works would follow. In Corinth, there was a tendency to divide into factions centred on the personalities of human leaders, rather than just over doctrines. A prominent member of the community was living in immorality and individual Christians were taking each other to the law-courts over minor disputes. There were also misunderstandings about the meaning of Christian liberty. Paul’s letters, as well as those of John, reveal controversies and power-struggles in the midst of encouragement and growth.

The Spread of Christian Communities:

005

But the church history of the second and third centuries is enough to confirm that all these things, taken together, offer good explanations for the spread of the Christian communities. These early Christians, strange though their views and lives might have seemed to those around, antisocial though some might have supposed them to be, were doing things that really do transform the wider society. By the end of the second century, Roman officials were not particularly aware of the nuances of Christian teaching, but they did know what the term ‘bishop’ meant – someone who agitated about the needs of the poor. This too was the result of a seed that Paul had planted, and when all of these began to sprout, a community came into being that challenged the ancient world with a fresh vision of a society in which each worked for all and all for each. This enabled that world to escape from the older paganism and its social, cultural and political practices and to find refuge in the new kind of community, the koinonia, the ‘fellowship’, the extended family of the One God. On the cross, Jesus had won the victory over all the other powers, or gods. This was the basic belief of these communities, which existed because all the old gods had been overthrown. Mammon, Mars and Aphrodite had been shown to be imposters, and Caesar was no longer the ultimate Lord. This was a theological, historical and political reality which the followers of Jesus demonstrated on the streets and in the market places, as well as in their homes.

001 (2)

The breaking through of Paul’s thinking in Graeco-Roman society was not because the other philosophies of the ancient world had ‘run out of steam’. The Stoics, Epicureans, and Platonists had serious, articulate and even ‘charismatic’ spokespeople. They were all, in the final analysis, ways of understanding the world and of finding a coherent path for humanity within it. When later generations of Christians wanted to articulate the gospel version of the same thing, they turned to Paul for help, though other sources remained vital. The prologue to the Gospel of John is an obvious example of these, but it was Paul’s engagement with the triple traditions of Israel, Greece and Rome and his transformation of them by the person and Spirit of Jesus that offered a platform for the great Christian thinkers of subsequent generations and centuries. Without this firm theological foundation, the church would not have survived the persecutions it was forced to endure in these centuries. Paul knew only too well what learning how to think would cost those who were ‘to follow’, but he believed that this new way was the only way for them to follow, a way that would win out over the other ways because of its genuine humanity.

002 (2)

The Wright Verdict:

Tom Wright completes his answer to his own question by summarising the several paths of explanation which converged on Paul himself in his mapping out of this ‘new Way’:

His was the vision of the united, holy, and outward-facing church. He pioneered the idea of a suffering apostleship through which the message of the crucified Jesus would not only be displayed, but be effective in the world. He could not have foreseen the ways in which these communities would develop. He might well not have approved of all that was done. But the historian and biographer can look back and discern, in Paul’s hasty and often contested work, the deep roots of a movement that changed the world…

… Paul’s vision of a united and holy community, prayerful, rooted in the scriptural story of ancient Israel, facing social and political hostility but insisting on doing good to all people, especially the poor, would always be central. His relentless personal energy, his clarity and vulnerability, and his way with words provided the motor to drive this vision, and each generation will need a few who can imitate him. His towering intellectual achievement, a theological vision of the One God reshaped around Jesus and the spirit and taking on the wider world of philosophy, would provide the robust, necessary framework for it all. When the church abandons the theological task… we should not be surprised if unity, holiness, and the care for the poor are sidelined as well.  

001 (2)

Paul’s contribution to the Nature & Worship of the Early Church:

The church brought together ideas and people from many backgrounds. It had to cope with people who had become Christians in such disreputable seaports as Corinth, notorious for its immorality. It had to resolve the pressures to revert to pagan or Judaic practices, to sort out its attitudes towards contemporary customs and cultures, and to thrash out beliefs and opinions about issues on which there were no precedents to guide its thinking. Many Christians in the third century were willing to suffer as martyrs rather than betray their Lord by acknowledging false gods. Some, however, renounced their faith under torture or the pressure of imprisonment. Others got pagan neighbours to make the required sacrifice on their behalf, or obtained false certificates from sympathetic officials. At the opposite extreme, some Christians eagerly sought out martyrdom, even when it was not forced upon them, though this was strongly discouraged by Christian leaders. Following each wave of persecution, the church was faced with the problem of what to do with those who repented after lapsing under pressure. Some Christian leaders claimed that offences such as idolatry after baptism were unpardonable on earth, but others allowed one such occasion of forgiveness subsequent to baptism. Callistus, bishop of Rome (217-22), was among the more moderate and appealed to Paul’s letters and the parables of the lost sheep and the prodigal son for proof that no sin is unforgivable if the sinner truly turns from their sins. His referral back to Paul reveals the continuing influence of the apostle.

006 (2)

In Paul’s time, and for at least a century afterwards, Christianity was largely an urban movement; Paul tended to preach in big cities, and small Christian groups could more easily spring up in the anonymity of large towns. Deep penetration of the countryside only began in the third century, though the methods used in that ‘outreach’ are unclear. Nearly every known Christian congregation started by meeting in someone’s house. One example of this was Philemon’s house-church, perhaps at Laodicea. The home formed an important starting-point, although by the mid-third century congregations were beginning to have their own special buildings because congregations were too large to meet even in the courtyard of a large Roman house. Most Christian writers were increasingly rationalistic, and Eusebius mentions only a very few miracles in his history of the church during this period. They also tried to discredit contemporary pagan superstition, focusing on ‘good living’ rather than supernatural ‘signs’. In the late third centre came the first deliberate attempts to follow Paul’s earlier examples of absorbing features of pagan religions into Christianity. Churches took over from temples, martyrs replaced the old gods in popular devotion, and the festivals of the Christian year took the place of high-days and holy days of paganism.

007 (2)

When Irenaeus succeeded as a third-generation ‘bishop’ of the church in Rome, he described it as the very great, very ancient and universally known church, founded and organised at Rome by the two most glorious apostles, Peter and Paul. Because Christians from all parts were found there, it was a microcosm of the whole Christian world. His statement hints at some of the reasons why Rome acquired a leading position among the churches. All roads led to Rome, the capital of the Empire, not least the well-engineered roads on which the Christian missionaries travelled. A remarkable number of prominent Christians made their way to the Imperial City: Ignatius, Polycarp, Marcion, Valentinus, Tatian, Justin, Hegesippus, Irenaeus, Tertullian, Praxeas, and Origen, all followed Peter and Paul’s journeys in the sixties. Rome was the only Western church to receive a letter from an apostle, and Luke’s long account of Paul’s miraculous journey to the city reflects the importance attached to his reaching the capital. Nothing boosted the prestige of Christian Rome so much as the fact that the two chief apostles were martyred there under Nero. By the mid-second century, memorial shrines to Paul and Peter had been erected in Rome, on the Appian Way and the Vatican Hill respectively. Remains of the latter have been uncovered in modern excavations.

011 (2)

The Fall of Jerusalem in AD 70 enhanced the standing of the Roman church in the long-term since it became almost impossible to evangelise the Jewish settlements on the eastern shores of the Mediterranean. Christianity’s centre of gravity shifted west, where Rome was well-placed to play a central role. However, the letter to the Corinthian church known as I Clement did not imply any claim to superiority by the church of Rome. Second-century Christianity there appears to have been very varied. It included independent schools like Justin’s and immigrant groups such as the Asians who followed their traditional observance of the Pascha (Passover). Not until the last decade of the century did a strong bishop emerge – Victor, an African and the first Latin speaker. Meanwhile, the shrines of Peter and Paul bolstered a growing self-confidence.

The first bishop to claim a special authority derived from Peter by appealing to Matthew 16: 18-19, was Stephen, in his dispute with Cyprian. Paul’s position alongside Peter in the earliest church now began to be lost sight of. Cyprian regarded every bishop’s seat as ‘the see of Peter’, although he agreed that the Roman church had special importance because it had been founded so early. The Roman church already possessed considerable wealth, including the underground burial-chambers (catacombs) outside the city and several large houses whose upper floors were adapted for use as churches (tituli).

009 (3)

Centuries later, the Roman church criticised the British for their great lack of martyrs as compared with their own record. The leaders of the British church informed them that the leaders of the British church lived to preach and teach the Gospel and not die for it unnecessarily. As noted already, there were many in the Roman church who viewed martyrdom as a noble, worthwhile gesture to such an extent that some became fanatics. They sought martyrdom before they had achieved anything else worthwhile. The most popular claimant to the honour of being the first Christian martyr in Britain, identified with the church of St. Alban’s, was the Christianised Roman soldier, named Alban. During the Diocletian persecution in Britain, he aided a hunted British priest to escape by wearing his robe, drawing pursuit to himself. On being recognised, the Roman officer ordered a soldier standing nearby to execute the culprit. The soldier refused, admitting that he too was a Christian, with the result that both soldiers were immediately beheaded. Tradition claims they were buried together on the spot where they were killed and a church erected on the site was named St. Alban’s. However, the early British historian, Bishop Alford wrote of an earlier martyr who was apparently known to both Peter, Barnabas and Paul, Aristobulus, who was absent in Britain before Paul arrived in Rome. In the Martyrologies of the Greek church, we read:

Aristobulus was one of the seventy disciples and a follower of St. Paul the Apostle, along with whom he preached the Gospel to the whole world, and ministered to them. He was chosen by St. Paul to be the missionary bishop to the land of Britain.  He was chosen by St. Paul to be the missionary to the land of Britain. He was there martyred after he had built churches and ordained deacons and priests on the island.

002 (2)

Dorotheus, Bishop of Tyre, recorded in AD 303 that Aristobulus who is mentioned by the Apostle in his Epistle to the Romans, was made Bishop in Britain. Haleca, Bishop of Augusta, confirms that he was one of many martyrs whose memory was celebrated by the Britons and the Adonis Martyrologia also contains a record which confirms his mission to Britain, where he founded a church before his martyrdom in circa AD 59 or 60, on 15 March. There is a legend suggesting that Paul himself may have paid a brief visit to Britain during his time in Rome, but though we know that he intended to travel to Spain, there is little evidence to suggest that he did so, or that he went further north. Apparently, in Merton College, Oxford, there is an ancient manuscript known as the ‘Paulian MS’ which purports to contain a series of letters between Paul and Seneca, which make allusions to the former’s residence in Siluria. Clement of Rome, who died in about AD 100 wrote of the martyrdoms of both Peter and Paul, whom he probably knew personally. He sums up the magnitude of Paul’s achievement in the following terms:

Paul, also, having seven times worn chains, and been hunted and stoned, received the prize of such endurance. For he was the herald of the Gospel in the West as well as in the East, and enjoyed the illustrious reputation of the faith in teaching the whole world to be righteous. And after he had been in the extremity of the West, he suffered martyrdom before the sovereigns of mankind; and thus delivered from this world, he went to his holy place, the most brilliant example of steadfastness that we possess. 

In referring to ‘the extremity of the West’, Clement could be referring to Gaul or Britain, but he is more likely to be referring, in this context, to the western Mediterranean. I Clement is an open letter from one of the early bishops or presbyters of the Rome to the church at Corinth, probably written at the very end of the first century, shortly after the persecution of Emperor Domitian. It is probably the earliest surviving Christian writing outside of the New Testament. It was written to counter the disruption and disturbance of in the church at Corinth, where some of the older leaders had been deposed by a younger clique. It sheds interesting light on the nature and conduct of church life soon after the age of the apostles. It puts great stress on good order, and on Christian faith being accompanied by good works, claiming that Abraham was saved by faith and hospitality. The book quotes extensively from the Old Testament, Jewish books outside the canon and writings of the apostles. Like Paul’s own letter to the Corinthians, written earlier, Clement exhorts his readers to Christian humility and love, and it was probably read out in Corinth and other churches.

005

In I Corinthians, which gives the earliest description of worship in the Christian church, Paul constantly draws on the Old Testament. This letter, written in about AD 55 pictures the church as the new Israel, living a pattern of the Christian life that is based on the new exodus. Paul uses ideas drawn from the Jewish Passover, which celebrated God’s saving favour and strength in calling Israel to be his people, and rescuing them from tyranny in Egypt. According to Paul, the church succeeded the old Jewish community and combined both Jews and Greeks within God’s one family of converted men and women. This fellowship of believers in Jesus stood at the dawn of a new age of grace and power. Al this was possible through the gift of the Holy Spirit, which followed the resurrection and ascension of Jesus. This one fact of experience stamps New Testament worship as unique, however much the church owed to its Jewish inheritance. Paul used the framework of the Passover meal to interpret the Lord’s Supper. But other elements were intertwined, such as the fellowship meal, called the agape or love-feast which had its counterpart in Jewish table-customs. This had become an occasion for an ‘orgy’ of gluttony and drunkenness in Corinth, and Paul pointed out that this was a breakdown in the fellowship which both the Lord’s Supper and the agape were designed to promote. Paul believed that the Lord’s Supper served both to unite Christians with the Lord in his death and risen life, and to join believers in a bond of union as ‘one body’ in Christ, receiving him by faith and in love.

005 (2)

The setting for worship was ‘the first day of the week’, referring to the day of Christ’s resurrection, as in the Gospels, and is distinct from the Jewish Sabbath. The Christian Sunday was not made a ‘day of rest’ until Constantine decreed it in AD 321. Paul also wrote about baptism, a rite of initiation with its roots in the Jewish washings for ceremonial purposes, and especially in the service of tebilah, the ‘bath’ necessary for all converts to Judaism. The practice of baptism was also being misused at Corinth, and Paul objected to their misunderstanding or abuse. Baptism, he told them, should be in the name of Jesus, not in the name of leaders in the fellowship, as if these were apostolic cult figures. ‘In the name of Jesus’ meant that new converts passed under his authority, and confessed him as Lord. The enthusiasm of the Corinthian Christians also led them to misuse ‘ecstatic tongues’ and other gifts of the Spirit. Paul tried to curb this by insisting that worship must promote the healthy growth of the entire community of Christians. Personal indulgence in the gifts of the Spirit was to be brought firmly under control. Not all the features of early Christian worship at Corinth are clear. It is not known what ‘baptism for the dead’ implied. Paul did not attach great importance to it but used it simply to illustrate another matter. He also mentioned the ‘kiss of peace’ without explanation.

008 (2)

Prayers also played an important part in worship at Corinth. At public prayer, the response of amen (a Hebrew word of confirmation) was the natural way to show agreement. Problems arose over women who attempted to pray with uncovered heads. Paul resisted this practice, though he freely granted the right of women believers to act as prophets and leaders of prayer in the assembled church. Both prophesying and praying were seen as gifts of the Spirit. The freedom that the Corinthians were exercising to the full was to be held in check. Paul crisply summed up: Let all things be done decently and in order. ‘Singing’ with the mind and the Spirit indicates a musical side to the meeting, but references to musical instruments do not make it clear whether they were used in worship. Exactly what these hymns were, and whether snatches of them have survived, is unclear. Passages in Philippians 2: 6-11; Colossians 1: 15-20 and 1 Timothy 3: 16 contain what may be early hymns, offered, as later among Christians in Bithynia about AD 112, to Christ as God. Ephesians 5: 14 is the most likely example of a hymn from the churches instructed by Paul. The setting of that three-line invocation is clearly a service of baptism.

Evidence about Christian worship from writers who lived between the time of Paul and the middle of the second is scarce and difficult to piece together. In his letters, Pliny gives an outsider’s view of Christian worship from this time:

They were in the habit of meeting on a certain fixed day before it was light, when they sang an anthem to Christ as God, and bound themselves by a solemn oath (‘sacramentum’) not to commit any wicked deed, but to abstain from all fraud, theft and adultery, never to break their word, or deny a trust when called upon to honour it; after which it was their custom to separate, and then meet again to partake of food, but food of the ordinary and innocent kind.

(Pliny, Letters x. 96; AD 112).

003 (2)

Pliny’s correspondence with Emperor Trajan reveals that the early Christians shared ‘holy meals’ and that by this time the agape had been separated from the Lord’s Supper. In fact, continuing abuse of the ‘love-feast’ led to its gradual disappearance in its original form. The solemn meal of ‘holy communion’ was given more and more prominence as a sacrament. Ignatius describes it as a medicine of immortality, the antidote that we should not die, but live forever in Jesus Christ. Worship gradually became more standardised, formal and stereotyped in the period following Paul’s death, with the ‘Lord’s Supper’ becoming the focal point of the liturgy. Bishops and deacons possibly helped in this trend. New converts (catechumens) were given instruction in preparation for baptism. Worship forms connected with this are referred to in the letters of I Peter and I John. Short snatches of an elementary creed are found in such verses as Jesus is Lord (Romans 10: 9), lengthened and developed in I Timothy 3: 16 and I Peter 3: 18-22.

At first, when a person was baptised they affirmed a creed which was concerned mainly with statements about Christ’s person, as in the addition to the text in Acts 8: 37. Examples of more formal creeds, stating the belief in the three persons of the Godhead, the Trinity, occur in descriptions of baptismal services reported by Irenaeus and Hippolytus of Rome. The Apostles’ Creed, shown below, derives from the late second-century baptismal creed used in Rome, which in turn derives from Paul’s theology. Perhaps the most lasting and visible legacy of the self-proclaimed apostle is, therefore, to be found in the liturgy of the sacraments, which is still shared in most Christian churches, more than nineteen hundred and fifty years after his death.

010 (2)

Sources:

Tom Wright (2018), Paul: A Biography. London: SPCK.

Robert C Walton (ed.) (1970), A Source Book of the Bible for Teachers. London: SCM Press.

Tim Dowley (ed.) (1977), The History of Christianity. Berkhamsted: Lion Publishing.

George F Jowett (1961), The Drama of the Lost Disciples. London: Covenant Publishing.

 

 

 

 

 

 

 

 

 

Posted March 18, 2019 by TeamBritanniaHu in Archaeology, Asia Minor, Assimilation, baptism, Bible, Britain, British history, Britons, Celtic, Celts, Christian Faith, Christianity, Church, Civilization, Colonisation, Commemoration, Compromise, Conquest, Crucifixion, Education, eschatology, Ethnicity, Europe, Family, Fertility, Gentiles, Graeco-Roman, History, Imperialism, India, Israel, Jerusalem, Jesus Christ, Jesus of Nazareth, Jews, John's Gospel, Josephus, Literature, Marriage, Mediterranean, Memorial, Messiah, Middle East, Midlands, morality, multiculturalism, Music, Narrative, Nationality, New Testament, Old Testament, Palestine, Paul (Saint), Poverty, Reconciliation, Remembrance, Romans, Sacraments, Simon Peter, Synoptic Gospels, Syria, The Law, theology, tyranny, Women in the Bible

Tagged with , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

You Only Live Twice – Cool Britannia to Cold Brexit: The United Kingdom, 1999-2019. Part Two: Identity, Immigration & Islam.   Leave a comment

 

002

British Identity at the Beginning of the New Millennium:

As Simon Schama pointed out in 2002, it was a fact that even though only half of the British-Caribbean population and a third of the British-Asian population were born in Britain, they continued to constitute only a small proportion of the total population. It was also true that any honest reckoning of the post-imperial account needed to take account of the appeal of separatist fundamentalism in Muslim communities. At the end of the last century, an opinion poll found that fifty per cent of British-born Caribbean men and twenty per cent of British-born Asian men had, or once had, white partners. In 2000, Yasmin Alibhai-Brown found that, when polled, eighty-eight per cent of white Britons between the ages of eighteen and thirty had no objection to inter-racial marriage; eighty-four per cent of West Indians and East Asians and fifty per cent of those from Indian, Pakistani or Bangladeshi backgrounds felt the same way. Schama commented:

The colouring of Britain exposes the disintegrationalist argument for the pallid, defensive thing that it is. British history has not just been some sort of brutal mistake or conspiracy that has meant the steamrollering of Englishness over subject nations. It has been the shaking loose of peoples from their roots. A Jewish intellectual expressing impatience with the harping on ‘roots’ once told me that “trees have roots; Jews have legs”. The same could be said of Britons who have shared the fate of empire, whether in Bombay or Bolton, who have encountered each other in streets, front rooms, kitchens and bedrooms.

001

Britain, the European Union, NATO & the Commonwealth, 2000

Until the Summer of 2001, this ‘integrationist’ view of British history and contemporary society was the broadly accepted orthodoxy among intellectuals and politicians, if not more popularly. At that point, however, partly as a result of riots in the north of England involving ethnic minorities, including young Muslim men, and partly because of events in New York and Washington, the existence of parallel communities began to be discussed more widely and the concept of ‘multiculturalism’ began to become subject to fundamental criticism on both the right and left of the political spectrum. In the ‘noughties’, the dissenters from the multicultural consensus began to be found everywhere along the continuum. In the eighties and nineties, there were critics who warned that the emphasis on mutual tolerance and equality between cultures ran the risk of encouraging separate development, rather than fostering a deeper sense of mutual understanding through interaction and integration between cultures. The ‘live and let live’ outlook which dominated ‘race relations’ quangos in the 1960s and ’70s had already begun to be replaced by a more active interculturalism, particularly in communities where that outlook had proven to be ineffective in countering the internecine conflicts of the 1980s. Good examples of this development can be found in the ‘Education for Mutual Understanding’ and ‘Inter-Cultural’ Educational projects in Northern Ireland and the North and West Midlands of England in which this author was involved and has written about elsewhere on this site.

Politicians also began to break with the multicultural consensus, and their views began to have an impact because while commentators on the right were expected to have ‘nativist’ if not ‘racist’ tendencies in the ‘Powellite’ tradition, those from the left could generally be seen as having less easily assailable motives.

Flickr - boellstiftung - Trevor Phillips.jpgTrevor Phillips (pictured left), whom I had known as the first black President of the National Union of Students in 1979 before, in 2003, he became the Chair of the Commission for Racial Equality, opened up territory in discussion and debate that others had not dared to ‘trespass’ into. His realisation that the race-relations ‘industry’ was part of the problem, and that partly as a result of talking up diversity the country was ‘sleepwalking to segregation’ was an insight that others began to share.

Simon Schama also argued that Britain should not have to choose between its own multi-cultural, global identity and its place in Europe. Interestingly, he put the blame for this pressure at least partly on the EU bureaucracy in Brussels, suggesting that…

 … the increasing compulsion to make the choice that General de Gaulle imposed on us between our European and our extra-European identity seems to order an impoverishment of our culture. It is precisely the the roving, unstable, complicated, migratory character of our history that ought to be seen as a gift for Europe. It is a past, after all, that uniquely in European history combines a passion for social justice with a tenacious attachment to bloody-minded liberty, a past designed to subvert, not reinforce, the streamlined authority of global bureaucracies and corporations. Our place at the European table ought to make room for that peculiarity or we should not bother showing up for dinner. What, after all, is the alternative? To surrender that ungainly, eccentric thing, British history, with all its warts and disfigurements, to the economic beauty parlour that is Brussels will mean a loss. But properly smartened up, we will of course be fully entitled to the gold-card benefits of the inward-looking club… Nor should Britain rush towards a re-branded future that presupposes the shame-faced repudiation of the past. For our history is not the captivity of our future; it is, in fact, the condition of our maturity.  

Featured Image -- 20189

‘Globalisation’

Fourteen years later, this was exactly the choice facing the British people, though now it was not De Gaulle or even the Brussels ‘Eurocrats’ who were asking the question, but the British Prime Minister, David Cameron, and his ‘Brexiteer’ Conservatives in his cabinet and on the back benches. The people themselves had not asked to be asked, but when they answered at the 2016 Referendum, they decided, by a very narrow majority, that they preferred the vision (some would say ‘unicorn’) of a ‘global’ Britain to the ‘gold-card benefits’ available at the European table it was already sitting at. Their ‘tenacious attachment’ to ‘bloody-minded liberty’ led to them expressing their desire to detach themselves from the European Union, though it is still not clear whether they want to remain semi-detached or move to a detached property at the very end of the street which as yet has not yet been planned, let alone built. All we have is a glossy prospectus which may or may not be delivered or even deliverable.

An internet poster from the 2016 Referendum Campaign

009

Looking back to 2002, the same year in which Simon Schama published his BBC series book, The Fate of Empire, the latest census for England and Wales was published. Enumerated and compiled the previous year, it showed the extent to which the countries had changed in the decade since the last census was taken. Douglas Murray, in the first chapter of his recent book, The Strange Death of Europe, first published in 2017, challenges us to imagine ourselves back in 2002 speculating about what England and Wales might look like in the 2011 Census. Imagine, he asks us, that someone in our company had projected:

“White Britons will become a minority in their own capital city by the end of this decade and the Muslim population will double in the next ten years.”

How would we have reacted in 2002? Would we have used words like ‘alarmist’, ‘scaremongering’, ‘racist’, ‘Islamophobic’? In 2002, a Times journalist made far less startling statements about likely future immigration, which were denounced by David Blunkett, then Home Secretary (using parliamentary privilege) as bordering on fascism. Yet, however much abuse they received for saying or writing it, anyone offering this analysis would have been proved absolutely right at the end of 2012, when the 2011 Census was published. It proved that only 44.9 per cent of London residents identified themselves as ‘white British’. It also revealed far more significant changes, showing that the number of people living in England and Wales who had been born ‘overseas’ had risen by nearly three million since 2001. In addition, nearly three million people in England and Wales were living in households where not one adult spoke English or Welsh as their main language.

DSCN0105

These were very major ethnic and linguistic changes, but there were equally striking findings of changing religious beliefs. The Census statistics showed that adherence to every faith except Christianity was on the rise. Since the previous census, the number of people identifying themselves as Christian had declined from seventy-two per cent to fifty-nine. The number of Christians in England and Wales dropped by more than four million, from thirty-seven million to thirty-three. While the Churches witnessed this collapse in their members and attendees, mass migration assisted a near doubling of worshippers of Islam. Between 2001 and 2011 the number of Muslims in England and Wales rose from 1.5 million to 2.7 million. While these were the official figures, it is possible that they are an underestimate, because many newly-arrived immigrants might not have filled in the forms at the beginning of April 2011 when the Census was taken, not yet having a registered permanent residence. The two local authorities whose populations were growing fastest in England, by twenty per cent in the previous ten years, were Tower Hamlets and Newham in London, and these were also among the areas with the largest non-response to the census, with around one in five households failing to return the forms.

002 (2)

Yet the results of the census clearly revealed that mass migration was in the process of altering England completely. In twenty-three of London’s thirty-three boroughs (see map above) ‘white Britons’ were now in a minority. A spokesman for the Office of National Statistics regarded this demonstrating ‘diversity’, which it certainly did, but by no means all commentators regarded this as something positive or even neutral. When politicians of all the main parties addressed the census results they greeted them in positive terms. This had been the ‘orthodox’ political view since in 2007 the then Mayor of London, Ken Livingstone, had spoken with pride about the fact that thirty-five per cent of the people working in London had been born in a foreign country. For years a sense of excitement and optimism about these changes in London and the wider country seemed the only appropriate tone to strike. This was bolstered by the sense that what had happened in the first decade of the twenty-first century was simply a continuation of what had worked well for Britain in the previous three decades. This soon turned out to be a politically-correct pretence, though what was new in this decade was not so much growth in immigration from Commonwealth countries and the Middle East, or from wartorn former Yugoslavia, but the impact of white European migrants from the new EU countries, under the terms of the accession treaties and the ‘freedom of movement’ regulations of the single market. As I noted in the previous article, the British government could have delayed the implementation of these provisions but chose not to.

Questions about the Quality & Quantity of Migration:

004

Besides the linguistic and cultural factors already dealt with, there were important economic differences between the earlier and the more recent migrations of Eastern Europeans. After 2004, young, educated Polish, Czech and Hungarian people had moved to Britain to earn money to earn money to send home or to take home with them in order to acquire good homes, marry and have children in their rapidly developing countries. And for Britain, as the host country, the economic growth of the 2000s was fuelled by the influx of energetic and talented people who, in the process, were also denying their own country their skills for a period. But the UK government had seriously underestimated the number of these workers who wanted to come to Britain. Ministers suggested that the number arriving would be around 26,000 over the first two years. This turned out to be wildly wrong, and in 2006 a Home Office minister was forced to admit that since EU expansion in 2004, 427,000 people from Poland and seven other new EU nations had applied to work in Britain. If the self-employed were included, he added, then the number might be as high as 600,000. There were also at least an additional 36,000 spouses and children who had arrived, and 27,000 child benefit applications had been received. These were very large numbers indeed, even if most of these turned out to be temporary migrants.

It has to be remembered, of course, that inward migration was partially offset by the outflow of around sixty thousand British people each year, mainly permanent emigrants to Australia, the United States, France and Spain. By the winter of 2006-07, one policy institute reckoned that there were 5.5 million British people living permanently overseas, nearly ten per cent of Britons, or more than the population of Scotland. In addition, another half a million were living abroad for a significant part of the year. Aside from Europe, the Middle East and Asia were seeing rising ‘colonies’ of expatriate British. A worrying proportion of them were graduates; Britain was believed to be losing one in six of its graduates to emigration. Many others were retired or better-off people looking for a life in the sun, just as many of the newcomers to Britain were young, ambitious and keen to work. Government ministers tended to emphasise these benign effects of immigration, but their critics looked around and asked where all the extra people would go, where they would live, and where their children would go to school, not to mention where the extra hospital beds, road space and local services would come from, and how these would be paid for.

Members of the campaign group Citizens UK hold a ‘refugees welcome’ event outside Lunar House in Croydon. Photograph: John Stillwell/PA

A secondary issue to that of ‘numbers’ was the system for asylum seekers. In 2000, there were thirty thousand failed asylum seekers in the United Kingdom, a third of those who had applied in 1999, when only 7,645 had been removed from the country. It was decided that it was impossible to remove more, and that to try to do so would prove divisive politically and financially costly. Added to this was the extent of illegal immigration, which had caught the ‘eye’ of the British public. There were already criminal gangs of Albanians, Kosovars and Albanians, operating from outside the EU, who were undermining the legal migration streams from Central-Eastern Europe in the eyes of many. The social service bill for these ‘illegal’ migrants became a serious burden for the Department of Social Security. Towns like Slough protested to the national government about the extra cost in housing, education and other services.

In addition, there was the sheer scale of the migration and the inability of the Home Office’s immigration and nationality department to regulate what was happening, to prevent illegal migrants from entering Britain, to spot those abusing the asylum system in order to settle in Britain and the failure to apprehend and deport people. Large articulated lorries filled with migrants, who had paid over their life savings to be taken to Britain, rumbled through the Channel Tunnel and the ferry ports. A Red Cross camp at Sangatte, near the French entrance to the ‘Chunnel’ (the photo below shows the Folkestone entrance), was blamed by Britain for exacerbating the problem. By the end of 2002, an estimated 67,000 had passed through the camp to Britain. The then Home Secretary, David Blunkett finally agreed on a deal with the French to close the camp down, but by then many African, Asian and Balkan migrants, believing the British immigration and benefits systems to be easier than those of other EU countries, had simply moved across the continent and waited patiently for their chance to board a lorry to Britain.

006 (2)

Successive Home Secretaries from Blunkett to Reid tried to deal with the trade, the latter confessing that his department was “not fit for purpose”. He promised to clear a backlog of 280,000 failed asylum claims, whose seekers were still in the country after five years. The historic Home Office was split up, creating a separate immigration and nationality service. Meanwhile, many illegal immigrants had succeeded in bypassing the asylum system entirely. In July 2005, the Home Office produced its own estimate of the number of these had been four years earlier. It reckoned that this was between 310,000 and 570,000, or up to one per cent of the total population. A year later, unofficial estimates pushed this number up to 800,000. The truth was that no-one really knew, but official figures showed the number applying for asylum were now falling, with the former Yugoslavia returning to relative peace.  Thousands of refugees were also being returned to Iraq, though the signs were already apparent that further wars in the Middle East and the impact of global warming on sub-Saharan Africa would soon send more disparate groups across the continents.

Britain’s Toxic Politics of Immigration:

010

To begin with, the arrival of workers from the ten countries who joined the EU in 2004 was a different issue, though it involved an influx of roughly the same size. By the government’s own figures, annual net inward migration had reached 185,000 and had averaged 166,000 over the previous seven years. This was significantly more than the average net inflow of fifty thousand New Commonwealth immigrants which Enoch Powell (pictured above) had referred to as ‘literally mad’ in his 1968 Rivers of Blood speech, though he had been criticising the immigration of East African Asians, of course. But although Powell’s speech was partly about race, colour and identity, it was also about numbers of immigrants and the practical concerns of his Wolverhampton constituents in finding hospital and school places in an overstretched public sector. It seems not unreasonable, and not at all racist, to suggest that it is a duty of central government to predict and provide for the number of newcomers it permits to settle in the country. In 2006, the Projections based on many different assumptions suggested that the UK population would grow by more than seven million by 2031. Of that, eighty per cent would be due to immigration. The organisation, Migration Watch UK, set up to campaign for tighter immigration controls, said this was equivalent to requiring the building of a new town the size of Cambridge each year, or five new cities the size of Birmingham over the predicted quarter century.

But such characterisations were surely caricatures of the situation since many of these new Eastern European migrants did not intend to settle permanently in the UK and could be expected to return to their countries of origin in due course. However, the massive underestimations of the scale of the inward migration were, of course, predictable to anybody with any knowledge of the history of post-war migration, replete with vast underestimates of the numbers expected. But it did also demonstrate that immigration control was simply not a priority for New Labour, especially in its early manifestations. It gave the impression that it regarded all immigration control, and even discussion of it, as inherently ‘racist’ (even the restriction of white European migration), which made any internal or external opposition hard to voice. The public response to the massive upsurge in immigration and to the swift transformation of parts of Britain it had not really reached before, was exceptionally tolerant. There were no significant or sustained outbreaks of racist abuse or violence before 2016, and the only racist political party, the British National Party (BNP) was subsequently destroyed, especially in London.

Official portrait of Dame Margaret Hodge crop 2.jpgIn April 2006, Margaret Hodge, the Labour MP for Barking since 1996 (pictured right), commented in an interview with The Sunday Telegraph that eight out of ten white working-class voters in her constituency might be tempted to vote for the British National Party (BNP) in the local elections on 4 May 2006 because “no one else is listening to them” about their concerns over unemployment, high house prices and the housing of asylum seekers in the area. She said the Labour Party must promote…

“… very, very strongly the benefits of the new, rich multi-racial society which is part of this part of London for me”.

There was widespread media coverage of her remarks, and Hodge was strongly criticised for giving the BNP publicity. The BNP went on to gain 11 seats in the local election out of a total of 51, making them the second largest party on the local council. It was reported that Labour activists accused Hodge of generating hundreds of extra votes for the BNP and that local members began to privately discuss the possibility of a move to deselect her. The GMB wrote to Hodge in May 2006, demanding her resignation. The Mayor of London, Ken Livingstone, later accused Hodge of “magnifying the propaganda of the BNP” after she said that British residents should get priority in council house allocations. In November 2009, the Leader of the BNP, Nick Griffin, announced that he intended to contest Barking at the 2010 general election. In spite of the unions’ position, Hodge was returned as Member for Barking in 2010, doubling her majority to over 16,000, whilst Griffin came third behind the Conservatives. The BNP lost all of its seats on Barking and Dagenham Council. Following the same general election in 2010, which saw New Labour defeated under Gordon Brown’s leadership.

Opinion polls and the simple, anecdotal evidence of living in the country showed that most people continued to feel zero personal animosity towards immigrants or people of different ethnic backgrounds. But poll after poll did show that a majority were deeply worried about what ‘all this’ migration meant for the country and its future. But even the mildest attempts to put these issues on the political agenda, such as the concerns raised by Margaret Hodge (and the 2005 Conservative election campaign poster suggesting ‘limits’ on immigration) were often met with condemnation by the ruling political class, with the result that there was still no serious public discussion of them. Perhaps successive governments of all hues had spent decades putting off any real debate on immigration because they suspected that the public disagreed with them and that it was a matter they had lost control over anyway.

Perhaps it was because of this lack of control that the principal reaction to the developing reality began to be to turn on those who expressed any concern about it, even when they reflected the views of the general public. This was done through charges of ‘racism’ and ‘bigotry’, such as the accidental ‘caught-on-mike’ remark made by Gordon Brown while getting into his car in the 2010 election campaign, when confronted by one of his own Labour councillors in a northern English town about the sheer numbers of migrants. It is said to have represented a major turning point in the campaign. A series of deflecting tactics became a replacement for action in the wake of the 2011 census, including the demand that the public should ‘just get over it’, which came back to haunt David Cameron’s ministers in the wake of the 2016 Referendum. In his Daily Telegraph column of December 2012, titled Let’s not dwell on immigration but sow the seeds of integration, Boris Johnson, then Mayor of London, responded to the census results by writing…

We need to stop moaning about the dam-burst. It’s happened. There is nothing we can now do except make the process of absorption as eupeptic as possible … 

The Mayor, who as an MP and member of David Cameron’s front-bench team later became a key leader of the ‘Leave’ campaign and an ardent Brexiteer, may well have been right in making this statement, saying what any practical politician in charge of a multi-cultural metropolis would have to say. But there is something cold about the tone of his remark, not least the absence of any sense that there were other people out there in the capital city not willing simply to ‘get over it’, who disliked the alteration of their society and never asked for it. It did not seem to have occurred to Johnson that there were those who might be nursing a sense of righteous indignation that about the fact that for years all the main parties had taken decisions that were so at variance with the opinions of their electors, or that there was something profoundly disenfranchising about such decisions, especially when addressed to a majority of the voting public.

In the same month as Johnson’s admonition, a poll by YouGov found two-thirds of the British public believed that immigration over the previous decade had been ‘a bad thing for Britain’. Only eleven per cent thought it had been ‘a good thing’. This included majorities among voters for every one of the three main parties. Poll after poll conducted over the next five years showed the same result. As well as routinely prioritising immigration as their top concern, a majority of voters in Britain regularly described immigration as having a negative impact on their public services and housing through overcrowding, as well as harming the nation’s identity. By 2012 the leaders of every one of the major parties in Britain had conceded that immigration was too high, but even whilst doing so all had also insisted that the public should ‘get over it’. None had any clear or successful policy on how to change course. Public opinion surveys suggest that a failure to do anything about immigration even while talking about it is one of the key areas of the breakdown in trust between the electorate and their political representatives.

At the same time, the coalition government of 2010-15 was fearful of the attribution of base motives if it got ‘tough on immigrants’. The Conservative leadership was trying to reposition itself as more socially ‘liberal’ under David Cameron. Nevertheless, at the election, they had promised to cut immigration from hundreds of thousands to tens of thousands per year, but they never succeeded in getting near that target. To show that she meant ‘business’, however, in 2013, Theresa May’s Home Office organised a number of vans with advertising hoardings to drive around six London boroughs where many illegal immigrants and asylum seekers lived. The posters on the hoardings read, In the UK illegally? Go home or face arrest, followed by a government helpline number. The posters became politically toxic immediately. The Labour Shadow Home Secretary, Yvette Cooper, described them as “divisive and disgraceful” and the campaign group Liberty branded them “racist and illegal”.

After some months it was revealed that the pilot scheme had successfully persuaded only eleven illegal immigrants to leave the country voluntarily. Theresa May admitted that the scheme had been a mistake and too “blunt”. Indeed, it was a ‘stunt’ designed to reassure the ‘native’ population that their government was getting tough, and it was not repeated, but the overall ‘hostile environment’ policy it was part of continued into the next majority Conservative government, leading to the illegal deportation of hundreds of ‘Windrush generation’ migrants from the Caribbean who had settled in Britain before 1968 and therefore lacked passports and papers identifying them as British subjects. The Tories repeated their promise on immigration more recently, in both David Cameron’s majority government of 2015 and Theresa May’s minority one of 2017, but are still failing to get levels down to tens of thousands. In fact, under Cameron, net immigration reached a record level of 330,000 per year, numbers which would fill a city the size of Coventry.

The movement of people, even before the European migration crisis of 2015, was of an entirely different quantity, quality and consistency from anything that the British Isles had experienced before, even in the postwar period. Yet the ‘nation of immigrants’ myth continued to be used to cover over the vast changes in recent years to pretend that history can be used to provide precedents for what has happened since the turn of the millennium. The 2011 Census could have provided an opportunity to address the recent transformation of British society but like other opportunities in the second half of the twentieth century to discuss immigration, it was missed. If the fact that ‘white Britons’ now comprised a minority of the London population was seen as a demonstration of ‘diversity’ then the census had shown that some London boroughs were already lacking in ‘diversity’, not because there weren’t enough people of immigrant origin but because there weren’t enough ‘white Britons’ still around to make those boroughs diverse.

Brexit – The Death of Diversity:

Since the 2011 Census, net migration into Britain has continued to be far in excess of three hundred thousand per year. The rising population of the United Kingdom is now almost entirely due to inward migration, and to higher birthrates among the predominantly young migrant population. In 2014 women who were born overseas accounted for twenty-seven per cent of all live births in England and Wales, and a third of all newborn babies had at least one overseas-born parent, a figure that had doubled since the 1990s. However, since the 2016 Brexit vote, statistics have shown that many recent migrants to Britain from the EU have been returning to their home countries so that it is difficult to know, as yet, how many of these children will grow up in Britain, or for how long. On the basis of current population trends, and without any further rise in net inward migration, the most modest estimate by the ONS of the future British population is that it will rise from its current level of sixty-five million to seventy million within a decade, seventy-seven million by 2050 and to more than eighty million by 2060. But if the post-2011 levels were to continue, the UK population would go above eighty million as early as 2040 and to ninety million by 2060. In this context, Douglas Murray asks the following rhetoric questions of the leaders of the mainstream political parties:

All these years on, despite the name-calling and the insults and the ignoring of their concerns, were your derided average white voters not correct when they said that they were losing their country? Irrespective of whether you think that they should have thought this, let alone whether they should have said this, said it differently or accepted the change more readily, it should at some stage cause people to pause and reflect that the voices almost everybody wanted to demonise and dismiss were in the final analysis the voices whose predictions were nearest to being right.

An Ipsos poll published in July 2016 surveyed public attitudes towards immigration across Europe. It revealed just how few people thought that immigration has had a beneficial impact on their societies. To the question, Would you say that immigration has generally had a positive or negative impact on your country? very low percentages of people in each country thought that it had had a positive effect. Britain had a comparatively positive attitude, with thirty-six per cent of people saying that they thought it had had a very or fairly positive impact. Meanwhile, on twenty-four per cent of Swedes felt the same way and just eighteen per cent of Germans. In Italy, France and Belgium only ten to eleven per cent of the population thought that it had made even a fairly positive impact on their countries. Despite the Referendum result, the British result may well have been higher because Britain had not experienced the same level of immigration from outside the EU as had happened in the inter-continental migration crisis of the previous summer.

whos-in-control-7

Indeed, the issue of immigration as it affected the 2016 Referendum in Britain was largely about the numbers of Eastern European migrants arriving in the country, rather than about illegal immigrants from outside the EU, or asylum seekers. Inevitably, all three issues became confused in the public mind, something that UKIP (United Kingdom Independence Party) used to good effect in its campaigning posters. The original version of the poster above, featuring UKIP leader Nigel Farage, caused considerable controversy by using pictures from the 2015 Crisis in Central-Eastern Europe to suggest that Europe was at ‘Breaking Point’ and that once in the EU, refugees and migrants would be able to enter Britain and settle there. This was untrue, as the UK is not in the ‘Schengen’ area. Campaigners against ‘Brexit’ pointed out the facts of the situation in the adapted internet poster. In addition, during the campaign, Eastern European leaders, including the Poles and the Hungarians, complained about the misrepresentation of their citizens as ‘immigrants’ like many of those who had recently crossed the EU’s Balkan borders in order to get to Germany or Sweden. As far as they were concerned, they were temporary internal migrants within the EU’s arrangements for ‘freedom of movement’ between member states. Naturally, because this was largely a one-way movement in numeric terms, this distinction was lost on many voters, however, as ‘immigration’ became the dominant factor in their backing of Brexit by a margin of 52% to 48%.

In Britain, the issue of Calais remained the foremost one in discussion in the autumn of 2016. The British government announced that it was going to have to build a further security wall near to the large migrant camp there. The one-kilometre wall was designed to further protect the entry point to Britain, and specifically to prevent migrants from trying to climb onto passing lorries on their way to the UK. Given that there were fewer than 6,500 people in the camp most of the time, a solution to Calais always seemed straightforward. All that was needed, argued activists and politicians, was a one-time generous offer and the camp could be cleared. But the reality was that once the camp was cleared it would simply be filled again. For 6,500 was an average day’s migration to Italy alone.

Blue: Schengen Area Green: Countries with open borders Ochre: Legally obliged to join

In the meantime, while the British and French governments argued over who was responsible for the situation at Calais, both day and night migrants threw missiles at cars, trucks and lorries heading to Britain in the hope that the vehicles would stop and they could climb aboard as stowaways for the journey across the Channel. The migrants who ended up in Calais had already broken all the EU’s rules on asylum in order to get there. They had not applied for asylum in their first country of entry, Greece, nor even in Hungary. Instead, they had pushed on through the national borders of the ‘Schengen’ free passage area (see map above right) until they reached the north of France. If they were cold, poor or just worse off, they were seen as having the right to come into a Europe which could no longer be bothered to turn anyone away.

007

Migrants/ Asylum Seekers arriving on the shores of the Greek island of Lesbos.

The Disintegration of Multiculturalism, ‘Parallel Development’ & the Populist Reaction in Britain:

After the 9/11 attacks on the USA, the wars in Iraq and Afghanistan and the 7/7 London bombings, there was no bigger cultural challenge to the British sense of proportion and fairness than the threat of ‘militant Islam’. There were plenty of angry young Muslim men prepared to listen to fanatical ‘imams’ and to act on their narrow-minded and bloodthirsty interpretations of ‘Jihad’. Their views, at odds with those of the well-established South Asian Muslim communities referred to above, were those of the ultra-conservative ‘Wahhabi’ Arabs and Iranian mullahs who insisted, for example, on women being fully veiled. But some English politicians, like Norman Tebbit, felt justified in asking whether Muslim communities throughout Britain really wanted to fully integrate. Would they, in Tebbit’s notorious ‘test’, support the English Cricket team when it played against Pakistan?

Britain did not have as high a proportion of Muslims as France, and not many, outside London and parts of the South East, of Arab and North African origin. But the large urban centres of the Home Counties, the English Midlands and the North of England had third generation Muslim communities of hundreds of thousands. They felt like they were being watched in a new way and were perhaps right to feel more than a little uneasy. In the old industrial towns on either side of the Pennines and in areas of West London there were such strong concentrations of Muslims that the word ‘ghetto’ was being used by ministers and civil servants, not just, as in the seventies and eighties, by rightwing organisations and politicians. White working-class people had long been moving, quietly, to more semi-rural commuter towns in the Home Counties and on the South Coast.

But those involved in this ‘white flight’, as it became known, were a minority if polling was an accurate guide. Only a quarter of Britons said that they would prefer to live in white-only areas. Yet even this measure of ‘multiculturalism’, defined as ‘live and let live’, was being questioned. How much should the new Britons ‘integrate’ or ‘assimilate’, and how much was the retention of traditions a matter of their rights to a distinctive cultural identity? After all, Britain had a long heritage of allowing newcomers to integrate on their own terms, retaining and contributing elements of their own culture. Speaking in December 2006, Blair cited forced marriages, the importation of ‘sharia’ law and the ban on women entering certain mosques as being on the wrong side of this line. In the same speech he used new, harder language. He claimed that, after the London bombings, …

“… for the first time in a generation there is an unease, an anxiety, even at points a resentment that outr very openness, our willingness to welcome difference, our pride in being home to many cultures, is being used against us … Our tolerance is what makes is part of what makes Britain, Britain. So conform to it; or don’t come here. We don’t want the hate-mongers … If you come here lawfully, we welcome you. If you are permitted to stay here permanently, you become an equal member of our community and become one of us.”

His speech was not just about security and the struggle against terrorism. He was defining the duty to integrate. Britain’s strong economic growth over the previous two decades, despite its weaker manufacturing base, was partly the product of its long tradition of hospitality. The question now was whether the country was becoming so overcrowded that this tradition of tolerance was finally eroding. England, in particular, had the highest population density of any major country in the Western world. It would require wisdom and frankness from politicians together with watchfulness and efficiency from Whitehall to keep the ship on an even keel. Without these qualities and trust from the people, how can we hope for meaningful reconciliation between Muslim, Christian, Jew and Humanist?; between newcomers, sojourners, old-timers and exiles?; between white Europeans, black Africans, South Asians and West Indians?

Map showing the location of Rotherham in South Yorkshire

In January 2011, a gang of nine Muslim men, seven of Pakistani heritage and two from North Africa, were convicted and sentenced at the Old Bailey in London for the sex trafficking of children between the ages of eleven and fifteen. One of the victims sold into a form of modern-day slavery was a girl of eleven who was branded with the initial of her ‘owner’ and abuser: ‘M’ for Mohammed. The court heard that he had branded her to make her his property and to ensure others knew about it. This did not happen in a Saudi or Pakistani backwater, nor even in one of the northern English towns that so much of the country had forgotten about until similar crimes involving Pakistani heritage men were brought to light. This happened in Oxfordshire between 2004 and 2012. Nobody could argue that gang rape and child abuse are the preserve of immigrants, but these court cases and the official investigations into particular types of child-rape gangs, especially in the case of Rotherham, have identified specific cultural attitudes towards women, especially non-Muslim women, that are similar to those held by men in parts of Pakistan. These have sometimes been extended into intolerant attitudes toward other religions, ethnic groups and sexual minorities. They are cultural attitudes which are anathema to the teachings of the Qu’ran and mainstream Imams, but fears of being accused of ‘racism’ for pointing out such factual connections had been at least partly responsible for these cases taking years to come to light.

British Muslims and members of the British-Pakistani community condemned both the abuse and that it had been covered up. Nazir Afzal (pictured right), Chief Crown Prosecutor of the Crown Prosecution Service (CPS) for North West England from 2011–2015, himself a Muslim, made the decision in 2011 to prosecute the Rochdale child sex abuse ring after the CPS had turned the case down. Responding to the Jay report, he argued that the abuse had no basis in Islam:

“Islam says that alcohol, drugs, rape and abuse are all forbidden, yet these men were surrounded by all of these things. … It is not the abusers’ race that defines them. It is their attitude toward women that defines them.” 

Below left: The front page of The Times, 24 September 2012.

Even then, however, in the Oxfordshire case, the gangs were described as ‘Asian’ by the media, rather than as men of Pakistani and Arabic origin. In addition, the fact that their victims were chosen because they were not Muslim was rarely mentioned in court or dwelt upon by the press. But despite sections of the media beginning focus on Pakistani men preying on young white girls, a 2013 report by the UK Muslim Women’s Network found that British Asian girls were also being abused across the country in situations that mirrored the abuse in Rotherham. The unfunded small-scale report found 35 cases of young Muslim girls of Pakistani-heritage being raped and passed around for sex by multiple men. In the report, one local Pakistani women’s group described how Pakistani-heritage girls were targeted by taxi drivers and on occasion by older men lying in wait outside school gates at dinner times and after school. They also cited cases in Rotherham where Pakistani landlords had befriended Pakistani women and girls on their own for purposes of sex, then passed on their name to other men who had then contacted them for sex. The Jay Report, published in 2014, acknowledged that the 2013 report of abuse of Asian girls was ‘virtually identical’ to the abuse that occurred in Rotherham, and also acknowledged that British Asian girls were unlikely to report their abuse due to the repercussions on their family. Asian girls were ‘too afraid to go to the law’ and were being blackmailed into having sex with different men while others were forced at knife-point to perform sexual acts on men. Support workers described how one teenage girl had been gang-raped at a party:

“When she got there, there was no party, there were no other female members present. What she found was that there were five adults, their ages ranging between their mid-twenties going on to the late-forties and the five men systematically, routinely, raped her. And the young man who was supposed to be her boyfriend stood back and watched”.

Groups would photograph the abuse and threaten to publish it to their fathers, brothers, and in the mosques, if their victims went to the police.

In June 2013, the polling company ComRes carried out a poll for BBC Radio 1 asking a thousand young British people about their attitudes towards the world’s major religions. The results were released three months later and showed that of those polled, twenty-seven per cent said that they did not trust Muslims (compared with 15% saying the same of Jews, 13% of Buddhists, and 12% of Christians). More significantly, perhaps, forty-four per cent said that they thought Muslims did not share the same views or values as the rest of the population. The BBC and other media in Britain then set to work to try to discover how Britain could address the fact that so many young people thought this way. Part of the answer may have had something to do with the timing of the poll, the fieldwork being carried out between 7-17 June. It had only been a few weeks before this that Drummer Lee Rigby, a young soldier on leave from Afghanistan, had been hit by a car in broad daylight outside an army barracks in South London, dragged into the middle of the road and hacked to death with machetes. The two murderers, Michael Adebolajo and Michael Adebowale, were Muslims of African origin who were carrying letters claiming justification for killing “Allah’s enemies”. It’s therefore reasonable to suppose that, rather than making assumptions about a religious minority without any evidence, those who were asked their opinions connected Muslims with a difference in basic values because they had been very recently associated with an act of extreme violence on the streets of London.

Unfortunately, attempts to provide a more balanced view and to separate these acts of terrorism from Islam have been dwarfed by the growing public perception of a problem which will not simply go away through the repetition of ‘mantras’. The internet has provided multiple and diverse sources of information, but the simple passage of the various events related above, and the many others available examples, have meant that the public have been able to make their own judgements about Islam, and they are certainly not as favourable as they were at the start of the current century. By 2015, one poll showed that only thirty per cent of the general public in Britain think that the values of Islam are ‘compatible’ with the values of British society. The passage of terrorist events on the streets of Europe continued through 2016 and 2017. On 22 March 2017, a 52-year-old British born convert to Islam, Khalid Masood, ploughed his car across Westminster Bridge, killing two tourists, one American and the other Romanian, and two British nationals. Dozens more were injured as they scattered, some falling into the River Thames below. Crashing into the railings at the side of Parliament, Masood then ran out of the hired vehicle and through the gates of the palace, where he stabbed the duty policeman, PC Keith Palmer, who died a few minutes later. Masood was then shot dead by armed police, his last phone messages revealing that he believed he was “waging jihad.” Two weeks later, at an inter-faith ‘Service of Hope’ at Westminster Abbey, its Dean, the Very Reverend John Hall, spoke for a nation he described as ‘bewildered’:

What could possibly motivate a man to hire a car and take it from Birmingham to Brighton to London, and then drive it fast at people he had never met, couldn’t possibly know, against whom he had no personal grudge, no reason to hate them and then run at the gates of the Palace of Westminster to cause another death? It seems that we shall never know.

Then on 22 May thousands of young women and girls were leaving a concert by the US pop singer Ariana Grande at Manchester Arena. Waiting for them as they streamed out was Salman Abedi, a twenty-two-year-old British-born man, whose Libyan parents had arrived in the UK in the early nineties after fleeing from the Gadaffi régime. In the underground foyer, Abedi detonated a bomb he was carrying which was packed with nuts, bolts and other shrapnel. Twenty-two people, children and parents who had arrived to pick them up, were killed instantly. Hundreds more were injured, many of them suffering life-changing wounds. Then, in what began to seem like a remorseless series of events, on 3 June three men drove a van into pedestrians crossing London Bridge. They leapt out of it and began slashing at the throats of pedestrians, appearing to be targeting women in particular. They then ran through Borough Market area shouting “this is for Allah”. Eight people were murdered and many more seriously injured before armed police shot the three men dead. Two of the three, all of whom were aged twenty to thirty, were born in Morocco. The oldest of them, Rachid Redouane, had entered Britain using a false name, claiming to be a Libyan and was actually five years older than he had pretended. He had been refused asylum and absconded. Khurram Butt had been born in Pakistan and had arrived in the UK as a ‘child refugee’ in 1998, his family having moved to the UK to claim asylum from ‘political oppression’, although Pakistan was not on the UNHCR list.

On the evening of 19 June, at end of the Muslim sabbath, in what appeared to be a ‘reprisal’, a forty-seven-year-old father or four from Cardiff drove a van into crowds of worshippers outside Finsbury Park mosque who were crossing the road to go to the nearby Muslim Welfare House. One man, who had collapsed on the road and was being given emergency aid, was run over and died at the scene. Almost a dozen more were injured. Up to this point, all the Islamist terror attacks, from 7/7/2005 onwards, had been planned and carried out by ‘home-grown’ terrorists. Even the asylum seekers involved in the June attack in London had been in the country since well before the 2015 migration crisis. But in mid-September, an eighteen-year-old Iraqi who arrived in the UK illegally in 2015, and had been living with British foster parents ever since, left a crudely-manufactured bomb on the London Underground District line during the rush hour when the carriages were also crowded with schoolchildren. The detonator exploded but failed to ignite the home-made device itself, leading to flash burns to the dozens of people in the carriage. A more serious blast would have led to those dozens being taken away in body bags, and many more injured in the stampede which would have followed at the station exit with its steep steps. As it was, the passengers remained calm during their evacuation, but the subsequent emphasis on the ubiquitous Blitz slogan ‘Keep Calm and Carry On!’

Conclusion: Brexit at its ‘Best’.

002

Of course, it would have been difficult to predict and prevent these attacks, either by erecting physical barriers or by identifying individuals who might be at risk from ‘radicalisation’, much of which takes place online. Most of the attackers had been born and radicalised in the UK, so no reinforcements at the borders, either in Calais or Kent would have kept them from enacting their atrocities. But the need for secure borders is not simple a symbolic or psychological reinforcement for the British people if it is combined with a workable and efficient asylum policy. We are repeatedly told that one of the two main reasons for the 2016 referendum decision for Britain to leave the EU was in order to take back control of its borders and immigration policy, though it was never demonstrated how exactly it had lost control of these, or at least how its EU membership had made it lose control over them.

001

There are already signs that, as much due to the fall in the value of the pound since Brexit as to Brexit itself, many Eastern European migrants are returning to their home countries, but the vast majority of them had already declared that they did not intend to settle permanently in the UK. The fact that so many came from 2004 onwards was entirely down to the decision of the British government not to delay or derogate the operation of the accession treaties. But the reality remains that, even if they were to be replaced by other European ‘immigrants’ in future, the UK would still need to control, as ever, the immigration of people from outside the EU, including asylum seekers, and that returning failed or bogus applicants would become more difficult. So, too, would the sharing of intelligence information about the potential threats of terrorists attempting to enter Britain as bogus refugees. Other than these considerations, the home-grown threat from Islamist terrorists is likely to be unaffected by Brexit one way or another, and can only be dealt with by anti-radicalisation strategies, especially through education and more active inter-cultural community relations aimed at full integration, not ‘parallel’ development.

‘Populism’

Since the Brexit referendum in 2016 and the election of Donald Trump, it seems that journalists just cannot get enough of Populism. In 1998, the Guardian published about three hundred articles that contained the term. In 2015, it was used in about a thousand articles, and one year later this number had doubled to almost two thousand. Populist parties across Europe have tripled their vote in Europe over the past twenty years and more than a quarter of Europeans voted populist in their last elections. So, in deciding to leave the EU, the British are, ironically, becoming more like their continental cousins in supporting populist causes and parties. In a recent article in The Guardian Weekly, (30 November 2018), Fintan O’Toole, a columnist for The Irish Times, points out that for many pro-Brexit journalists and politicians Brexit takes the form of a populist ‘Britain alone’ crusade (see the picture and text below) which has been endemic in Britain’s political discourse about Europe since it joined ‘the common market’ in 1973:

Europe’s role in this weird psychodrama is entirely pre-scripted. It doesn’t greatly matter what the European Union is or what it is doing – its function in the plot is to be a more insiduous form of nazism. This is important to grasp, because one of the key arguments in mainstream pro-Brexit political and journalistic discourse would be that Britain had to leave because the Europe it had joined was not the Europe it found itself part of in 2016…

… The idea of Europe as a soft-Nazi superstate was vividly present in 1975, even when the still-emerging EU had a much weaker, less evolved and less intrusive form…

Yet what brings these disparate modes together is the lure of self-pity, the weird need to dream England into a state of awful oppression… Hostility to the EU thus opens the way to a bizarre logic in which a Nazi invasion would have been, relatively speaking, welcome…

It was a masochistic rhetoric that would return in full force as the Brexit negotiations failed to produce the promised miracles.

002

Certainly, the rejection of Mrs May’s deal in the House of Commons by large numbers of ‘Brexiteer’ MPs from her own Conservative Party was largely, by their own admission, because they felt they could not trust the assurances given by the Presidents of the Council and Commission of the European Union who were, some MPs stated, trying to trick them into accepting provisions which would tie the UK indefinitely to EU regulations. It is undoubtedly true that the British people mostly don’t want to spend any more time arguing about Brexit. But when ‘leavers’ and ‘remainers’ are united only in disliking Mrs May’s solution, that offers no way forward. The Brexiteers can only offer a “managed no deal” as an alternative, which means just strapping on seat belts as your car heads for the cliff edge. Brexit has turned out to be an economic and political disaster already, fuelling, not healing the divisions in British society which have opened up over the last twenty years, and have widened into a chasm in the last six years since the triumph of the London Olympics and the Diamond Jubilee Celebrations. The extent of this folly has grown clearer with each turn of the page. But the ending is not fully written.

Sources (for both parts):

The Guardian Weekly,  30 November 2018. London.

Douglas Murray (2018), The Strange Death of Europe: Immigration, Identity, Islam. London: Bloomsbury.

Simon Schama (2002), A History of Britain III: 1776-2000, The Fate of Empire. London: BBC Worldwide.

Andrew Marr (2009), A History of Modern Britain. London: Pan Macmillan.

John Morrill (ed.), (2001), The Penguin Atlas of British and Irish History. Harmondsworth: Penguin Books.

 

Posted January 16, 2019 by TeamBritanniaHu in Affluence, Africa, Arabs, Assimilation, asylum seekers, Australia, Balkan Crises, BBC, Brexit, Britain, British history, Britons, Brussels, Caribbean, Cartoons, Christian Faith, Christianity, Church, Colonisation, Commonwealth, Compromise, decolonisation, democracy, Demography, devolution, Discourse Analysis, Education, Empire, English Language, Europe, European Economic Community, European Union, Factories, Germany, History, Home Counties, Humanitarianism, Hungary, Immigration, India, Integration, Iraq, Ireland, Jews, Journalism, Labour Party, liberalism, Midlands, Migration, multiculturalism, multilingualism, Mythology, New Labour, Population, populism, Reconciliation, Refugees, Respectability, Satire, Second World War, terror, terrorism, United Kingdom, United Nations, West Midlands, World War Two, xenophobia

Tagged with , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

You Only Live Twice – Cool Britannia to Cold Brexit: The United Kingdom, 1999-2019. Part One: Economics, Culture & Society.   Leave a comment

Europe-map-without-UK-012

Cold Shoulder or Warm Handshake?

On 29 March 2019, the United Kingdom of Great Britain and Northern Ireland will leave the European Union after forty-six years of membership, since it joined the European Economic Community on 1 January 1973 on the same day and hour as the Republic of Ireland. Yet in 1999, it looked as if the long-standing debate over Britain’s membership had been resolved. The Maastricht Treaty establishing the European Union had been signed by all the member states of the preceding European Community in February 1992 and was succeeded by a further treaty, signed in Amsterdam in 1999. What, then, has happened in the space of twenty years to so fundamentally change the ‘settled’ view of the British Parliament and people, bearing in mind that both Scotland and Northern Ireland voted to remain in the EU, while England and Wales both voted to leave? At the time of writing, the manner of our going has not yet been determined, but the invocation of ‘article fifty’ by the Westminster Parliament and the UK government means that the date has been set. So either we will have to leave without a deal, turning a cold shoulder to our erstwhile friends and allies on the continent, or we will finally ratify the deal agreed between the EU Commission, on behalf of the twenty-seven remaining member states, and leave with a warm handshake and most of our trading and cultural relations intact.

As yet, the possibility of a second referendum – or third, if we take into account the 1975 referendum, called by Harold Wilson (above) which was also a binary leave/ remain decision – seems remote. In any event, it is quite likely that the result would be the same and would kill off any opportunity of the UK returning to EU membership for at least another generation. As Ian Fleming’s James Bond tells us, ‘you only live twice’. That certainly seems to be the mood in Brussels too. I was too young to vote in 1975 by just five days, and another membership referendum would be unlikely to occur in my lifetime. So much has been said about following ‘the will of the people’, or at least 52% of them, that it would be a foolish government, in an age of rampant populism, that chose to revoke article fifty, even if Westminster voted for this. At the same time, and in that same populist age, we know from recent experience that in politics and international relations, nothing is inevitable…

referendum-ballot-box[1]

One of the major factors in the 2016 Referendum Campaign was the country’s public spending priorities, compared with those of the European Union. The ‘Leave’ campaign sent a double-decker bus around England stating that by ending the UK’s payments into the EU, more than 350 million pounds per week could be redirected to the National Health Service (NHS).

A British Icon Revived – The NHS under New Labour:

To understand the power of this statement, it is important to recognise that the NHS is unique in Europe in that it is wholly funded from direct taxation, and not via National Insurance, as in many other European countries. As a service created in 1948 to be ‘free at the point of delivery’, it is seen as a ‘British icon’ and funding has been a central issue in national election campaigns since 2001, when Tony Blair was confronted by an irate voter, Sharon Storer, outside a hospital. In its first election manifesto of 1997, ‘New Labour’ promised to safeguard the basic principles of the NHS, which we founded. The ‘we’ here was the post-war Labour government, whose socialist Health Minister, Aneurin Bevan, had established the service in the teeth of considerable opposition from within both parliament and the medical profession. ‘New Labour’ protested that under the Tories there had been fifty thousand fewer nurses but a rise of no fewer than twenty thousand managers – red tape which Labour would pull away and burn. Though critical of the internal markets the Tories had introduced, Blair promised to keep a split between those who commissioned health services and those who provided them.

001

Under Frank Dobson, Labour’s new Health Secretary, there was little reform of the NHS but there was, year by year, just enough extra money to stave off the winter crises. But then a series of tragic individual cases hit the headlines, and one of them came from a Labour peer and well-known medical scientist and fertility expert, Professor Robert Winston, who was greatly admired by Tony Blair. He launched a furious denunciation of the government over the treatment of his elderly mother. Far from upholding the NHS’s iconic status, Winston said that Britain’s health service was the worst in Europe and was getting worse under the New Labour government, which was being deceitful about the true picture. Labour’s polling on the issue showed that Winston was, in general terms, correct in his assessment in the view of the country as a whole. In January 2000, therefore, Blair announced directly to it that he would bring Britain’s health spending up to the European average within five years. That was a huge promise because it meant spending a third as much again in real terms, and his ‘prudent’ Chancellor of the Exchequer, Gordon Brown, was unhappy that Blair had not spoken enough on television about the need for health service reform to accompany the money, and had also ‘stolen’ his budget announcements. On Budget day itself, Brown announced that until 2004 health spending would rise at above six per cent beyond inflation every year, …

… by far the largest sustained increase in NHS funding in any period in its fifty-year history … half as much again for health care for every family in this country.       

The tilt away from Brown’s sharp spending controls during the first three years of the New Labour government had begun by the first spring of the new millennium, and there was more to come. With a general election looming in 2001, Brown also announced a review of the NHS and its future by a former banker. As soon as the election was over, broad hints about necessary tax rises were dropped. When the Wanless Report was finally published, it confirmed much that the winter crisis of 1999-2000 had exposed. The NHS was not, whatever Britons fondly believed, better than health systems in other developed countries, and it needed a lot more money. ‘Wanless’ also rejected a radical change in funding, such as a switch to insurance-based or semi-private health care. Brown immediately used this as objective proof that taxes had to rise in order to save the NHS. In his next budget of 2002, Brown broke with a political convention that which had reigned since the mid-eighties, that direct taxes would not be raised again. He raised a special one per cent national insurance levy, equivalent to a penny on income tax, to fund the huge reinvestment in Britain’s health.

Public spending shot up with this commitment and, in some ways, it paid off, since by 2006 there were around 300,000 extra NHS staff compared to 1997. That included more than ten thousand extra senior hospital doctors (about a quarter more) and 85,000 more nurses. But there were also nearly forty thousand managers, twice as many as Blair and Brown had ridiculed the Tory government for hiring. An ambitious computer project for the whole NHS became an expensive catastrophe. Meanwhile, the health service budget rose from thirty-seven billion to more than ninety-two billion a year. But the investment produced results, with waiting lists, a source of great public anger from the mid-nineties, falling by 200,000. By 2005, Blair was able to talk of the best waiting list figures since 1988. Hardly anyone was left waiting for an inpatient appointment for more than six months. Death rates from cancer for people under the age of seventy-five fell by 15.7 per cent between 1996 and 2006 and death rates from heart disease fell by just under thirty-six per cent. Meanwhile, the public finance initiative meant that new hospitals were being built around the country. But, unfortunately for New Labour, that was not the whole story of the Health Service under their stewardship. As Andrew Marr has attested,

…’Czars’, quangos, agencies, commissions, access teams and planners hunched over the NHS as Whitehall, having promised to devolve power, now imposed a new round of mind-dazing control.

By the autumn of 2004 hospitals were subject to more than a hundred inspections. War broke out between Brown and the Treasury and the ‘Blairite’ Health Secretary, Alan Milburn, about the basic principles of running the hospitals. Milburn wanted more competition between them, but Brown didn’t see how this was possible when most people had only one major local hospital. Polling suggested that he was making a popular point. Most people simply wanted better hospitals, not more choice. A truce was eventually declared with the establishment of a small number of independent, ‘foundation’ hospitals. By the 2005 general election, Michael Howard’s Conservatives were attacking Labour for wasting money and allowing people’s lives to be put at risk in dirty, badly run hospitals. Just like Labour once had, they were promising to cut bureaucracy and the number of organisations within the NHS. By the summer of 2006, despite the huge injection of funds, the Service was facing a cash crisis. Although the shortfall was not huge as a percentage of the total budget, trusts in some of the most vulnerable parts of the country were on the edge of bankruptcy, from Hartlepool to Cornwall and across to London. Throughout Britain, seven thousand jobs had gone and the Royal College of Nursing, the professional association to which most nurses belonged, was predicting thirteen thousand more would go soon. Many newly and expensively qualified doctors and even specialist consultants could not find work. It seemed that wage costs, expensive new drugs, poor management and the money poured into endless bureaucratic reforms had resulted in a still inadequate service. Bupa, the leading private operator, had been covering some 2.3 million people in 1999. Six years later, the figure was more than eight million. This partly reflected greater affluence, but it was also hardly a resounding vote of confidence in Labour’s management of the NHS.

Public Spending, Declining Regions & Economic Development:

As public spending had begun to flow during the second Blair administration, vast amounts of money had gone in pay rises, new bureaucracies and on bills for outside consultants. Ministries had been unused to spending again, after the initial period of ‘prudence’, and did not always do it well. Brown and his Treasury team resorted to double and triple counting of early spending increases in order to give the impression they were doing more for hospitals, schools and transport than they actually could. As Marr has pointed out, …

… In trying to achieve better policing, more effective planning, healthier school food, prettier town centres and a hundred other hopes, the centre of government ordered and cajoled, hassled and harangued, always high-minded, always speaking for ‘the people’.  

The railways, after yet another disaster, were shaken up again. In very controversial circumstances Railtrack, the once-profitable monopoly company operating the lines, was driven to bankruptcy and a new system of Whitehall control was imposed. At one point, Tony Blair boasted of having five hundred targets for the public sector. Parish councils, small businesses and charities found that they were loaded with directives. Schools and hospitals had many more. Marr has commented, …

The interference was always well-meant but it clogged up the arteries of free decision-taking and frustrated responsible public life. 

002

Throughout the New Labour years, with steady growth and low inflation, most of the country grew richer. Growth since 1997, at 2.8 per cent per year, was above the post-war average, GDP per head was above that of France and Germany and the country had the second lowest jobless figures in the EU. The number of people in work increased by 2.4 million. Incomes grew, in real terms, by about a fifth. Pensions were in trouble, but house price inflation soured, so the owners found their properties more than doubling in value and came to think of themselves as prosperous. By 2006 analysts were assessing the disposable wealth of the British at forty thousand pounds per household. However, the wealth was not spread geographically, averaging sixty-eight thousand in the south-east of England, but a little over thirty thousand in Wales and north-east England (see map above). But even in the historically poorer parts of the UK house prices had risen fast, so much so that government plans to bulldoze worthless northern terraces had to be abandoned when they started to regain value. Cheap mortgages, easy borrowing and high property prices meant that millions of people felt far better off, despite the overall rise in the tax burden. Cheap air travel gave the British opportunities for easy travel both to traditional resorts and also to every part of the European continent. British expatriates were able to buy properties across the French countryside and in southern Spain. Some even began to commute weekly to jobs in London or Manchester from Mediterranean villas, and regional airports boomed as a result.

Sir Tim Berners Lee arriving at the Guildhall to receive the Honorary Freedom of the City of LondonThe internet, also known as the ‘World-Wide Web’, which was ‘invented’ by the British computer scientist Tim Berners-Lee at the end of 1989 (pictured right in 2014), was advancing from the colleges and institutions into everyday life by the mid- ‘noughties’. It first began to attract popular interest in the mid-nineties: Britain’s first internet café and magazine, reviewing a few hundred early websites, were both launched in 1994. The following year saw the beginning of internet shopping as a major pastime, with both ‘eBay’ and ‘Amazon’ arriving, though to begin with they only attracted tiny numbers of people.

But the introduction of new forms of mail-order and ‘click and collect’ shopping quickly attracted significant adherents from different ‘demographics’.  The growth of the internet led to a feeling of optimism, despite warnings that the whole digital world would collapse because of the inability of computers to cope with the last two digits in the year ‘2000’, which were taken seriously at the time. In fact, the ‘dot-com’ bubble was burst by its own excessive expansion, as with any bubble, and following a pause and a lot of ruined dreams, the ‘new economy’ roared on again. By 2000, according to the Office of National Statistics (ONS), around forty per cent of Britons had accessed the internet at some time. Three years later, nearly half of British homes were ‘online’. By 2004, the spread of ‘broadband’ connections had brought a new mass market in ‘downloading’ music and video. By 2006, three-quarters of British children had internet access at home.

001

Simultaneously, the rich of America, Europe and Russia began buying up parts of London, and then other ‘attractive’ parts of the country, including Edinburgh, the Scottish Highlands, Yorkshire and Cornwall. ‘Executive housing’ with pebbled driveways, brick facing and dormer windows, was growing across farmland and by rivers with no thought of flood-plain constraints. Parts of the country far from London, such as the English south-west and Yorkshire, enjoyed a ripple of wealth that pushed their house prices to unheard-of levels. From Leith to Gateshead, Belfast to Cardiff Bay, once-derelict shorefront areas were transformed. The nineteenth-century buildings in the Albert Dock in Liverpool (above) now house a maritime museum, an art gallery, shopping centre and television studio. It has also become a tourist attraction. For all the problems and disappointments, and the longer-term problems with their financing, new schools and public buildings sprang up – new museums, galleries, vast shopping complexes (see below), corporate headquarters in a biomorphic architecture of glass and steel, more imaginative and better-looking than their predecessors from the dreary age of concrete.

002

Supermarket chains exercised huge market power, offering cheap meat and dairy products into almost everyone’s budgets. Factory-made ready-meals were transported and imported by the new global air freight market and refrigerated trucks and lorries moving freely across a Europe shorn of internal barriers. Out-of-season fruit and vegetables, fish from the Pacific, exotic foods of all kinds and freshly cut flowers appeared in superstores everywhere. Hardly anyone was out of reach of a ‘Tesco’, a ‘Morrison’s’, a ‘Sainsbury’s’ or an ‘Asda’. By the mid-noughties, the four supermarket giants owned more than 1,500 superstores throughout the UK. They spread the consumption of goods that in the eighties and nineties had seemed like luxuries. Students had to take out loans in order to go to university but were far more likely to do so than previous generations, as well as to travel more widely on a ‘gap’ year, not just to study or work abroad.

Those ‘Left Behind’ – Poverty, Pensions & Public Order:

Materially, for the majority of people, this was, to use Marr’s term, a ‘golden age’, which perhaps helps to explain both why earlier real anger about earlier pension decisions and stealth taxes did not translate into anti-Labour voting in successive general elections. The irony is that in pleasing ‘Middle Englanders’, the Blair-Brown government lost contact with traditional Labour voters, especially in the North of Britain, who did not benefit from these ‘golden years’ to the same extent. Gordon Brown, from the first, made much of New Labour’s anti-poverty agenda, and especially child poverty. Since the launch of the Child Poverty Action Group, this latter problem had become particularly emotive. Labour policies took a million children out of relative poverty between 1997 and 2004, though the numbers rose again later. Brown’s emphasis was on the working poor and the virtue of work. So his major innovations were the national minimum wage, the ‘New Deal’ for the young unemployed, and the working families’ tax credit, as well as tax credits aimed at children. There was also a minimum income guarantee and a later pension credit, for poorer pensioners.

The minimum wage was first set at three pounds sixty an hour, rising year by year. In 2006 it was 5.35 an hour. Because the figures were low, it did not destroy the two million jobs as the Tories claimed it would. Neither did it produce higher inflation; employment continued to grow while inflation remained low. It even seemed to have cut red tape. By the mid-noughties, the minimum wage covered two million people, the majority of them women. Because it was updated ahead of rises in inflation rates, the wages of the poor also rose faster. It was so successful that even the Tories were forced to embrace it ahead of the 2005 election. The New Deal was funded by a windfall tax on privatised utility companies, and by 2000 Blair said it had helped a quarter of a million young people back into work, and it was being claimed as a major factor in lower rates of unemployment as late as 2005. But the National Audit Office, looking back on its effect in the first parliament, reckoned the number of under twenty-five-year-olds helped into real jobs was as low as 25,000, at a cost per person of eight thousand pounds. A second initiative was targeted at the babies and toddlers of the most deprived families. ‘Sure Start’ was meant to bring mothers together in family centres across Britain – 3,500 were planned for 2010, ten years after the scheme had been launched – and to help them to become more effective parents. However, some of the most deprived families failed to show up. As Andrew Marr wrote, back in 2007:

Poverty is hard to define, easy to smell. In a country like Britain, it is mostly relative. Though there are a few thousand people living rough or who genuinely do not have enough to keep them decently alive, and many more pensioners frightened of how they will pay for heating, the greater number of poor are those left behind the general material improvement in life. This is measured by income compared to the average and by this yardstick in 1997 there were three to four million children living in households of relative poverty, triple the number in 1979. This does not mean they were physically worse off than the children of the late seventies, since the country generally became much richer. But human happiness relates to how we see ourselves relative to those around us, so it was certainly real. 

The Tories, now under new management in the shape of a media-marketing executive and old Etonian, David Cameron, also declared that they believed in this concept of relative poverty. After all, it was on their watch, during the Thatcher and Major governments, that it had tripled, which is why it was only towards the end of the New Labour governments that they could accept the definition of the left-of-centre Guardian columnist, Polly Toynbee. A world of ‘black economy’ work also remained below the minimum wage, in private care homes, where migrant servants were exploited, and in other nooks and crannies. Some 336,000 jobs remained on ‘poverty pay’ rates. Yet ‘redistribution of wealth’, a socialist phrase which had become unfashionable under New Labour lest it should scare away middle Englanders, was stronger in Brown’s Britain than in other major industrialised nations. Despite the growth of the super-rich, many of whom were immigrants anyway, overall equality increased in these years. One factor in this was the return to the means-testing of benefits, particularly for pensioners and through the working families’ tax credit, subsequently divided into a child tax credit and a working tax credit. This was a U-turn by Gordon Brown, who had opposed means-testing when in Opposition. As Chancellor, he concluded that if he was to direct scarce resources at those in real poverty, he had little choice.

Apart from the demoralising effect it had on pensioners, the other drawback to means-testing was that a huge bureaucracy was needed to track people’s earnings and to try to establish exactly what they should be getting in benefits. Billions were overpaid and as people did better and earned more from more stable employment, they then found themselves facing huge demands to hand back the money they had already spent. Thousands of extra civil servants were needed to deal with the subsequent complaints and the scheme became extremely expensive to administer. There were also controversial drives to oblige more disabled people back to work, and the ‘socially excluded’ were confronted by a range of initiatives designed to make them more middle class. Compared with Mrs Thatcher’s Victorian Values and Mr Major’s Back to Basics campaigns, Labour was supposed to be non-judgemental about individual behaviour. But a form of moralism did begin to reassert itself. Parenting classes were sometimes mandated through the courts and for the minority who made life hell for their neighbours on housing estates, Labour introduced the Anti-Social Behaviour Order (‘Asbo’). These were first given out in 1998, granted by magistrates to either the police or the local council. It became a criminal offence to break the curfew or other sanction, which could be highly specific. Asbos could be given out for swearing at others in the street, harassing passers-by, vandalism, making too much noise, graffiti, organising ‘raves’, flyposting, taking drugs, sniffing glue, joyriding, prostitution, hitting people and drinking in public.

001 (2)

Although they served a useful purpose in many cases, there were fears that for the really rough elements in society and their tough children they became a badge of honour. Since breaking an Asbo could result in an automatic prison sentence, people were sent to jail for crimes that had not warranted this before. But as they were refined in use and strengthened, they became more effective and routine. By 2007, seven and a half thousand had been given out in England and Wales alone and Scotland had introduced its own version in 2004. Some civil liberties campaigners saw this development as part of a wider authoritarian and surveillance agenda which also led to the widespread use of CCTV (Closed Circuit Television) cameras by the police and private security guards, especially in town centres (see above). Also in 2007, it was estimated that the British were being observed and recorded by 4.2 million such cameras. That amounted to one camera for every fourteen people, a higher ratio than for any other country in the world, with the possible exception of China. In addition, the number of mobile phones was already equivalent to the number of people in Britain. With global satellite positioning chips (GPS) these could show exactly where their users were and the use of such systems in cars and even out on the moors meant that Britons were losing their age-old prowess for map-reading.

002003

The ‘Seven Seven’ Bombings – The Home-grown ‘Jihadis’:

Despite these increasing means of mass surveillance, Britain’s cities have remained vulnerable to terrorist attacks, more recently by so-called ‘Islamic terrorists’ rather than by the Provisional IRA, who abandoned their bombing campaign in 1998. On 7 July 2005, at rush-hour, four young Muslim men from West Yorkshire and Buckinghamshire, murdered fifty-two people and injured 770 others by blowing themselves up on London Underground trains and on a London bus. The report into this worst such attack in Britain later concluded that they were not part of an al Qaeda cell, though two of them had visited camps in Pakistan, and that the rucksack bombs had been constructed at the cost of a few hundred pounds. Despite the government’s insistence that the war in Iraq had not made Britain more of a target for terrorism, the Home Office investigation asserted that the four had been motivated, in part at least, by ‘British foreign policy’.

They had picked up the information they needed for the attack from the internet. It was a particularly grotesque attack, because of the terrifying and bloody conditions in the underground tunnels and it vividly reminded the country that it was as much a target as the United States or Spain. Indeed, the long-standing and intimate relationship between Great Britain and Pakistan, with constant and heavy air traffic between them, provoked fears that the British would prove uniquely vulnerable. Tony Blair heard of the attack at the most poignant time, just following London’s great success in winning the bid to host the 2012 Olympic Games (see above). The ‘Seven Seven’ bombings are unlikely to have been stopped by CCTV surveillance, of which there was plenty at the tube stations, nor by ID cards (which had recently been under discussion), since the killers were British subjects, nor by financial surveillance, since little money was involved and the materials were paid for in cash. Even better intelligence might have helped, but the Security Services, both ‘MI5’ and ‘MI6’ as they are known, were already in receipt of huge increases in their budgets, as they were in the process of tracking down other murderous cells. In 2005, police arrested suspects in Birmingham, High Wycombe and Walthamstow, in east London, believing there was a plot to blow up as many as ten passenger aircraft over the Atlantic.

After many years of allowing dissident clerics and activists from the Middle East asylum in London, Britain had more than its share of inflammatory and dangerous extremists, who admired al Qaeda and preached violent jihad. Once 11 September 2001 had changed the climate, new laws were introduced to allow the detention without trial of foreigners suspected of being involved in supporting or fomenting terrorism. They could not be deported because human rights legislation forbade sending back anyone to countries where they might face torture. Seventeen were picked up and held at Belmarsh high-security prison. But in December 2004, the House of Lords ruled that these detentions were discriminatory and disproportionate, and therefore illegal. Five weeks later, the Home Secretary Charles Clarke hit back with ‘control orders’ to limit the movement of men he could not prosecute or deport. These orders would also be used against home-grown terror suspects. A month later, in February 2005, sixty Labour MPs rebelled against these powers too, and the government only narrowly survived the vote. In April 2006 a judge ruled that the control orders were an affront to justice because they gave the Home Secretary, a politician, too much power. Two months later, the same judge ruled that curfew orders of eighteen hours per day on six Iraqis were a deprivation of liberty and also illegal. The new Home Secretary, John Reid, lost his appeal and had to loosen the orders.

006

Britain found itself in a struggle between its old laws and liberties and a new, borderless world in which the hallowed principles of ‘habeas corpus’, free speech, a presumption of innocence, asylum, the right of British subjects to travel freely in their own country without identifying papers, and the sanctity of homes in which the law-abiding lived were all coming under increasing jeopardy. The new political powers seemed to government ministers the least that they needed to deal with a threat that might last for another thirty years in order, paradoxically, to secure Britain’s liberties for the long-term beyond that. They were sure that most British people agreed, and that the judiciary, media, civil rights campaigners and elected politicians who protested were an ultra-liberal minority. Tony Blair, John Reid and Jack Straw were emphatic about this, and it was left to liberal Conservatives and the Liberal Democrats to mount the barricades in defence of civil liberties. Andrew Marr conceded at the time that the New Labour ministers were ‘probably right’. With the benefit of hindsight, others will probably agree. As Gordon Brown eyed the premiership, his rhetoric was similarly tough, but as Blair was forced to turn to the ‘war on terror’ and Iraq, he failed to concentrate enough on domestic policy. By 2005, neither of them could be bothered to disguise their mutual enmity, as pictured above. A gap seemed to open up between Blair’s enthusiasm for market ideas in the reform of health and schools, and Brown’s determination to deliver better lives for the working poor. Brown was also keen on bringing private capital into public services, but there was a difference in emphasis which both men played up. Blair claimed that the New Labour government was best when we are at our boldest. But Brown retorted that it was best when we are Labour. 

002 (2)

Tony Blair’s legacy continued to be paraded on the streets of Britain,

here blaming him and George Bush for the rise of ‘Islamic State’ in Iraq.

Asylum Seekers, EU ‘Guest’ Workers & Immigrants:

One result of the long Iraqi conflict, which President Bush finally declared to be over on 1 May 2003, was the arrival of many Iraqi asylum-seekers in Britain; Kurds, as well as Shiites and Sunnis. This attracted little comment at the time because there had been both Iraqi and Iranian refugees in Britain since the 1970s, especially as students and the fresh influx were only a small part of a much larger migration into the country which changed it fundamentally during the Blair years. This was a multi-lingual migration, including many Poles, some Hungarians and other Eastern Europeans whose countries had joined the EU and its single market in 2004. When the EU expanded Britain decided that, unlike France or Germany, it would not try to delay opening the country to migrant workers. The accession treaties gave nationals from these countries the right to freedom of movement and settlement, and with average earnings three times higher in the UK, this was a benefit which the Eastern Europeans were keen to take advantage of. Some member states, however, exercised their right to ‘derogation’ from the treaties, whereby they would only permit migrant workers to be employed if employers were unable to find a local candidate. In terms of European Union legislation, a derogation or that a member state has opted not to enforce a specific provision in a treaty due to internal circumstances (typically a state of emergency), and to delay full implementation of the treaty for five years. The UK decided not to exercise this option.

There were also sizeable inflows of western Europeans, though these were mostly students, who (somewhat controversially) were also counted in the immigration statistics, and young professionals with multi-national companies. At the same time, there was continued immigration from Africa, the Middle East and Afghanistan, as well as from Russia, Australia, South Africa and North America. In 2005, according to the Office for National Statistics, ‘immigrants’ were arriving to live in Britain at the rate of 1,500 a day. Since Tony Blair had been in power, more than 1.3 million had arrived. By the mid-2000s, English was no longer the first language of half the primary school children in London, and the capital had more than 350 different first languages. Five years later, the same could be said of many towns in Kent and other Eastern counties of England.

The poorer of the new migrant groups were almost entirely unrepresented in politics, but radically changed the sights, sounds and scents of urban Britain, and even some of its market towns. The veiled women of the Muslim world or its more traditionalist Arab, Afghan and Pakistani quarters became common sights on the streets, from Kent to Scotland and across to South Wales. Polish tradesmen, fruit-pickers and factory workers were soon followed by shops owned by Poles or stocking Polish and East European delicacies and selling Polish newspapers and magazines. Even road signs appeared in Polish, though in Kent these were mainly put in place along trucking routes used by Polish drivers, where for many years signs had been in French and German, a recognition of the employment changes in the long-distance haulage industry. Even as far north as Cheshire (see below), these were put in place to help monolingual truckers using trunk roads, rather than local Polish residents, most of whom had enough English to understand such signs either upon arrival or shortly afterwards. Although specialist classes in English had to be laid on in schools and community centres, there was little evidence that the impact of multi-lingual migrants had a long-term impact on local children and wider communities. In fact, schools were soon reporting a positive impact in terms of their attitudes toward learning and in improving general educational standards.

001

Problems were posed, however, by the operations of people smugglers and criminal gangs. Chinese villagers were involved in a particular tragedy when nineteen of them were caught while cockle-picking in Morecambe Bay by the notorious tides and drowned. Many more were working for ‘gang-masters’ as virtual, in some cases actual ‘slaves’. Russian voices became common on the London Underground, and among prostitutes on the streets. The British Isles found themselves to be ‘islands in the stream’ of international migration, the chosen ‘sceptred isle’ destinations of millions of newcomers. Unlike Germany, Britain was no longer a dominant manufacturing country but had rather become, by the late twentieth century, a popular place to develop digital and financial products and services. Together with the United States and against the Soviet Union, it was determined to preserve a system of representative democracy and the free market. Within the EU, Britain maintained its earlier determination to resist the Franco-German federalist model, with its ‘social chapter’ involving ever tighter controls over international corporations and ever closer political union. Britain had always gone out into the world. Now, increasingly, the world came to Britain, whether poor immigrants, rich corporations or Chinese manufacturers.

005

Multilingual & Multicultural Britain:

Immigration had always been a constant factor in British life, now it was also a fact of life which Europe and the whole world had to come to terms with. Earlier post-war migrations to Britain had provoked a racialist backlash, riots, the rise of extreme right-wing organisations and a series of new laws aimed at controlling it. New laws had been passed to control both immigration from the Commonwealth and the backlash to it. The later migrations were controversial in different ways. The ‘Windrush’ arrivals from the Caribbean and those from the Indian subcontinent were people who looked different but who spoke the same language and in many ways had had a similar education to that of the ‘native’ British. Many of the later migrants from Eastern Europe looked similar to the white British but shared little by way of a common linguistic and cultural background. However, it’s not entirely true to suggest, as Andrew Marr seems to, that they did not have a shared history. Certainly, through no fault of their own, the Eastern Europeans had been cut off from their western counterparts by their absorption into the Soviet Russian Empire after the Second World War, but in the first half of the century, Poland had helped the British Empire to subdue its greatest rival, Germany, as had most of the peoples of the former Yugoslavia. Even during the Soviet ‘occupation’ of these countries, many of their citizens had found refuge in Britain.

Moreover, by the early 1990s, Britain had already become both a multilingual nation. In 1991, Safder Alladina and Viv Edwards published a book for the Longman Linguistics Library which detailed the Hungarian, Lithuanian, Polish, Ukrainian and Yiddish speech communities of previous generations. Growing up in Birmingham, I certainly heard many Polish, Yiddish, Yugoslav and Greek accents among my neighbours and parents of school friends, at least as often as I heard Welsh, Irish, Caribbean, Indian and Pakistani accents. The Longman book begins with a foreword by Debi Prasanna Pattanayak in which she stated that the Language Census of 1987 had shown that there were 172 different languages spoken by children in the schools of the Inner London Education Authority. In an interesting precursor of the controversy to come, she related how the reaction in many quarters was stunned disbelief, and how one British educationalist had told her that England had become a third world country. She commented:

After believing in the supremacy of English as the universal language, it was difficult to acknowledge that the UK was now one of the greatest immigrant nations of the modern world. It was also hard to see that the current plurality is based on a continuity of heritage. … Britain is on the crossroads. It can take an isolationist stance in relation to its internal cultural environment. It can create a resilient society by trusting its citizens to be British not only in political but in cultural terms. The first road will mean severing dialogue with the many heritages which have made the country fertile. The second road would be working together with cultural harmony for the betterment of the country. Sharing and participation would ensure not only political but cultural democracy. The choice is between mediocrity and creativity.

002

Language and dialect in the British Isles, showing the linguistic diversity in many English cities by 1991 as a result of Commonwealth immigration as well as the survival and revival of many of the older Celtic languages and dialects of English.

Such ‘liberal’, ‘multi-cultural’ views may be unfashionable now, more than a quarter of a century later, but it is perhaps worth stopping to look back on that cultural crossroads, and on whether we are now back at that same crossroads, or have arrived at another one. By the 1990s, the multilingual setting in which new Englishes evolved had become far more diverse than it had been in the 1940s, due to immigration from the Indian subcontinent, the Caribbean, the Far East, and West and East Africa. The largest of the ‘community languages’ was Punjabi, with over half a million speakers, but there were also substantial communities of Gujurati speakers (perhaps a third of a million) and a hundred thousand Bengali speakers. In some areas, such as East London, public signs and notices recognise this (see below). Bengali-speaking children formed the most recent and largest linguistic minority within the ILEA and because the majority of them had been born in Bangladesh, they were inevitably in the greatest need of language support within the schools. A new level of linguistic and cultural diversity was introduced through Commonwealth immigration.

003

007

Birmingham’s booming postwar economy attracted West Indian settlers from Jamaica, Barbados and St Kitts in the 1950s. By 1971, the South Asian and West Indian populations were equal in size and concentrated in the inner city wards of North and Central Birmingham (see the map above).  After the hostility towards New Commonwealth immigrants in some sections of the local White populations in the 1960s and ’70s, they had become more established in cities like Birmingham, where places of worship, ethnic groceries, butchers and, perhaps most significantly, ‘balti’ restaurants, began to proliferate in the 1980s and ’90s. The settlers materially changed the cultural and social life of the city, most of the ‘white’ population believing that these changes were for the better. By 1991, Pakistanis had overtaken West Indians and Indians to become the largest single ethnic minority in Birmingham. The concentration of West Indian and South Asian British people in the inner city areas changed little by the end of the century, though there was an evident flight to the suburbs by Indians. As well as being poorly-paid, the factory work available to South Asian immigrants like the man in a Bradford textile factory below, was unskilled. By the early nineties, the decline of the textile industry over the previous two decades had let to high long-term unemployment in the immigrant communities in the Northern towns, leading to serious social problems.

006

Nor is it entirely true to suggest that, as referred to above, Caribbean arrivals in Britain faced few linguistic obstacles integrating themselves into British life from the late 1940s to the late 1980s. By the end of these forty years, the British West Indian community had developed its own “patois”, which had a special place as a token of identity. One Jamaican schoolgirl living in London in the late eighties explained the social pressures that frowned on Jamaican English in Jamaica, but which made it almost obligatory in London. She wasn’t allowed to speak Jamaican Creole in front of her parents in Jamaica. When she arrived in Britain and went to school, she naturally tried to fit in by speaking the same patois, but some of her British Caribbean classmates told her that, as a “foreigner”, she should not try to be like them, and should speak only English. But she persevered with the patois and lost her British accent after a year and was accepted by her classmates. But for many Caribbean visitors to Britain, the patois of Brixton and Notting Hill was a stylized form that was not truly Jamaican, not least because British West Indians had come from all parts of the Caribbean. When another British West Indian girl, born in Britain, was taken to visit Jamaica, she found herself being teased about her London patois and told to speak English.

003

The predicament that still faced the ‘Black British’ in the late eighties and into the nineties was that, for all the rhetoric, they were still not fully accepted by the established ‘White community’. Racism was still an everyday reality for large numbers of British people. There was plenty of evidence of the ways in which Black people were systematically denied access to employment in all sections of the job market.  The fact that a racist calamity like the murder in London of the black teenager Stephen Lawrence could happen in 1993 was a testimony to how little had changed in British society’s inability to face up to racism since the 1950s. As a result, the British-Caribbean population could still not feel itself to be neither fully British. This was the poignant outcome of what the British Black writer Caryl Phillips has called “The Final Passage”, the title of his novel which is narrated in Standard English with the direct speech by the characters rendered in Creole. Phillips migrated to Britain as a baby with his parents in the 1950s, and sums up his linguistic and cultural experience as follows:

“The paradox of my situation is that where most immigrants have to learn a new language, Caribbean immigrants have to learn a new form of the same language. It induces linguistic shizophrenia – you have an identity that mirrors the larger cultural confusion.”

One of his older characters in The Final Passage characterises “England” as a “college for the West Indian”, and, as Philipps himself put it, that is “symptomatic of the colonial situation; the language is divided as well”.  As the “Windrush Scandal”, involving the deportation of British West Indians from the UK has recently shown, this post-colonial “cultural confusion” still ‘colours’ political and institutional attitudes twenty-five years after the death of Stephen Lawrence, leading to discriminatory judgements by officials. This example shows how difficult it is to arrive at some kind of chronological classification of migrations to Britain into the period of economic expansion of the 1950s and 1960s; the asylum-seekers of the 1970s and 1980s; and the EU expansion and integration in the 1990s and the first decades of the 2000s. This approach assumed stereotypical patterns of settlement for the different groups, whereas the reality was much more diverse. Most South Asians, for example, arrived in Britain in the post-war period but they were joining a migration ‘chain’ which had been established at the beginning of the twentieth century. Similarly, most Eastern European migrants arrived in Britain in several quite distinct waves of population movement. This led the authors of the Longman Linguistics book to organise it into geolinguistic areas, as shown in the figure below:

001

The Poles and Ukrainians of the immediate post-war period, the Hungarians in the 1950s, the Vietnamese refugees in the 1970s and the Tamils in the 1980s, sought asylum in Britain as refugees. In contrast, settlers from India, Pakistan, Bangladesh and the Caribbean, had, in the main come from areas of high unemployment and/or low wages, for economic reasons. It was not possible, even then, to make a simple split between political and economic migrants since, even within the same group, motivations differed through time. The Eastern Europeans who had arrived in Britain since the Second World War had come for a variety of reasons; in many cases, they were joining earlier settlers trying either to escape poverty in the home country or to better their lot. A further important factor in the discussion about the various minority communities in Britain was the pattern of settlement. Some groups were concentrated into a relatively small geographical area which made it possible to develop and maintain strong social networks; others were more dispersed and so found it more difficult to maintain a sense of community. Most Spaniards, Turks and Greeks were found in London, whereas Ukrainians and Poles were scattered throughout the country. In the case of the Poles, the communities outside London were sufficiently large to be able to sustain an active community life; in the case of Ukrainians, however, the small numbers and the dispersed nature of the community made the task of forging a separate linguistic and cultural identity a great deal more difficult.

Groups who had little contact with the home country also faced very real difficulties in retaining their distinct identities. Until 1992, Lithuanians, Latvians, Ukrainians and Estonians were unable to travel freely to their country of origin; neither could they receive visits from family members left behind; until the mid-noughties, there was no possibility of new immigration which would have the effect of revitalizing these communities in Britain. Nonetheless, they showed great resilience in maintaining their ethnic minority, not only through community involvement in the UK but by building links with similar groups in Europe and even in North America. The inevitable consequence of settlement in Britain was a shift from the mother tongue to English. The extent of this shift varied according to individual factors such as the degree of identification with the mother tongue culture; it also depended on group factors such as the size of the community, its degree of self-organisation and the length of time it had been established in Britain. For more recently arrived communities such as the Bangladeshis, the acquisition of English was clearly a more urgent priority than the maintenance of the mother tongue, whereas, for the settled Eastern Europeans, the shift to English was so complete that mother tongue teaching was often a more urgent community priority. There were reports of British-born Ukrainians and Yiddish-speaking Jews who were brought up in predominantly English-speaking homes who were striving to produce an environment in which their children could acquire their ‘heritage’ language.

Blair’s Open Door Policy & EU Freedom of Movement:

During the 1980s and ’90s, under the ‘rubric’ of multiculturalism, a steady stream of immigration into Britain continued, especially from the Indian subcontinent. But an unspoken consensus existed whereby immigration, while always gradually increasing, was controlled. What happened after the Labour Party’s landslide victory in 1997 was a breaking of that consensus, according to Douglas Murray, the author of the recent (2017) book, The Strange Death of Europe. He argues that once in power, Tony Blair’s government oversaw an opening of the borders on a scale unparalleled even in the post-war decades. His government abolished the ‘primary purpose rule’, which had been used as a filter out bogus marriage applications. The borders were opened to anyone deemed essential to the British economy, a definition so broad that it included restaurant workers as ‘skilled labourers’. And as well as opening the door to the rest of the world, they opened the door to the new EU member states after 2004. It was the effects of all of this, and more, that created the picture of the country which was eventually revealed in the 2011 Census, published at the end of 2012.

004

The numbers of non-EU nationals moving to settle in Britain were expected only to increase from 100,000 a year in 1997 to 170,000 in 2004. In fact, the government’s predictions for the number of new arrivals over the five years 1999-2004 were out by almost a million people. It also failed to anticipate that the UK might also be an attractive destination for people with significantly lower average income levels or without a minimum wage. For these reasons, the number of Eastern European migrants living in Britain rose from 170,000 in 2004 to 1.24 million in 2013. Whether the surge in migration went unnoticed or was officially approved, successive governments did not attempt to restrict it until after the 2015 election, by which time it was too late.

(to be continued)

Posted January 15, 2019 by TeamBritanniaHu in Affluence, Africa, Arabs, Assimilation, asylum seekers, Belfast, Birmingham, Black Market, Britain, British history, Britons, Bulgaria, Calais, Caribbean, Celtic, Celts, Child Welfare, Cold War, Colonisation, Commonwealth, Communism, Compromise, Conservative Party, decolonisation, democracy, Demography, Discourse Analysis, Domesticity, Economics, Education, Empire, English Language, Europe, European Economic Community, European Union, Factories, History, Home Counties, Humanism, Humanitarianism, Hungary, Immigration, Imperialism, India, Integration, Iraq, Ireland, Journalism, Labour Party, liberal democracy, liberalism, Linguistics, manufacturing, Margaret Thatcher, Midlands, Migration, Militancy, multiculturalism, multilingualism, Music, Mythology, Narrative, National Health Service (NHS), New Labour, Old English, Population, Poverty, privatization, Racism, Refugees, Respectability, Scotland, Socialist, south Wales, terror, terrorism, Thatcherism, Unemployment, United Kingdom, United Nations, Victorian, Wales, Welsh language, xenophobia, Yugoslavia

Tagged with , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

The ‘Other England’ of the Sixties and Seventies: The Changing Fortunes of East Anglia.   Leave a comment

007

Looking across the River Deben towards Woodbridge from Sutton Hoo.

East of England; the Country from the Stour to the Wash:

001

After the far West of England, East Anglia was one of the most neglected regions of England until the sixties. In the fashionable division of the nation into North and South, it has tended to get lumped in with the South. The South-east Study of 1964 was less vague, however, drawing an arbitrary line from the Wash to the Dorset Coast at Bournemouth and defining the area to the east of this boundary as ‘South-east England’. In the same year, Geoffrey Moorhouse (pictured below), a well-known contemporary Guardian correspondent, wrote that, in time, if policies to encourage a counter-drift of the population from the South were not adopted, the whole of the vast area delineated might well become one in character, in relative wealth and in disfigurement. As far as he was concerned, the ‘carving out’ of this area encroached upon the traditional regions of the West Country, beginning at Alfred’s ancient capital of Winchester in Hampshire, and East Anglia, incorporating Norfolk, Suffolk and Essex, or at least that part of it lying to the north of Colchester. To the south, most of Essex was already part of the ‘Golden Circle’ commuter area for the metropolis, stretching from Shoeburyness at the end of the Thames estuary, around the edge of ‘Greater London’ and up the Hertfordshire border to the north of Harlow. Suffolk and Norfolk, however, still remained well ‘beyond the pale’ between the Stour Valley and the Wash, occupying most of the elliptical ‘knob’ sticking out into the North Sea. It was an ‘East Country’ which still seemed as remote from the metropolitan south-east of England as that other extremity in the far south-west peninsular.

003

In the fifties, as the wartime airfields were abandoned and the Defence Ministry personnel went back to London, East Anglia went back to its old ways of underemployment, rural depopulation, low land and property values. By the mid-fifties, the people of East Anglia were not yet having it as good as many parts of the Home Counties that Macmillan probably had in mind when he made his famous remark. Urban growth continued, however, into the early sixties. For the most part, development was unimaginative, as council estates were built to replace war-time damage and cater for the growing town populations.  Where, in 1959, the Norfolk County Council was getting four thousand applicants a year for planning permission, by 1964 the figure had risen to ten thousand. Issues of planned town growth became urgent. Old properties, particularly thatched cottages and timber-framed farmhouses were eagerly sought. For all the talk of imminent development, with all the benefits and drawbacks that this implied, East Anglia did not look as if it had changed much by the early sixties. The most noticeable signs of the times were the great number of abandoned railway stations. Railway traffic had declined throughout England as British road transport had eclipsed railways as the dominant carrier of freight. Several branch lines, such as the Long Melford to Bury St Edmunds and sections of the Waveney Valley had already closed before the celebrated ‘Beeching Axe’ was wielded in 1963. Neither Suffolk nor Norfolk enjoyed a share in the slow growth of national prosperity of the fifties, but then the boom came suddenly and Suffolk became the fastest growing county by the end of the decade. It began in the early sixties when many new industries came to the East Anglian towns and cities.

Photo0300

The abandoned railway station at Needham Market, Suffolk.

The ‘neglected’ Suffolk of the fifties was ready to be rediscovered in the sixties. Companies escaping from the high overheads in London and the Home Counties realised that they could find what they were looking for in Ipswich, Bury, Sudbury and Haverhill. Executives discovered that they could live in an area of great peace and beauty and yet be within commuting distance of their City desks. Moreover, the shift in the balance of international trade focused attention on once more on the eastern approaches. When the bulk of Britain’s trade was with the empire and North America it was logical that London, Southampton and Liverpool should have been the main ports. The railway network had been constructed in the nineteenth century in such a way as to convey manufactured goods to these ports. But the Empire had been all but disbanded and Britain was being drawn, inexorably if sometimes reluctantly, into the European Common Market. More and more industrial traffic took to the road; heavy lorries at first, then containers. Now producers were looking for the shortest routes to the continent, and many of them lay through Suffolk, shown below in Wilson’s 1977 map of the county.

002

One of the benefits of East Anglia’s poor communications was that, at the height of summer, it was the only region south of the Bristol-Wash line which was not crammed with holidaymakers and their traffic. The seaboard caught it a little, as of course did the Norfolk Broads. Norfolk reckons, for instance, that caravans are worth two million pounds a year to it one way or another and, like Cornwall, saw this as a mixed blessing; as Moorhouse was writing his book (in 1964), the County Council was in the process of spending fifty thousand pounds on buying up caravan sites which had been placed with an eye more to income than to landscape. But inland and away from the waterways crowds of people and cars were hard to find; out of the holiday season, East Anglia was scarcely visited by any ‘outsiders’ apart from occasional commercial travellers. Local difficulties, small by comparison with those of the North, were lost from sight. As the sixties progressed, more and more British people and continental visitors realised that discovered the attractions the two counties had to offer. As Derek Wilson wrote at the end of the following decade,

They realised that a century or more of economic stagnation had preserved from thoughtless development one of the loveliest corners of England. They came in increasing numbers by their, now ubiquitous, motor-cars to spend quiet family holidays at the coast, to tour the unspoilt villages, to admire the half-timbering, the thatch, the pargetting and the great wool churches. Some decided to stake a claim by buying up old cottages for ‘week-ending’ or retirement.

DSC09565

So great was the demand for even derelict old properties that prices trebled in the period 1969-73. Village communities were no longer so tight-knit so the arrival of these ‘strangers’ cannot be said to have disrupted a traditional culture. Only in those areas where the newcomers congregated in large numbers, buying up properties at inflated prices which ‘locals’ could no longer afford was any real and lasting cultural damage inflicted. At first, the seaside towns found it difficult to come to terms with the expansion in tourism, having been ignored for so long. Even the established Suffolk holiday resorts – Aldeburgh, Southwold, Dunwich, even Felixstowe – were ‘genteel’ places; compared with Clacton on the Essex coast which was far closer in time and space to for day-trippers from London, they did not bristle with amusement arcades, Wimpy bars, holiday camps and the assorted paraphernalia that urban man seems to expect at the seaside. Derek Wilson commented that Suffolk was more like a coy maiden prepared to be discovered than an accomplished seductress thrusting her charms at every single passer-by. 

dscn08091.jpg

Three centuries of properties in Aldeburgh, Suffolk.

A Metropolitan ‘Refugee’ in Dunwich:

001

Greyfriars, The Simpson coastal ‘pile’ in Dunwich.

One of the earliest of these ‘refugees’ from the metropolis was John Simpson (who was to become the BBC’s World Affairs Editor). When he was fifteen, in 1959,  moved from Putney to Dunwich. His holidays had already been taken up with following his father’s genealogical enthusiasms, and they went from village church to county archive to cathedral vault searching for records of births, marriages and deaths, and transcribing inscriptions on gravestones. Having discovered the full extent of the full extent of the Simpson’s Suffolk roots, Roy Simpson insisted that they should look for a country house there. John recalled,

We spent a wintry week driving from one depressing place to another and talking to lonely farmers’ wives whose ideal in life was to leave their fourteenth-century thatched manor-houses and move to a semi near the shops. We had almost given up one evening and were setting out on the road to London when I spotted a brief mention at the end of an estate agent’s list of a rambling place on a clifftop overlooking the sea at Dunwich. …

From the moment I saw it I knew I would never be happy until I lived there. No one could call ‘Greyfriars’ handsome. It was the left hand end of an enormous 1884 mock-Elizabethan pile which had been split up into three separate sections at the end of the war. Our part had around eight bedrooms and five bathrooms. … It was always absurdly unsuitable … four hours’ drive from London, and nowhere near the shops or anything else. Its eleven acres of land were slowly being swallowed up by the ravenous North Sea, and it cost a small fortune to keep warm and habitable. … 

The village of Dunwich immediately formed another element of that sense of the past, faded glory which had haunted so much of my life. In the early Middle Ages it had been the greatest port in England, sending ships and men and hundreds of barrels of herrings to the Kings of England, and possessing a bishopric and forty churches and monasteries. But it was built on cliffs of sand, and the storms of each winter undermined it and silted up the port. In the twelfth century, and again in the thirteenth, large parts of the town collapsed into the sea. … Our land ran down to the cliff edge, and we watched it shrink as the years went by. 

The stories about hearing bells under the sea were always just fantasy, but Dunwich was certainly a place of ghosts. A headless horseman was said to drive a phantom coach and four along one of the roads nearby. … In the grounds of our house two Bronze Age long-barrows stood among the later trees, and when the moon shone hard and silver down onto the house, and the thin clouds spread across the sky, and a single owl shrieked from the bare branches of the dead holm-oak outside my bedroom window, it was more than I could do to get out of bed and look at them. I would think of those cold bones and the savage gold ornaments around them, and shiver myself to sleep.

The winter of 1962 was the worst since 1947, and that was the worst since the 1660s, people said. The snow fell in early December and dug in like an invading army, its huge drifts slowly turning the colour and general consistency of rusty scrap iron. In our vast, uneconomic house at Dunwich the wind came off the North Sea with the ferocity of a guillotine blade and the exposed pipes duly froze hard. The Aga stood in the corner of the kitchen like an icy coffin. … We wandered round the house in overcoats, with scarves tied round our heads like the old women at Saxmundham market. None of the lavatories worked.

In October 1963, Roy Simpson drove his son ‘up’ to Cambridge from the Suffolk coast in his old Triumph. John Simpson set down his cases, as had many Suffolk boys before him, outside the porter’s lodge in the gateway of Magdalene College. For the next three years, his life revolved around the University city in the Fens until he joined the BBC in 1966.

Coast, Cathedral City & Inland Industrial Development:

2b54e4b900000578-3196566-image-m-9_1439473593698

The curvature of the eastern coastline had been responsible for the lack of metropolitan infiltration hitherto. Norfolk and Suffolk were in a cul-de-sac; even today, apart from the ports of Felixstowe and Harwich, on opposite sides of the mouth of the River Stour, they do not lie on transport routes to anywhere else, and their lines of communication with other parts of the country, except with London, were still poor in the early sixties, and are still relatively retarded half a century later, despite the widening of the A12 and the extension of the A14. The disadvantages of remoteness could be severe, but at the same time, this saved the two countries from the exploitation that had occurred in places with comparable potential. Had there been better communications, Norwich might have been as badly ravaged by the Industrial Revolution as Bradford, but the great East Anglian woollen trade and cloth-making industry were drawn to Yorkshire as much by the promise of easier transport as by the establishment of the power-loom on faster-flowing water sources. Instead, Norwich still retained the air of a medieval city in its centre with its cathedral, its castle, and its drunken-looking lollipop-coloured shops around Elm Hill, Magdalen Street, and St. Benedict’s. Its industries, like the Colman’s mustard factory, were already discreetly tucked away on its flanks, and there they did not intrude.

005 (61)

Norwich itself was poised to move forward by the sixties, and though its hopes had received a setback as a result of Britain’s early failures to get into the Common Market, it still saw itself as playing an important part in the development of trade between this country and the Continent. European connections were already strong in East Anglia. From the obvious Dutch gables widespread throughout the region (see the example below from a farmhouse near Woodbridge, Suffolk) and concentrated in places like Kings Lynn, to the names beginning with the prefix ‘Van’ in the telephone directories, Flemish influences could, and still can be found everywhere. Dutch farmers had been settling in the two counties since the late seventeenth century. There were two Swiss-owned boatyards on the Norfolk Broads and one of Norwich’s biggest manufacturers, Bata Shoes, was Swiss in origin. In the early sixties, two Danish firms had set themselves up near the city.

DSCN0672

For Suffolk, the sixties and seventies saw a most astonishing growth in the population, which had been decreasing for over a century. The population of Suffolk showed a comparatively modest, but significant growth from 475,000 in 1951 to 560,000 in 1961. Most of this increase was in West Suffolk, where the growth of Haverhill, Bury and Sudbury accounted for most of the extra population. These were designated in the mid-fifties as London overspill areas. In Haverhill, the notion of town expansion had been pioneered in 1955; by the time Geoffrey Moorhouse published his survey in 1964, there was already a plan for a further massive transfusion of people to the town from London.  Thetford, Bury St Edmunds, and Kings Lynn were to be transformed within the next two decades. Between the two censuses of 1961 to 1971, the population of Suffolk jumped by over eighteen per cent (the national average was 5.8 per cent). There were many reasons for this unprecedented growth, which brought Suffolk a prosperity it had not known since the great days of the cloth trade.

Photo0304

A variety of restored properties in Needham Market today.

But the hinterland towns of central East Anglia presented a bigger problem for the local planners and county authorities. They had grown up as market-places for the sale of agricultural produce like those in other parts of rural England. By the mid-sixties, they had held on to this function much longer than most. But the markets, and particularly the cattle markets, had recently become more and more concentrated in the biggest towns – Norwich, King’s Lynn, Bury and Cambridge – and the justification for places like Stowmarket, Diss, Eye, Downham Market and Needham Market (pictured above), in their traditional form had been rapidly disappearing. Their populations were in need of new industries to take the place of old commerce and, in part, they got them. As early as the sixties, a new town at Diss, on the Norfolk-Suffolk border, was already talked of.  Carefully planned industrial and housing estates were built and a variety of service industries and light engineering concerns moved their machines and desks to spacious premises from whose windows the workers could actually see trees and green fields. Writing in the late seventies, Derek Wilson concluded that, while such examples of economic planning and  ‘social engineering’ could only be described as revolutionary, they were still too recent to invite accurate assessment.

DSC09732

Above: The Centre of Ipswich is now undergoing an extensive renovation, including that of its historic Corn Exchange area, complete with a statue to one of its more famous sons, Giles, the Daily Express cartoonist, popular in the sixties and seventies, when rapid development engulfed many earlier buildings in concrete.

Paradoxically, Suffolk’s depressed isolation gave a boost to the new development. Some of Suffolk’s most beautiful countryside was no further from the metropolis than the ‘stockbroker belt’ of Sussex, Hampshire, Wiltshire, Berkshire and Buckinghamshire. Yet land and property prices in Suffolk were less than half of what they were in the desirable areas of those counties. Most of the county was within eighty miles of London and served by still reasonable rail connections, and improving road connections from the capital. The population was now more mobile, and light industry less tied to traditional centres.  But development in the sixties and seventies was not restricted to the eastern side of the two counties. Ipswich, the other town in the two counties which was relatively industrialised, had been, like Norwich, comparatively unscathed by that industrialisation. Its growth occurred largely as a result of migration within Suffolk. Even so, its population increased from a hundred thousand to a hundred and twenty-two thousand between 1961 and 1971. It became the only urban centre in the county to suffer the same fate of many large towns and cities across England in that period – haphazard and largely unplanned development over many years. In the late seventies, farmers could still remember when the county town was still was just that, a large market town, where they could hail one another across the street. By then, however, dual carriageways and one-way systems had been built in an attempt to relieve its congested centre, while old and new buildings jostled each other in what Derek Wilson called irredeemable incongruity.

East Anglia as Archetypal Agricultural England:

003 (64)

Life on the land had already begun to change more generally in the sixties. East Anglia is an important area to focus on in this respect, because it was, and still is, agricultural England. In the sixties and seventies, agriculture was revitalised: farmers bought new equipment and cultivated their land far more intensely than ever before. The industries here remained identical to the main purpose of life, which was to grow food and raise stock. Many of the industries in the two counties were secondary, and complimentary, to this purpose. Of the thirty-nine major industrial firms in East Suffolk, for example, twelve were concerned with food processing, milling, or making fertilisers, and of the five engineering shops most were turning out farm equipment among other things. These industries varied from the firm in Brandon which employed three people to make and export gun-flints to China and Africa, to the extensive Forestry Commission holding at Thetford, where it was calculated that the trees grew at the rate of seventeen tons an hour, or four hundred tons a day. But a quarter of the total workforce in Norfolk and Suffolk was employed in the primary industry of farming; there were more regular farm-workers in Norfolk than in any other English county. The county produced two of the founders of modern British agriculture, Coke of Holkham and Townshend of Raynham, and it had kept its place at the head of the field, quite literally.

DSCN0671

East Anglia was easily the biggest grain-producing region of the country and the biggest producer of sugar-beet. During the First World War, farmers had been encouraged to grow sugar beet in order to reduce the country’s dependence on imported cane sugar. This had been so successful that in 1924 the government offered a subsidy to beet producers. The crop was ideally suited to the heavy soil of central Suffolk and without delay, a number of farmers formed a co-operative and persuaded a Hungarian company to build a sugar factory near Bury St Edmunds. Five thousand acres were planted immediately and the acreage grew steadily over the next half-century. In 1973, the factory was considerably enlarged by the building of two huge new silos, which came to dominate the skyline along the A14 trunk road. The factory became the largest plant of its kind in Europe and by the late seventies was playing an important part in bringing Britain closer to its goal of self-sufficiency in sugar.

50

Local ingenuity and skill had devised and built many agricultural machines during the nineteenth century, like this threshing/ grain crushing machine from the Leiston Richard Garrett works, which made various farming machines, including tractors.

Of all the English counties, Norfolk had the biggest acreage of vegetables and the heaviest yield per acre of main crop potatoes. It was also the second biggest small fruit producer and the second highest breeder of poultry. Suffolk came close behind Norfolk in barley crops, while it had the biggest acreage of asparagus and more pigs than any other county. The region’s importance to agriculture was symbolised by the headquarters of the Royal Agricultural Society having its base in Norfolk, and the region also played host to the British-Canadian Holstein-Friesian Association, the Poll Friesian Cattle Society, the British Goat Society, and the British Waterfowl Association. No other county had as many farms over three hundred acres as Norfolk, and most of the really enormous farms of a thousand acres or more were to be found in the two Easternmost counties. The biggest farm in England, excluding those owned by the Crown, was to be found on the boundary of Bury St Edmunds, the ten-thousand-acre Iveagh estate, covering thirteen farmsteads, and including a piggery, three gamekeepers’ lodgings and homes for its cowmen, foresters and its works department foreman.

DSCN0666

The most significant change taking place on the land throughout England was in the size of farms. The big ones were getting bigger and the small ones were slowly dwindling and going out of business. Mechanisation was reducing the number of jobs available to agricultural workers, and from this followed the steady decline of rural communities. By the end of the sixties, however, the employment position in Norfolk was beginning to stabilise as the old farm hands who were reared as teams-men and field-workers and were kept on by benevolent employers retired and were not replaced. Although it employed fewer people than ever before, farming was still Suffolk’s largest single industry in the mid-seventies. After Britain joined the Common Market in 1973, accessibility to European markets had led to a certain amount of diversity. There were numerous farmers specialising in poultry, pigs and dairying. Yet persistently high world grain prices led to the intensive production of what the heavy soils of central Suffolk are best suited to – cereal crops. The tendency for large estates to be split up and fields to remain unploughed had been dramatically reversed. The larger the unit, the more productive and efficient the farm, with every producer determined to get the maximum yield from their acres.

71

The field patterns between Leiston and Sizewell (from the model detailed below).

As the big farms grew bigger and farming became more highly mechanised, farmers were tending to re-organise the shapes and sizes of their fields, making them as large as possible so that the tractor and the combine harvester could work them with greater ease and maximum efficiency. They uprooted trees and whole copses, which were awkward to plough and drill around, cut out hedges which for centuries had bounded small parcels of land, and filled in ditches. To the farmer, this meant the promise of greater productivity, but to the ecologist, it meant the balance of nature was being upset in a way that the farmer and the general countryside population, including animals as well as people, would have to pay for, later if not sooner. The practical answer to this problem has been the increasing use of chemicals to control pests which, as soon became obvious, was a double-edged blade. In addition, the poor land was treated with chemical fertilizers. East Anglia provided a classic example of what could happen as a result of the indiscriminate chemical warfare being conducted in the English countryside. As reported in the New Statesman (20 March 1964), …

… a Norfolk fruit-grower was persuaded by a pesticide salesman that the best way of keeping birds off his six acres of blackcurrants was to use an insecticide spray. Two days after he did so the area was littered with the silent corpses of dozens of species of insects, birds and mammals.

This was very far removed, of course, from the idealised conception of the rural life that most people carried around in their imaginations, and perhaps many of us still do today, especially when we look back on childhood visits to the countryside and relatives living in rural villages.  Moorhouse characterised this contrast as follows:

Smocked labourers, creaking hay carts, farmyard smells, and dew-lapped beasts by the duck-pond – these are still much more to the forefront of our consciousness than DDT, aldrin, dieldrin, and fluoroacetemide. In most of us, however completely we may be urbanised, there lurks some little lust for the land and a chance to work it.  

Rustic Life; Yeomen Farmers and Yokels:

Farmers had to become hard-nosed professional businessmen. The profits from their labour had to be extracted while they were there, for it was never certain what might be around the next bend. This emphasis on business sense, both in himself and in others, his passion for getting the maximum work out of his men and machines, was what made Moorhouse’s Norfolk farmer sound indistinguishable from any high-powered industrialist in the Midlands. In a sense, he wasn’t. He was prepared to try any method which would increase his productivity. In the early sixties, something very odd had been happening in his part of the world. Traditionally, ‘big’ Norfolk farmers like him had tended to be isolated neighbours, seeing each other at the market but otherwise scarcely at all. But he and three other men had taken to sharing their equipment for harvesting quick-freeze peas; this work had to be done particularly fast on a day appointed by the food factory and ‘Farmer Giles’ and his neighbours had decided that it could be done most efficiently and cheaply by pooling their men and machines and having this unit move from property to property in the course of one day. In 1964, they also clubbed together for a contracting helicopter to spray their crops. He and his friends, being staunch Tories, might not have accepted that they were putting co-operative principles into farming practice, but that was precisely what they were doing, just as the Suffolk sugar-beet growers had done forty years earlier.

For all his business acumen, however, ‘Farmer Giles’ measured up to the popular stereotypical image of a yeoman farmer. He was a warden at his local church, had a couple of horses in his stables and during ‘the season’ he went shooting for four days a week. He cared about the appearance of his patch of countryside, spent an impressive amount of time in doing up the tied cottages of his men, rather than selling it to them, as some of them would like. This is not simply because, in the long run, it results in a contented workforce, but because he can control what it looks like on the outside, as pretty as an antique picture, thatched and whitewashed. Fundamentally, he belonged as completely to the land as he possessed it. Though he no longer had any real need to, he did some manual work himself, as well as prowling around the farm to make sure everything was going to his overall plan. He was organic, like his 1,200 acres, which nonetheless produced a profit of sixteen thousand pounds a year. As he himself commented, overlooking his fields, there is something good about all this! A cynic might have responded to this by suggesting that any life that could produce such a profit was indeed, a good life.

17

Above & Below: Cattle grazing on the Deben meadows near Woodbridge, Suffolk.

But how had the tied agricultural workers, the eternal rustics, fared in this changing pattern of agriculture? The farm labourer interviewed by Moorhouse worked on the Norfolk-Suffolk border. He left school at fourteen, the son of a mid-Norfolk cowman of thirty-five years standing. He first worked on a poultry farm for a couple of years, had four years as assistant cowman to his father, five years as a stock feeder, then two years ‘on the land’ working with tractors and horses. He then came to the farm Moorhouse found him working on fifteen years previously, just after getting married, as a relief man. At the age of forty-two, with a teenage daughter, he was head cowman for a ‘gaffer’ with 450 arable acres and a hundred acres of pasture which carried fifty Friesian milking cows, forty-six calves, and a bull. His farmer was nearing seventy and didn’t hold with too many of the new ways. It was only in that year, 1964, that the modern method of milking – straight from the cow through a pipeline to a common container – had been adopted by his gaffer. Farmer Giles had been doing it this way ever since it was proved to be the quickest and easiest way. ‘Hodge’ got up at 5.30 a.m. to milk the cows and feed the calves. After breakfast until mid-day, he was busy about the yards, mixing meal, washing up and sterilizing equipment. From 1.30 p.m. he was out again, feeding the calves and doing various seasonal jobs until milking, which generally finished by 5 o’clock. Very often he went out again before bed-time, to check on the cows and the calves. He worked a six-and-a-half-day week, for which he was paid twenty-two per cent more than the basic farm worker’s wage for a forty-six-hour week.

16

When he first came to the farm, ‘Hodge’ was given, rent-free, a cottage, which was in rather worse shape than the shelters which housed the cows in winter. It had one of the tin-can lavatories described below and was lit with paraffin lamps. He had to tramp eighty yards to a well for water. There was one room downstairs plus a tiny kitchen, and two bedrooms, one of which was so small you couldn’t fit a full-size bed in it. After a while, the farmer modernised it at a cost of a thousand pounds, knocking it together with the next-door cottage. The renewed place, though still cramped, had all the basic necessities and Hodge paid twelve shillings a week for it. He accepted his situation, though the National Union of Agricultural Workers (NUAW) did not, since it had been trying to abolish tied cottages for forty years on the principle of eviction. Although a socialist and chairman of his local union branch, Hodge argued that tied cottages were necessary because the farm worker had to be near his job so that, as in his case, he could hop across the road before bedtime to check on the cows. Other changes had taken place in his lifetime on Norfolk land. The drift to the towns had fragmented the old society, and traditions had been quietly petering out. The parish church was generally full for the harvest festival, but otherwise ill-attended; the rector had three parishes to cope with.

Rural Poverty & Village Life:

DSC09763

A former labourer’s cottage in Saxmundham marketplace.

The poverty of the inland, rural villages was the result of far more basic concerns than the pressures on property prices created by newcomers, or the changes in agriculture, which did little to improve the lives of villagers. Their cottages may have looked attractive enough in their appearance on the outside, but too often offered their home-grown dwellers little encouragement to remain in them, and if they got the chance to move out they did, while there was no help at all for those who might be interested in trying their hand at rural life. Moorhouse found one village within ten miles of Ipswich which, apart from its electricity and piped water supplies, had not changed at all since the Middle Ages. Some of its cottages were without drains and in these, the housewife had to put a bucket under the plughole every time she wanted to empty the sink; she then carried it out and emptied onto the garden. Sewerage was unknown in the community of 586 people, none of whom had a flush toilet. They used tins, lacing them with disinfectant to keep down the smell and risk of infection. In some cases, these were housed in cubicles within the kitchens, from where they had to be carried out, usually full to the brim, through the front door. Every Wednesday night, as darkness fell, the Rural District Council bumble cart, as the villagers call it, arrived in the village street to remove the tins from the doorsteps. Moorhouse commented that this was…

… for nearly six hundred people … a regular feature of life in 1964 and the joke must long since have worn thin. There are villages in the remoter parts of the North-west Highlands of Scotland which are better equipped than this.

001

This was not by any means an isolated example. While in both counties the coverage of electricity and water supplies were almost complete, drainage and sewerage were far from being so. In the Clare rural district of Suffolk villages were expected to put up with the humiliating visitations of the ‘night cart’ for another five years; in the whole of West Suffolk there were twenty-four villages which could not expect sewerage until sometime between 1968 and 1981, and both county councils accepted that they were some villages which would never get these basic amenities. In East Suffolk, only those places within the narrow commuting belts around the biggest towns could be sure that they would one day soon become fully civilised. In Norfolk, it was estimated that as many as a hundred would never be so. Again, this was the price that East Anglia was paying for being off the beaten track. It was not the indolence of the county councils which ensured the continuance of this residue of highly photogenic rural slums, as Moorhouse put it, so much as cold economics. Both counties had, acre for acre, among the smallest population densities in England; in neither is there very much industry. Therefore, under the rating system of that time, based on property values and businesses, they were unable to raise sufficient funds to provide even these basic services, as we would see them now. Norfolk claimed to have the lowest rateable value among the English counties, and Suffolk was not much better off. They simply did not have the ‘wherewithal’ to make these small communities fit for human habitation. But this simple fact was little ‘comfort’ to those who had to live in them.

img_9755

County Hall, Norwich.

For a survey which it undertook for its 1951 development plan, East Suffolk County Council had decided that basic communal necessities consisted of at least a food shop, a non-food shop, a post office, a school, a doctor’s surgery and/or clinic, a village hall, and a church. When it took a long, hard look at its villages, it found that only forty-seven had all of these things, that ninety-three had all three basic requirements and that (food shop, school, village hall), that 133 had only one or two of them and that thirty-one had none. A similar survey by the West Suffolk County Council showed that only sixteen per cent of its 168 parishes had all the facilities and that about the same proportion had none. When the county authorities made a follow-up survey in 1962, using the same criteria, they found that the position of these rural communities had hardly changed in a decade. There were many more surgeries, due to the growing provisions of the NHS, but the number of village schools had dropped from 103 to 92 and of non-food shops from fifty to twenty-seven.

001

 Suffolk County flag.

In 1964, a regional, South-east Plan was being considered, which included both Suffolk and Norfolk. Moorhouse considered that it might transform the whole of East Anglia into something more approximating Hertfordshire or Essex in terms of economic development. But he also felt that unless there was a change of national direction, the East Country could not stay as it was, virtually inviolate, its people so conscious of their inaccessibility that they frequently refer to the rest of England as ‘The Shires’, and with so many of them eking out a living in small rural communities as their forefathers had done for generations.  It was scarcely surprising, wrote Moorhouse, that the young were leaving, looking for something better. The appeal of bigger towns and cities, with their exciting anonymity, was great enough for many whose childhood and adolescence had been spent wholly in the confining atmosphere of the village. Combined with the lack of basic amenities and work opportunities, this left young people with few reasons to stay.

Power, Ports & Progress:

74

A lonely stretch of coast near Leiston, still enjoyed by caravanners and campers, was the sight of another important development. There, at Sizewell, Britain’s second nuclear power station was built in the early 1960s (the first was built at Windscale in Cumbria in the late fifties). In 1966, power began surging out from the grey, cuboid plant (a model of which – pictured above – can be seen at the Richard Garrett museum in Leiston) into the national grid. By the late seventies, Sizewell’s 580,000 kilowatts were going a long way towards meeting eastern England’s electricity needs.

DSC09797

Sizewell Nuclear Power Station (2014)

The docks also began to be modernised, with ports like Tilbury and Felixstowe hastening the decline of London, which could not handle containerised freight. In addition, most of the Suffolk ports were no further from London than those of Kent and they were a great deal closer to the industrial Midlands and North. In 1955 the Felixstowe Dock and Railway Company had on its hands a dilapidated dock that needed dredging, and warehouses, quays and sea walls all showing signs of storm damage. The total labour force was nine men. By the mid-seventies, the dock area covered hundreds of acres, many reclaimed, made up of spacious wharves, warehouses and storage areas equipped with the latest cargo handling machinery. The transformation began in 1956 as the direct result of foresight and careful planning. The Company launched a three million pound project to create a new deepwater berth geared to the latest bulk transportation technique – containerisation. It calculated that changing trading patterns and Felixstowe’s proximity to Rotterdam and Antwerp provided exciting prospects for an efficient, well-equipped port. Having accomplished that, it set aside another eight million for an oil jetty and bulk liquid storage facilities. In addition, a passenger terminal was opened in 1975. The dock soon acquired a reputation for fast, efficient handling of all types of cargo, and consignments could easily reach the major industrial centres by faster road and rail networks.

DSC09982

Looking across the estuary from Harwich to the Felixstowe container port today.

DSC09983

Increasing trade crammed the Suffolk’s main roads with lorries and forced an expansion and improvement of port facilities. The development of new industries and the growth of the east coast ports necessitated a considerable programme of trunk road improvement. From the opening of the first stretches of motorway in the winter of 1958/59, including the M1, there was a major improvement in the road network. By 1967 motorways totalled 525 miles in length, at a cost of considerable damage to the environment.  This continued into the mid-seventies at a time when economic stringency was forcing the curtailment of other road building schemes. East Anglia’s new roads were being given priority treatment for the first time. Most of the A12, the London-Ipswich road, was made into a dual carriageway. The A45, the artery linking Ipswich and Felixstowe with the Midlands and the major motorways, had been considerably improved. Stowmarket, Bury St Edmunds and Newmarket had been bypassed. By the end of the decade, the A11/M11 London-Norwich road was completed, bringing to an end the isolation of central Norfolk and Suffolk.

021 (10)

DSC09844

 

 

 

 

 

 

 

 

 

 

 

 

 

Above Left: An old milestone in the centre of Woodbridge, Suffolk; Right: The M1 at Luton Spur, opened 1959.

Culture, Landscape & Heritage; Continuity & Conflict:

 

DSCN0790

Suffolk remained a haven for artists, writers and musicians. Indeed, if the county had any need to justify its existence it would be sufficient to read the roll call of those who have found their spiritual home within its borders. Among them, and above them, towers Benjamin Britten, who lived in Aldeburgh and drew inspiration from the land and people of Suffolk for his opera Peter Grimes. The composer moved to the seaside town in 1947 on his return from the USA and almost at once conceived the idea of holding a festival of arts there. It began quietly the following year but grew rapidly thereafter as the activities multiplied – concerts, recitals, operas and exhibitions – and every suitable local building was made use of. Many great artists came to perform and the public came, from all over the world, to listen. Britten had long felt the need for a large concert hall with good acoustics but he did not want to move the festival away from Aldeburgh and the cost of building a new hall was prohibitive.

DSCN0792

In October 1965, the lease of part of a disused ‘maltings’ at nearby Snape became available. It was in a beauty spot at a bridge over the River Alde (pictured above), and architects and builders were soon drafted in to transform the site into a concert hall and other facilities for making music. Queen Elizabeth II opened the buildings in June 1967, but almost exactly two years later disaster struck when the Maltings was burnt out. Only the smoke-blackened walls were left standing, but there was an almost immediate determination that the concert hall would be rebuilt. Donations poured in from all over the world and in less than forty-two weeks the hall had been reconstructed to the original design, and the complex was extended by adding rehearsal rooms, a music library, an art gallery, an exhibition hall and other facilities.

003

The Suffolk shore or, to be more accurate, ‘off-shore’ also made a crucial contribution to the breakthrough of popular or ‘pop’ music in Britain. At Easter 1964 the first illegal ‘pirate’ radio station, Radio Caroline, began broadcasting from a ship just off the Suffolk coast (see map, right). Within months, millions of young people were listening to Radio Caroline North and Radio Caroline South, Radio London and other pirate stations that sprung up. Not only did they broadcast popular music records, but they also reminded their listeners that any attempt to silence them would constitute a direct ‘attack on youth’.

007 (25)

With the advent of these radio stations, the BBC monopoly on airtime was broken, and bands were able to get heard beyond their concerts. Eventually, the Government acted to bring an end to its ‘cold war’ with the British record industry. The BBC set up Radio One to broadcast popular records and in August 1967, the Marine Offences Act outlawed the pirate ships.

Back on dry land, there were areas of conflict, then as now, in which the interests of farmers, businessmen, holidaymakers and country residents clashed. When the farmer rooted out hedges, sprayed insecticides indiscriminately and ploughed up footpaths he soon had conservationists and countryside agencies on his back. When schedule-conscious truck drivers thundered their way through villages, there were angry protests.

019 (17)

002

Saxtead Green’s post mill (see OS map above for location near Framlingham) as it looked in the 1970s when it was maintained by the Department of the Environment; it is now managed (2018) by English Heritage.

w290 (1)There were also, still, many for whom the images of Constable’s rolling landscapes were set in their mind’s eye. For them, this was, above all, his inviolable country. It was also dotted with windmills, another echo of earlier continental associations, many of them still working. Every new building project was examined in great detail by environmentalists.

Many local organisations were formed to raise awareness about and resist specific threats to rural heritage, such as the Suffolk Preservation Society and Suffolk Historic Churches Trust.

001

DSC09864

Most of the churches, like the very early example at Rendlesham (right), were built of flint, both in Suffolk and in Norfolk, where a great number of them have round towers, a feature unique to that county. The farming people of Barsham in the Waveney Valley added their church to the Norman round tower in the fourteenth century (pictured above). After that, they could not afford elaborate additions. When the nave needed re-roofing, modest thatch seemed to offer the best solution. Suffolk, in particular, had an incredibly rich and well-preserved heritage which gave it its distinct county identity.

DSC09863Almost every church had a superb timber roof, described by Moorhouse as a complex of rafters, kingposts, and hammerbeams which look, as you crane your neck at them, like the inverted hold of a ship (the one pictured left is again, from Rendlesham). Very often these medieval churches were miles from any kind of community, emphasising the peculiarly lonely feeling of most of the area. Most are the remains of the Black Death villages, where the plague killed off the entire population and no one ever came back.

 

Around its magnificent ‘wool church’ (pictured below), the half-timbered ‘perfection’ of Lavenham might not have survived quite so completely had it been located in the South of England. This was one of the hidden benefits of the county’s relative isolation which had, nevertheless, come to an end by the late seventies.

023

On the other hand, Wilson has reminded us that the wool-rich men of the town rebuilt their church almost entirely between 1485 and 1530 in the magnificent, new Perpendicular style, yet it remains today and is widely viewed as the crowning glory of ecclesiastical architecture in Suffolk. 

DSC09666

Many other of the county’s churches are not as Medieval as they look (see the fifteenth-century additions to the transepts of St Michael’s, Framlingham, above) which may challenge our contemporary view of the balance between preservation and progress. In 1974 the Department of the Environment produced a report called Strategic Choice for East Anglia. It forecast a population of over eight hundred thousand in Suffolk alone by the end of the century. It saw the major towns growing much larger and suggested that the counties would inevitably lose some of their individuality:

We know … that the change and the growth … will make East Anglia more like other places. For some, this will mean the growth should be resisted, and the opportunities which it brings should be foregone. Whether or not we sympathise with this point of view, we do not think it is practicable. Much of the change and growth that is coming cannot be prevented by any of the means that is likely to be available. The only realistic approach is to recognize this, and take firm, positive steps to maintain and even enhance the environment of the region, using the extra resources that growth will bring …

By the time the report was published, the people of East Anglia had already begun, as they had always done in earlier times, to face up to many of the problems which change and development brought their way.

 

Sources:

Joanna Bourke, et. al. (2001), The Penguin Atlas of British & Irish History. London: Penguin Books.

John Simpson (1999), Strange Places, Questionable People. Basingstoke: Macmillan Pan.

Derek Wilson (1977), A Short History of Suffolk. London: Batsford.

Geoffrey Moorhouse (1964),… Harmondsworth: Penguin Books.

004

Posted November 1, 2018 by TeamBritanniaHu in Affluence, Agriculture, Assimilation, BBC, Britain, British history, Christian Faith, Christian Socialism, Christianity, Church, Civilization, cleanliness, Co-operativism, Cold War, Commemoration, Conservative Party, Demography, Domesticity, East Anglia, Education, Elementary School, Europe, European Economic Community, Factories, Family, Great War, History, Home Counties, Hungary, Immigration, Integration, Journalism, Labour Party, manufacturing, Medieval, Midlands, Migration, Music, Mythology, Narrative, National Health Service (NHS), Norfolk, Population, Poverty, Refugees, Respectability, Scotland, Second World War, Suffolk, Tudor times, Uncategorized, Welfare State, World War One, World War Two

Tagged with , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

The Other Side of the Eighties in Britain, 1983-1988: The Miners and The Militants.   Leave a comment

Labour – Dropping the Donkey Jacket:

From 1980 to 1983, Michael Foot’s leadership had saved the Labour Party from splitting into two, but in all other respects, it was a disaster. He was too old, too decent, too gentle to take on the hard left or to modernise his party. Foot’s policies were those of a would-be parliamentary revolutionary detained in the second-hand bookshops in Hay-on-Wye. I enjoyed this experience myself in 1982, with a minibus full of bookish ‘revolutionaries’ from Cardiff, who went up there, as it happened, via Foot’s constituency. When roused, which was often, his Cromwellian hair would flap across a face contorted with passion, his hands would whip around excitedly and denunciations would pour forth from him with a fluency ‘old Noll’ would have envied. During his time as leader, he was in his late sixties, and would have been PM at seventy, had he won the 1983 General Election, which, of course, was never a remote possibility. Unlike Thatcher, he was contemptuous of the shallow presentational tricks demanded by television, and he could look dishevelled, being famously denounced for wearing a ‘donkey jacket’, in reality, a Burberry-style woollen coat, at the Remembrance Service at the Cenotaph. But he was more skilled than anyone I saw then or have seen since, in whipping up the socialist faithful in public meetings, or in finger-stabbing attacks on the Tory government in the House of Commons, both in open debates and questions to the PM. He would have been happier communing with Jonathan Swift and my Gulliver forbears in Banbury than attempting to operate in a political system which depended on television performances, ruthless organisation and managerial discipline. He was a political poet in an age of prose.

Nobody in the early eighties could have reined in its wilder members; Foot did his best but led the party to its worst defeat in modern times, on the basis of a hard-left, anti-Europe, anti-nuclear, pro-nationalisation manifest famously described by Gerald Kaufman as the longest suicide note in history. Kaufman had also urged Foot to stand down before the election. It was a measure of the affection felt for him that his ‘swift’ retirement after the defeat was greeted with little recrimination. Yet it also meant that when Neil Kinnock won the subsequent leadership election he had a mandate for change no previous Labour leader had enjoyed. He won with seventy-one per cent of the electoral college votes, against nineteen per cent for Roy Hattersley. Tony Benn was out of Parliament, having lost his Bristol seat, and so could not stand as the standard-bearer of the hard left. Kinnock had been elected after a series of blistering campaign speeches, a Tribunite left-winger who, like Foot, advocated the unilateral abandonment of all Britain’s nuclear weapons, believed in nationalisation and planning and wanted Britain to withdraw from the European Community. A South Wales MP from the same Bevanite stock as Foot, he also supported the abolition of private medicine and the repeal of the Tory trade union reforms. To begin with, the only fights he picked with the Bennites were over the campaign to force Labour MPs to undergo mandatory reselection, which handed a noose to local Militant activists. Yet after the chaos of the 1983 Campaign, he was also sure that the party was in need of radical remedies.

003

To win power, Labour needed to present itself better in the age of the modern mass media. Patricia Hewitt (pictured above), known for her campaigning on civil liberties, joined Kinnock’s new team. She was chosen to fight Leicester East in the 1983 Election but was unsuccessful. In her new role, she began trying to control interviews and placing the leader in more flattering settings than those Foot had found himself in. Kinnock knew how unsightly ‘old’ Labour had looked to the rest of the country and was prepared to be groomed. He gathered around him a ‘Pontypool front row’ of tough, aggressive heavy-weights, including Charles Clarke, the former communist NUS leader; John Reid, another former communist and Glaswegian backbench bruiser. Hewitt herself and Peter Mandelson, grandson of Herbert Morrison and Labour’s side-stepping future director of communications, led the three-quarter line with Kinnock himself as the able scrum-half. Kinnock was the first to flirt with the once-abhorred world of advertising and to seek out the support of pro-Labour pop artists such as Tracy Ullman and Billy Bragg. In this, he was drawing on a long tradition on the Welsh left, from Paul Robeson to the Hennesseys. He smartened up his own style, curtailing the informal mateyness which had made him popular among the ‘boyos’ and introduced a new code of discipline in the shadow cabinet.

004

Neil Kinnock attacking the Militant Tendency at the party conference in 1985.

In the Commons, he tried hard to discomfit Thatcher at her awesome best, which was difficult and mostly unsuccessful. The mutual loathing between them was clear for all to see, and as Thatcher’s popularity began to decline in 1984, Labour’s poll ratings slowly began to improve. But the party harboured a vocal minority of revolutionaries of one kind or another. They included not only the long-term supporters of Tony Benn, like Jeremy Corbyn, but also Arthur Scargill and his brand of insurrectionary syndicalism; the Trotskyist Militant Tendency, a front for the Revolutionary Socialist League, which had been steadily infiltrating the party since the sixties; and assorted hard-left local councillors, like Derek Hatton in Liverpool, a Militant member who were determined to defy Thatcher’s government, no matter how big its democratic mandate, by various ‘ultra-vires’ and illegal stratagems. Kinnock dealt with them all. Had he not done so New Labour would never have happened, yet he himself was a passionate democratic socialist whose own politics were well to the left of the country.

Neil Kinnock was beginning a tough journey towards the centre-ground of British politics, which meant leaving behind people who sounded much like his younger self. On this journey, much of his natural wit and rhetoric would be silenced. He had created his leadership team as if it were a rugby team, involved in a confrontational contact sport against opponents who were fellow enthusiasts, but with their own alternative strategy. He found that political leadership was more serious, drearier and nastier than rugby. And week after week, he was also confronting someone in Thatcher someone whose principles had been set firm long before and whose politics clearly and consistently expressed those principles on the field of play. Yet, like a Welsh scrum-half, he was always on the move, always having to shadow and shade, to side-step and shimmy, playing the ball back into the scrum or sideways to his three-quarters rather than kicking it forward. The press soon dubbed him ‘the Welsh windbag’, due to his long, discursive answers in interviews.

001 (3)

The first and toughest example of what he was up against came with the miners’ strike. Neil Kinnock and Arthur Scargill (above) had already shown their loathing for each other over the mainstream leadership’s battles with the Bennites. The NUM President was probably the only person on the planet that Kinnock hated more than Thatcher. He distrusted Scargill’s aims, despised his tactics and realised early on that he was certain to fail. In this, he was sharing the views of the South Wales NUM who had already forced a U-turn on closures from an unprepared Thatcher in 1981. Yet they, and he had to remain true to their own traditions and heritage. They both found themselves in an embarrassing situation, but more importantly, they realised that like it or not, they were in an existential struggle. As the violence spread, the Conservatives in the Commons and their press continually goaded and hounded him to denounce the use of ‘flying pickets’ and to praise the police. He simply could not do so, as so many on his own side had experienced the violence of the police, or heard about it from those who had. For him to attack the embattled trade union would be seen as the ultimate betrayal by a Labour leader. He was caught between the rock of Thatcher and hard place of Scargill. In the coalfields, even in South Wales, he was shunned on the picket lines as the miner’s son too “frit” in Thatcher’s favourite phrase, to come to the support of the miners in their hour of need. Secretly, however, there was some sympathy for his impossible situation among the leadership of the South Wales NUM. Kinnock at least managed to avoid fusing Labour and the NUM in the mind of many Labour voters, ensuring that Scargill’s ultimate, utter defeat was his alone. But this lost year destroyed his early momentum and stole his hwyl, his Welsh well-spring of ‘evangelical’ socialist spirit.

The Enemy Within?:

002

Above: Striking Yorkshire miners barrack moderate union leaders in Sheffield.

The first Thatcher government was had been dominated by the Falklands War; the second was dominated by the miners’ strike. Spurred on by ‘the spirit of the Falklands’, the government took a more confrontational attitude towards the trade unions after the 1983 General Election. This year-long battle, 1984-5, was the longest strike in British history, the most bitter, bloody and tragic industrial dispute since the General Strike and six-month Miners’ Lock-out of 1926. The strike was eventually defeated, amid scenes of mass picketing and running battles between the police and the miners. It resulted in the total defeat of the miners followed by the end of deep coal-mining in Britain. In reality, the strike simply accelerated the continuing severe contraction in the industry which had begun in the early eighties and which the South Wales NUM had successfully resisted in what turned out, however, to be a Pyrrhic victory. By 1984, the government had both the resources, the popular mandate and the dogged determination to withstand the miners’ demands. The industry had all but vanished from Kent, while in Durham two-thirds of the pits were closed. They were the only real source of employment to local communities, so the social impact of closures proved devastating. In the Durham pit villages, the entire local economy was crippled and the miners’ housing estates gradually became the ghost areas they remain today.

001

The government had little interest in ensuring the survival of the industry, with its troublesome and well-organised union which had already won a national strike against the Heath government a decade earlier. For the Thatcher government, the closures resulting from the defeat of the strike were a price it was willing to pay in order to teach bigger lessons. Later, the Prime Minister of the time reflected on these:

What the strike’s defeat established was that Britain could not be made ungovernable by the Fascist Left. Marxists wanted to defy the law of the land in order to defy the laws of economics. They failed and in doing so demonstrated just how mutually dependent the free economy and a free society really are.

It was a confrontation which was soaked in history on all sides. For the Tories, it was essential revenge for Heath’s humiliation, a score they had long been eager to settle; Margaret Thatcher spoke of Arthur Scargill and the miners’ leaders as ‘the enemy within’, as compared to Galtieri, the enemy without. For thousands of traditionally ‘militant’ miners, it was their last chance to end decades of pit closures and save their communities, which were under mortal threat. For their leader Arthur Scargill, it was an attempt to follow Mick McGahey in pulling down the government and winning a class war. He was no more interested than the government, as at least the other former, more moderate leaders had been, in the details of pay packets, or in a pit-by-pit review to determine which pits were truly uneconomic. He was determined to force the government, in Thatcher’s contemptuous phrase, to pay for mud to be mined rather than see a single job lost.

The Thatcher government had prepared more carefully than Scargill. Following the settlement with the South Wales NUM, the National Coal Board (NCB) had spent the intervening two years working with the Energy Secretary, Nigel Lawson, to pile up supplies of coal at the power stations; stocks had steadily grown, while consumption and production both fell. Following the riots in Toxteth and Brixton, the police had been retrained and equipped with full riot gear without which, ministers later confessed, they would have been unable to beat the pickets. Meanwhile, Thatcher had appointed a Scottish-born Australian, Ian MacGregor, to run the NCB. He had a fierce reputation as a union-buster in the US and had been brought back to Britain to run British Steel where closures and 65,000 job cuts had won him the title ‘Mac the Knife’. Margaret Thatcher admired him as a tough, no-nonsense man, a refreshing change from her cabinet, though she later turned against him for his lack of political nous. His plan was to cut the workforce of 202,000 by 44,000 in two years, then take another twenty thousand jobs out. Twenty pits would be closed, to begin with. When he turned up to visit mines, he was abused, pelted with flour bombs and, on one occasion, knocked to the ground.

Arthur Scargill was now relishing the coming fight as much as Thatcher. In the miners’ confrontation with Heath, Scargill had led the flying pickets at the gates of the Saltley coke depot outside Birmingham. Some sense of both his revolutionary ‘purity’, combined with characteristic Yorkshire bluntness, comes from an exchange he had with Dai Francis, the Welsh Miners’ leader at that time. He had called Francis to ask for Welsh pickets to go to Birmingham and help at the depot. Francis asked when they were needed and Scargill replied:

“Tomorrow, Saturday.”

“But Wales are playing Scotland at Cardiff Arms Park.”

“But Dai, the working class are playing the ruling class at Saltley.”

009

Many found Scargill inspiring; many others found him scary. Like Francis, he had been a Communist, but unlike Dai (pictured above, behind the poster, during the 1972 strike), he retained hard-line Marxist views and a penchant for denouncing anyone who disagreed with him. Kim Howells, also a former Communist and an officer of the South Wales NUM, gained a sense of Scargill’s megalomania when, just prior the 1984-5 strike, he visited his HQ in Barnsley, already known as ‘Arthur’s Castle’. Howells, a historian of the Welsh Labour movement, later becoming an MP and New Labour minister, was taken aback to find him sitting at this Mussolini desk with a great space in front of it. Behind him was a huge painting of himself on the back of a lorry, posed like Lenin, urging picketing workers in London to overthrow the ruling class. Howells thought anyone who could put up a painting like that was nuts and returned to Pontypridd to express his fears to the Welsh miners:

And of course the South Wales executive almost to a man agreed with me. But then they said, “He’s the only one we’ve got, see, boy.  The Left has decided.”

Scargill had indeed been elected by a huge margin and had set about turning the NUM’s once moderate executive, led by Joe Gormley, into a militant group. The Scottish Miners’ leader, Mick McGahey, although older and wiser than his President, was his Vice-President. Scargill had been ramping up the rhetoric for some time. He had told the NUM Conference in 1982, …

If we do not save our pits from closure then all our other struggles will become meaningless … Protection of the industry is my first priority because without jobs all our other claims lack substance and become mere shadows. Without jobs, our members are nothing …

Given what was about to happen to his members’ jobs as a result of his uncompromising position in the strike, there is a black irony in those words. By insisting that no pits should be closed on economic grounds, even if the coal was exhausted, and that more investment would always find more coal, from his point of view the losses were irrelevant. He made sure that confrontation would not be avoided. An alternative strategy put forward by researchers for the South Wales NUM was that it was the NCB’s economic arguments that needed to be exposed, along with the fact that it was using the Miners’ Pension Fund to invest in the production of cheap coal in Poland and South Africa. It’s definition of what was ‘economic’ in Britain rested on the comparative cost of importing this coal from overseas. If the NCB had invested these funds at home, the pits in Britain would not be viewed as being as ‘uneconomic’ as they claimed. But Scargill was either not clever enough to deploy these arguments or too determined to pursue the purity of his brand of revolutionary syndicalism, or both.

The NUM votes which allowed the strike to start covered both pay and closures, but from the start Scargill emphasised the closures. To strike to protect jobs, particularly other people’s jobs, in other people’s villages and other countries’ pits, gave the confrontation an air of nobility and sacrifice which a mere wages dispute would not have enjoyed. But national wage disputes had, for more than sixty years, been about arguments over the ‘price of coal’ and the relative difficulties of extracting it from a variety of seams in very different depths across the various coalfields. Neil Kinnock, the son and grandson of Welsh miners, found it impossible to condemn Scargill’s strategy without alienating support for Labour in its heartlands. He did his best to argue the economics of the miners’ case, and to condemn the harshness of the Tory attitude towards them, but these simply ran parallel to polarised arguments which were soon dividing the nation.

Moreover, like Kinnock, Scargill was a formidable organiser and conference-hall speaker, though there was little economic analysis to back up his rhetoric. Yet not even he would be able to persuade every part of the industry to strike. Earlier ballots had shown consistent majorities against striking. In Nottinghamshire, seventy-two per cent of the areas 32,000 voted against striking. The small coalfields of South Derbyshire and Leicestershire were also against. Even in South Wales, half of the NUM lodges failed to vote for a strike. Overall, of the seventy thousand miners who were balloted in the run-up to the dispute, fifty thousand had voted to keep working. Scargill knew he could not win a national ballot, so he decided on a rolling series of locally called strikes, coalfield by coalfield, beginning in Yorkshire, then Scotland, followed by Derbyshire and South Wales. These strikes would merely be approved by the national union. It was a domino strategy; the regional strikes would add up to a national strike, but without a national ballot.

But Scargill needed to be sure the dominoes would fall. He used the famous flying pickets from militant areas to shut down less militant ones. Angry miners were sent in coaches and convoys of cars to close working pits and the coke depots, vital hubs of the coal economy. Without the pickets, who to begin with rarely needed to use violence to achieve their end, far fewer pits would have come out. But after scenes of physical confrontation around Britain, by April 1984 four miners in five were on strike. There were huge set-piece confrontations with riot-equipped police bused up from London or down from Scotland, Yorkshire to Kent and Wales to Yorkshire, generally used outside their own areas in order to avoid mixed loyalties. As Andrew Marr has written, …

It was as if the country had been taken over by historical re-enactments of civil war battles, the Sealed Knot Society run rampant. Aggressive picketing was built into the fabric of the strike. Old country and regional rivalries flared up, Lancashire men against Yorkshire men, South Wales miners in Nottinghamshire.

The Nottinghamshire miners turned out to be critical since without them the power stations, even with the mix of nuclear and oil, supplemented by careful stockpiling, might have begun to run short and the government would have been in deep trouble. To Scargill’s disdain, however, other unions also refused to come out in sympathy, thus robbing him of the prospect of a General Strike, and it soon became clear that the NUM had made other errors in their historical re-enactments. Many miners were baffled from the beginning as to why Scargill had opted to strike in the spring when the demand for energy was relatively low and the stocks at the power stations were not running down at anything like the rate which the NUM needed in order to make their action effective. This was confirmed by confidential briefings from the power workers, and it seemed that the government just had to sit out the strike.

In this civil war, the police had the cavalry, while the miners were limited to the late twentieth-century equivalent of Oakey’s dragoons at Naseby, their flying pickets, supporting their poor bloody infantry, albeit well-drilled and organised. Using horses, baton charges and techniques learned in the aftermath of the street battles at Toxteth and Brixton, the police defended working miners with a determination which delighted the Tories and alarmed many others, not just the agitators for civil rights. An event which soon became known as the Battle of Orgreave (in South Yorkshire) was particularly brutal, involving ‘Ironside’ charges by mounted police in lobster-pot style helmets into thousands of miners with home-made pikes and pick-axe handles.

The NUM could count on almost fanatical loyalty in coalfield towns and villages across Britain. Miners gave up their cars, sold their furniture, saw their wives and children suffer and lost all they had in the cause of solidarity. Food parcels arrived from other parts of Britain, from France and most famously, from Soviet Russia. But there was a gritty courage and selflessness in mining communities which, even after more than seventy years of struggle, most of the rest of Britain could barely understand. But an uglier side to this particularly desperate struggle also emerged when a taxi-driver was killed taking a working miner to work in Wales. A block of concrete was dropped from a pedestrian bridge onto his cab, an act swiftly condemned by the South Wales NUM.

In Durham, the buses taking other ‘scabs’ to work in the pits were barraged with rocks and stones, as later portrayed in the film Billy Elliot. The windows had to be protected with metal grills. There were murderous threats made to strike-breaking miners and their families, and even trade union ‘allies’ were abused. Norman Willis, the amiable general secretary of the TUC, had a noose dangled over his head when he spoke at one miners’ meeting. This violence was relayed to the rest of the country on the nightly news at a time when the whole nation still watched together. I remember the sense of helplessness I felt watching the desperation of the Welsh miners from my ‘exile’ in Lancashire, having failed to find a teaching post in the depressed Rhondda in 1983. My Lancastrian colleagues were as divided as the rest of the country over the strike, often within themselves as well as from others. In the end, we found it impossible to talk about the news, no matter how much it affected us.

Eventually, threatened by legal action on the part of the Yorkshire miners claiming they had been denied a ballot, the NUM was forced onto the back foot. The South Wales NUM led the calls from within for a national ballot to decide on whether the strike should continue. Scargill’s decision to accept a donation from Colonel Gaddafi of Libya found him slithering from any moral ground he had once occupied. As with Galtieri, Thatcher was lucky in the enemies ‘chosen’ for her. Slowly, month by month, the strike began to crumble and miners began to trail back to work. A vote to strike by pit safety officers and overseers, which would have shut down the working pits, was narrowly avoided by the government. By January 1985, ten months after they had first been brought out, strikers were returning to work at the rate of 2,500 a week, and by the end of February, more than half the NUM’s membership was back at work. In some cases, especially in South Wales, they marched back proudly behind brass bands.

001 (2)

Above: ‘No way out!’ – picketing miners caught and handcuffed to a lamp-post by police.

Scargill’s gamble had gone catastrophically wrong. He has been compared to a First World War general, a donkey sending lions to the slaughter, though at Orgreave and elsewhere, he had stood with them too. But the political forces engaged against the miners in 1984 were entirely superior in strength to those at the disposal of the ill-prepared Heath administration of ten years earlier. A shrewder, non-revolutionary leader would not have chosen to take on Thatcher’s government at the time Scargill did, or having done so, would have found a compromise after the first months of the dispute. Today, there are only a few thousand miners left of the two hundred thousand who went on strike. An industry which had once made Britain into a great industrial power, but was always dangerous, disease-causing, dirty and polluting, finally lay down and died. For the Conservatives and perhaps for, by the end of the strike, the majority of the moderate British people, Scargill and his lieutenants were fighting parliamentary democracy and were, therefore, an enemy which had to be defeated. But the miners of Durham, Derbyshire, Kent, Fife, Yorkshire, Wales and Lancashire were nobody’s enemy. They were abnormally hard-working, traditional people justifiably worried about losing their jobs and loyal to their union, if not to the stubborn syndicalists in its national leadership.

Out with the Old Industries; in with the New:

In Tyneside and Merseyside, a more general deindustrialisation accompanied the colliery closures. Whole sections of industry, not only coal but also steel and shipbuilding, virtually vanished from many of their traditional areas.  Of all the areas of Britain, Northern Ireland suffered the highest level of unemployment, partly because the continuing sectarian violence discouraged investment. In February 1986, there were officially over 3.4 million unemployed, although statistics were manipulated for political reasons and the real figure is a matter of speculation. The socially corrosive effects were felt nationally, manifested in further inner-city rioting in 1985. Inner London was just as vulnerable as Liverpool, a crucial contributory factor being the number of young men of Asian and Caribbean origin who saw no hope of ever entering employment: opportunities were minimal and they felt particularly discriminated against. The term ‘underclass’ was increasingly used to describe those who felt themselves to be completely excluded from the benefits of prosperity.

Prosperity there certainly was, for those who found alternative employment in the service industries. Between 1983 and 1987, about 1.5 million new jobs were created. Most of these were for women, and part-time vacancies predominated. The total number of men in full-time employment fell still further, and many who left the manufacturing for the service sector earned much-reduced incomes. The economic recovery that led to the growth of this new employment was based mainly on finance, banking and credit. Little was invested in British manufacturing. Far more was invested overseas; British foreign investments rose from 2.7 billion in 1975 to a staggering 90 billion in 1985. At the same time, there was a certain amount of re-industrialisation in the South East, where new industries employing the most advanced technology grew. In fact, many industries shed a large proportion of their workforce but, using new technology, maintained or improved their output.

These new industries were not confined to the South East of England: Nissan built the most productive car plant in Europe at Sunderland. After an extensive review, Sunderland was chosen for its skilled workforce and its location near major ports. The plant was completed in 1986 as the subsidiary Nissan Motor Manufacturing (UK) Ltd. Siemens established a microchip plant at Wallsend on Tyneside in which it invested 1.1 billion. But such industries tended not to be large-scale employers of local workers. Siemens only employed about 1,800. Traditional regionally-based industries continued to suffer a dramatic decline during this period. Coal-mining, for example, was decimated in the years following the 1984-5 strike, not least because of the shift of the electricity generation of the industry towards alternative energy sources, especially gas. During 1984-7 the coal industry shed 170,000 workers.

The North-South Divide – a Political Complex?:

By the late 1980s, the north-south divide in Britain seemed as intractable as it had all century, with high unemployment particularly concentrated in the declining extractive and manufacturing industries of the North of England, Scotland and Wales. That the north-south divide increasingly had a political as well as an economic complexion was borne out by the outcome of the 1987 General Election. While Margaret Thatcher was swept back to power for the third time, her healthy Conservative majority largely based on the voters of the South and East of England. North of a line roughly between the Severn and the Humber, the long decline of the Tories, especially in Scotland, where they were reduced to ten seats, was increasingly apparent. At the same time, the national two-party system seemed to be breaking down. South of the Severn-Humber line, where Labour seats were now very rare outside London, the Liberal-SDP Alliance were the main challengers to the Conservatives in many constituencies.

The Labour Party continued to pay a heavy price for its internal divisions, as well as for the bitterness engendered by the miners’ strike. It is hardly Neil Kinnock’s fault that he is remembered for his imprecise long-windedness, the product of self-critical and painful political readjustment. His admirers recall his great platform speeches, the saw-edged wit and air-punching passion. There was one occasion, however, when Kinnock spoke so well that he united most of the political world in admiration. This happened at the Labour conference in Bournemouth in October 1985. A few days before the conference, Liverpool City Council, formally Labour-run but in fact controlled by the Revolutionary Socialist League, had sent out redundancy notices to its thirty-one thousand staff. The revolutionaries, known by the name of their newspaper, Militant, were a party-within-a-party, a parasitic body within Labour. They had some five thousand members who paid a proportion of their incomes to the RSL so that the Militant Tendency had a hundred and forty full-time workers, more than the staff of the Social Democrats and Liberals combined. They had a presence all around Britain, but Liverpool was their great stronghold. There they practised Trotsky’s politics of the transitional demand, the tactic of making impossible demands for more spending and higher wages so that when the ‘capitalist lackeys’ refused these demands, they could push on to the next stage, leading to collapse and revolution.

In Liverpool, where they were building thousands of new council houses, this strategy meant setting an illegal council budget and cheerfully bankrupting the city. Sending out the redundancy notices to the council’s entire staff was supposed to show Thatcher they would not back down, or shrink from the resulting chaos. Like Scargill, Militant’s leaders thought they could destroy the Tories on the streets. Kinnock had thought of taking them on a year earlier but had decided that the miners’ strike made that impossible. The Liverpool mayhem gave him his chance, so in the middle of his speech at Bournemouth, he struck. It was time, he said, for Labour to show the public that it was serious. Implausible promises would not bring political victory:

I’ll tell you what happens with impossible promises. You start with far-fetched resolutions. They are then pickled into a rigid dogma, a code, and you go through the years sticking to that, outdated, misplaced, irrelevant to the real needs, and you end in the grotesque chaos of a Labour council – a Labour council – hiring taxis to scuttle round a city handing out redundancy notices to its own workers.

By now he had whipped himself into real anger, a peak of righteous indignation, but he remained in control. His enemies were in front of him, and all the pent-up frustrations of the past year were being released. The hall came alive. Militant leaders like Derek Hatton stood up and yelled ‘lies!’ Boos came from the hard left, and some of their MPs walked out, but Kinnock was applauded by the majority in the hall, including his mainstream left supporters. Kinnock went on with a defiant glare at his opponents:

I’m telling you, and you’ll listen, you can’t play politics with people’s jobs and with people’s services, or with their homes. … The people will not, cannot abide posturing. They cannot respect the gesture-generals or the tendency tacticians.

Most of those interviewed in the hall and many watching live on television, claimed it was the most courageous speech they had ever heard from a Labour leader, though the hard left remained venomously hostile. By the end of the following month, Liverpool District Labour Party, from which Militant drew its power, was suspended and an inquiry was set up. By the spring of 1986, the leaders of Militant had been identified and charged with behaving in a way which was incompatible with Labour membership. The process of expelling them was noisy, legally fraught and time-consuming, though more than a hundred of them were eventually expelled. There was a strong tide towards Kinnock across the rest of the party, with many left-wingers cutting their ties to the Militant Tendency. There were many battles with the hard left to come, and several pro-Militant MPs were elected in the 1987 Election. These included two Coventry MPs, Dave Nellist and John Hughes, ‘representing’ my own constituency, whose sole significant, though memorable ‘contribution’ in the House of Commons was to interrupt prayers. Yet by standing up openly to the Trotskyist menace, as Wilson, Callaghan and Foot had patently failed to do, Kinnock gave his party a fresh start. It began to draw away from the SDP-Liberal Alliance in the polls and did better in local elections. It was the moment when the New Labour project became possible.

A Third Victory and a Turning of the Tide:

Yet neither this internal victory nor the sharper management that Kinnock introduced, would bring the party much good against Thatcher in the following general election. Labour was still behind the public mood. Despite mass unemployment, Thatcher’s free-market optimism was winning through, and Labour was still committed to re-nationalisation, planning, a National Investment Bank and unilateral nuclear disarmament, a personal cause of both Neil and his wife, Glenys, over the previous twenty years. The Cold War was thawing and it was not a time for the old certainties, but for the Kinnocks support for CND was fundamental to their political make-up. So he stuck to the policy, even as he came to realise how damaging it was to Labour’s image among swing voters. Under Labour, all the British and US nuclear bases would be closed, the Trident nuclear submarine force cancelled, all existing missiles scrapped and the UK would no longer expect any nuclear protection from the US in time of war. Instead, more money would be spent on tanks and conventional warships. All of this did them a lot of good among many traditional Labour supporters; Glenys turned up at the women’s protest camp at Greenham Common. But it was derided in the press and helped the SDP to garner support from the ‘middle England’ people Labour needed to win back. In the 1987 General Election campaign, Kinnock’s explanation about why Britain would not simply surrender if threatened by a Soviet nuclear attack sounded as if he was advocating some kind of Home Guard guerrilla campaign once the Russians had arrived. With policies like this, he was unlikely to put Thatcher under serious pressure.

When the 1987 election campaign began, Thatcher had a clear idea about what her third administration would do. She wanted more choice for the users of state services. There would be independent state schools outside the control of local councillors, called grant-maintained schools.  In the health services, though it was barely mentioned in the manifesto, she wanted money to follow the patient. Tenants would be given more rights. The basic rate of income tax would be cut and she would finally sort out local government, ending the ‘rates’ and bringing in a new tax. On paper, the programme seemed coherent, which was more than could be said for the Tory campaign itself. Just as Kinnock’s team had achieved a rare harmony and discipline, Conservative Central Office was riven by conflict between politicians and ad-men. The Labour Party closed the gap to just four points and Mrs Thatcher’s personal ratings also fell as Kinnock’s climbed. He was seen surrounded by admiring crowds, young people, nurses, waving and smiling, little worried by the hostile press. In the event, the Conservatives didn’t need to worry. Despite a last-minute poll suggesting a hung parliament, and the late surge in Labour’s self-confidence, the Tories romped home with an overall majority of 101 seats, almost exactly the share, forty-two per cent, they had won in 1983. Labour made just twenty net gains, and Kinnock, at home in Bedwellty, was inconsolable. Not even the plaudits his team had won from the press for the brilliance, verve and professionalism of their campaign would lift his mood.

The SDP-Liberal Alliance had been floundering in the polls for some time, caught between Labour’s modest revival and Thatcher’s basic and continuing popularity with a large section of voters. The rumours of the death of Labour had been greatly exaggerated, and the ‘beauty contest’ between the two Davids, Steel and Owen, had been the butt of much media mockery. Owen’s SDP had its parliamentary presence cut from eight MPs to five, losing Roy Jenkins in the process. While most of the party merged with the Liberals, an Owenite rump limped on for a while. Good PR, packaging and labelling were not good enough for either Labour or the SDP. In 1987, Thatcher had not yet created the country she dreamed of, but she could argue that she had won a third consecutive victory, not on the strength of military triumph, but on the basis of her ideas for transforming Britain. She also wanted to transform the European Community into a free-trade area extending to the Baltic, the Carpathians and the Balkans. In that, she was opposed from just across the Channel and from within her own cabinet.

In the late eighties, Thatcher’s economic revolution overreached itself. The inflationary boom happened due to the expansion of credit and a belief among ministers that, somehow, the old laws of economics had been abolished; Britain was now supposed to be on a continual upward spiral of prosperity. But then, on 27 October 1986, the London Stock Exchange ceased to exist as the institution had formerly done. Its physical floor, once heaving with life, was replaced by dealing done by computer and phone. The volume of trading was fifteen times greater than it had been in the early eighties. This became known as ‘the Big Bang’ and a country which had exported two billion pounds-worth of financial services per year before it was soon exporting twelve times that amount. The effect of this on ordinary Britons was to take the brake off mortgage lending, turning traditional building societies into banks which started to thrust credit at the British public. Borrowing suddenly became a good thing to do and mortgages were extended rather than being paid off. The old rules about the maximum multiple of income began to dissolve. From being two and a half times the homeowner’s annual salary, four times became acceptable in many cases. House prices began to rise accordingly and a more general High Street splurge was fuelled by the extra credit now freely available. During 1986-88 a borrowing frenzy gripped the country, egged on by swaggering speeches about Britain’s ‘economic miracle’ from the Chancellor, Nigel Lawson, and the Prime Minister. Lawson later acknowledged:

My real mistake as Chancellor was to create a climate of optimism that, in the end, encouraged borrowers to borrow more than they should.

In politics, the freeing up and deregulation of the City of London gave Margaret Thatcher and her ministers an entirely loyal and secure base of rich, articulate supporters who helped see her through some tough battles. The banks spread the get-rich-quick prospect to millions of British people through privatisation share issues and the country, for a time, came closer to the share-owning democracy that Thatcher dreamed of.

The year after the election, 1988, was the real year of hubris. The Thatcher government began an attack on independent institutions and bullying the professions. Senior judges came under tighter political control and University lecturers lost the academic tenure they had enjoyed since the Middle Ages. In Kenneth Baker’s Great Education Reform Bill (‘Gerbil’) of that year, Whitehall grabbed direct control over the running of the school curriculum, creating a vast new state bureaucracy to dictate what should be taught, when and how, and then to monitor the results. Teachers could do nothing. The cabinet debated the detail of maths courses; Mrs Thatcher spent much of her time worrying about the teaching of history. Working with history teachers, I well remember the frustration felt by them at being forced to return to issues of factual content rather than being able to continue to enthuse young people with a love for exploring sources and discovering evidence for themselves. Mrs Thatcher preferred arbitrary rules of knowledge to the development of know-how. She was at her happiest when dividing up the past into packages of ‘history’ and ‘current affairs’. For example, the 1956 Hungarian Revolution was, she said, part of history, whereas the 1968 Prague Spring was, twenty years on, still part of ‘current affairs’ and so should not appear in the history curriculum, despite the obvious connections between the two events. It happened at a time when education ministers were complaining bitterly about the lack of talent, not among teachers, but among civil servants, the same people they were handing more power to. A Hungarian history teacher, visiting our advisory service in Birmingham, expressed his discomfort, having visited a secondary school in London where no-one in a Humanities’ class could tell him where, geographically, his country was.

At that time, my mother was coming to the end of a long career in NHS administration as Secretary of the Community Health Council (‘The Patients’ Friend’) in Coventry which, as elsewhere, had brought together local elected councillors, health service practitioners and managers, and patients’ groups to oversee the local hospitals and clinics and to deal with complaints. But the government did not trust local representatives and professionals to work together to improve the health service, so the Treasury seized control of budgets and contracts. To administer the new system, five hundred NHS ‘trusts’ were formed, and any involvement by elected local representatives was brutally terminated. As with Thatcher’s education reforms, the effect of these reforms was to create a new bureaucracy overseeing a regiment of quangos (quasi/ non-governmental organisations). She later wrote:

We wanted all hospitals to have greater responsibility for their affairs.  … the self-governing hospitals to be virtually independent.

In reality, ‘deregulation’ of care and ‘privatisation’ of services were the orders of the day. Every detail of the ‘internal market’ contracts was set down from the centre, from pay to borrowing to staffing. The rhetoric of choice in practice meant an incompetent dictatorship of bills, contracts and instructions. Those who were able to vote with their chequebooks did so. Between 1980 and 1990, the number of people covered by the private health insurance Bupa nearly doubled, from 3.5 million to a little under seven million. Hubris about what the State could and could not do was to be found everywhere. In housing, 1988 saw the establishment of unelected Housing Action Trusts to take over the old responsibility of local authorities for providing what is now known as ‘affordable housing’. Mrs Thatcher claimed that she was trying to pull the State off people’s backs. In her memoirs, she wrote of her third government,

… the root cause of our contemporary social problems … was that the State had been doing too much.

Yet her government was intervening in public services more and more. The more self-assured she became, the less she trusted others to make the necessary changes to these. That meant accruing more power to the central state. The institutions most heart in this process were local councils and authorities. Under the British constitution, local government is defenceless against a ‘Big Sister’ PM, with a secure parliamentary majority and a loyal cabinet. So it could easily be hacked away, but sooner or later alternative centres of power, both at a local and national level, would be required to replace it and, in so doing, overthrew the overbearing leader.

Sources:

Andrew Marr (2008), A History of Modern Britain. Basingstoke: Pan Macmillan.

Peter Catterall, Roger Middleton & John Swift (2001), The Penguin Atlas of British & Irish History. London: Penguin Books.

 

 

 

Posted October 1, 2018 by TeamBritanniaHu in Affluence, Birmingham, Britain, British history, Britons, Caribbean, Coalfields, Cold War, Communism, Conservative Party, Coventry, democracy, Europe, European Economic Community, France, guerilla warfare, History, Humanities, Hungary, Ireland, Journalism, Labour Party, Marxism, Midlands, Migration, Militancy, Narrative, National Health Service (NHS), nationalisation, Population, Remembrance, Revolution, Russia, Social Service, south Wales, Thatcherism, Uncategorized, Unemployment, USA, USSR, Victorian, Wales, Welfare State

Tagged with , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

The Rise of Thatcherism in Britain, 1979-83: Part Two.   Leave a comment

002 (2)

Above: Denis Healey in combatant mood

Labour’s ‘Civil War’ and the Creation of the SDP:

As a general election loomed, with Labour in visible disarray, Margaret Thatcher moved within a couple of months from being one of the least popular prime ministers ever to being an unassailable national heroine. This was the result of two ‘factors’, the struggle for power within the Labour Party, which (as I wrote about in the first part of this article) began with Callaghan’s decision to step down as its leader in the autumn of 1980, and the Falklands Crisis and War of 1982.

Labour’s Civil War began with constitutional arguments about whether MPs should be able to be sacked by their local constituency parties. It became nasty, personal, occasionally physical, and so disgusted those outside its ranks that the party almost disappeared as an effective organisation. Undoubtedly, there was widespread bitterness on the left of the party about what were considered to be the right-wing policies of the defeated Wilson-Callaghan government, and about the small number of party conference decisions which found their way into Labour’s manifesto at the May 1979 election. In this atmosphere, the left wanted to take power away from right-wing MPs and their leadership and carry out a revolution from below. They believed that if they could control the party manifesto, the leadership election and bring the MPs to heel, they could turn Labour into a radical socialist party which would then destroy Thatcher’s economics at the next general election.

At Labour’s October 1980 Blackpool Conference, the left succeeded in voting through resolutions calling for Britain to withdraw from the European Community, unilateral disarmament, the closing of US bases in Britain, no incomes policy and State control of the whole of British industry, plus the creation of a thousand peers to abolish the House of Lords. Britain would become a kind of North Sea Cuba. The Trotskyite Militant Tendency, which had infiltrated the Labour Party, believed in pushing socialist demands so far that the democratic system would collapse and a full-scale class war would follow. Tony Benn, who thought that their arguments are sensible and they make perfectly good rational points, saw Militant as no more than of a threat than the old Tribune group or the pre-war Independent Labour Party. He thought that the left would bring about a thoroughly decent socialist victory. In fact, thuggish intimidation in many local Labour parties by Militant supporters was driving moderate members away in droves. Many mainstream trade unionists went along with Militant, feeling let down by the Wilson and Callaghan governments. So too did those who were driven by single issues, such as nuclear disarmament.

Shrewd tactics and relentless campaigning enabled a small number of people to control enough local parties and union branches to have a disproportionate effect in Labour conference votes, where the huge, undemocratic block votes of the trades unions no longer backed the leadership. At the 1980 Conference, the left won almost every important vote, utterly undermining Callaghan, who quit as leader two weeks later. Since new leadership election rules would not be in place until a special conference the following January, Labour MPs had one final chance to elect their own leader. Michael Foot, the old radical and intellectual, was persuaded to stand.  Benn would stand no chance against him, especially since he had now allied himself with the Trotskyists who were attacking the MPs. But Foot was a great parliamentarian and was considered to be the only candidate who could beat Denis Healey, by now the villain of the piece for the Labour left.

Healey had already highlighted the fatal flaw in their strategy which was that if they did take over the Labour Party, the country wouldn’t vote for it. Activists, he told them, were different from the vast majority of the British people, for whom politics was something to think about once a year at most. His robust remarks about what would later be called ‘the loony left’ were hardly calculated to maximise his chances, despite his popularity in the country at the time. At any rate, he was eventually beaten by Foot by 139 votes to 129. Many believe that Foot was the man who saved the Labour Party since he was the only leader remotely acceptable to both the old guard and the Bennite insurgents. He took on the job out of a sense of duty, with his old-style platform oratory. He was always an unlikely figure to topple Margaret Thatcher, the ‘Iron Lady’. It was the last blast of romantic intellectual socialism against the free market.

At the special party conference, Labour’s rules were indeed changed to give the unions forty per cent of the votes for future Labour leaders, the activists in the constituencies thirty per cent, and the MPs only thirty per cent. Labour’s struggle now moved to its next and most decisive stage, with the left in an exuberant mood. It was decided that Benn must challenge Healey for the deputy leadership the following year. This would signal an irreversible move. A Foot-Benn Labour Party would be a fundamentally different one from a party in which Healey continued to have a strong voice. Both sides saw it as the final battle and ‘Benn for Deputy’ badges began to appear everywhere. Benn went campaigning around the country with verve and relentless energy. I heard him speak impressively at the Brangwyn Hall in Swansea, though his analysis of the problems in the British economy was far stronger than the solutions he proposed. At public meetings, Healey was booed and heckled and spat at. The intimidation of anyone who would not back Benn was getting worse, though Benn himself was apparently unaware of what was being said and done in his name. Neil Kinnock eventually decided that he would support neither Benn nor Healey, announcing his decision in Tribune. As education spokesman, he had been gradually moving away from the hard left, while continuing to support his neighbouring south Wales and fellow-Bevanite MP and now party leader, Michael Foot. Popular in the party, he was regarded with increasing suspicion by Tony Benn. But this open break with the left’s ‘champion’ shocked many of his friends. At the Brighton conference, Benn was narrowly beaten by Healey, by less than one per cent of the votes. Neil Kinnock and Arthur Scargill clashed angrily on television, and a young Jeremy Corbyn openly called for the mandatory deselection of Tribune MPs who had refused to back Benn.

002

This next phase was too much for those who were already planning to break away and form a new party. Roy Jenkins had already mooted the idea before the Bennite revolt, as he contemplated the state of the British party system from his offices in Brussels, where he was President of the European Commission. He argued that the Victorian two-party system was out-dated and that coalition government was not such a bad thing. It was time, he said, to strengthen the ‘radical centre’ and find a way through the economic challenges which accepted the free market but also took unemployment seriously. Although he was in touch with David Steel, the Liberal leader, and was close to Liberal thinking, he judged that only a new party would give British politics the new dimension it needed. He began holding lunches for his old friends on the right of the Labour Party, including Bill Rodgers, still a shadow cabinet member, and Shirley Williams, who had lost her seat but who remained one of the best-liked politicians in the country. At this stage, the public reaction from Labour MPs was discouraging. Williams herself had said that a new centre party would have no roots, no principles, no philosophy and no values. David Owen, the young doctor and former Foreign Secretary, who was now fighting against unilateral nuclear disarmament, said Labour moderates must stay in the party and fight even if it took ten or twenty years.

The Bennite revolt changed many minds, however. After the Wembley conference, at which Owen was booed for his views on defence, he, Jenkins, Williams and Rodgers issued the ‘Limehouse Declaration’, describing Wembley as ‘calamitous’ and calling for a new start in British politics. Two months later, this was formalised as the ‘Social Democratic Party’ (SDP) two months later, in March 1981. In total thirteen Labour MPs defected to it and many more might have done so had not Roy Hattersley and others fought very hard to persuade them not to. Within two weeks, twenty-four thousand messages of support had flooded in and peers, journalists, students, academics and others were keen to join. Public meetings were packed from Scotland to the south coast of England, and media coverage was extensive and positive. In September an electoral pact was agreed with the Liberal Party, and ‘the Alliance’ was formed.

After running the Labour Party close in the Warrington by-election, the SDP won their first seat when Shirley Williams took Crosby from the Conservatives in November, with nearly half the votes cast, followed by Jenkins winning Glasgow Hillhead from the Tories the following year. His victory allowed Jenkins to become the leader of the party in the Commons, but David Owen had always believed that leadership was more rightly his and feared that Jenkins was leading the SDP towards a merger with the Liberals. Owen saw himself still as a socialist, although of a new kind. By the early eighties, the Liberal Party was led by Steel, ‘the boy David’ who was looking for a route back from the Thorpe scandal to the centre ground. The alliance with the SDP provided this, but Owen was not alone in despising the Liberals and the eventual merger between the two parties was bitter and difficult. Nevertheless, the initial upsurge in the SDP’s support shook both the Labour Party and the Conservatives and by the early spring of 1982, the SDP and Liberals could look forward with some confidence to breaking the mould of British politics.

The Falklands ‘Escapade’:

One of the many ironies of the Thatcher story is that she was rescued from the political consequences of her monetarism by the blunders of her hated Foreign Policy. In the great economic storms of 1979-81, and on the European budget battle, she had simply charged ahead, ignoring all the flapping around her in pursuit of a single goal. In the South Atlantic, she would do exactly the same and with her good luck, she was vindicated. Militarily, it could so easily have all gone wrong, and the Falklands War could have been a terrible disaster, confirming the Argentinian dictatorship in power in the South Atlantic and ending Margaret Thatcher’s career after just one term as Prime Minister. Of all the gambles in modern British politics, the sending of a task force of ships from the shrunken and underfunded Royal Navy eight thousand miles away to take a group of islands by force was one of the most extreme.

On both sides, the conflict derived from colonial quarrels, dating back to 1833, when the scattering of islands had been declared a British colony. In Buenos Aires, a newly installed ‘junta’ under General Leopoldo Galtieri was heavily dependent on the Argentine navy, itself passionately keen on taking over the islands, known in Argentina as the Malvinas. The following year would see the 150th anniversary of ‘British ownership’ which the Argentines feared would be used to reassert the Falklands’ British future. The junta misread Whitehall’s lack of policy for lack of interest and concluded that an invasion would be easy, popular and impossible to reverse. In March an Argentine ship ‘tested the waters’ by landing on South Georgia, a small dependency south of the Falklands, disembarking scrap-metal dealers. Then on 1 April, the main invasion began, a landing by Argentine troops which had been carefully prepared for by local representatives of the national airline. In three hours it was all over, and the eighty British marines surrendered, having killed five Argentine troops and injured seventeen with no losses of their own. In London, there was mayhem. Thatcher had had a few hours’ warning of what was happening from the Defence Secretary, John Nott. Calling a hurried meeting in her Commons office, Sir John Leach gave her clarity and hope, when her ministers were as confused as she was. He told her he could assemble a task-force of destroyers, frigates and landing craft, led by Britain’s two remaining aircraft carriers. It could be ready to sail within forty-eight hours and the islands could be retaken by force. She told him to go ahead. Soon after, the Foreign Secretary, Peter Carrington, tended his resignation, accepting responsibility for the Foreign Office’s failings.

But Margaret Thatcher was confronted by a moral question which she could not duck, which was that many healthy young men were likely to die or be horribly injured in order to defend the ‘sovereignty’ of the Falkland Islanders. In the end, almost a thousand did die, one for every two islanders and many others were maimed and psychologically wrecked. She argued that the whole structure of national identity and international law were at stake. Michael Foot, who had been bellicose in parliament at first, harking back to the appeasement of fascism in the thirties, urged her to find a diplomatic answer. Later she insisted that she was vividly aware of the blood-price that was waiting and not all consumed by lust for conflict. Thatcher had believed that from the start that to cave in would finish her. The press, like the Conservative Party itself, were seething about the original diplomatic blunders. As it happened, the Argentine junta, even more belligerent, ensured that a serious deal was never properly put. They simply insisted that the British task-force be withdrawn from the entire area and that Argentine representatives should take part in any interim administration and that if talks failed Britain would simply lose sovereignty. The reality, though, was that their political position was even weaker than hers. She established a small war cabinet and the task-force, now up to twenty vessels strong was steadily reinforced. Eventually, it comprised more than a hundred ships and 25,000 men. The world was both transfixed and bemused.

030

Above: Royal Marines march towards Port Stanley during the Falklands War.

The Empire struck back, and by the end of the month South Georgia was recaptured and a large number of Argentine prisoners taken: Thatcher urged questioning journalists outside Number Ten simply to ‘rejoice, rejoice!’ Then came one of the most controversial episodes in the short war. A British submarine, The Conqueror, was following the ageing but heavily armed cruiser, the Belgrano. The British task-force was exposed and feared a pincer movement, although the Belgrano was later found to have been outside an exclusion zone announced in London, and streaming away from the fleet. With her military commanders at Chequers, Thatcher authorised the submarine attack. The Belgrano was sunk, with the loss of 321 sailors. The Sun newspaper carried the headline ‘Gotcha!’ Soon afterwards, a British destroyer was hit by an Argentine Exocet missile and later sunk. Forty died.

001

On 18 May 1982, the war cabinet agreed that landings on the Falklands should go ahead, despite lack of full air cover and worsening weather. By landing at the unexpected bay of San Carlos in low cloud, British troops got ashore in large numbers. Heavy Argentine air attacks, however, took a serious toll. Two frigates were badly damaged, another was sunk, then another, then a destroyer, then a container ship with vital supplies. Nevertheless, three thousand British troops secured a beach-head and began to fight their way inland. Over the next few weeks, they captured the settlements of Goose Green and Darwin, killing 250 Argentine soldiers and capturing 1,400 for the loss of twenty British lives. Colonel ‘H’ Jones became the first celebrated hero of the conflict when he died leading ‘2 Para’ against heavy Argentine fire. The battle then moved to the tiny capital, Port Stanley, or rather to the circle of hills around it where the Argentine army was dug in. Before the final assault on 8 June, two British landing ships, Sir Tristram and Sir Galahad were hit by missiles and the Welsh Guards suffered dreadful losses, many of the survivors being badly burned. Simon Weston was one of them. Out of his platoon of 30 men, 22 were killed. The Welsh Guards lost a total of 48 men killed and 97 wounded aboard the Sir Galahad. Weston survived with 46% burns, following which his face was barely recognisable. He later became a well-known spokesman and charity-worker for his fellow injured and disabled veterans. He recalled:

My first encounter with a really low point was when they wheeled me into the transit hospital at RAF Lyneham and I passed my mother in the corridor and she said to my gran, “Oh mam, look at that poor boy” and I cried out “Mam, it’s me!” As she recognised my voice her face turned to stone.

Simon Weston cropped.jpg

Simon Weston in 2008

The Falklands Factor and the 1983 Election:

The trauma of the Falklands War broke across Britain, nowhere more strongly than in Wales. The impact on Wales was direct, in the disaster to the Welsh Guards at Bluff Cove and in anxieties over the Welsh communities in Patagonia in Argentina. Plaid Cymru was the only mainstream party to totally oppose the war from the beginning, and it evoked a strong response among artists in Wales. Students from the Welsh College and Drama in Cardiff staged a satirical drama on the war which won many plaudits. They portrayed the war as a mere butchery for a meaningless prize. Veteran Labour MP Tam Dalyell hounded the Prime Minister with parliamentary questions as he sought to prove that the sailors on the Belgrano had been killed to keep the war going, not for reasons of military necessity. One of the few memorable moments of the 1983 election campaign came when Mrs Thatcher was challenged on television about the incident by a woman who seemed a match for her. Among the Labour leadership, Denis Healey accused her of glorifying in slaughter and Neil Kinnock got into trouble when, responding to a heckler who said that at least Margaret Thatcher had guts, he replied that it was a pity that other people had had to leave theirs on Goose Green to prove it.  But there had also been those on the left who supported the war, together with Michael Foot, because of their opposition to the Argentine dictatorship, and there is little doubt that it gave a similar impetus to British patriotism across the political spectrum. It also bolstered a more narrow nationalism, jingoism and chauvinism both in the Conservative party and in the media.

For millions, the Falklands War seemed a complete anachronism, a Victorian gunboat war in a nuclear age, but for millions more still it served as a wholly unexpected and almost mythic symbol of rebirth. Margaret Thatcher herself lost no time in telling the whole country what she thought the war meant. It was more than simply a triumph of ‘freedom and democracy’ over Argentinian dictatorship. Speaking at Cheltenham racecourse in early July, she said:

We have ceased to be a nation in retreat. We have instead a newfound confidence, born in the economic battles at home and found true eight thousand miles away … Printing money is no more. Rightly this government has abjured it. Increasingly the nation won’t have it … That too is part of the Falklands factor. … Britain found herself again in the South Atlantic and will not look back from the victory she has won. 

Of course, the Falklands War fitted into Margaret Thatcher’s personal narrative and merged into a wider sense that confrontation was required in public life country’s politics. The Provisional IRA had assassinated Lord Mountbatten on his boat off the coast of Donegal in 1979 and the mainland bombing campaign went on with attacks on the Chelsea barracks, then Hyde Park bombings, when eight people were killed and fifty-three injured. In Northern Ireland itself, from the spring of 1981, a hideous IRA hunger-strike had been going on, leading to the death of Bobby Sands and nine others. Thatcher called Sands a convicted criminal who chose to take his own life. It was a choice, she added, that the PIRA did not allow to any of its victims. She was utterly determined not to flinch and was as rock-hard as the ruthless Irish republican enemies.

002

Thatcher was now becoming a vividly divisive figure. On one side were those who felt they, at last, had their Boudicca, a warrior queen for hard times. On the other were those who saw her as a dangerous and bloodthirsty figure, driven by an inhumane worldview. To the cartoonists of the right-wing press, she was the embodiment of Britannia, surrounded by cringing ‘wets’. To others, she was simply mad, with a sharply curved vulture’s beak nose, staring eyes and rivets in her hair. Gender-confusion was rife. France’s President Mitterrand, who in fact had quite a good relationship with her, summed up the paradox better than any British observer when, after meeting her soon after his own election, he told one of his ministers, She has the eyes of Caligula but she has the mouth of Marilyn Monroe.

The Falklands War confirmed and underlined these opposing and paradoxical views of Thatcher. She encouraged the government’s think tank, the Central Policy Review Staff, to come up with a paper about the future of public spending. They came up with a manifesto which could be characterised as ‘Margaret Thatcher unconstrained’. They suggested ending state funding of higher education, extending student loans to replace grants, breaking the link between benefits and the cost of living, and replacing the National Health Service with a system of private health insurance, including charges for doctor’s visits and prescriptions. In effect, this represented the end of Attlee’s Welfare State. Although some of these ideas would become widely discussed much later, at the time the prospectus was regarded as ‘bonkers’ by most of those around her. The PM supported it but ministers who regarded it as, potentially, her worst mistake since coming to power, leaked the CPRS report to the press in order to kill it off. In this they were successful, but the whole episode was an early indication of how Thatcher’s charge-ahead politics could produce disasters as well as triumphs.

The electoral consequences of the Falklands War have been argued about ever since. The government had got inflation down and the economy was at last improving but the overall Conservative record in 1983 was not impressive. The most dramatic de-industrialisation of modern times, with hundreds of recently profitable businesses disappearing forever, had been caused in part by a very high pound boosted by Britain’s new status as an oil producer. Up to this point, unemployment had been seen as a price worth paying in order to control inflation, but the extent of de-manning required by 1983 had been underestimated. Howe’s economic squeeze, involving heavy tax increases and a reduction in public borrowing deflated the economy, reducing demand and employment. In the 1980s, two million manufacturing jobs disappeared, most of them by 1982. Given the shrinking of the country’s industrial base and unemployment at three million, a total tax burden of forty per cent of GDP and public spending at forty-four per cent, there were plenty of targets for competent Opposition politicians to take aim at. In an ordinary election, the state of the economy would have had the governing party in serious trouble, but this was no ordinary election.

After the war, the Conservatives shot into a sudden and dramatic lead in the polls over the two Opposition groupings now ranged against them.  In the 1983 general election, the SDP and the Liberals took nearly a quarter of the popular vote, but the electoral system gave them just twenty-three MPs, only six of them from the SDP, a bitter harvest after the advances made in the by-elections of 1981-2. Labour was beaten into third place in the number of votes cast. This meant that the Conservatives won by a landslide, giving Mrs Thatcher a majority of 144 seats, a Tory buffer which kept them in power until 1997. It would be perverse to deny that the Falklands conflict was crucial, giving Thatcher a story to tell about herself and the country which was simple and vivid and made sense to millions. But there were other factors in play, ones which were present in the political undercurrents of 1981-2 and the divisions within the Labour Party in particular. For one thing, the Labour Party’s Manifesto at the 1983 Election, based on the left-wing Conference decisions of 1980-82, was later considered to be the longest suicide note in history.

The Political and Cultural Landscape of Wales:

In Wales, we had expected that the calamitous effect of the monetarist policies would produce a surge in support for Labour and that the effect of the Falklands factor would not weigh so heavily in the Tories’ favour as elsewhere in Britain. We were wrong. Moreover, we believed that the efforts we had made on the left-wing of the national movement in association with Welsh language activists, libertarian socialist groups, ecological, peace and women’s groups would bring dividends in electoral terms. But, in the Wales of 1983, these remained marginal movements as the country remained, for the most part, locked into the British two-party system. The General Election of 1983 exposed the myth that South Wales, in particular, was still some kind of ‘heartland of Labour’ and continued the trend of 1979 in relocating it within the South of the British political landscape. In Wales as a whole, the Labour vote fell by nearly ten per cent, exceeded only in East Anglia and the South-East of England, and level with London again. The Labour vote in Wales fell by over 178,000, the Tories by 24,000 (1.7 per cent), the great ‘victors’ being the Alliance, whose votes rocketed by over two hundred thousand. This surge did not, however, benefit the third parties in terms of seats, which simply transferred directly from Labour to Conservative.

The Conservatives, with a candidate of Ukranian descent and strong right-wing views, took the Cardiff West seat of George Thomas, the former Speaker, and swept most of Cardiff. They also took the marginal seat of Bridgend and pressed hard throughout the rural west, almost taking Carmarthen. Michael Foot visited the constituency and held a major rally, during which he spoke powerfully but almost fell of the stage. We canvassed hard on the council estates for the Labour MP, Dr Roger Thomas, managing to hold off both the Tories and Plaid Cymru, in what turned out to be Gwynfor Evans’ last election. Nevertheless, the Tories ended up with thirteen seats out of thirty-eight in Wales. Plaid Cymru, disappointed in the valleys, still managed to hold its green line across the north-west, holding Caernarfon and Merioneth and moving into second place, ahead of Labour, on Anglesey. The Alliance more than doubled the former Liberal poll, reaching twenty-three per cent in the popular vote, and coming second in nineteen out of the thirty-eight seats. But it won only two seats. Labour’s defeat seemed to be slithering into rout even though it retained more than half the seats, twenty in all. It held on by the skin of its teeth not only to Carmarthen but also to Wrexham, its former stronghold in the north-east. In the fourteen seats which covered its traditional base in the south, one fell to the Conservatives and six became three-way marginals. The SDP-Liberal Alliance came second in ten and, in the Rhondda won eight thousand votes without even campaigning. The remaining seven constituencies gave Labour over half of their votes. Of the old twenty thousand majority seats, only three remained: Rhondda, Merthyr Tydfil and Blaenau Gwent (Ebbw Vale). As Gwyn Williams commented:

They stand like Aneurin Bevan’s memorial stones on the Pound above Tredegar and they are beginning to look like the Stonehenge of Welsh politics.   

006 (3)

Two other ‘events’ of cultural significance took place in Wales in 1983. The first demonstrates how the question of culture in Wales had become caught up with the arguments over language. The language became a badge, the possession of which by learners is a sign of good faith: I was one of them, though I never learnt how to write in Welsh. In 1979, however, I had managed, with the help of friends, to write a speech in ‘Cymraeg Byw’ (Colloquial Welsh) as ‘Cadeirydd’ (‘Chair’) of UCMC (NUS Wales), which I delivered at the National Eisteddfod in Caernarfon. I argued for English- speaking and Welsh-speaking students to come back together throughout Wales in order to defend the country, the University and their colleges, paid for by the ‘pennies’ of miners and quarrymen, from the cut-backs in education which the Tories were bringing in. I was not successful in persuading the Welsh-speaking students from Bangor, who had formed their own separate union in 1977, to form a federal union, like the one which existed in Aberystwyth. But what chance did we have when, four years later, the renowned poet R S Thomas, himself a learner of the language, fulminated at the Eisteddfod that the Welshman/ woman who did not try to speak Welsh was, in terms of Wales, an ‘un-person’. His fundamentalism as Dai Smith called it, demanded that reality, the chaos of uncertainty, be fenced in. R S Thomas, for all the brilliant wonder of his own poetry in English, had:

… turned Wales into ‘an analogy for most people’s experience of living in the twentieth century … a special, spare grammar and vocabulary in which certain statements can be made in no other language’. 

003

Thomas’ conversion to Welsh language fundamentalism had come rather late in life. In the sixties and seventies, he had remarked that he was rather tired of the themes about nationalism and the decay of the rural structure of Wales and that whereas he used to propagandise on behalf of Welsh Country Life and … the Welsh identity, he felt that he’d wrung that dishcloth dry. In May 1983, the Western Mail had welcomed the poet to Cardiff on the occasion of his seventieth birthday to Cardiff, describing him as a man whose genius found expression in the search for the ancient simplicities of rural Wales. R Gerallt Jones, introducing an evening of celebration at the Sherman Theatre in the capital some days later, acclaimed Thomas as the poet who has expressed the national identity of the Welshman. As Tony Bianchi showed in 1986, Thomas’ work has been used  – within the context of a wide range of prescriptive notions concerning the “Welsh heritage” – to condemn most of the Welsh to a marginal existence in which they are permitted only a vicarious identity. That’s what makes R S Thomas’ statement at the 1983 National Eisteddfod so surprising and intriguing.

The second cultural ‘event’ was the publication of an impressionistic but learned survey of Welsh history by the distinguished Welsh novelist Emyr Humphrys. The Taliesin Tradition took as its theme the survival of a continuous Welsh tradition in the face of all contrary odds. He ascribed this to a ‘poetic tradition’ which had invested the native language with the power and authority to sustain ‘national being’. In order to explain the unfolding of Welsh history, however, he welcomes the blurring of history and myth:

The manufacture and proliferation of myth must always be a major creative activity among a people with unnaturally high expectations reduced by historic necessity … In Wales history and myth have always mingled and both have been of equal importance in the struggle for survival. 

003

For ‘organic nationalists’, like R S Thomas and Emyr Humphrys, history must not only mingle with myth but also have its disciplines submitted to the needs of the nation. Dai Smith pointed out that while this provided for acceptable politics for some, it is not good history. The verbal dexterity which it requires, Dai Smith claimed, obscures the reality of Welsh life, by emphasising the myths of ‘the murder of the Welsh language’, and the ‘kowtowing to ‘Britishness’ at the expense of ‘Welshness’. On this theme, Gwyn Williams (below) wrote:

001

Ahead, a country which largely lives by the British state, whose input into it is ten per cent of its gross product, faces a major reconstruction of its public sector … a country whose young people are being dumped like in town and country faces the prospect of a large and growing population which will be considered redundant in a state which is already considering a major reduction in the financial burden of welfare.

Small wonder that some, looking ahead, see nothing but a nightmare vision of a depersonalised Wales which has shrivelled up to a Costa Bureaucratica  in the south and a Costa Geriatrica in the north; in between, sheep, holiday homes burning merrily away and fifty folk museums where there used to be communities.

… What seems to be clear is that a majority of the inhabitants of Wales are choosing a British identity which seems to require the elimination of a Welsh one.

As it happened, Dai Smith was right. The idea that ‘Britishness’ and ‘Welshness’ were mutually exclusive was indeed a myth, and both were able to survive as dual identities into the later eighties and beyond.

Ghost Town – The Case of Coventry, 1979-83:

By the late 1970s, the British motor industry had reached an historic crossroads. Entry into the EEC had coincided with an unusually weak range of British products. Models were either outdated or bedevilled by quality and reliability problems. European manufacturers soon captured nearly forty per cent of the home market. The choice facing British manufacturers was varied. Those companies owned by American parents integrated their UK operations with their European counterparts. Ford and General Motors are two successful examples of this strategy. Unfortunately for Coventry, the Chrysler Corporation was experiencing problems in many parts of their ’empire’ and did not possess the resources necessary for the establishment of a high-volume European operation. British-owned Leyland faced a more complex situation. The company produced both high-volume and specialist products. The Cowley and Longbridge plants which produced high-volume products badly needed investment to keep up with the European companies and the American subsidiaries. The specialist producers, Jaguar, Rover and Triumph, also required a large injection of capital in order to meet the growing competition from such companies as Audi, BMW, Alfa Romeo and the Scandinavian manufacturers. The various schemes devised by Ryder and the National Enterprise Board underlined Leyland’s commitment to the large and medium volume plants. The announcement of the collaborative agreement with Honda in 1979 to produce a new Japanese designed quality saloon at Canley was seen by many as an end to uncertainty over Leyland’s long-term commitment to Coventry.

The change of government in 1979 soon quashed the cautious optimism that had been present in the local car industry. The Conservative economic strategy of high-interest rates overvalued the pound, particularly in the USA, the major market for Coventry’s specialist cars. Demand for Coventry models declined rapidly and Leyland management embarked upon a new rationalisation plan. The company’s production was to be concentrated into two plants, at Cowley and Longbridge. Triumph production was transferred to Cowley along with the Rover models produced at Solihull. The Courthouse Green engine plant in Coventry was closed and three of the city’s other car-manufacturing concerns – Alvis, Climax and Jaguar – were sold off to private buyers. Only Jaguar survived the recession. In the first three years of the Thatcher government, the number of Leyland employees in Coventry fell from twenty-seven thousand to just eight thousand. One writer described the effects of Conservative policy on manufacturing industry in these years as turning a process of gentle decline into quickening collapse. The city’s top fifteen manufacturing companies shed thirty-one thousand workers between 1979 and 1982. Well-known names at the base of the pyramid of Coventry’s economic life – Herbert’s, Triumph Motors and Renold’s – simply disappeared.

Even in 1979, before the change in government, unemployment in Coventry stood at just five per cent, exactly the same level as in the early seventies. There was a noticeable rise in youth unemployment towards the end of the decade, but this, as we have seen, was part of a national problem caused mainly by demographic factors. Neither was the election of the Tory government seen as a harbinger of hard times to come. Coventry had prospered reasonably well during previous Tory administrations and even enjoyed boom conditions as a result of the policies of Anthony Barber, Heath’s Chancellor of the Exchequer. Heath had ridden to the rescue of Rolls-Royce when it needed government assistance. Unfortunately, the economic brakes were applied too rapidly for the car industry and monetarist policy quickly cut into it. Redundancy lists and closure notices in the local press became as depressingly regular as the obituary column. The biggest surprise, however, was the lack of protest from the local Labour movement. It was as if all the ominous prophecies of the anti-union editorials which had regularly appeared in the Coventry Evening Telegraph during the industrial unrest of the previous decades were finally being fulfilled.

In any case, it was difficult to devise defensive industrial strategies. Michael Edwardes’ new tough industrial relations programme at British Leyland had seen the removal of Derek Robinson,  ‘Red Robbo’, the strongest motor factory union leader from Longbridge. He also demonstrated, at Speke in Liverpool, that he could and would close factories in the face of trade union opposition. Factory occupations, used to such effect by continental trades unionists had, thanks to the Meriden Triumph Motorcycle fiasco, no chance of local success. The opposition to closures was also undoubtedly diminished by redundancy payments which in many cases cushioned families from the still unrealised effects of the recession. Young people, especially school- leavers, were the real victims. Coventry’s much-prized craft apprenticeships all but vanished, with only ninety-five apprentices commencing training in 1981. In 1982, only sixteen per cent of sixteen-year-old school leavers found employment. The early 1980s were barren years for Coventry’s youth. Even the success of the local pop group, The Specials’, brought little relief, though for a brief moment the band’s song Ghost Town was a national success, giving vent to the plight of young people throughout the manufacturing towns of the Midlands and the North of England, not to mention Wales. The sombre comparison in the lyrics of boom time and recession express an experience that was felt more sharply in Coventry than elsewhere.

For the first time in over a century, Coventry became a net exporter of labour, but unemployment levels still remained stubbornly high. The main loss was mainly among the young skilled and technical management sectors, people who the city could ill afford to lose. Little research and development work was taking place in local industry. Talbot’s research department at Whitley including much key personnel, for example, was removed to Paris in 1983. The Conservatives promised in 1979 that a restructuring of the economy would be followed by increased investment and employment opportunities, but by 1983 there were very few signs of that promise being fulfilled. Coventry’s peculiar dependence on manufacturing and its historically weak tertiary sector has meant that the city was, at that time, a poor location for the so-called ‘high tech’ industries. As a local historian concluded at that time:

Coventry in the mid 1980s displays none of the confidence in the future that was so apparent in the immediate post-war years. . The city, which for decades was the natural habitat of the affluent industrial worker is finding it difficult to adjust to a situation where the local authority and university rank among the largest employers. Coventry’s self-image of progressiveness and modernity has all but vanished. The citizens now largely identify themselves and their environment as part of a depressed Britain. 

This was a sad contrast to the vibrant city of full employment in which my mother had grown up in the thirties and forties and where she had met and married my father in the early fifties. By the time I returned there as a teacher, from a former mill town in Lancashire in 1986 which had recovered from its own decline in the sixties and seventies, Coventry was also beginning to recover, but the shiny new comprehensive schools built thirty years before were already beginning to merge and close due to these years of recession, unemployment and outward migration.

Revolution or retro-capitalism?

Thatcher’s government of 1979-83 was not the return of ‘Victorian Val’, a revival of Gladstonian liberalism, nor even of the Palmerstonian gunboat imperialism which it sometimes resembled in its rhetoric. It was more of a reversion to the hard-faced empire of the 1920s when war socialism was energetically dismantled, leaving industries that could survive and profit to do so and those which couldn’t to go to the wall. As in the twenties, resistance to brutal rationalisation through closure or sell-off of uneconomic enterprises, or by wage or job reductions, was eventually to be met by determined opposition in the confrontation of 1984-5 between Thatcher and the NUM, led by Arthur Scargill, a battle comprehensively won by the PM.

The trouble with this ‘retro-capitalism’ masquerading as innovation was that sixty years after the policy had first been implemented, the regions that were the weaker species in this Darwinian competition were not just suffering from influenza, but prostrate with pneumonia. They were now being told to drop dead. These included South Wales, Lancashire, the West Riding, Tyneside and Clydeside. Those regions which had risen to extraordinary prosperity as part of the British imperial enterprise were now, finally, being written off as disposable assets in a sale. What interest would the Welsh and Scots, in particular, have in remaining part of Great Britain plc? They were also now being joined by those same manufacturing areas which had provided respite for millions of migrants from the older industrial areas in the thirties, centres such as Coventry. The euphoria felt by the Conservatives following their unexpected second victory in 1983 disguised the fact that their majority was built at the price of perpetuating a deep rift in Britain’s social geography. Not since Edward I in the thirteenth century had a triumphant England imposed its rule on the other nations of Britain.

Thatcher’s constituency was not, however, to be found among the engineers of ‘Middle England’ or even the Lincolnshire grocers from whom she hailed, who might have voted for Ted Heath’s ‘Third Way’ Tories. It was overwhelmingly to be found among the well-off middle and professional classes in the south of England, in the Home Counties, or the ‘golden circle’ commuter areas. The distressed northern zones of derelict factories, pits, ports and decrepit terraced houses were left to rot and rust. The solution of her governments, in so far as they had one, was to let the employment market and good old Gladstonian principles of ‘bootstrap’ self-help take care of the problem. People living in areas of massive redundancy amidst collapsing industries ought simply to ‘retrain’ for work in the up-and-coming industries of the future or, in Norman Tebbitt’s famous phrase, “get on their bikes” like their grandfathers had done and move to places such as Milton Keynes, Basingstoke or Cambridge where those opportunities were now clustered. But this vision of ex-welders, or even assembly workers, lining up to use computers was not helped by the absence of such publicly funded retraining. And even if it was available, there was no guarantee of a job at the end of it, no apprenticeship system. The whole point of the computer revolution in industry was to save, not to expand labour. The new jobs it created could, and would be taken by the sons and daughters of the industrial workers of the early eighties, but not by those workers themselves.

Finally, the kick-up-the-rear-end effect of the eighties’ Thatcher counter-revolution ran into something that she could do little about; the Coronation Street syndrome. Like the residents of the mythical TV soap opera, millions in the old British industrial economy had a deeply ingrained loyalty to the place where they had grown up, gone to school, got married and had their kids; to their extended family with older generations, to their pub, their parks and hills, to their football or rugby club. In that sense, at least, the post-war social revolution and welfare state had helped to maintain and even develop towns and cities that, for all their ups and downs, their poverty and pain, were real communities. Fewer people were willing to give up on these places than had been the case fifty years earlier, and certainly not on cities like Liverpool, Leeds, Nottingham, Derby and Coventry. But not everything the Thatcher government did was out of tune with social ‘harmony’. The sale of council-houses created an owner-occupier class which corresponded to the long passion of the British to be kings and queens of their own little castles. Nationalised industries were failing to take advantage of enterprise and innovation. But many of these more popular reforms were to come after her confrontation with the miners and especially in her third term.

Sources:

Gwyn A Williams (1985), When Was Wales? A History of the Welsh. Harmondsworth: Penguin Books.

Dai Smith (1984), Wales! Wales?  Hemel Hempstead: George Allen & Unwin.

Bill Lancaster & Tony Mason (1984), Life & Labour in a Twentieth Century City: The Experience of Coventry. Coventry: University of Warwick Cryfield Press.

Simon Schama (2002), A History of Britain III, 1776-2000: The Fate of Empire.  London: BBC Worldwide.

Andrew Marr (2008), A History of Modern Britain. Basingstoke: Macmillan.

Posted September 26, 2018 by TeamBritanniaHu in Affluence, Britain, British history, Cartoons, Castles, Coalfields, Colonisation, Conquest, Conservative Party, Coventry, decolonisation, democracy, Demography, devolution, Empire, Europe, European Economic Community, Factories, Falklands, History, Immigration, Imperialism, Labour Party, manufacturing, Marxism, Methodism, Midlands, Migration, Militancy, monetarism, Monuments, Mythology, Narrative, National Health Service (NHS), nationalisation, nationalism, Nationality, Nonconformist Chapels, Population, Revolution, south Wales, terrorism, Thatcherism, Trade Unionism, Unemployment, Victorian, Wales, Welfare State, Welsh language, West Midlands, World War Two

Tagged with , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

%d bloggers like this: