It’s widely presumed that the English language will become entrenched as the world’s lingua franca and that minority languages will continue to die out. But you don’t really buy into this theory and have argued that new technology might allow minority languages to thrive. I wonder if you could expand on this?
I try to look at things from a historical perspective rather than just what’s happening in this decade or century. I look at the progress of languages over centuries and millennia – my book Empires of the Word starts in 3000 BC and ends in modern times. Each of us only lives two or three generations, so it’s quite difficult for us to get that perspective without really striving for it. When it comes to languages, we tend to be familiar only with the one that we use on a daily basis. When we are also conscious that in the last century or two that language has spread out all over the world, it gives us a very foreshortened perspective. What I’m trying to do is to correct that.
There have been many lingua francas and English, although it is the most widespread that we know of, is a relative latecomer. And we still can’t tell the full form of its life history yet. If you look at a really established lingua franca like Latin, it lasted for one and a half millennia. And just when it was thought that it was on its way out with the collapse of the Western Roman Empire, it got a new lease of life through its association with the Catholic Church. So these things are difficult to predict.
As things stand at the moment, the forces that have put English where it is – which mostly had to do with global economic power, first of the British Empire and then the United States – have come to the point where they are being overtaken. The crucial thing is that what put English where it is today is not going to be a feature in the future, and one has to think how the world is going to react to that. There will be other dominant powers which have non-English languages associated with them, notably the so-called BRIC countries – Brazil, Russia, India and China.
At the same time, we have a crisis in all the “little guys” in the world’s languages, where we seem to be losing them at a very fast rate – something like two a month. There are a lot of languages around, perhaps 6,000 or 7,000, but if we continue to lose two a month we are going to see about half the world’s languages disappear in the course of a century. There is time for things to change.
“If we continue to lose two a month we are going to see about half the world’s languages disappear in the course of a century.”
One of the things which is changing is the way we are using our languages and, in particular, their involvement with technology. One of the more significant developments in technology at the moment is machine translation and other electronic means of getting access to what’s going on in languages other than your own. These technologies are going to become more significant and are already becoming available to people who own handheld devices. If you combine that technical fact with the undying human preference for using one’s own language, you are going to see people using this technology to avoid having to resort to foreign languages such as English.
That’s the basis of my view of the world. I don’t know exactly how it’s going to pan out, but I think it’s very unlikely that English will expand until it takes over the world.
On this question of minority languages – what do we lose from their disappearance?
There are different ways of looking at it. One is that human languages are repositories of culture and that language tradition is a good way of retaining within a group all things which have been built up as worth knowing. When that language ceases to be transmitted, some of those things will be translated, but inevitably a lot of those things will go. This will affect humanity as a whole because this diversity, which is congealed by the different languages, is not something that we could have predicted. The kind of languages that have been spoken is an interesting fact in itself about human development and the human mind. If we lose the range of languages that we currently have, we are not going to have that particular channel into understanding what human beings have been.
In addition to that, we are losing the possibility of a sense of identity, which goes with each of these languages. It’s harder and harder to retain your identity when you haven’t got this clear marker of a language that keeps a boundary for your community. We are in danger of losing the sheer variety that we have in humanity, which is a source of cultural wealth.
Learning languages comes easier to some people than others. I believe, at the last count, you have a working knowledge of 26
This is a rumour! I have certainly worked with that number of languages and got interesting things out of them, but living as I do in southwest England, the opportunity to use different languages on a daily basis is fairly limited. “I read, much of the night, and go south in the winter.” [TS Eliot, The Waste Land]
On the question of technology and translation, the British writer Tim Parks, who lives in Italy, said in his Five Books interview that contemporary Italian fiction writers are often adapting their style to make their work easier to translate. So the prospect of easy translation is in itself problematic for language.
Global communication is taking over these major languages. I’m currently reading some Greek detective thrillers by Petros Markaris and it would be fairly easy to translate them into English because they are in a form that is familiar to us ever since Sherlock Holmes.
The more there is global communication – whether direct or through translation – the more there will be a coming together of the kind of things people are saying in their different languages. You could argue that retaining your own language is some way of putting a break on this process and thus protecting that diversity. But this tendency towards coalescence is certainly there.
Your first book choice is by Nicholas Evans, an Australian academic and one of the leading figures in language documentation. Please tell us more about Dying Words.
This book is a history of world languages which focuses on the small languages that make up about 96% of all spoken languages but are spoken by only about 4% of the world’s population. It’s language history from the point of view of the “little guy”. Evans gives you an idea of how the world seems when you speak a very small language and how you interact with lots of other groups who also speak very small languages. In some sense, a language, rather than being a mark of the people, becomes a mark of a geographical area where you know that particular group lives.
The book also thinks about how other bigger languages have worked over the centuries and millennia and examines their evolution from the point of view of the small languages. It’s this approach that makes this book so interesting and distinctive.
How is the book structured? I know it includes portraits of individual “last speakers” of languages. Is it a mix of theory and anecdote?
It’s in five parts, but theory and anecdote pervade it all. Each part is structured in a different way. I have talked really just about the first part. The second part points out how many languages there are when they are viewed not locally but globally and how your view of your own society is expressed in the grammar of that society. The third examines how languages are a result of the spread of human beings around the world. The fourth looks at language artists and the fifth at the present dangers confronting language diversity. What it is saying is that you get a very universal view of humanity from looking at these small communities and that the artificially large communities that we have built up over the last couple of millennia distort our view of what we are.
I presume that, given he’s Australian, quite a lot of attention is paid to aboriginal languages.
He is, and there is. Evans is a great expert particularly on the languages of northern Australia. They turn out to be more diverse than those of the centre and southern parts which all belong to a single family.
Please tell us about your second pick.
The author is a French linguist and historian and this book explores the new view of language which was adopted about the time of the Renaissance. The big figure here is the Spaniard Antonio de Nebrija (1444 – 522). He did two things that nowadays don’t seem so extraordinary but represented at the time a completely new view of language. First, he wrote a grammar of his own language, which was Spanish. Up to this time, the only languages that had explicit grammars were Greek and Latin. These grammars had been written more than 1,500 years earlier.
The Greeks first worked out what the structure of language was with nouns and verbs and inflections of various sorts and how the syntax of sentences worked. Then the Romans came along and it was done for Latin too. You might have thought that people would have come from all directions thereafter and done the same for their own languages, but it stopped dead there. The only languages thought worthy of having grammar were Greek and Latin. Even languages like Hebrew, which you’d have thought would have some credibility, were not analysed in this way.
What Nebrija said was that you could do this for any language, and argued that, above all, any language which is the language of an empire ought to have a grammar. He did this for Spanish, but later, as a result of his work, the Spanish missionaries who went out to the Americas started writing grammars of all the languages they encountered there. This was an amazing thing: Nobody had ever done this with such a wide variety of languages before, and certainly not with languages they considered to be used by savages.
The other thing that Nebrija did was write a grammar of Latin. That had been done before, but he happened to do it at the time the printing press was taking over the production of books, and so effectively he ended up creating one of the first student textbooks. The idea took root that language could now be learnt from books and that there could be a technology or system of learning languages. This began to be applied all over Europe, where all the major powers wanted to analyse the grammar of their language and provide textbooks for students.
“The idea took root that language could now be learnt from books and that there could be a technology or system of learning languages.”
Auroux looks at the process of how this development spread across Europe and the world. This is a very important work and I’m just sorry that this book doesn’t seem to be available in English.
You recently debated with David Crystal about the future of English – he is of the view that English “may find itself in the service of the world community forever”.
David Crystal is a friend of mine. Conveniently he has said in print that English may find itself in the service of mankind forever. When I challenged him he said: “I only said it may”, suggesting that he also thinks it may not. From my attempt to show that the world’s linguistic future may be very diverse, he’s a useful straw man to attack. But he’s an extremely estimable linguist and knows he can’t know the future any more than I can.
Please tell us about The Stories of English.
This is an attempt to provide a standard history of English and it does that very well. It starts with the origins of Anglo-Saxon and moves on to the advent of the printing press and so forth. But what he does in addition is look at how English was not one language or one unified system – it was rather a family of systems that were growing up on the same island. That’s why he calls it “stories” of English rather than “story”.
After each major chapter, he has a sort of counter-chapter, which tells you how the standard story is only one dimension and that different things were going on in different minority communities at every stage of the story. For example, before English got established in this country as a result of invasions of Germanic-speaking people from the Continent, we were speaking British, which is a direct ancestor of Welsh. Furthermore, we were dominated for 400 years by Roman soldiers who spoke Latin. Yet the funny thing is that English is surprisingly bare of borrowings from Welsh and Celtic languages and indeed of early borrowings from Latin, which one might have expected. Crystal draws attention to this black hole that there is in that early part of the history of English, and makes it clear to readers what a paradoxical process has created the English language.
He also looks at all the variant forms of English too, such as dialects and slang, and at attempts to standardise its general usage.
There have been different assaults on the diversity of English at different times. In the ninth century King Alfred attempted to make the particular dialect that was spoken in Wessex into a literary standard for English. It succeeded for a time but then English literature died away for various historical reasons and a new standard was then established after the age of Chaucer. When printing came in there was another reason to have a standard.
Then as British commerce grew and imposed itself more widely around the world, there was stuff coming into English from all over, and another attempt – associated with the activities of Dr Johnson in the 18th century – was made to provide a central core of what was really English and what was peripheral English. It’s a complex story and the good thing about David Crystal’s books is that he does justice to this complexity without losing the thread.
Your next book, Linguistic Diversity, takes a highly interdisciplinary approach to looking at the origins of language. Tell us more.
David Nettle is an anthropologist of language. Although the publishers say this book is a volume of “Oxford linguistics”, this is like no linguistics that I was ever taught when I was studying for my doctorate in the subject.
What he does is go through language in its different general properties as a human phenomenon, trying to work out how the diversity can be modelled as having come about. He uses computer simulations to show what sort of processes might have produced the type of language diversity that we find. That’s his first major way of looking at diversity and the results he comes up with are comparable with the generally rather dry things anthropologists tend to say about humanity as a whole – for instance, the role of language in social selection.
Where his book comes alive is in the later chapters where he has a map showing the diversity of language. What it shows is that the thickness of languages on the ground – how many languages there are per unit of area – correlates very well with the temperature. Just as there are more species per unit of area in the equator, so there are more languages more thickly spread in the equatorial areas than there are in the rest of the world. So equatorial countries – particularly in Africa and south-east Asia – have far more languages per unit area than anywhere else.
He conjectures some explanations as to why this should be. He says it has to do with the need of human beings to make their living, and the fact that it’s easier to live in a small community in an equatorial area where things grow easily than it is in temperate areas, where you need to trade with other communities or create large-scale empires to survive and which cause one language to spread across a large area. So this book gives you a completely different explanation of the patterning of languages in the world. He then combines this geographic view of diversity with some interesting discussion about human history and the discovery and spread of different agricultural practices.
A review I read of Le Ton Beau de Marot described it as “quirky and often exasperating”
Please tell us more about your final pick.
Douglas Hofstadter is famous for being a mathematician who is interested in patterns. In Le Ton Beau de Marot he is taking patterns you find in language and showing how weird and wonderful they are. He is trying to get an idea of how the finite resources there are in each language can somehow express a vast diversity of meaning. He does this through interesting examples. One of the most powerful is when he tries to understand the nature of Russian without learning Russian. What he does is read and compare four different English translations of Eugene Onegin, an extremely structured and witty poem written in Russian by Alexander Pushkin. The translations are similarly structured and witty and the idea is that by comparing the different translations of a given verse of this poem, you can understand what the Russian must be like without actually having to learn it. In one sense it’s nonsense, but in another it’s rather profound.
That sort of insight is the kind of thing that animates this extremely long and varied book. The author interplays a number of conceits like this and his other encounters with different languages with a particular short French love poem that was written in early modern France by Clément Marot. The book’s chapters are interspersed by a vast number of different translations – mainly into English – of this one poem, showing the diversity which will come out of attempts to translate it.
The title of the book is worthy of the whole because Le Ton Beau de Marot is literally “the beautiful tone of Marot” but indistinguishable when you pronounce it to meaning “the tomb of Marot”. So in some sense the beauties that have come across from translating Marot’s verse have provided his epitaph.
So, in a nutshell, what’s his point?
Well the book is in praise of the music of language, so I’m not sure what points there would be other than to show you how powerful language is in expressing human thought with a small compass of means.
Later on in the book he talks about artificially constrained fiction. A number of writers have been interested in doing things like writing a whole book without using the letter “E”, such as Ernest Wright’s Gadsby and Georges Perec’s La Disparition. You might think that’s impossible to do because it constrains you too much, but you can actually do it. You can find all sorts of other constraints and what he discusses is how powerful a constraint you can put on your expressive power.
For example, a metre in poetry is a constraint on how you can express yourself. But to some extent that helps you in poetry, but does it help you more the more you constrain yourself, or does there come a point where you put too heavy a constraint upon yourself and you can no longer actually express anything that you wanted to say? So he’s interested in those types of questions.
One criticism that has been levelled at this book is that his approach is too scientific and that he fails to understand stylistic and imaginative subtleties in literature. Is this fair?
No, it’s not. The point of his book is that there is so much more there in literature. The very fact that he hasn’t said all that needs to be said only reinforces such points that he has made. As Oscar Wilde once remarked: “Like all those who attempted to exhaust his subject, he succeeded in only exhausting the reader.”
Five Books aims to keep its book recommendations and interviews up to date. If you are the interviewee and would like to update your choice of books (or even just what you say about them) please email us at email@example.com