Wednesday, 20 March 2013

About professional writing and translation in African languages – by Charles Tiayon

Background

By all indications, the development of professional writing, transcription, and most especially professional translation in African languages, depends very much on how efficient and competitive writing in such languages is. I just read Kwesi Kwaa Prah‘s interview published in Next (cf. also https://web.facebook.com/charles.tiayon/posts/140505359333053?__cft__[0]=AZWHA6jy510vaGISN4Tn12Uu-pTyzrYy0OHB0ar-ISwMYb44AWXQnd3jXsn5CY-LKaTd_Zq-t_9MFfib1la5ZB4Kh6hT4s6n5uBOVWsM4yssyKRuAwax871dUzGsXYhTw8U&__tn__=%2CO%2CP-R and http://nigeriang.com/entertainment/battle-of-ideas-over-mother-tongues/5591/ ), and found the following questions and answers particularly thought-provoking, don’t you think?

Is there any need for standardisation with languages like Hausa and Yoruba that have a certain prescribed version that has been written down, that has literature written in it for decades? Do we need any further standardisation? They need to be improved. Take Yoruba for example, it’s full of diacritics.

It’s like a forest, it detracts from the reading. Take a newspaper that’s written in Yoruba and the first thing you see is not a language; it’s the forest of diacritics. It’s written as if it’s for a foreigner who wants to read Yoruba, not for somebody who is a born Yoruba.

But the diacritics – the accents – it’s because of the tonality of Yoruba.Many African languages are tonal, you don’t mark all tones. If you ask an English person: R.E.A.D – it can be Read (present tense), it can be Read (past tense), it can be Mr. Read. How do you find out? Contextually. English is not tonal, French is – and they do mark the accents on the French.Yes, but the French have certain habits which are not necessarily scientific. Not everything they mark is based on science, it’s based on habit. We have to write in a way that is user-friendly for our kids. When you’re publishing a dictionary, you can use all these symbols, to clarify differences and so on. But in a newspaper or a children’s book, if you start using diacritics, you’re confusing the child. He or she will never be a good reader, will not like reading and it’s self-defeating. The people who wrote with all those diacritics, wrote to enable English people to be able to learn Yoruba.

In tune with Kwesi Kwaa Prah’s arguments, there is reason to say that,  on the face of it, notably both segmentally and supra-segmentally, many script proposals for endogenous African languages are questionable. For one thing, the writing systems do not respect the requirements of user-friendliness (See Fig. 1 below); most importantly, they remain very much like laboratory experiments and samples that are yet to be fully translated into something effectively consumable or usable by the mass of speakers of the languages themselves. As a matter of evidence today, some of the writing systems for endogenous African languages are concerned more with phonological validity (cf. General Alphabet of Cameroonian Languages) than actual graphological and discoursal validity. Yet, while there is no doubt nowadays that some of the most coherent writing systems are those that  are based on a scrupulous phonological study of the language, there is evidence too that it does not just suffice to devise a writing system purely in a phonetic/phonology laboratory for it to be valid.

Besides, respect for phonological principles does not necessarily imply the use of phonetic symbols in the writing system. In other words, a writing system could be a reflection of a rigorous phonological study of the language without necessarily resorting to phonetic symbols in the orthography (ever so often, under the influence of the General Alphabet of Cameroonian Languages for example, there is a tendency to confuse phonological transcription and orthographical transcription when it comes to writing endogenous languages). Nor is the use of phonetic symbols in some of the writing proposals always a guarantee of scientific rigour as far as the phonological study of the language is concerned.

Over and beyond the issues of phonological validity, there is need to ensure that the writing systems can pass the acceptability test. This seems to be a fundamental condition for there to be effective shift from the currently observed “elitisation” to effective “massification” of written communication and text production, a shift from simple linguistic descriptions to mass production of content, i.e. a shift from deceitful to “genuine literacy” (http://www.scoop.it/t/translation-world/p/4050343284/2015/08/28/solve-literacy-digital-literacy-will-follow?hash=96224097-26ab-4dd5-8a2c-b9124e79386b; http://www.huffingtonpost.com/brij-kothari/solve-literacy-digital-li_b_8052694.html ). This shift implies writers’ ability to use of all modern text formats as well as all writing tools. It also implies writers’ ability to conveniently penetrate the entire writing sphere, from handwriting, long hand, and short-hand to all e-writing interfaces, including internet, social networks online or offline, text messaging, interactive multimedia tools such as radio, television, etc.. (cf. http://rising.globalvoicesonline.org/blog/2011/07/27/languages-tweeting-in-chichewa-in-southern-africa/http://www.nytimes.com/2011/12/11/magazine/everyone-speaks-text-message.htmlhttp://www.bootheando.com/2012/01/10/los-interpretes-en-las-redes-sociales/).

Relevant questions and foci

Obviously, the issue is no longer just whether endogenous African languages can be written, easily taught or easily learned.  Indeed, there is evidence that just anything, any language, any symbol, can be written either by hand, with mechanical or electronic devices. Similarly, if exogenous languages with some of the most cognitively-demanding writing systems like English and French (cf. Kwesi Kwaa Prah’s opening statement above) can be taught, learned and mastered by Africans, then just any written system can obviously be taught, learned and mastered. Besides, (psychometric) studies have usually shown that the teaching of African languages, including the marking of tones in these languages, cannot only be taught and learned, but also can be an advantage to the learner. However, teaching and learning may not always translate to effective use, especially so beyond the classroom. In point of fact, far beyond the observation that some of the African language alphabets remain socio-contextually more suitable for tactile than for e-print, it is increasingly evident that the issue is no longer just whether the languages can be written electronically. Rather, the question is whether, compared to what obtains with the official language(s) of the setting, writing such languages is done effectively and efficiently, both in and out of the classroom or laboratory setting. More precisely, the issue is whether the writing process in these languages does (rather than can) scripturally compete everywhere, in terms of effective use, with that of official/dominant (albeit mostly exogenous) languages like Arabic, English and French hic et nunc (here and now, rather than in future). In other words, there is need to find out whether and to what extent the alphabets and writing systems of endogenous African languages, as compared to English and French, have translated or do translate effectively into the production of written content (other than classroom, pedagogical and religious content) in the languages concerned.

From this perspective, there is argument that an inaccurate or inappropriate translation of the relevant sounds of non-official languages into acceptable alphabets and orthographic symbols may actually contribute to the further endangerment of these languages in the face of the already dominant official language(s). So, over and above the issue of directionality of writing whereby Arabic is a right-to-left language while other official languages of the continent are left-to-right languages, there is, strategically, need for an acceptable degree of script-type convergence and coherence in the way both official and non-official languages of a national setting are written. This seems to be a determining factor in ensuring effective comparative use of both non-official and official languages within the same sociolinguistic setting.

As of now, written endogenous African languages may be classified into two groups. On the one hand, there are languages whose script strictly conforms with that of the official/dominant language of the immediate (national) setting, i.e. the language which tends to drive most of the country’s scriptural technology for the school and administrative systems, the media and all interpersonal communication devices; such script may be referred to as the convergent script, while the languages themselves may be scripturally described as script convergent. On the other, there are languages whose script differs partly or entirely from that of the official/dominant language of the setting/country; that is, what may be referred to as the divergent script, whereas the languages themselves would scripturally qualify as script divergent.  This issue of script convergence or divergence is particularly relevant when one bears in mind the requirements of translation. Actually, there is reason to argue that writing in this case is translation, in its functional, communicative or interpretative sense.

So, besides the already mentioned user friendliness and acceptability tests, there is need also to find out if the writing of African languages can effectively pass the tests of internet visibility and vitality (notably through social networks)interoperability and transferabilityspeed and searchability, just to name a few parameters which currently prove to be the essential ingredients of competitive writing in general and translation in particular within specific national settings. With regard to internet visibility (cf. Jeff Herring “The Dos and Dont’s (sic) to Using Article Marketing to Get Online Visibility” in Mastering the World of Marketing by Erik Taylor and David Riklan 2011: 55-58), it seems imperative for written African languages to evolve from simple online existence to optimal online presence, from simple online activity and identity to effective online vitality. Similarly, it seems imperative to understand that script usability in electronic media, whether online or offline, may be low or high, just as existence in electronic media is not necessarily a guarantee for interoperability and transferability on the one hand or speed and searchability on the other. The present study intends to examine these issues from a generic and macro perspective rather than a micro-level, with the understanding that a further study will investigate the micro-level translation of specific language sounds into orthographic symbols and writing systems.

But before a closer look at each of these, it may be useful to define what the present paper understands as professional writing and translation. It is hard, however, to discuss the above issues as well as professional writing today without explicit reference to social writing which, as a matter of evidence, increasingly recognises translation as an indispensable means of interpersonal communication in the increasingly globalised world (cf. http://www.danatranslation.com/index.php?option=com_content&view=article&id=122:translation-social-media&catid=29:blog-posts&Itemid=222http://www.scoop.it/t/translation-world?tag=social+network++tech).

Professional writing in the era of social media

Social writing is a phenomenon that developed mainly with the spread of electronic writing (e-writing). It is related to the use of computer devices in interpersonal communication, including desktop and laptop computers, mobile multimedia telephony and visiophony, etc.. Most especially, social writing is generally linked to what has been referred to as social networking and social media, ushered in by internet community networks, including Bebo, Facebook, Google+, Myspace, Tumblr, Twitter,  etc.. In this paper, however,  it is understood to comprise also e-mailing, mail groups, e-forums and e-discussion groups, web blogging, texting or text messaging, as well as crowd-sourcing, to the extent that these are the most common means through which common or everyday humans (especially young Africans) increasingly socialise and communicate (cf. http://www.nytimes.com/2011/12/11/magazine/everyone-speaks-text-message.htmlhttp://www.rfi.fr/france/20120101-le-milliard-sms-le-nouvel-an-devrait-etre-depassehttp://www.01net.com/editorial/551614/un-record-de-plus-de-un-milliard-de-sms-pour-la-nouvelle-annee/http://www.lesnumeriques.com/1-milliard-sms-pour-voeux-2012-n22668.htmlhttp://cellphones.lovetoknow.com/Texting_Statisticshttp://www.mobiledia.com/news/124688.html). As a matter of observation, the bulk of social writing is done by the common rather than professional or expert user of language and, thus, can be an efficient means for preserving, promoting or reviving endogenous African languages in the face of enormous pressure from exogenous official languages which double as the most widespread languages of the world (cf. http://www.altalang.com/beyond-words/2012/03/26/texting-endangered-languages/http://www.kasitimes.co.za/opinions/computing-in-mother-tongue/). However, social media are increasingly used by professionals too – notably in collaborative work and exchange of expert knowledge via intranet or internet channels – thus indicating that such media are simply new platforms for writing, including professional and non-professional writing.

Through social networking, people can use networks of online friends and group memberships to keep in touch with current friends, reconnect with old friends or create real–life friendships through similar interests or groups. Besides establishing important social relationships, social networking members can share their interests with other like–minded members by joining groups and forums. Some networking can also help members find a job or establish business contacts.

Most social networking websites also offer additional features. In addition to blogs and forums, members can express themselves by designing their profile page to reflect their personality. The most popular extra features include music and video sections. Members can read bios of their favorite music artists from the artist’s profile page as well as listen to their favorite songs and watch music videos. The video section can include everything from member–generated videos from hundreds of subjects to TV clips and movie trailers. (From http://social-networking-websites-review.toptenreviews.com/ 26-04.11)

Actually,  resources on social media and networks are fast overtaking those that are available on professional and/specialist media, thus obliging practitioners, experts, teachers, students and researchers to increasingly turn to them for information (cf. http://www.bootheando.com/2012/01/10/los-interpretes-en-las-redes-sociales/) and interpersonal communication. LinkedIn is a typical example of a social network which specifically targets professional groups of all kinds. In the meantime, most traditional media, whether they are meant for a professsional audience or not, now include social media or social fora interfaces.

Needless to stress that, today, a good many, if not all, translators working into such languages as Arabic, English, French, Portuguese, Spanish increasingly find it more convenient and rewarding to use online and offline electronic sources and references than to resort to hard copy documents. Anyone translating professionally from/into African languages today would naturally find it far more convenient to use online and offline electronic technology or resources as well, i.e. where relevant content is effectively available in the required format. Thus, the presentation of internet as a “lifeboat for endangered languages” (http://www.euractiv.com/culture/internet-lifeboat-endangered-lan-news-509285http://www.endangeredlanguages.com/about/#about_alliancehttp://www.paradisec.org.au/blog/2012/06/elar-cracks-a-ton/) is more than mere propaganda today.

With the rise of the Internet, the 21st century could witness a renewal in linguistic diversity, [says] Daniel Prado, a renowned linguist of Franco-Argentine origin.”Some languages can resuscitate, or even be reborn,” […]”There is a new competition between languages,” fostered by a kind of online “prestige”, [says] the renowned linguist.(http://www.euractiv.com/culture/internet-lifeboat-endangered-lan-news-509285)

At this juncture, it may be argued that the penetration rate of internet and such networks in African countries is still low. According to world statistics, the June 2010 rate for the entire continent is barely 2.3% on the average (see http://www.internetworldstats.com/stats1.htm). Nonetheless, the rate is fast increasing and, in all likelihood, will constantly improve to follow the same trend as in areas where the 2010 rate of internet penetration (http://www.internetworldstats.com/list4.htm) is estimated to be over 90% (e.g. Iceland, and Greenland). According to estimates, the continent actually has one of the highest internet growth rates. Until African themselves can impose their own technological tools and norms to the rest of the world, any reasonably proactive proposal on the writing of any endogenous African language in the current context needs to factor in this reality upstream, so that the writing of the languages can become effective and efficient in the face of effectively available and widely accessible technology. Other than that, the proposal is likely to fail, precisely because of the speakers’ inability to use readily available scriptural technology to write their own mother languages. Meanwhile, languages on the margin of internet will increasingly be overtaken and overshadowed by those with sustained internet presence.

Compared to social writing, professional writing is understood to mean writing that is done for a living, by professional workers, including creative writers, technical writers, and journalists, as well as (monolingual and bilingual) subtitle, caption and teletext specialists, transcriptionists (http://www.transcriptionoutsourcing.org/2012/06/should-you-hire-a-court-reporter-or-a-transcriptionist/) and translators. A professional writer is usually expected to produce written material instantly and fast, irrespective of the devices and operating systems used. For example, as reporters on daily or breaking news, journalists are expected to send in stories almost as the events occur. Similarly, on average, competitive professional translators and translation agencies produce 4000 to 8000 words of written text daily (see also http://www.timesherald.com/article/20120102/LIFE01/120109993). For all such writers, nothing can be more distracting, frustrating,  discouraging and ultimately demotivating than situations where they cannot use commonly available technology to process and file their work, including word processing, proofreading, editing and file transfer technology. Here, it appears that writing endogenous African languages with the help of computer technology cannot be limited to the effective use of desktop hardware.

As Sridhar (http://www.thehindu.com/news/states/karnataka/article2907796.ece?goback=.gde_1748997_member_96470425) rightly notes with regard to difficulties faced by Indian languages in the era of computer technology and localisation,

The persistence of the digital divide in the era of widespread computing probably poses the biggest challenge to the realisation of the promise that the Internet would offer deliverance to society at large.In simple terms, the problems can be condensed in two sets. The first is the problem of access to computing — not to hardware but in the ability to handle computers in the language you already know. The second set of problems arises from the lack of content, or ‘knowledge’ as stated in more fashionable terms.Basically, access to computers — understood not merely as having a device to use, but as one which people can actually use in the language they know — arises from the ability to make the machine understand what you want it to do, in your own language.

As of now, the massive use of computer and localisation technology in African languages that scripturally depart from the official language(s) of the area is still a far cry. Some of the most commonly used operating systems in the writing of the official language are yet to conveniently factor in (notably through the default interface of keyboards) many of the proposals made in divergent script languages. As a result, e-writing a single 250-word page in script divergent languages often takes far too long a time for any professional writer to sustainably or consistently bear or afford.

In terms of content creation, some critics may however want to see Bible translation into African languages as a counter example to this argument. Nevertheless, the translation of the Bible (a text which is estimated to be less than one million words (cf. http://www.biblebelievers.com/believers-org/kjv-stats.html; and http://blogs.wsj.com/metropolis/2011/12/27/bible-by-hand-copying-king-james-word-for-word/ whereby the King James Version (KJV) is said to comprise  921,820 words) generally takes decades – despite the use of fairly large teams, with considerable funding – for  each project. This would hardly be compatible with the timeline exigencies of professional practice outside Bible translation (cf.: http://www.commonenglishbible.com/Connect/Blog/ViewBlog/tabid/209/ArticleId/103/Technology-and-the-Common-English-Bible-translation.aspxhttp://news.nationalpost.com/2012/05/13/inuit-language-bible-finished-after-34-year-project/; cf. also Salman Rushdie’s stance on the ‘Job’ of writing (http://www.youtube.com/watch?v=yueYFHoYAuY) ). Meanwhile, basic literacy and life-saving material and news, as well as social and professional writing remains scarce, or nonexistent, in these languages (cf. http://onafricajournal.com/2012/04/dying-for-lack-of-knowledge-2/#comment-1523). As a matter of evidence, in many, if not all, of the languages that are written in divergent script, the only major written corpus material remains the Bible (Old and New Testaments), usually as a result of several decades of sustained and painstaking work by a large team with adequate funding. It is unfortunate to realise that while divergent script African languages into which the Bible has been translated are consistently showcased by some linguists as examples of written endogenous languages to emulate, there is hardly any question as to why real professional translation — with speed, technology and ability to handle a diversity of relevant subject fields as some of the key factors — remains a far cry in those languages; there is hardly any question as to why many broadcasters and media reporters in those languages still have difficulties in producing instant written reports (as is the case with professional reporters in exogenous official languages) in those languages; there is hardly any question as to why written current news translation (otherwise known in academia as editorial translation) is still hard to come by in those languages; there is hardly any question as to why even those who have been trained in Bible translation still prefer sight translating homilies and the like.

Translation as (professional) writing and writing as translation

Translation as writing

It is well established through history that translation has usually played a crucial role in the co-development of language writing norms and standards (cf Woodsworth and Delisle in Translators through History (1995 )). For example, the translation of the Bible into English, notably the much-revered King James Version, is widely acknowledged to be a key reference for many of the English language writing and language use habits since 1611 (see, for example:  http://www.thespec.com/opinion/columns/article/858190–the-bible-permeates-our-languagehttp://www.kansascity.com/2011/04/24/2821651/the-king-james,  http://www.oregonlive.com/living/index.ssf,  http://www.bible-researcher.com/mcafee4.html,  http://www.guardian.co.uk/books). Besides, says Hardy (2012),

The words and phrases are familiar to everyone […]: “Lead us not into temptation but deliver us from evil;” “Seek and you shall find;” “Judge not that you not be judged;” “My brother’s keeper.” If you think these are from the venerable King James Version of the Bible, you are half right. The words are there, but as in so many cases, the KJV translators kept the words of translation previously written by William Tyndale. Tyndale also created the neologisms (well, they were “neo” back then) “Passover,” “beautiful,” and “Jehovah.” He invented “network,” too. In fact, most of the KJV’s words can be traced to Tyndale, and the tone, cadence, and idiom of the book are his, too. In fact, a case could be made that Tyndale was forging English into a language for written literature. (Read more:http://www.cdispatch.com/robhardy/article.asp?aid=15839#ixzz1nwQiPDpk)

Needless to mention the impact of (covertly and overtly) translated classical Greek and Latin literatures on the construction and development of literary writing in English, French, Portuguese, Spanish and most other Western languages (cf. http://www.helleniccomserve.com/classical_greek_influence.html). And far beyond the shaping of larger units of written language such as vocabulary, syntax and texts, such translations have contributed significantly in developing, operationalising and stabilising writing systems (Woodsworth and Delisle, ibid.) of all Western Europe languages some of which, incidentally, are now used throughout Africa and all other continents. However, unlike what English gained from the KJV, the story of the impact of Bible translation may be slightly different in Africa because despite the undeniable impact of Bible literature on target African languages (an impact which certainly deserves further investigation), most of these languages have remained essentially in spoken mode outside Christian gatherings.

Rare exceptions to this trend do exist though, as Noss and Renju (2007: 42) indicate in the following statement:

Mwalimu Nyerere’s long and distinguished political career was framed by his two major literary contributions to the Swahili language that he championed as the language of the nation. These publications were his translations of William Shakespeare and of Holy Scripture into Kiswahili.In 1963, shortly after he became President, Oxford University Press published Juliasi Kaizari, his translation of Shakespeare’s Julius Caesar in Kiswahili. Six years later in 1969 it published Mabepari wa Venisi, his translation of The Merchant of Venice. In 1966, a decade after he left office, a Benedictine publishing house in Tanzania published five volumes of Swahili poetry under the name of Julius K. Nyerere. (http://www.sil.org/siljot/2007/1/49048/siljot2007-1-04.pdf)

Nyerere’s example surely is evidence that African languages have the same potential, as any well known world language, to express the finest literary creation. Yet, this example is still to be systematically emulated across the continent.

Writing is translation

What is at stake here is not just interlingual translation and intralingual translation as is the case in Bible translation but also translation as writing and writing as translation, with particular focus on graphological or orthographical transcription (cf. http://www.business2community.com/strategy/transcription-in-the-workforce-0203255http://www.transcriptionoutsourcing.org/2012/06/should-you-hire-a-court-reporter-or-a-transcriptionist/). Coupled with evidence of increased expectations for transcription competences and skills from professional translators, there is total agreement with proponents of the argument that all “writing is translation” (cf. http://www.mtexpress.com/). More precisely, it is ultimately thought that hitherto unwritten languages of Africa provide a good testing ground for yet another dimension of Becka Mara McKay’s claim that “all writers should be translators and all translators should be writers” (http://wordswithoutborders.org/dispatches/article/teaching-in-translation-translation-in-the-workshop#ixzz1qKWnEB2K).This applies both to the creation of complete written messages/texts, especially during the earlier stages of fully-fledged text writing in any language, and to the creation of writing symbols and writing systems for a language. Perhaps with the exception, to an extent, of Catford’s seminal A Linguistic Theory of Translation (1965) which gives some consideration for “graphological translation”, very few translation scholars actually find the topic relevant. True, Catford seemingly has more critics than supporters. In fact, the resurgence of audiovisual modes of interaction notwithstanding, most scholars tend to operate with language and translation theories that are based exclusively on the written tradition (cf. Per Linell 2005 in The Written Language Bias in Linguistics: Its Nature, Origins And Transformations, London, when he posits as follows: “Traditionally linguists have been occupied with written language to a much greater extent than with spoken language. Even today, when much lip service is being paid to the need to study spoken language, we approach language with a conceptual apparatus which shows numerous signs of being derived from this tradition.”). But as Tiayon (2001 on “Patterns of Phonological Translation and Transliteration from English into Mbafeung”, in Epasa Moto) argues, transliteration into hitherto unwritten African languages cannot just be taken for granted nor thought of as simple transcoding with predefined rules for moving from source to target language (as is conventionally the case across many Western languages). Writing difficulties faced by users of languages with divergent scripts become even more obvious with the first attempts by institutions like the Advanced School of Translators and Interpreters (ASTI), University of Buea, to introduce professional translation as well as translation studies from/into African languages alongside such purely exogenous language combinations as English-French, French-English, Spanish-French, etc.. From such experiences, there is hardly any competitive advantage yet in translating into endogenous African languages with divergent scripts.

Furthermore, translators who also happen to be pioneer writers in a language (e.g. Alfred Saker for the Duala language of Cameroon) often play an essential role in devising the language alphabet and writing system (cf also Woodsworth and Delisle). In such situations, there is often far more phonological and graphological translation than is usually acknowledged, especially in instances of formal borrowing and proper noun transcription/transliteration (cf. http://www.atlantico.fr/atlantico-light/jan-marc-ayrault-embarrasse-chaines-arabophones-362226.htmlhttp://www.montrealgazette.com/matter+Lang+Lang+superlative/6646686/story.html). From this perspective, it should be noted that in general (i.e. if one observes practice around the world)  moving from identified phonological units of a language to appropriate orthographic representation of such units is not just a one-to-one correspondence of phonetic and orthographical symbols. Such movement actually implies translating purely phonological symbols into orthographically acceptable and receivable characters. As of now, some of the general alphabet proposals for African languages remain, in many regards, closer to mere phonological transcription or transcoding than consistent orthographic transcription. Consequently, under the influence the General Alphabet of Cameroonian Languages, for example, there remains an unfortunate confusion between IPA-based phonetic/phonological transcription and the actual orthographic transcription of endogenous Cameroonian languages. In many cases, the use of purely International Phonetic Alphabet symbols renders the implementation of the General Alphabet extremely difficult in a context where English and French remain the dominant languages of official business, Administration, school instruction and even of informal interethnic or even intraethnic communication. Conversely, an effective translation (in the real sense) of such symbols into purely Latin scripts has been possible in comparable settings and has made the implementation far easier: cf. Kinyarwanda, Kirundi, Kiswahili, Luganda, and all South African languages mentioned in the subheading below on user friendliness and internet visibility.

There is no gainsaying the evidence that adopting the Arabic script or Latin script as appropriate for the writing an endogenous African language will definitely imply that there will hardly be systematic one-to-one correspondence between identified phonemes of the language and individual orthographic symbols. But, in the short term, that may well be the price to pay in order to ensure an acceptable degree of scriptural coherence between the dominant, though still mostly exogenous, official languages of Africa and endogenous African languages (most of which are actually endangered: see http://www.kasitimes.co.za/opinions/computing-in-mother-tongue/). In other words, that may be the price to pay to ensure the effective presence of non-official endogenous African languages in ALL writing platforms simultaneously with and to the same extent as partner official languages. It should be stressed that translation of phonetic symbols  into orthographic signs is a universal practice in phonographic writing systems. This therefore applies not just to Latin and Arabic orthographic signs which are widely used in Africa but to other signs too: Cyrillic, Bamum, Vai, N’ko, etc.. However, in general, any such translation is effective and efficient only if and when it proves to be socio-contextually consistent and therefore practical. So, while Latin script translations of relevant phonemes may be convenient in countries with Latin script official languages (e.g. English, French, Spanish, Portuguese), Arabic script translations would be more convenient in countries with Arabic script official languages. On these grounds, ISESCO’s project (http://www.isesco.org.ma/english/publications/Marabic/P2.phphttp://www.isesco.org.ma/english/news/news.php?id=1725http://www.isesco.org.ma/english/news/news.php?id=1756http://www.isesco.org.ma/francais/publications/AvlngArabe/P2.phphttp://www.isesco.org.ma/francais/news/news.php?id=1665) to help standardise the orthographic transcription of African languages in Arabic script may not be unjustified in Northern African countries and other countries where Arabic is the official language. By way of comparison, endogenous languages of China would be scripturally more realistic, competitive and efficient if they are written in ideographic characters, rather than in Latin, Cyrillic or any other scripts. And contrary to what may be thought, the use of the same orthographical symbols by different languages would not in any way threaten the internal (structural) identity of each of the languages, especially if the differences are properly identified and codified through consistent writing rules. Differently put, all languages of the world could very conveniently be written in a single script type, be it Cyrillic script, in Arabic script, or even IPA, etc., provided the rules of writing conform to the internal structure and identity of each language. Actually, one would have thought that the first adoption of the IPA language sound charts in the late nineteenth century (1888: see http://www.voices.com/articles/languages-accents-and-dialects/international-phonetic-alphabet.html) should have caused hitherto written languages to review their writing systems in the direction of newly-established conventions of phonological transcription which, presumably, are universally applicable. But centuries later, the initial habits taken by users of the oldest written languages remain difficult to change. Rather, generations and generations of users have continued to defend earlier writing systems as part a cultural convention which they must preserve.

Anyhow, it is often through fully-fledged content creation via professional translation projects (usually one of the best ways of creating content in hitherto unwritten languages) that proposed writing systems are initially tested as to whether writing signs and rules are effectively complete and consistent, and whether the signs and rules are acceptable to the mass of the target native users themselves. This is  especially so when it comes to actual writing and reading practice (cf. Wiesemann’s (1987: 77) recommendation that language standardisation should, as a matter of topmost priority, target native users of the language). African language writing systems may need to be reassessed with these criteria in mind.

Testing user-friendliness and internet visibility and vitality of writing (including translation) in African languages

How do proposals of writing systems for African languages contextually accommodate present day requirements of social and professional writing, including web blogging, social networking, crowd-sourcing and crowd translation? Obviously, there have been constant efforts to integrate African languages in the e-writing world, as can be observed in such crowd-sourced ventures as Wikipedia, where a 2012 ranking features the following African languages, classified in decreasing order depending on the number of articles published in the languages:

This may look impressive. However, it must be realised that Africa has well over 2000 languages and most of them are, arguably, both underrepresented (http://rising.globalvoicesonline.org/blog/2011/01/11/rising-voices-seeks-micro-grant-proposals-for-citizen-media-outreach-2011/) and endangered (http://www.kasitimes.co.za/opinions/computing-in-mother-tongue/). Besides, as was indicated before, the issue is no longer just whether African languages can be written electronically. It also depends on whether the writing of the languages do (rather than can) effectively compete scripturally with the official language of the setting, throughout the entire writing sphere. Most importantly, it depends also on whether the writing of the language is done by everyday users in formal as well less formal settings rather than by a limited group of speakers or language specialists and professional IT geeks who are ready and willing to go through otherwise cumbersome procedures to get their write-ups accepted for publication, as is currently the case with most Wikis and localisation sites. Moreover, while wikis in most script-convergent languages are, more often than not crowdsourced from an unpredictable number of language users, such crowdsourcing is far from effective when it comes to script-divergent African languages.

User friendliness and internet visibility and vitality of script convergent languages

Today, user-friendly writing systems depend on many criteria, including usability which, in itself, is a determining factor for language visibility and vitality. In terms of usability (see http://www.usabilityfirst.com/ for relevant details on usability) and visibility, the writing of African languages should not be contextually constrained by anyone or anything. If Africans had their way, they would like to be able to write their language to the same extent as they write the country’s official language, without having to go through the burden of looking for special characters outside the ones available for the official language or having to switch keyboards altogether (e.g. http://www.abeokuta.org/yoruba/?page_id=235). To say the least, it is actually paradoxical and ultimately unfair that in contemporary Africa, the writing endogenous African languages should require the use of special characters where the writing of their exogenous counterparts does not require any such characters. The result is that on both professional and social writing networks (including texting), run by grassroots speakers of endogenous African languages themselves with texts in rather than about the language, the vitality of convergent script languages tends to be far greater than that of languages with divergent scripts. The first group would include the following:

Thus, arguments in favour of a reference alphabet for the (phonological) transcription of African languages – including the 1928 proposal for an “Africa Alphabet, also known as International African Alphabet or IAI alphabet (see http://maps.thefullwiki.org/Africa_Alphabet) as well as various other proposals since the 1960s (e. g. the African Reference Alphabet at http://maps.thefullwiki.org/African_reference_alphabet;  http://www.bisharat.net/Documents/Niamey78annex.htm) – need not be misconstrued as implying that the Latin script may not be adapted to orthographical transcription in those languages. The systematic use of the script in the writing of the above languages has proven to be rather effective over the years since the adoption of exogenous languages like English, French, Portuguese and Spanish as official languages in Africa. Originally, Swahili (or Kiswahili) was written in Arabic script, in conformity with the dominant script imposed by twelve centuries of contact and trade with the Arabic world (for more information, see http://en.wikipedia.org/wiki/Swahili_language). Since the 18th century, the language has conventionally been written in Latin script, under the influence of English, the dominant official language of Tanzania, the native seat of Kiswahili. The language is currently widely used both as a spoken and written lingua franca by over 150 million people across East, Central and Southern Africa. The writing systems of most other East and Southern Africa languages (cf. the languages listed above) similarly use the Latin script, like that of Swahili, and are equally doing relatively well as written languages alongside English and French, the main official languages in these regions. Designers of the relevant writing systems therefore deserve some kudos. Through their wisdom, they have, as a matter of evidence, given these languages a truly competitive advantage in the face of enormous pressure from English or French. In order to better assess this, suffice it to compare the use of Kiswahili, Kinyarwanda/Kirundi (http://www.facebook.com/pages/Kamonyi-District/254985754516540https://twitter.com/#!/@kamonyi) and other convergent script languages on popular social networks and through SMS with that of divergent script languages, no matter how great or little the divergence is. Here, the tendency among ordinary users to fall back on more convergent characters is an indicator which linguists and language planners need to take more seriously. E.g.:

In this regard, Wikipedia‘s classification of African languages simply on the basis of the number of articles published therein (http://www.voanews.com/english/news/africa/Wikipedia-Co-Founder-Adding-More-African-Languages-144875665.html) may only give a partial account of the actual vitality of writing in those languages. A full account will have to go to the grassroots users of the languages as well. Needless to stress that most of the symbols in well-known written languages may equally be used in writing just any other language, provided the orthographic rules are scientifically justified and consistent. It is on the basis of this principle that Chinese, Japanese, Greek or Russian language users can translate their names into English, French, Spanish, Portuguese, etc..  Similarly, users of the latter languages can and do translate their names into the former.  Diplomats and international negotiators are definitely familiar with such evidence. It is therefore possible for all languages within the same sociolinguistic/communication context to use roughly the same  symbols in writing. There are well over 6000 languages in the world, including over 2000 languages in Africa, with countries like Cameroon counting some 286 and Nigeria some 514 according to estimates in Ethnologue (Lewis (ed.) 2009: http://www.ethnologue.com/show_country.asp?name=Cameroon and http://www.ethnologue.com/show_country.asp?name=Nigeria). Anyhow, it is hard to imagine a situation where each of the 286 languages of Cameroon or each of the 514 languages of Nigeria has a script of its own. Observably, it is in the interest of users of endogenous languages of the same national setting to use roughly the same script and typewriting keyboard as that of the dominant official language (be it endogenous or exogenous).

User friendliness and internet visibility and vitality of script divergent languages

African languages with writing systems that scripturally depart more or less significantly (or systems with what the present paper calls “divergent script”) from the official language(s) of the country have a much lower written usability, including internet and ICT visibility. Examples of divergent script languages (i.e. which depart partly from the official language script) include:

The effective presence on the internet of languages with such writing systems is evidence beyond reasonable doubt that just anything can be written electronically and published via the world wide web. Indeed, since its inception in the late 1990s, the UNICODE consortium has obviously recorded very significant progress in making it possible for any language to enjoy e-writing facilities.

[insert]

As Kahunapule Michael Paul Johnson nicely puts it in a draft paper titled “Unicode and Bible Translation” (2004) available online at http://ebible.org/translation/Unicode+BibleTranslation.pdf (accessed on 23 April 2011; cf. also http://www.thehindu.com/news/states/karnataka/article2907796.ece?goback=.gde_1748997_member_96470425),

Unicode provides a unique number for every character, no matter what the platform, no matter what the program, no matter what the language. Using Unicode, you can more easily mix writing systems, even in one document, like this. Unicode 給每個字元提供了一個唯一的數位Η κωδικοσελίδα Unicode προτείνει έναν και µοναδικό αριθµό για κάθε χαρακτήρα, יוניקודמקצהמספרייחודילכלתוयूिनकोड ()येकअ,रकेिलएएकिवशेषन4बर (दानकरताहैユニコードは、すべての文字に固有の番号を付与しますUnicode – это уникальный код для любого символа […]

Thanks to UNICODE, even African languages which  depart completely from the nation’s official language script can thus be written through electronic media. But writing in this case will generally require scriptural technologies (software and hardware) and techniques which differ considerably from those that are commonly used in the writing of official languages. Such African languages include the following:

Further advances in technology make the use of such scripts even easier with virtual keyboard technology and similar devices (smart technology) which currently allow for instant script-type switch cross-linguistically (e.g. http://gate2home.com/) or handwriting recognition. In this regard, many Egyptologists and Africanists tend to argue that African languages should preferably be written in either the Bamum script, Hyeroglyphics, Mandombe, N’ko, Tamazigh Tifinagh script, Vai script, or any other “typical” African scripts (cf. Gaddafi’s stand on this issue at http://www.youtube.com/watch?v=nvaAmq7KIl8http://kouroussaba.com/http://kanjamadi.com/; etc.). Others suggest that African languages would be more suitably written in ideograms (like Chinese) than the widespread phonological alphabets in which most are currently written (cf. http://kwanzaamillenium.wordpress.com/2012/01/09/quelques-mots-sur-linstitutionnalisation-des-langues-africaines/). While the ideological passion surrounding such arguments (obviously full of postcolonial undertones) is understandable, there is need to reiterate that it is not the script type per se that confers identity to a language (cf. http://www.metafilter.com/110360/For-many-tiny-endangered-languages-digital-technology-has-become-a-lifeline). Written scripts and other symbols (very much in the same way as hand signs and body movement in sign language) are merely particular attempts to conventionally represent objectifiable referents which, otherwise, are universally perceptible by anyone endowed with the relevant senses (including emotion, imagination, hearing, sight, smell, taste, touch) to which the objects appeal. Ultimately, with the proviso that any script type can be used in representing actual human language (whether the language is spoken or based on hand/body signs) or thought, all natural languages may very well be written with a single script type, so long as writing rules prove to be internally consistent. The use of specific script types, symbols or signs by a language community would inevitably build up into the users’ scriptural habits and culture and thus may give the users a sense of pride, especially if the symbols are unique to their language. However, pride need not overshadow reason.

In point of fact, some African languages have experienced or do experience the use of at least two completely different standard writing systems (e.g. Arabic, Latin, etc.). They include Bamum, Berber languages, Comorian, Hausa, etc.(http://www.wazu.jp/gallery/Fonts_Arabic.html). Besides, the use of the same script type (including the use of IPA-type symbols in so-called general alphabets) in writing different languages does not confer the same identity to the said languages. Meanwhile, it seems absurd to claim that IPA symbols (most of which are of Latin or Greek origin) are more representative of African language identity than Latin symbols. Script types, symbols, and signs are in no way inherently attached to the language which uses them for writing purposes, even when the script type was initially designed for that language. Were it not so, then the use of the Latin alphabet and hand signs by so many distant languages around the whole wide world would either be impossible or would long have jeopardised linguistic diversity and identity.  For example, all East and Southern African languages (Kiswahili, Kinyarwanda, Kirundi, Luganda, IsiXhosa, isiNdebele, isiZulu, etc.) which exclusively use the same characters as those found on the English QWERTY keyboard would long have been swallowed by English. That, fortunately, has not been the case; rather, those languages are doing relatively well in writing, alongside English, as can be seen in daily newspapers and web activities in those languages. Needless to reiterate that English itself, like scores of other languages around the world, uses the Latin alphabet, but that does not in any way jeopardise its identity. Anyhow, even some of the so-considered “typical” African scripts were developed from earlier scripts (see http://maps.thefullwiki.org/Writing_systems_of_Africa).

Besides, it remains doubtful whether, in today’s sociolinguistic context largely dominated by Arabic, English, French, Portuguese and Spanish as official languages of Africa, “typically’ African scripts will effectively and efficiently compete with the official language script (essentially Arabic or Latin). It is foreseeable that a “typically” African script for the writing of endogenous African languages may only become effective and efficient if such languages are effectively made to replace exogenous languages as official languages, with what that entails in terms of scriptural adaptation to the entire range of available technologies which integrate writing (computers, multimedia sets, telephones, television, film industry,  video, internet, GPS, etc. most of which are not African innovations). Obviously, that can hardly be done overnight. And yet, current pressure from dominant exogenous languages are such that it would be suicidal to tie the fate of endogenous languages to the use of characters that do not appear by default on widely available keyboards and other scriptural technology throughout the specific social setting.

From the standpoint of the present study, it would seem that the effective and efficient writing of African languages currently (i.e. in a context largely dominated by Arabic, English, French, Portuguese and Spanish as official languages) requires a much subtler strategy, much subtler revolution, than that which seems to be envisaged by proponents of divergent scripts. In this respect, ensuring “genuine literacy”, i.e. the effective and efficient writing of African languages by the mass of users of the languages, and in line with immediate socio-contextual realities and current target user preferences, is more urgent and imperative than arguing over the “African identity” of the scripts to be used.  As Brij Kothari (http://www.scoop.it/t/translation-world/p/4050343284/2015/08/28/solve-literacy-digital-literacy-will-follow?hash=96224097-26ab-4dd5-8a2c-b9124e79386b; http://www.huffingtonpost.com/brij-kothari/solve-literacy-digital-li_b_8052694.html ) rightly argues,

Basic literacy is a pre-condition for digital literacy and if we solve the former, the latter generally follows through self-drive or is easy enough to achieve through targeted interventions.

Delivering literacy, globally, is still mostly the responsibility of governments. Imagine where we would be if connectivity, the internet, search, social media were all left to governments. Literacy is at core of each of these capabilities, yet it is not at the center of big global innovation.

We know, at least in developing countries that despite the progress, there is a long way to go in achieving genuine literacy. The word ‘genuine’ is important. A country’s literacy rate is NOT a good measure of a population’s ability to read, even simple texts.

Take India, for example. According to the latest round of the National Sample Survey (2015), 75% Indians above age 7 are literate. Almost everyone immediately makes the assumption that 75% can read. Studies have shown that over half of India’s “literates” cannot read a Grade 2 level text (or a newspaper headline). At the low level of alphabetic knowledge that the government is comfortable calling people “literate,” no parent would ever call his/her own child literate.

Moreover, with reference to the earlier argument about IPA and the failure to impose it as a standard writing system in the West, there is reason to think that the script type used is far less important than acceptance and effective use of it by the target speakers in their preferred spheres of writing. The survival of endogenous African languages (including spoken forms of these languages) also depends on this. Subsequently, when African languages will establish themselves as actual rivals of exogenous official languages (notably in terms of relevant general and specialised text corpora), and provided there is agreement across all African countries or most of them, a revolutionary decision may then be taken to adopt a “typical” African script (say, Bamum script, Mandombe, N’ko, Vai, etc.) for all endogenous languages.

In the current disposition, beyond the enthusiasm of many Africans in the face of an avalanche of proposals by computer specialists on the possibility of writing African languages with divergent scripts, there is need to remain lucid and refrain from falling prey to funded initiatives which may not always serve their ultimate interests. Indeed, thanks somehow to significant developments in translation and translation technology, the 21st century has ushered in a paradigm shift whereby linguistic diversity is no longer seen as a threat as was the case over the years with certain minimalist sociolinguistic planning models. Rather, the in-thing is maximalist models which advocate for the preservation all (endangered) languages (http://www.endangeredlanguages.com/about/#about_alliancehttp://www.paradisec.org.au/blog/2012/06/elar-cracks-a-ton/http://summit2012.globalvoicesonline.org/2012/07/keeping-endangered-languages-alive-online/). In the process, each of the estimated 6000 languages or so across the world, including well over 2000 in Africa alone, is now seen as a true investment opportunity, with many computer specialists outside Africa already positioning themselves for exploitation (http://www.lingo24.com/blogs/company/endangered-languages.htmlhttp://www.slate.com/articles/podcasts/lexicon_valley/2012/07/lexicon_valley_why_should_we_care_if_a_language_goes_extinct_.html). Even within religious circles, there is a significant shift from a select few liturgical languages (cf. http://fr.missionerh.com/index.php?option=com_content&task=view&id=4351&Itemid=619) to the realisation of the urgent need to translate the Bible in every single language of the world with the help of technology (http://www.qideas.org/video/the-bible-in-a-technological-age.aspx). However, while many such investors fundamentally focus on demonstrating that hitherto unwritten endogenous languages of Africa can effectively be written via electronic and internet means, it cannot be overemphasised that the matter goes far beyond the possibility or effectiveness of writing African languages electronically. The issue is actually that of efficiency and competitive advantage. More precisely, the relevant question is whether electronic writing in this case is competitive and efficient enough (especially in terms of the actual involvement of the masses) in the face of writing in existing official languages. Until now, a fact remains: the actual large scale implementation of proposals by potential or real investors in divergent script writing across Africa can hardly be obtained in the much-desired short term. There is urgent need not simply to save most endogenous languages from further attrition and death. There is need, most importantly, to ensure effective learning, teaching and hobbying in those languages as well as sustainable content development during and after classroom training, by the grassroots users themselves. Some may want to argue that writing in languages like English, French, Spanish, German, took a very long time, sometimes centuries, to gain relative stability. But then, the science of writing had hardly developed to the level where it is today. With the current scientific knowledge available in matters of writing, it is possible to develop and stabilise writing systems and writing habits much faster than in the past, with the proviso that the right decisions are taken during the design stage of the writing systems. As of now, the efficient writing of an African language with scripts that diverge from the writing symbols of the official language of the setting remains an uphill task, especially in terms of digital interoperability and transferability.

Writing and translating in African languages: interoperability, file transferability and the unavoidable keyboard issue

It may be useful here to begin with the following premises:

In fact, by today’s standards, exceptions of multiscript keyboards (e.g. parallel Latin-Arabic  or Intellark Arabic keyboard: http://en.wikipedia.org/wiki/Intellark; cf. also Cameroon QWERTY Keyboard Chart and Cameroun AZERTY Keyboard Chart) notwithstanding, the number of characters allowable on each keyboard (including smart technology keyboards) is still normally (i.e. by default) limited to those of one single language writing system at a time, irrespective of whether, as was earlier indicated, the characters of the writing system may quite conveniently be used in writing any other language. Thus, although there may be a lot of pride for each language to have a totally unique script, any language can be written equally with Chinese language characters, Latin characters, Cyrillic characters, etc. provided the writing rules are adapted to suit the system requirements of the language concerned.

Moreover, a separate script for each language would hardly be practical in view of current technological developments. In specific national contexts and irrespective of the number of languages available in the setting, the official language, local servers and internet providers tend to impose one specific default keyboard across the board, with corresponding scripts (cf. http://www.languagegeek.com/typography/diacritics.html). The experience in officially bilingual countries like Cameroon and Canada is no different, as the dominant keyboard observably tends to be either AZERTY or QWERTY, depending on whether the specific area is French- or English-speaking. Thus, for example, despite the enthusiasm raised by the proposals to adapt available keyboards to Cameroun AZERTY and Cameroon QWERTY (see note on the “Going Kompyuta” initiative at http://www.goethe.de/ins/cm/yao/wis/gko/fr8636553.htm), the proposals have remained mostly in software form as they are yet to truly become default keyboard hardware across Cameroon. And one can predict that even when these keyboards are widely distributed in Cameroon, the English QWERTY and French AZERTY keys will continue to dominate  technological habits and the lives of a good many Cameroonians both at home and abroad. And this is likely to be so, as long as English and French remain the dominant official languages of the country. Adopting and democratising a Cameroun AZERTY or Cameroon QWERTY keyboard would work to some extent within the country, especially for all those who are really motivated to write endogenous languages. Such keyboards will nonetheless remain inaccessible to potential learners who, for some reason, still doubt the actual competitive advantage of the languages in the face of dominant exogenous official languages; moreover, the keyboards will be difficult to freely access abroad, notably in partner French-speaking or English-speaking countries which host the majority of Cameroon’s diaspora. Consequently, Cameroonians abroad will still have to rely mainly on English and French scriptural technology (which extends far beyond personal computer or laptop keyboards) in their interaction with other Cameroonians at home and abroad. Actually, the all-pervasive use of default official language keyboards throughout the educational and administrative lives of citizens ends up shaping scriptural habits which, in turn, become rooted and fossilised as cultural habits which would usually take a rather very long time to change. As a result, although most people would generally be very excited to see that their ‘heart’ languages can be written too, rather few may be ready and willing to go an extra mile to constantly switch keyboards or type in special codes either within text or across texts to get markedly divergent scripts for their language. Thus, in many African countries where the official language AZERTY and/or QWERTY keyboards have been imposed over the years as default/dominant hardware, users generally find it rather too demanding to have to switch keyboards or use sets of memorised codes in order to get markedly different scripts for their (endogenous) mother tongues (cf. Cameroon QWERTY Keyboard Chart and Cameroun AZERTY Keyboard Chart). What this simply implies is that even if an African country were to choose an endogenous African language as official language, with any of the typical African script (say, Bamum, Hieroglyphics, Mandombe, N’ko, Vai, etc.) as basic writing norm, it would be technologically ill-advised for non-official endogenous languages of the setting to diverge from that official language script.

Script divergence is particularly compounded by the many interoperability difficulties often involved in transferring and receiving typed texts with non-convergent scripts, not only on personal computers but also all other devices with integrated keyboards and other scriptural resources. Clearly, keyboard issues cannot be limited to computer (be it desktop or laptop) keyboards. Most technological devices today comprise multimodal language functions. These include telephone, television, video kits, teletext/subtitling suites, caption creation and editing software, as well as radio sets, cameras, and the like. Many of the divergent script proposals hardly take these into consideration. Consequently, in the face of interoperability difficulties, many of those who master the use of divergent scripts tend to develop user apathy with time. Even some of the die-hards specialists and defenders of divergent scripts observably tend to fall back on more convergent ones when the going gets tough, notably during the presentation of linguistic papers at conferences, seminars, etc..

So, despite commendable efforts by UNICODE and other technology experts (eg ANLoc members and other localisation experts: see http://www.africanlocalisation.net/http://www.africanlocalisation.net/en/keyboardshttp://summit2012.globalvoicesonline.org/2012/07/keeping-endangered-languages-alive-online/), writing systems which resort to character sets other than those available on the dominant official language keyboard of the setting tend to face major problems of system interoperability and file transferability (http://www.thehindu.com/news/states/karnataka/article2907796.ece?goback=.gde_1748997_member_96470425http://summit2012.globalvoicesonline.org/2012/07/keeping-endangered-languages-alive-online/). Besides the heavy financial burden of interoperability and transferability problems, the written text which the receptor(s) often get(s) is at best a document with questionable face validity compared to documents in a script convergent language, and at worst a document with garbled/encrypted characters (Fig. 2 below) that can hardly be decoded.

[insert]

This obtains specifically when, for example, one changes operating systems or browsers, or when the device receiving the text is not provided with the required software. Most, if not all, who have written or received texts with uninstalled divergent scripts would definitely be familiar with this picture (cf. Chantal Enguehard & Soumana Kané http://vzauf-dev.refer.org/greenstone/collect/bibauf/index/assoc/HASHfc59/18cbb89d.dir/doc.pdf. pp 60ff). Such difficulties point to the evidence that while it allows for just any script to be eventually used on any electronic devices and on the internet, unicodification is not yet a universal guarantee for  technological interoperability, especially in situations where non-official/non-dominant languages use script types that are different from those of the official/dominant language. Nor is it a guarantee that all possible script types could be used with identical ease within the same sociolinguistic setting, whereby each endogenous language resorts to characters which differ completely from those of all others.

In effect, one cannot help asking: is it normal for the writing of some African languages to continue to suffer such fate, i.e. lack of interoperability and adaptability, on African soil itself, decades after the creation of writing systems for those languages? Can these languages ever be written with the same ease as is the case with partner official languages? Would it not be fair and more logical that default keyboards in Africa should be made to reflect the writing requirements of endogenous African languages rather than those of their exogenous  counterparts? Some linguists or critics may want to shift the blame to political decision makers (cf. http://www.h-net.org/reviews/showrev.php?id=4359; http://en.cnki.com.cn/Article_en/CJFDTOTAL-TRIB201102014.htm; etc.; http://news.abidjan.net/h/425813.html). But this argument would hardly stand in the face of the relative success of African languages with convergent writing systems (cf. Kiswahili and most written endogenous languages of East and southern Africa). It is unfortunate, however, that in many Central and West African countries, linguist and presumably “Africanist” lobbies in favour of divergent scripts (in the name of general alphabet for endogenous languages which presumably would best preserve the identity of African languages) continue to hold sway. Worse, they tend to discourage, sideline, blackmail and lobby against any attempts to use formally convergent scripts. Anyhow, for an illustration of the successful writing of some of the endogenous African languages in Latin script see, for example:

Need there be recalled that contrary to what may be thought, and in addition to the above examples, virtually all African languages have already experienced some form of writing. Each of the languages may not have a writing system that has been formally designed on the basis of known scientific rules. However, historical records would generally show evidence of the way in which names of persons and places have,  preferably, been written from the earliest contacts with literate Europeans till date. Observing recurrent trends in such records could very well lead to the identification of some justifiable scriptural preferences, which may be subsequently be integrated into a more systematic proposal for a user-friendly writing system. Unfortunately, despite evidence of user preference for convergent scripts, some linguists are adamant in discouraging their use, simply on the grounds that scripts that fail to conform with the ones proposed by the so-called “general alphabet” (e.g. the General Alphabet of Cameroonian Languages, the Pan-Nigerian Alphabet, etc.) are presumably unscientific (cf. innuendos  in Yemmene 2001, Voutsa 2003: 10). Consequently, many lay Central and West Africans are still torn between the unfounded belief that the best (and most scientific) way to write an African language is to (phonologically) transcribe it on the one hand, and their own, almost natural, preference for a convergent script whenever they attempt to write the language on the other (see http://nms.sagepub.com/content/13/3/427.refs, and http://nms.sagepub.com/content/13/3/427.full.pdf+html: pp 432ff).

The rapid survey on the writing of the Mbafung language (http://polldaddy.com/poll/4039514/?view=results) looks quite instructive in this regard and may need to be extended to users of other endogenous languages on a larger scale. Meanwhile, it would be observed that such negative reaction to the use of phonetic symbols which are not represented on ordinary keyboards is not typical of the Mbafung people.

Ordinary speakers and writers of Kinyarwanda (and, by the same token, kirundi) have been resisting any attempt to introduce tone marks and other such phonetic description features in the orthography of their language (see, for example, http://www.kimenyi.com/kinyarwanda.php), despite the colossal work done, through a 3000-page (and three-volume) reference document entitled “Dictionnaire kinyarwanda et kinyarwanda-francais” (http://www.irst.ac.rw/spip.php?article205), to show their relevance and advantages. The reaction of users is arguably an indication of the need to keep writing as simple as contextually acceptable (that is, preferably devoid of unnecessarily phonological and other linguistic explicitations which would complicate writing within the target sociolinguistic context) while ensuring base script convergence in all non-official languages and official languages (whether such languages are endogenous or not) found within the same national context. In other words, despite the goodwill, ordinary users generally tend to be put off upon the realisation that, compared with the dominant official language of the setting, writing their own language is often a feat and a prowess rather than something they can do at will, without any technological stress or barrier, be it software or hardware. From this observation, it can be argued that with divergent scripts, the writing of endogenous African languages is constantly distracted and  even frustrated by stressful technological constraints and bottlenecks. This situation constitutes a real boon to computer geeks who have been earning a living by demonstrating that any script can be written electronically. Nonetheless, there is evidence that the issue of script coherence in multilingual settings is carefully ignored and, consequently, divergent African languages scripts hardly do compete scripturally with the dominant official language scripts, especially in terms of real popular use by the grassroots.

Unfortunately, many proposals of so-called general alphabets for Africans languages continue to include sets symbols which are not found on the official language keyboards. By virtue of the alignment of many graphemes on IPA symbols, proponents are seemingly interested in principally demonstrating some hypothetical/relative phonological unity and harmony across different endogenous African languages (absolute harmony is actually deceptive in most cases, in the sense that letter-phonemes are scarcely ever realised in exactly the same way across all languages to which so-called general alphabets purport to apply). In effect, it is difficult to say whether what is at stake is a “general phonological alphabet” or “general orthographic alphabet”. Moreover, because they do not always conform with the official language script, the so-called general alphabets are not as general as it is claimed. By creating a sort of script or linguistic apartheid in favour of endogenous languages only, the so-called general alphabets somehow, by chance or by design, discriminate against official languages. Otherwise, a country’s general alphabet ought to be designed inclusively, i.e. to integrate both non-official and official languages of the country and preferably in conformity with known script directionality and other keyboard requirements in the official language. Scripturally, setting apart non-official languages results in scriptural hiatus and incoherence with respect to official language writing. Ironically, this ultimately contributes to the marginalisation of the former languages from most of the writing contexts which are dominated by the latter. In addition, script divergence generally requires the development of a completely new technology chain, ranging from more to less traditional typewriting and word processing technology (including letter and Braille keys). Inasmuch as such script divergent proposals offer many new avenues of technological research investment, they are ultimately an obstacle to the fast development and popularisation of endogenous language writing. Similarly, they end up being uneconomical or too costly for the already poor and suffering people and countries of Africa to afford. As a result, divergent script proposals currently make African more dependent on exogenous technology than may be thought. As a matter of evidence, the use of such scripts is still largely dependent on foreign technological investment and decision making. In the process, Africans continue to empower foreign technology innovation, both in the areas of hardware and software development, as is currently the case if one needs to carry out certain localisation projects using a divergent script. With convergent scripts Africans would strategically be far less dependent on such technology, with the possibility, in the medium and long terms, to become totally independent.

Besides, in the current context which is technologically dominated by Arabic, English, French, Portuguese and Spanish as official languages, the effective use of special characters (i.e. characters other than those available through the official language keyboard) on a TV monitor, a mobile phone, a search engine, a translation memory, a corpus or concordancing tool, requires a long and cumbersome procedure of negotiating with the owners of each technological device. In addition, interoperability in this case does not concern only the software and hardware dimensions electronic and mechanical technology. Divergent scripts would imply further investment in the creation of new Braille and Morse characters, for example; and that, unfortunately, does not yet seem to be a central concern among proponents of such scripts. This actually tends to confirm the historical evidence that apartheid, whether it is political (as in pre-1990 South Africa) or otherwise, eventually has a boomerang effect on the promoters.

Writing and translating in African languages: search-engine friendliness, speed, translation technology and professionalism

In the current era, search engines constitute a major resource for professional writing, especially translation. They play a central role in information gathering, and even information processing and production. However, finding relevant information through a search engine depends entirely on whether that information is accessible and retrievable through the search engine in the first place. And this in turn depends on whether the search engine readily accommodates the writing system. In designing writing systems strictly with IPA characters as a reference, it has been customary to overlook the script requirements of the most widely used search engines in specific social settings.  Consequently, searching and finding written text data in languages with such divergent scripts is far from easy. The reason for this is that even when textual data in such languages is available electronically, the preferred format in which it is usually stored would tend to be picture format, notably PDF and the like, many of which are generally less readily accessible through search engines than, say, documents in straight HTLM format. Searchability here is viewed from two perspectives: (a) user’s ability to query a search engine directly in the language (actual search phase) and (b) user’s ability to effectively find and retrieve written text information in the language once a search has been launched (data access and retrieval proper). Also, search engines here should be understood to include both online and offline ones, as well as both best known and lesser known ones. In most cases, languages with divergent scripts are hardly search-engine friendly. In view of this, there is reason to argue that the closer the writing systems of non-official languages are, symbol/character-wise, to that of the official language of a country, the better they tend to be in terms of searchability and use by the common speakers, notably in electronic media and, most especially, the internet. Only effective and sustained internet presence in appropriate formats, notably via social networks, blogs and websites currently facilitates searchability. The higher the degree of searchability, the more the users of the language – especially translators – will be empowered through easy information access and sharing. In the process, available and traceable information will be used as a web corpus or will facilitate corpus building (including parallel corpora and online translation memories) on general as well as specialised issues. Translation quality and speed will be improved considerably and result in better professionalisation of the activity in the direction of African languages. In turn, more translation will feed the language corpus on- or offline and further improve on searchability. Clearly, the adoption of the same characters and keyboards with the official languages of the national setting will contribute to an increased popular use of endogenous languages alongside the official languages and, by implication, improve on electronic search and information retrieval in the endogenous languages. Unfortunately, most disciples of script divergent systems have tended to neglect searchability requirements. They generally prefer to put emphasis on principle of script economy, which  they see almost exclusively in terms of number of characters used in representing a sound. In this regard, the more characters the less economical the system; meanwhile, writing speed seems to be taken for granted as implied in presumably economical writing systems. Nonetheless, unlike what obtains with script convergent languages, users of script divergent languages generally find it much more difficult to sustain production to cover daily news or instant messages as is required of social networking, text messaging, etc.. Actually, this makes them far less searchable, and thus more time-consuming (when it comes to information gathering) and therefore much less economical, than script convergent languages.

Besides, in regard to the realities of today, English, French, Portuguese, and Spanish have been serving as main official languages for generations and generations of African communities; and there is hardly any indication that this is likely to change over night or any time soon. Nor is there evidence that a replacement of exogenous languages with endogenous languages as official languages will result in monolingual and/or monocultural African states. It would seem wise, therefore, while designing writing systems for African languages,  to perceive exogenous official languages as partner rather than merely colonial and rival languages. This would even be truer from the perspective of translation and related disciplines, whereby there is clear evidence that the use the same set of characters in writing different languages does not in any way threaten the distinctive (structural) identity of each of the languages involved. It is on this basis that in the American continent, “Quechua can be typed on a Spanish keyboard, or Abenaki on a Canadian French keyboard.” (Harvey 2009 at http://www.languagegeek.com/typography/diacritics.html). Strategically, there is need to focus on the essential rather than waste precious time crying over spilt milk, looking for a scapegoat or ideologising, as has frequently been the case in some post-colonial theories. With regard to the writing of African languages, it would seem wiser to begin by developing  a win-win strategy whereby both endogenous and exogenous languages will be complementary partners rather than opponents. As a matter of observation, the inventory of all relevant discrete phonemes of a language may be an unavoidable prerequisite for devising a phonographic writing system for the language. However, movement from phonology to orthography actually shares a lot in common with professional translation and needs to occupy a more central place in translation studies than ever before. Anyhow, there is evidence that such movement may not be equated a priori to systematic transcoding (cf. in the Interpretative Theory of Translation’s  sense), i.e. with one-to-one correspondence of symbols and signs. Instead, movement from phonology to orthography may need to seriously consider the following pieces of advice which stress the role of co-text, context, reception and acceptability in such matters:

  1. « Signs make meaning in context », « the conceptual activity which develops during productive thought and purposeful communication, verbal as well as non-verbal, is the collective outcome not only of the occurrence of significant signs, whether individual or in combination, but also of the enveloping social context that surrounds all human thinkers and word producers (and processors). » (Wendland (2008) Contextual Frames of Reference in Bible Translation. A Coursebook for Bible Translators and Teachers, Manchester:St Jerome Publishing, p. 59
  1. ….one should adapt to the written language with which the people already are familiar. French, for example, which is the official language of the country… (Schulze, director of a linguistics department at Ludwig Maximilian University in Munich. In Joelle Verreet Mar 24, 2011: “Translating the bible into exotic and rare languages”)

After decades and decades of experimentation, the writing of African languages with divergent scripts has, compared with convergent ones, continuously failed to adapt contextually to scriptural technology. It is strategically imperative to tow the line and invest more in the direction of script convergence with the dominant official languages, so that focus can effectively shift to the mass-production of texts in endogenous African languages. In the current state of scientific knowledge, proposals for the writing of African languages will undoubtedly continue to be based on more coherent phonological descriptions and, therefore result in writing systems which are, by far more phonographically coherent than those of earlier written languages. However, there will be need for a considerable shift of focus from purely phonological considerations to morphosyntactic and discourse/textual and cross-linguistic considerations too (issues concerning the development of writing cannot be left in the hands of specialists in linguistic/phonological description alone (cf. Fishman; Bird, etc.)).

There is still far too much talk and writing about/on endogenous African languages with divergent scripts and rather too little writing in those languages themselves. Time for action, i.e. preserving African languages and ensuring their effective presence in the entire writing sphere, time for mass content (especially written) in endogenous African languages is now! Tomorrow will be too late, because by then, the very exogenous languages some see as the main problem of Africa today will continue to be preferred by Africans in the writing sphere. Meanwhile, due to absence from the writing sphere, African languages will undergo gradual attrition and ultimately become extinct.

In view of a more reasonable and better grounded development of the African continent there is imperative need for both the African elite and grassroots populations to mass produce content in their own languages rather than just content about the languages. Translation (far beyond Bible translation) is a major inspirational tool for creating such content. Comparative evidence on the development of English and French to their current state and status would largely confirm this (cf. Gémar; Woodsworth and Delisle, etc.). Translation in this case will need to cover every all aspects of human activity in which Africans themselves are involved as well as all those models from elsewhere which they may wish to emulate. In the process, Africans themselves should feel concerned with the development of content about their own specialist area of activity in their respective mother languages. A little contribution from everyone will make African languages more visible in relatively little time. In this way, Africa with its over 2000 languages will definitely turn what has too often been considered as an obstacle into a veritable development opportunity which will subsequently attract interest and investment from around the world.  Already, specialists in content dissemination and storage like Google predict that Africa will be one of the major stakes in the relatively near future of internet (cf. http://www.itnewsafrica.com/2011/07/qa-google-wants-more-african-content-online/http://www.multilingualblog.com/?p=1305http://www.nytimes.com/2011/12/11/magazine/everyone-speaks-text-message.html?_r=3), with all it implies in terms of development and sale of technological innovations (both hardware and software). The consistently high growth rate of internet penetration across Africa should be an eye opener here too. Obviously, there  will ultimately be need to use exogenous as well as endogenous languages in content development and dissemination. But just how many Africans themselves are aware of this potential?

Concluding remarks

Having said all that precedes, it may be argued, somehow reasonably, that writing does not necessarily guarantee the existence and survival of a language today. In other words, African langages can even do without writing, especially with current developments which make audio and audiovisual recording (and voice-text transcription) everyday easier (cf. also http://manenomatamu.wordpress.com/2012/05/17/video-news-linguist-k-david-harrison-launches-talking-dictionaries-for-endangered-languages/ on K. David Harrison’s talking dictionary initiative).  This may be true, but only to a certain extent. On the one hand, it can hardly be true with Latin, for example, whose written form largely overshadows its spoken form today and actually constitutes a basis for the formal learning, teaching and assessment of knowledge of the language.  Not to mention the fact that in addition to sign language, writing remains an integral part of any inclusive social development plan which bears in mind deaf users of language. Writing therefore still plays a major role in the formal learning and teaching as well as inclusive use of any language. On the other hand, it may be true inasmuch as the spoken form of a natural language generally exists long before the language ever gets written, if ever. Anyhow, in an increasingly internet-dependent world, there seems to be an everyday stronger correlation between the written tradition of a language, internet presence and the vitality of that language, particularly with a widening generation which considers internet as a way of life.

As a matter of observation, besides the increasing integration and interpenetration of both spoken and written modes  of discourse in most information and communication tools (including radio, telephone, video cameras and television), contemporary speakers of a language would like to communicate via text messaging, subtitling, captioning, teletext and the internet too, especially through writing on social networks, emails, blogs, websites, etc..  In other words, multimodality is increasingly the order of the day. Speaking and writing a language are now more complementary than ever before, with speaking modes enriching writing just as writing modes enrich speaking. So much so that, predictably, in today’s highly competitive world economy, African languages which lack truly efficient and competitive writing systems – notably with appropriate multimodal flexibility – are likely to be gradually relegated to the museums while unwritten languages are likely to gradually lose users and disappear, i.e. die, without a trace. And, of course, it would be far easier to resuscitate a dead language with some written records than one without any.

Actually, one of the underlying argument of this paper has essentially been that the development of writing habits in African languages cannot be seen to be dependent on the political will and on phonological considerations alone. No matter how good the political will may be, there is need for it to be backed by linguistic expertise that properly factors in contextual parameters and target user needs and preferences. In the present disposition, there is reason to argue that even when political decision makers across Africa shall have become more favourable to the officialisation of endogenous languages and/or their effective use in administration and the school system (as is timidly becoming the case in many countries), the use of scripts that markedly diverge from that of the existing dominant official language(s) will remain a major stumbling block to the mass production of written content in those very endogenous languages by the grassroots native speakers. In Cameroon, for example, where a number of endogenous languages have been officially re-introduced (on an experimental or trial basis) in the school system, unfortunately with exclusive emphasis on the use of divergent-script-based General Alphabet of Cameroonian languages as basic writing norm, it is doubtful whether outside the classroom environments and beyond the organised settings of language committees and research laboratories , such writing is effectively used  both by the students themselves and by the target speech community.  This definitely requires further investigation. Meanwhile, it should be remembered that this is actually not the first time the use of such divergent script for teaching endogenous Cameroonian languages in the school system is being experimented. Already throughout the seventies, a similar experiment was carried out (albeit without sufficient backing from the State), with languages like Fe’fe’ standing out as leading examples with fully-fledged programmes leading to graduation with a First School Leaving Certificate in the language. It suffices to ask some of the graduates of those programmes whether the use of a divergent writing system was an advantage or a disadvantage to its effective use outside the classroom; whether they have been able to sustainably use the writing system in communicating with peer Fe’fe’ speakers… The answer is anyone’s guess.

Anyhow, in designing writing systems for African languages today, phonological description and validity is definitely an essential starting point. In this regard, most phonetically-based general alphabets that have been proposed for the writing of African languages remain key reference material for phonological description/transcription. However, moving from the identified phonological units to graphological or orthographical units may not just consist in carrying over phonetic symbols to the graphological sphere or applying the one-to-one correspondence blindly. In a similar way as in translation, the choice of graphological correspondents of basic phonological units needs to take into account both the socio-linguistic and the larger discourse contexts and, as much as possible, make proposals that conform to the target users scriptural customs and habits and to existing (rather than hypothetical) scriptural technology.

As a transitional measure, writing endogenous African languages with the same script types as those of exogenous official language(s) of Africa has far more advantages than disadvantages, with the understanding that there is a sharp difference between phonetic and orthographic characters. In this respect, rather than discourage total convergence with the Latin alphabet, it is strategically advisable to adopt both a general reference phonological alphabet as is notably the case already in Cameroon and Nigeria and the translation of the general alphabet into the Latin alphabet, while ensuring consistency in writing rules. The reference phonological alphabet will be used for purposes of phonological transcription, for example in reference dictionaries and grammars. Meanwhile, the translation of the reference alphabet into Latin symbols will be used for orthographical purposes, alongside existing official languages. Besides, rather than systematically adopt a script divergent approach to the writing of African languages, it is far more advisable to go for a bi-scriptural approach which makes it possible to choose either a conventional divergent script or a conventional convergent script as strategically desirable. In this case, an endogenous African language may, depending on situations, be conventionally written in convergent Latin script or divergent Mandombe, in convergent Arabic script or divergent N’ko, etc.. Alternatively, the bi-scriptural strategy may simply mean adopting a general phonological alphabet for phonological transcription purposes on the one hand and an orthographical alphabet for purposes of orthographical writing on the other.

Another key argument in the paper is that – the development of audio-visual recording technology notwithstanding – the ultimate survival of African languages depends on the effective use of writing by both the masses and the linguistic elite. Thus, writing in this case includes not just writing by language committee members and interested linguists but, most importantly, social and professional writing as well. As  Lynn Landweer (n.d.: http://www.sil.org/sociolx/ndg-lg-indicators.html#distribution) puts it,

The fact that languages “die” is not new; Koiné Greek and Classical Latin both are “dead” as spoken languages. The only reason we know of them is because of the written record that was left behind. … vernaculars are in fact dying… not because their populations have stopped talking—people after all are incurably gregarious—they are dying because of the effect of language use choices made by the majority of individuals of those speech communities.

Among the choices is the shift to other languages. And the argument here is that socio-contextually ill-adapted scriptural choices in the design of a writing system can also contribute to the shift to languages whose writing systems  are found more user friendly contextually. Obviously too, a look at writing today shows that internet visibility and interoperability as well as  social and professional writing play a vital role in the shift from ‘elitisation’ to ‘massification’ of practice.

In this light, and seeing the difficulties in effectively and efficiently writing some  endogenous African languages, there is little doubt that the ultimate survival of these languages hinges on a more serious consideration of these issues. Writing African languages need not be a feat or prowess. This actually implies going back to the drawing board to review some of the basic principles used in designing writing systems, perhaps in consonance with the following advice which UNESCO experts (in Fishman 1968) gave way back in 1951, but which has seemingly been overlooked or ignored altogether:

  • Even where there is uniformity in the system of writing, special problems may exist. Many traditional orthographies date back to the time when the only important technique of graphic representation was handwriting.  The number of separate characters was not too important, and they could be interlaced with each other in complicated ways. Modern machine writing, including the typewriter and various typewriting machines for printing work best with simple writing systems using a limited number of characters which can be arranged in a single line.
    •In cases where the school uses a local mother tongue in the early years and later changes to a second language of regional and national importance as the main instrument of communication, it is advantageous to the students if both languages are written in the same way.  This means that they do not have to learn to read and write a second time for the second language, but make a more or less direct transfer of their previously acquired abilities.•Where there are several major regional languages in one country or where more than one language has official status, it is of value to have relative uniformity in the way in which they are written.  To the extent that they are similar, the learning of the additional language is facilitated.  Even within a single country, there may be obstacles towards achieving unification of alphabets because of the attachment people may have developed towards a given form of script or because of strong feeling they may hold against another one. These same attitudes may stand in the way of achieving the simplification of a writing system along lines that may be more suitable for typewriting, machine typesetting, and telegraphy.

Only fools do not make mistakes; and foolish pride hardly pays. Similarly, putting the cart before the horse hardly ever produces expected results. Promoters of socio-contextually non-convergent writing systems in Africa put the cart before the horse in the sense that they make proposals without ensuring that Africans themselves actually have the technological means and power to implement the proposals. Since the effective implementation of such a divergent model  is largely predicated on hypothetical technology over which Africans themselves have little influence but which instead requires further financial costs, the model has little or no impact on the fight against poverty and the empowerment of the grassroots.

There is need, therefore, to recognise that despite decades and decades of experimentation of IPA-script models where European languages themselves have failed for socio-contextual reasons, the success of the proposals has remained grossly meagre compared to that of socio-contextually more convergent systems. Some readers would probably contend that it is more logical for technology to adapt to script proposals for African languages than the other way around. Besides the fact that such argument presupposes that every new technology is justified, it would be futile invent new tools where old ones can actually do the job even more efficiently.

For the record, the General Alphabet of Cameroonian Languages was proposed in 1979, together with a suggestion for new AZERTY and QWERTY keyboards which would take aboard hitherto unrepresented characters. Decades after, and as was earlier discussed, there remain insuperable difficulties in implementing the proposals. Concurrently, there is evidence that the writing of endogenous African languages can be largely made to comply with the Latin character sets on the English-QWERTY and French-AZERTY keyboards, as is the case with many East and Southern African languages. True, the Latin script may not be a perfect or an ideal way (if there is such a thing) of writing African languages. But it may console to know that ALL writing systems, including IPA, are merely approximations of real human language and each has identifiable strengths and weaknesses. In any case, no writing system can purport to be a detailed phonological description of human language and it is doubtful if any attempt to provide a writing system which explicitates all phonological (cf. so-called phonographic writing systems) or semantic (cf. so-called ideographic writing systems) features of language would be at all feasible or practical. Consequently, it is high time to stop experimenting purely phonological theories on African languages whereas the competitiveness of writing in such languages observably depends on more than just phonological theory and research. The proposal is that these general alphabets may be kept for phonological transcription purposes. Meanwhile, serious investigation should be made to translate problem characters (mostly vowels in the case of the General Alphabet of Cameroonian languages) into purely Latin symbols as was done, for example, to arrive at c, gh‘, ny and sh, etc. at consonantal level. In the process, languages that are already popularly written in Latin script (e.g. Duala, Bulu in Cameroon, Kinyarwanda in Rwanda) should be allowed to operate with such scripts (with amendments where necessary) in the school system, rather than insist on changing the people’s habits.

Needless to stress, although it generally develops from spoken language, writing does not always function on exactly the same plane as speaking. Actually, it is often realised that one may not always write the way one speaks (cf. http://llacan.vjf.cnrs.fr/PDF/Mandenkan14-15/14balenghien.pdf). So much so, that calquing writing strictly or exclusively on spoken language could be a mistake and may explain why some of the systems fail to attract enough interest from the native speakers themselves. So, it cannot be overemphasised that designing a writing system cannot be seen as an exclusively phonological matter or an issue reserved for specialists in phonology. The following statement by Rosemary Osbourne may already provide food for thought in this regard:

Aoccdrnig to a rscheearch at Cmabrigde Uinervtisy, it deosn’t mttaer in waht oredr the ltteers in a wrod are, the olny iprmoetnt tihng is taht the frist and lsat ltteer be at the rghit pclae. The rset can be a total mses and you can sitll raed it wouthit a porbelm. Tihs is bcuseae the huamn mnid deos not raed ervey lteter by istlef, but the wrod as a wlohe. Amzanig huh? (http://finance.ninemsn.com.au/smallbusiness/salesandmarketing/8352535/proofreading-is-serious-business)

That anyone with sound knowledge of written English should be able to capture the entire message in this text is is a pointer to the evidence that effective reading of written language (much in the same way as the writing of spoken or signed  language) is not an automated exercise as is the case with unassisted machine translation. It is an intelligent (human) activity which, from observation, often requires more than physiological reflexes in response to specific stimuli of the senses. It would generally involve also the use of memory and thinking where, for example, etymology and logic prove to be more relevant. Observably, the successful use of most writing systems generally requires the use of all three strategies: reflexes, memory and thinking; some (e.g. English and French), however, tend to be more memory-based than others (e.g. Latin, Spanish, Kiswahili) ). Nonetheless, any successful reading would initially require a well-informed identification of codified segmental signals (based on agreed orthographic norms involving individual signs or a combination thereof) and what they correspond to in the reality of discourse. Ultimately, however, readers may only succeed in decoding what is written if they can contextually add the relevant “cognitive complements” (as suggested by the disciples of the Interpretative Theory of Translation) to what is explicitly written. The onus is therefore on orthography designers to strike an acceptable balance between what would acceptably (i.e. with due consideration for target user’s cognitive exposure) be made explicit or implicit in the writing  system. By today’s standards, everything being equal, designers should strive to devise writing systems which are less memory-dependent. However, no matter how efficient a writing system is in this regard, there will always be need for cognitive complements when it comes to reading.

Anyhow, even where there is agreement on script convergence, insistence on phonological tone marking and the use of abundant diacritics in orthography is also questionable in many respects (cf. Lepsius’ 1863  Standard_alphabet_for_reducing_unwritten languages and foreign graphic systems to a uniform orthography in European letters). Technological difficulties aside, this is not only because tones – which may be lexical, grammatical or syntactic – and other prosodic features of language discourse have hardly ever been systematically marked in writing, but most importantly, because some tones are relatively unstable (cf. Bird in “When Marking Tone Reduces Fluency…”) and so, therefore, is the marking thereof bound to be unstable too. Or, to put it perhaps more appropriately, the marking of tones is bound to be unpredictable. As would generally be observed, unlike the accented characters on many European language keyboards, no orthographic vowel or consonant sign for endogenous African languages is marked on the keyboard. The marking is left to the writer who must be able ascertain the relevance of each of the conventionally approved tone marks depending on context. Besides, in a rapid observation of some native Mbafung speakers reading their home language texts with tone-marks, it was  noticed that the native readers hardly paid attention to tone marks in their effort to make sense of what they were reading (A 1968 article titled “Transcription de langues africaines” actually stressed the relative rather than indispensable nature of tone marks in the orthographical representation of most of the African languages studied, even though an examination of the tones was seen as unavoidable from a pedagogical (i.e. learning and teaching) perspective: see pp. 229-233). Neither were beginners, including native children of nursery school age and average writers, always aware of how such tones should be correctly marked. This may partly explain why African language writings with tone marks are far less visible in electronic media, especially in social writing networks, than those without tone marks. In any case, there is a tendency among ordinary users to drop tone marks, as well as other divergent/special characters, when they resolve to write on social networks (e.g.: the case of Yoruba at https://www.facebook.com/pages/ORIKI-ILE-YORUBA/140433425980640).

Furthermore, there is abundant evidence that among the non-African linguists who support systematic or partial tone marking in orthography, very few actually achieve native-like fluency and intelligibility when they read their own transcriptions of texts in endogenous African languages, although they may have been studying and describing thee languages for decades. Here, as it seems, it is actually not just the entire word that guides reading/writing and sense construction. It is also a matter of whether the reader actually has the right aptitude and attitude for reading written material in the specified language. There the entire utterance context too, whereby lexical identity and morphology, collocational patterns, syntactic morphology, etc. play a determining role in disambiguating otherwise confusable (lexical) items, as would be the case with homographs in all written languages. The following statement is obviously in line with this argument:

“Various people have said that ambiguity is a problem for communication,” says Ted Gibson, an MIT professor of cognitive science and senior author of a paper describing the research to appear in the journal Cognition. “But once we understand that context disambiguates, then ambiguity is not a problem — it’s something you can take advantage of, because you can reuse easy [words] in different contexts over and over again.” (http://www.physorg.com/news/2012-01-cognitive-scientists-problem-human-language.html)

In this regard, the importance of tone marks in the validation of orthographic writing in endogenous African languages has seemingly been blown out of proportions. In the process, there has not been enough emphasis on the realisation that although tone (like most other prosodic elements of discourse) is an indispensable feature of the acquisition, learning and use of most endogenous African languages, it is not equally always indispensable when it comes to putting discourse in writing. Whatever the case, the non-competitive nature of socio-contextually divergent writing systems, due to the use of tone marks and other phonetic symbols that are absent from ordinary keyboards of the country’s official language(s), will likely continue to reinforce the current marginalisation of African languages in most writing spheres in general and written translation in particular. What all proponents of tone marks in the writing of African languages fail to say is that tone marking does not stop the existence of hundreds of homophones (and by implication homographs) in African languages. Yet, these homophones do not per se constitute an obstacle to spoken or written discourse interactions in these languages. Needless to stress, most, if not all, of the best known and the most widespread languages comprise a large number of homophones and homographs whose distinction depends solely on discourse context.

That said, it is sincerely hoped, that the central argument of this paper regarding the competitive advantage (notably sociolinguistic, translational, technological, ergonomic, economic, etc.) of convergent scripts in writing both endogenous and exogenous African languages will soon be defeated by a demonstration of the contrary: i.e. the competitive advantage of divergent scripts within the same socio-linguistic context.

Anyhow, advocacy for the use of the same characters as available on the dominant official language keyboard(s) to write endogenous African languages need not be mistaken to mean advocacy for adoption at all costs of the writing system of the dominant, exogenous, official language(s). English and French, for example, are two of the main official languages of the continent. These languages undoubtedly stand out as two of the most widely used written languages in modern and contemporary history. But their writing systems are reputed for being rather too inconsistent with regard to current advances in phonology and orthography (http://www.lexpress.fr/culture/livre/une-reforme-ambitieuse-de-la-langue-francaise-est-necessaire-a-sa-survie_1107592.html). Besides, it is common knowledge that languages may have exactly the same set of basic scriptural symbols (or alphabet for the case in point) but differ significantly in the use of the symbols in respective writing systems. In designing Latin-based writing systems for African languages, it would therefore be necessary to use individual items of the Latin alphabet or combinations thereof in a way that would be both phonologically and orthographically consistent throughout each of the endogenous African language writing systems.

A forthcoming paper will return to these issues, with concrete illustrations of how typically phonetic symbols that are absent from the French AZERTY or English QWERTY keyboards have been or can be reasonably translated into typically Latin signs, in strict conformity with the relevant official language script on the one hand and the internal requirements of endogenous African languages on the other.

In any case, it seems fair to end this paper with the following reflection from someone who may very well be seen as a role model in writing and translation in endogenous African languages within the context of dominant exogenous official languages. This is what the great African scholar has to say:

What African languages need is power sharing with English, French, Afrikaans or any other official languages. It is not too much to ask that demonstration of competence in at least one African language be made a condition for promotion. I don’t see why anybody should be allowed to stand for councils and parliament without showing a certified competence in an African language. Corporations can also help in attaching competence in an African language as an added value to the other conditions for hire and promotion. English, Afrikaans, French newspapers should also lead the way in this, for a reporter who also has one or more languages of the country they serve is surely a much better informed journalist. It should be a national effort The struggle to right the imbalance of power between languages should be national with belief and passion behind it. The education system should reflect that commitment and I don’t see why a knowledge of one or more African languages should not be a requirement at all levels of graduation from primary to colleges. And finally, we have to stop the madness of promoting African writing on condition that participants write in European languages. Can anybody think of giving money to promote French literature on condition that they write it in isiZulu? African languages are equally legitimate as tools for creative imagination and in South Africa, there is the testimony of the great tradition of Rubisana, Mqhayi, Dhlomo, Vilakazi, Mofolo and Mazisi Kunene. In translation, Mofolo’s Chaka, written in Sesotho, made a big impact on the work of such greats as Senghor and other African writers. (http://bookslive.co.za/blog/2012/06/25/speaking-my-language-ngugi-wa-thiongos-address-at-the-2012-sunday-times-literary-awards/)

Food for thought in the direction of African decision and policy makers, as well as all Africans who earn a living as professional translators, interpreters or writers. It is time for each African professional translator, interpreter or writer worthy of the name to ensure capability to work in their very own endogenous African language(s) too, namely that which they fondly consider as mother tongue or mother language. That should be their own contribution to the genuine development of their profession locally. Strategically too, there is need also for translator trainers and African publishers to resolutely invest in African language translation and publishing respectively (cf. http://www.h-net.org/reviews/showrev.php?id=4359http://www.jahn-bibliothek.ifeas.uni-mainz.de/227_ENG_HTML.php). All should be encouraged to do so with incentives from national and continental governments or organisations, in tune with the The Asmara Declaration on African Languages and Literatures (http://www.africa.upenn.edu/Govern_Political/asmrlit.html). Similarly, with respect to the much-desired call for power sharing between endogenous and exogenous languages of Africa, the time has probably come for bodies like UNESCO, ISO, the World Trade Organisation, together with the African Union and other relevant organs, to ensure the protection of all languages not only from extinction but most importantly from predation by colonial and imperial initiatives. In the process, it is imperative to make sure that keyboard manufacturing norms (e.g. QWERTY and AZERTY) are systematically reviewed to take into account the divergent writing requirements of partner languages in Africa, when such requirements prove to be fully justified. There is need also for computer operating systems to break the current tendency to favour major language monopoly; that is, by factoring in the possibility of adding user languages freely to the initially installed languages, without interfering with the latter (some smart technologies are already offering this possibility to users). Without such initiatives, writing in endogenous African languages is likely remain in the periphery for yet a very long time again, with the resulting attrition and death of many languages in the process. There is absolutely no reason to think that any politically correct individual or organisation would want to be an instigator or accomplice of the programmed death or programmed dwarfing of any endogenous African languages to the benefit of exogenous ones.

See also Tsat mbäfeung

Related articles

Leave a Reply