By Candice Louisa Daquin
Language. Children learn it before they understand its importance. Adults can struggle to learn additional languages because the brain is less elastic as it ages. The formulation of language is a key component of what makes humans, human. Our language (though not our communication) is speculated to be among the most complicated and rule-bound of living creatures. Whether disproved in the future when we are able to translate other animals’ languages, we can all agree, the impact of language on us is invaluable.
Why then are our language skills diminishing? With every person who has benefited from being able to look up information online and thus, know a little bit about a lot of subjects, we have simultaneously reduced our language breadth. We are increasingly tempted to take short cuts linguistically both in writing (texts and emails) and how we speak to each other. It may be tempting to blame this on social media but it’s not that simple. This is not new: Throughout time, there has been enormous value put on ‘banter’, ‘ridicule’, ‘sarcasm’ and pithy retorts.
Perhaps people who can summon lightning fast rapporté are considered witty, nimble minded, fashionable. Contemplate those who have been considered ‘cool’ socially. Those who had the quick response, the short soundbite that cut to the chase or was easily repeatable, was often admired. Just recently Ukrainian President Volodymyr Zelenskyy in response to President Biden’s offer of a flight to get him and his family out of harm’s way, replied: “I don’t need a ride, I need bullets.”
People applauded his response because in many ways it describes the crux of what Volodymyr Zelenskyy and his country are going through and portrays him as a brave leader. Just as Clint Eastwood in the Dirty Harry series said the much-emulated words: “Make my day punk” and Arnold Schwarzenegger coined “hasta la vista” in the Terminator series alongside other iconic statements made by film stars, celebrities, politicians and authors. It’s the admiration of cutting to the chase, emulated by millions, as a means of extracting the essence of what we’re trying to say or merely being glib. We utilise quotes now more than ever, through the social media medium which makes it easy. The only question is; when we scroll through quotes and need ever increasing variety to our lives, are we really absorbing the meaning behind the soundbite or merely parroting it?
There is a history behind the proverb, quote and parable. It was a means of remembering wisdoms easily for those who might have been illiterate or before books were widely available. Along with songs, this was a method of retaining what was not written. Religion has employed this through easy to remember choruses and proverbs, it has long been human nature to reflect on life through such proverbs and sayings. In the 14th century the popular proverb “He who sups with the Devil should have a long spoon” and other sayings were a means of social control or wisdom, depending. In quotes from Aesop, (the famous fables) “We hang the petty thieves and appoint the great ones to public office.” We can admire the truism of this, just as we admire the bravado of a TV hero saying “make my day punk.” Perhaps Confucius said it best: “They must often change who would be constant in happiness or wisdom.” And quotes are a ready means of growth that don’t require the commitment of reading say, The Four Books and Five Classics of Confucianism.
But when do quotes stop being educational and more, inadequate shortcuts to knowledge? Where I live, people don’t read whole books very often anymore. They excuse this by saying they are ‘too busy’ to read. I work a 60-hour week and I read. I can understand emergency room doctor’s and new parents not having time to read, but regular folk? You can tell how much someone reads by asking them what their favourite books are. If they quote more than one high school book, chances are, they haven’t read much since high school. Reading is an evolutionary experience and we grow as we read. Many people have forsaken televisions (which isn’t a bad thing) but also the genre of fiction, believing it has no worth. Are we ‘better’ for reading less fiction, and for reading fewer complete books and more online news channels and texts and memes? Do we lose something? And if so, what do we lose when we absorb language in this different, shortened way?
When was the last time you sat down and had a really in-depth conversation that wasn’t about your parents’ dementia or a breast cancer diagnosis or something that serious? But simply analyzing a book you read, a play you went to see, a film, or a discussion on politics or history or psychology? Granted some of us may never have sought to do this and that’s okay, but of those who did used to analyze, it seems analysis is less mainstream and now very specific to your job field. Fewer people sit on Sunday afternoons and read the paper from front to back. More scroll quickly, gleaning the basic amuse-bouche but nothing of substance. If you are a marketing analysist you analyze market trends. If you work in the financial sector, you may analyse financial impact. If you are an economist, you may consider economic development. Because most of us work such long hours, do we really have time, energy (or desire) to analyse things we don’t have to analyse?
It could be a sign of the times, of modernity, cultural shifts, progress even, that we don’t need to delve as deeply. No longer subject to pouring through piles of textbooks to hand write a paper. And some of that progress facilitates other knowledge, such as an ability to navigate the www… and beyond — to understand HTML and design websites and publish books all by ourselves, things that formerly would have seemed impossible. The scope of things we can do with technology for example, has expanded our choices recreationally and professionally. Kids are creating entire music albums in their bedrooms with affordable equipment, people are making whole films on their iPhone, others earn a living filming themselves for social media platforms, models are made by Instagram photos, we have all become graphic designers and editors of our own stories.
On the flip side of that, jobs that once promised a living wage such as graphic design, photography, editing, translation, music production, are being replaced by cheaper options. Platforms like FIVERR can design your tattoo for you, create a corporate logo, a book cover, anything you should desire, for a fraction of the price a professional would charge, because they are borderless, not beholden to the rules of old, and could well be a 16-year-old practicing graphic design skills from their bedroom. On the face of it, there is nothing wrong with that. If someone in Indonesia wants to offer those services for a fraction of the price a US supplier would charge, and under-cuts them in the process, isn’t that just the consequence of a free marketplace? If that Indonesian supplier wishes to charge what for them is a relatively fair wage, but in the US market would be considered a pauper’s wage, is it exploitative to the Indonesian? Or must we accept an individual’s choice to make that decision for themselves?
Should we question what ‘choice’ means if there is effectively, less choice when someone offers services for less? What is equitable? Ethical? With technology half the times we don’t know what country, what nationality or the details of the people we work with and it becomes less important. The only reason it is important is if someone is working from Nepal for $3 an hour whilst someone is working from Australia for $25 an hour. Simultaneously if everyone is aware of that inequity but agrees-none-the-less, then who are we to complain? Cost of living varies and maybe charging commensurate to your cost of living is a more realistic model than across-the-board flat rates. How much has changed since the internet opened borders and countries to a greater freedom of the choice of commerce and services than ever before? Like with anything, there is exploitation and there is improvement, and there’s no one simple answer to ensuring everything is fair, or nobody is exploited.
How does this relate to language? Or a series of retorts? It comes down to shifting social mores and what our expectations are – with this comes a modification of language, much like that you read about in science fiction novels of the 1960’s where the homogenised aspect of the world watered down uniqueness in favour of uniformity and created a melting pot where language among other things, was diluted for simplicities sake. Interestingly science fiction also created entirely new languages, (conlang, which is an artificial language) and worlds, so one could say it added to as much as it abstracted the future. Using posteriori languages (borrowed words and grammar from existing languages) has become commonplace, from Spanglish to Yiddish in America.
In the article ‘SMS & Its Negative Effects on Language’ (www.itstillworks.com), the authors note the mass use of shortened ‘slang’ language employed by societal groups, not least teenagers. Such habits have been speculated to carry over into reduced grammatical and spelling abilities, although conversely it could be argued, if teens are writing more (even badly) it encourages those who formerly may not have written at all. If you think how much your social skills have deteriorated since the pandemic because of lock-down and less direct socialising, is it that hard to believe persistent use of abbreviations and icons would replace language fluency? In the article ‘Alienating the Audience: How Abbreviations Hamper Scientific Communication’ (http://www.Psychologicalscience.org) the authors point to loss of deeper meaning when employing constant acronyms.
In The Times of India article ‘Shortening language has negative implications’, the authors point to a misuse of technology (always being ‘on’ and responsive to technology) ironically reduces efficiency and that ‘infomania’ can cause an overload of information. Being ‘dumbed down’ by technology and linguistic abbreviation could reduce the sharpness of our knowledge. Conversely, The Atlantic says in its article ‘Teens Aren’t Ruining Language’ that while ‘fad’ words may have a different trajectory online, they don’t have the power to ‘debase’ linguistic standards. “How much a person’s vernacular changes over time may have as much to do with personality and social standing as it has to do with age. The extent to which teenagers are credited with (or blamed for) driving lasting change to language … is grossly overstated.”
Whatever language we speak, we may be aware of this shift in seeking depth. Not only reading less complete works but expecting a synopsis instead. How does this affect conversations? Social interactions? What do we value and consider ‘worth’ as well as what the shift from meaning to soundbite imply? It is good to be able to Google everything and think we understand things we may not have looked up if the internet did not exist. But simultaneously we’re aware what’s online isn’t always factual so much as a series of compiled opinion. If history is written by the victor, then doesn’t it stand to reason what we assume is ‘fact’ shifts dramatically? What people in China right now, are exposed to as ‘incontrovertible truth’ is not the same as what people in Russia are being exposed to, or America, or India. With so much variation in what represents pure truth, shouldn’t we worry about that? Or are we imagining a past where truth existed in a purer state than it ever did?
Journalists used to have to write objectively about subjects, unless they were writing opinion pieces or columns. Even with the latter, there was a responsibility to report news from both sides, and not let personal bias taint the reporting or the information being presented. When you read news articles now, if you step back and try to put your own personal political views aside – what do you think about the reporting of those subjects? Are they objective? Take any side you want, but what you’ll find is they are often blatant or subtle bias and invested in projecting their perspective as the ‘only truth.’ Even the most basic Wikipedia definition of journalistic objectivity states: “To maintain objectivity in journalism, journalists should present the facts whether or not they like or agree with those facts. Objective reporting is meant to portray issues and events in a neutral and unbiased manner, regardless of the writer’s opinion or personal beliefs.” Why then is this not demanded?
I admire journalists who could step outside of their own views and write on a subject without that natural bias. Non-journalists are biased and look to journalism as a fair reporter of facts, where we can make our own minds up. Reporting shouldn’t be an opportunity to tear one side down to promote another. In America, the backlash against Donald Trump was perhaps the greatest witnessed in this country because of the deep divide in voters and the horror felt towards Donald Trump by many. He was considered dangerous for the country and irrespective of whether it was a truth, the majority of news outlets were a 100 percent against him. When I brought this up, I was told I obviously was a racist who supported Donald Trump otherwise why would I even care? This missed the point. I cared not because I wanted to defend Trump, but because I felt objective, rational, non-biased news reporting had been completely eroded.
It’s more important to me that we retain that objectivity even in the face of things that we may personally revile. A journalist who is unable to be objective, forfeits the right to condemn another, because they are not utilising that objectivity in their analysis. Maybe we cannot expect regular every day people to be free of bias, but when the moral underpinning of your job requires it, then you owe it to your readership not to pander to their outrage and stir the pot, but present an objective overview. The same is true of social media ‘conversation’ where a subject is presented, and people sound off, often becoming offensive, outrageous and exceeding the remit expected if we were all sitting in a room together. That anonymity afforded by a screen and physical distance, seems to have opened a pandoras box of horrors.
People can be unrecognisably offensive in their attack of others, for no discernable reason. It should be possible to discuss any subject without people devolving into personal attack and ad hominin. Has our use of language also been altered via our anonymity online? The oft disputed Sapir–Whorf Hypothesis states that the language we speak, influences the way one thinks. This verges on potential fascism if it judges one language superior to another, but the point is taken. The complexity of language has been linked to higher IQ’s which may explain why Finnish and Japanese students tend to be in the top ten achieving academically, their languages being among the most complex. Likewise, people who are polyglots, and speak multiple languages, or those who are musical (often considered a language) or mathematical (likewise) tend to possess higher IQ’s. The problem with this is we will never know what other influencing factors play a part in this, including nurture and nature.
This paradox between cancel culture, that looks to demote those who say things deemed offensive, and the increasing offensive backlash and gaslighting of others, seems to point at the hypocrisy of ‘free speech.’ It’s only free if the right person is saying the right thing, otherwise you’re liable to be canceled, but watch out, because that’s subject to social fashions that vary wildly. Beyond that hypocrisy, what of the actual quality of communication? Have we devolved to the point of only being able to say what we really think anonymously? Why do people write reams online but say little in a real-life group? Are we hiding behind rules and etiquette that only creates sub-groups that have no tolerance for other groups? Does group think ever produce something other than subjective thinking?
If aliens came to Earth, they might notice humans seem to admire ridicule and socially sanctioned sarcasm and call it witty without needing to have a deeper conversation. Twitter represents this phenomenon most acutely because the actual length of your post is restricted and thus, you have to encapsulate what you want to convey, by honing it to that word-count. That’s great if you’re delivering a highly edited statement, but how naturalistic is it as a form of legitimate communication? Does it replace your grandmother sitting at the kitchen table with you for an hour? Maybe it doesn’t have to, but do we still sit at the kitchen table and have those conversations? And if we don’t, will that affect what human communication is evolving (or devolving) into?
Sometimes flim-flam is appealing, in its starry simplicity. We embrace Digi-Fiction written and read from computers, changing how we process fiction. We can be attracted to the code-switch of language that takes the guess work out of communication. For those neurodiverse populations and people increasingly using actual code as a form of language such as HTML, it may seem like a logical next step. An improvement on the guesswork of complex modes of communication that were challenging for so many. At the same time, when we lose the ability to read a book from beginning to end, we lose the patience and journey of that process, which if not replaced, may be a genuine loss we cannot even fathom. Then again, in the spirit of all possibilities it could be we leave behind that which is not necessary and embrace a Haiku perspective of saying more in less.
I confess, when I read a ‘classic’ novel I am aware of how much superfluous information exists and doesn’t strictly need to. It is interesting to consider how much language we used to say one thing, compared to now. The medium of social media means we’re busier than ever and take our ‘fix’ of what attracts us (quickly) before signing off. Therefore, long poems have less attraction than shorter memes. We fixate on the easily presented, the humorous and immediate. Nuance, subtlety, slow burns, those are almost luxuries we may leave for rare nights in the bath. Novels are changing to adjust to this phenomenon. Graphic novels are gaining further traction, even songs. Our entire social fabric has altered, and, in some ways, this was inevitable if you recall we always admired that witty fast retort, going back to Marie Antoinette, Gloria Swanson and beyond.
Does this mean all language must conform to this new rigor? Or will epochs of devotees to other forms of communication, endure? When I browse through bookstores, I notice there are many styles of writing, including the long-winded, and the easy read. My fear is not that we read ‘easier’ books but that we stop reading altogether, believing scrolling on our phone compensates for the discipline of reading a book. One may argue, do we need discipline? But learning is invariably discipline and part of honing rigorous learning habits is being challenged. We can do crosswords, play chess, sudoku, and go to the gym to maintain a healthy body and mind, but the ‘imaginarium’ of fiction and the need for creative expression is for many of us, equally necessary.
Fiction isn’t a waste of time simply because it’s not a literalism. Fiction as a speculative field, has inspired science, politics, social advancement and a sense of possibility. Fiction can thrill, entice, or simply entertain. Not all forms of entertainment are equal. Whilst I confess, I do watch television, I recognise the lasting value of a book compared to a serial, because it requires more of us mentally. For those highly disciplined souls, there may be no need to ‘indulge’ in fiction, or television. Maybe reading Scientific American or pouring over The Financial Times or Anglers Digest will be their choice. But language has a trickle-down effect, and you can guarantee, it will eventually permeate all sectors of our lives.
Do we want to completely dilute the value of further explanation, detail and depth in favour of the glamorous soundbite? Or is it possible to harness the value of succinct communication and retain the continued relevance of detail? When I read what passes for scientific news in popular media, it concerns me that we are picking and choosing for ‘click bait’ purposes and this leads to the proliferation of inaccuracy. Case in point, the startling headline: ‘The Epstein Barr Virus (EBV) causes MS’! Then in the meat of the article, we find 95 percent of humans have ‘EBV whilst only (35.9 [95% CI: 35.87, 35.95] per 100,000 people) have MS’ – we are leaving out the most important connections in favour of scaremongering journalism which only serves to increase (inaccurate) neuroticism when it should seek to educate and elucidate.
It’s not that too much information is bad for us, it’s that too much incomplete information can distract us from truth, and we may learn to gloss over what matters in favour of what shines brightest. Sometimes it is necessary to finish the chapter.
Candice Louisa Daquin is a Psychotherapist and Editor, having worked in Europe, Canada and the USA. Daquins own work is also published widely, she has written five books of poetry, the last published by Finishing Line Press called Pinch the Lock. Her website is www thefeatheredsleep.com
PLEASE NOTE: ARTICLES CAN ONLY BE REPRODUCED IN OTHER SITES WITH DUE ACKNOWLEDGEMENT TO BORDERLESS JOURNAL.