Categories
The Observant Immigrant

Climate Change: Are You for Real? 

By Candice Louisa Daquin

In childhood I recall getting my coveted membership to Save Our Seas. I loved the sea and marine animals, and this seemed a meaningful way of helping from a child’s perspective. I recall reading Rachael Carsons famous books Silent Spring and The Sea Around Us at the same age and wondering how a book written in the sixties could be so prescient and why the subject was still under debate? If a ten-year-old can understand the message Carson had, of indiscriminate application of agricultural chemicals, pesticides, and other modern chemicals polluting waterways, damaging wildlife populations and causing health problems for humans, then why not adults?

It’s easy for a child’s mind to think those simple questions, not understanding the intricacies of what’s at play. Not least; politics, big business and money. These more than anything has dictated international policy, and it’s not science that sways policy, it’s influence. Vandana Shiva, an Indian environmentalist, is another such example of a prescient activist whose truth has been stifled in the march toward profit. Shiva, both physicist and social activist, founded the Research Foundation for Science, Technology, and Natural Resource Policy devoted to developing sustainable methods of agriculture. Shiva is contended; “Justice and sustainability both demand that we do not use more resources than we need. Uniformity is not nature’s way; diversity is nature’s way. We are either going to have a future where women lead the way to make peace with the Earth or we are not going to have a human future at all.”

Sadly, Shiva’s work is less known than companies like Monsanto  who are responsible for mass destruction due to putting profits before conscience in the selling of GMO[1] seeds that caused widespread bankruptcy, suicides and irreversible environmental damage. In 1995, the United States Environmental Protection Agency, EPA, listed Monsanto among the top 5 lethal corporations dumping toxic waste, as it was recorded dumping nearly 37 million tons of toxic waste, through air, water, and land. . It is unfathomable why such blatant atrocities should be permitted but our global history is littered with them.

Scientists have warned since the 1800s, where experiments suggesting that human-produced carbon dioxide (CO2) and other gases were able to collect in the atmosphere and insulate Earth (or its reverse) were met with more curiosity than concern. By the late 1950s, CO2 readings would offer some of the first data to corroborate the global warming theory. That it’s not if, but when, climate change will alter the way humans experience life on this planet, let alone wildlife and nature.

At one extreme we have the eco warrior who has valiantly tried to campaign and actively fight against human encroachment; in the middle, we have the skeptic who points to fluctuating weather patterns going back millennia and at the other extreme, there are the climate deniers who despite having children seem not to be concerned about the earth those children will inherit. There is proof that “Dating back to the ancient Greeks, many people had proposed that humans could change temperatures and influence rainfall by chopping down trees, plowing fields or irrigating a desert.”

If I sound biased it is because it’s a generally accepted fact that the earth isn’t just heating up, it is changing. The only issue under debate now is who or what is responsible, if anyone is, and how long do we have before things get really bad. Twenty years ago, people still mulled over whether climate change was happening, many believing it was just cyclical and sometimes it was, but there have been enough giant seismic changes in the last 40 years to put that doubt to rest. “Scientists have pieced together a record of the earth’s climate by analysing a number of indirect measures of climate, such as ice cores, tree rings, glacier lengths, pollen remains, and ocean sediments, and by studying changes in the earth’s orbit around the sun. This record shows that the climate varies naturally over a wide range of time scales, but this variability does not explain the observed warming since the 1950s. Rather, it is extremely likely (> 95%) that human activities have been the dominant cause of that warming.”  

Now if you turn on the TV, the nightly news is as much about weather as it is other things. Weather dominates our lives more than ever. Perhaps it’s ironic that ancient man would live or die by weather and we are now doing the same. The heyday of calm weather may have been slightly exaggerated but most people over 50 can attest that things weren’t quite as dramatic all the time, every year, as they appear to be now.

The harbinger of our behaviour in terms of polluting the environment has speeded up, something that may have been inevitable but could possibly have been avoided. The hardest part being that ‘developed’ countries such as America and Europe asked ‘developing’ countries to reduce their carbon and other emissions without really reflecting that they were as if not more guilty, relatively speaking, before they ‘saw the light’. To ask developing countries to leapfrog ahead in their development for the sake of the environment is coming from a position of privilege, having already polluted the world themselves first.

On the other hand, developing countries may struggle to reduce emissions because they are gaining traction in terms of improving quality of life for most of their population but are not there yet in terms of having the luxury to reduce emissions. It takes a lot of money, effort, commitment and determination to do this and for a country that is trying to improve its lot for its citizens this isn’t always their first priority, not to mention the patronising tone of developed countries demanding this be done. It is important to see this relationally which means understanding the difference in countries development and that some of those countries were abused and depleted of resources and kept ‘poor’ by conquering overlords who reaped the benefits and left them poor as a result. Those counties will struggle to climb out of the post-colonial model and that should be considered when judging them.

But we don’t have time. Despite know this beforehand, we did not do enough. In the late 1800’s, Swedish chemist Svante Arrhenius [1859-1927] wondered if decreasing levels of CO2 in the atmosphere might cool Earth. To explain earth’s ice ages, he considered if decreased volcanic activity could lower CO2 levels globally. Those calculations evidenced that if CO2 levels were halved, global temperatures may decrease by about 9 degrees Fahrenheit. From this, Arrhenius investigated if the reverse were also true; investigating what would occur if CO2 levels doubled. His results suggested global temperatures would increase by the same amount.  

By the 1980s global temperatures were going rapidly higher. So, climate-based experts use 1988 as a critical turning point when events placed global warming in the spotlight with extreme weather and increased public interest. Scientists, the UN and many others warned we were heading to a point of no return.

Turn on the news today and we seem to be there.

Even if we did everything right as a planet from now on, it would still be too late to repair the biggest climate change consequences. That doesn’t mean we shouldn’t try but it’s alarming to imagine we’ve let it become too late, though not surprising when you consider the apathy of world leaders to come together and make this happen.

UN Secretary-General António Guterres’ remarks to the General Assembly in March 2022 illustrates this: “Just last week, the Intergovernmental Panel on Climate Change issued an alarming report that showed climate impacts are already devastating every region of the world, but particularly developing countries and small island States.  The session considered the irreversible impacts of the climate crisis, which could render some parts of the earth uninhabitable.”  What does it mean for us? For the future generations if there are to be any? It means things we took for granted will change. Just as more animals are going extinct than ever before we also must look to history to give us an idea of what we might face in the near future.

Look how many times there have been huge seismic shifts in the earth? One example in particular is quite interesting. The Storegga Slide happened in approximately (600-BCE) and was the largest Paleo tsunami to hit Europe in the (Paleo) era. It altered the geography of Europe massively, causing England to break off from continental Europe from where Scotland was attached to Scandinavia. This was lost beneath the sea as a huge part of Scandinavia broke off and caused giant waves that poured over this fertile land and swallowed it whole.  

Climate deniers use these types of stories to explain away climate change as being a natural phenomenon but that’s inaccurate. Whilst significant and damaging things have occurred throughout history and will continue to, as scientists warned, it’s the number of disasters and changes occurring that count, not that they happen but that they happen with such regularity and severity. It’s been this hot before, but has it been this hot consistently and throughout the world for as long before? I was born in a year where there was freak heatwave but that’s just it, it was a freak heatwave.

Such things are natural in nature but not if its progressive or things keep happening one after the other. People assume if there is a cold winter then climate change can’t be real but that’s the funny thing, it’s the extremes of weather as much as heat, that are indicative of climate change. For every extraordinarily hot summer and burning Hawaii, there are extreme weather events in winter too as the planet falls out of a healthy cycle and is slowly losing its ability to nurture life like it used to.  

Does it mean we will become extinct? Or just that life will become harder and less places habitable? And hasn’t that happened before? Well, it has, in so much as once Africa was a grassland without drought and Europe was covered in ice. But when a planet first forms it’s likely to have extreme weather. As long as humans have been churning out chemicals that pollute the seas and mining the earth for its ore, we’ve accelerated and exacerbated those disasters. And just as it is believed a meteor killed off the dinosaurs and a virus might have killed off the Neanderthals it’s possible our actions will hasten our demise and at very least make life more unbearable.  

How? Along with viruses being more omnipresent than previously and antibiotic resistance, UV exposure and higher radiation have increased. The average human has more chemicals and formaldehyde and plastics in their body than at any other point in history. It affects our health, our reproduction and our longevity. Cancers hit the young more than ever before. We’re either over medicated or not able to afford medication. If global temperatures rose by 11 or 12 degrees, more than half the world’s population, as distributed today, would die of direct heat. The disparity between the ‘haves’ and the ‘have nots’ have exaggerated like in the feudal past. The idea we’re all middle class is a myth borne on ownership for technology rather than quality of life, which for many working two or three jobs to sustain their lifestyle, is hardly enviable.

The world is heading for a collision, and we are propagating this by a lifestyle we don’t seem capable of changing. If we label those who care about the environment as eco terrorists and pay football players millions whilst leaving nurses and teachers underfunded? Our priorities must be reflected in these things to have a trickledown effect in the future. If we can’t educate our children to understand that saving the planet isn’t just a day every year or a whim but must be a full-time effort, then what hope does the future possess?

ActNow is the United Nations campaign to inspire people to act for the Sustainable Development Goals and many other organisations like it fight against misinformation and seek to actuate these goals, but they’re often drowned out by lobbyists for special interests, such as the car industry, gas industry, fossil fuel industry, nuclear industry etc.

Just like in the fight against cancer, we need science to lead the way, that science which is not the influenced by special interest groups, like in the case of cancer, big-pharma and big-business. We need to take profit out of research and make it objective rather than tied to business, so it can be unimpeded to do what’s necessary. With cancer research, profit has stymied progress and stalled any meaningful change, instead people believe cancer is being cured by pharmaceutical promises, whilst more people than ever are getting cancer. Contrast this to climate change and if we don’t do the research into sustainable alternatives and ways to live into the future, there may be no future worth living for.

All hope is not completely lost of course. We always find ways, maybe one of them will be to go off world whilst another would be to live in Antarctica when it melts, provided the sea doesn’t swallow it. But what of the towns and cities by the coasts? What will that look like in 50 years? Maybe less. I think in my life time it is predicted that many of these places will be unliveable, beneath water, and whilst this has happened before it hasn’t happened to this degree. Yes, Venice has always been sinking and maybe NYC wasn’t built on the best land but everywhere else? And what will the displaced do? And how is space travel possible without a healthy earth?  

Those old enough can attest that the world seems to be burning and statistically with more people than ever on an already burgeoning planet in terms of resources. We seem to be wasting more food, yet more people are hungry in certain pockets of the world. We are growing hotter in some parts, colder in others, heatwaves represent an increasing threat to cities in both the Northern and Southern hemispheres. And it’s shifting agricultural production. Heatwaves are affecting colder countries too. A study states: “As illustrated by the example of Quebec, rising temperatures and heatwaves are an increasing hazard in countries of comparably cold climate as well as in warmer climates. According to a report published by UN Climate Change, higher temperatures due to climate change cause heatwaves which affect human health. For example, in Germany alone, the heatwave of 2003 resulted in nearly 7,000 deaths and many heat-related illnesses due to heat stroke, dehydration, and cardiovascular disease.”

Realistically many places in the planet are harder to live in, firstly because prices are pushed artificially high by unrelenting inflation but slower wage increases, people are often underemployed or expected to work longer hours for less pay if you consider the cost of living 50 years ago to now and relate that to increase in wages. On another level, people’s standards of living seem to improve in some areas, but again this is hard to gauge when you consider the divide between the very wealthy elite and the rest.

In America at least, displaced people’s flood through the borders and are hopefully given shelter and housing and opportunities but are they really better off than from the places they have fled? In some circumstances invariably, but for others, they may earn more but that money is swallowed by the higher cost of living; so, they’re not really better off. It’s an illusion to consider America as the land of the free or the American Dream, with so many living below the poverty line or just above it, which is negligible when you consider you may have slightly more money but you are thus not eligible for social assistance so you end up being as poor or poorer than those who do qualify for social assistance. This all relates to climate change because what incentive do people who are struggling to survive have, to help save the environment? Can you blame them? Shouldn’t we blame if we are to allocate blame, those who perpetuate poverty and turn a blind eye to its outcomes? Like former colonial countries who once having raped the land, decry its poverty, even as it’s the direct result of such pillage? Haiti being a great example of that.

Meanwhile the war machine grinds on and we pour money into that, to the detriment of climate change. Climate change is left for summits about but little changes. Countries make pledges but few are actuated and that’s without considering the lies that abound, or the cover ups of environmental disasters that are hushed up but have caused immeasurable harm. In 2017, the US Air Force used USD$4.9 billion worth of fuel; also, that year, the US military was responsible for 59 million tons of CO2 which is the same as total emissions of some industrialised countries like Switzerland or Sweden.

If we don’t even get the actual truth, how can we know the true extent of damage and our real part in it? Think of the nuclear disasters? That said, it’s understandable countries seeking to free themselves from fossil fuels would consider nuclear power, but how tenable is that when it depends upon people to function, what if those people were lost? Would the sites go critical and kill all survivors? Where do we safely store radioactive nuclear waste when it takes thousands of years to degrade even slightly? Just like those toxic super-dumping sites dotted throughout the planet, filling the seas with plastic and debris, we don’t think about the consequence of such dumping, only the immediacy of needing air conditioners.

Eventually fossil fuels will run out, but we haven’t found a tangible replacement. Electric car batteries don’t do well in heat, they also aren’t as durable in distance driving, cost a lot in using electricity which is still using resources, are prohibitively expensive and likewise with solar energy and wind energy. It seems there are downsides to all we’ve come up with so far, and whilst some progress is made with desalination of water to ensure clean drinking water and terraforming of previously uninhabitable land, is it enough to ward off the inevitable or does it mean those who already are rich, will be somewhat protected from the first consequences of planet earths deteriorating climate, whilst those without, will be the first to pay the price?

We’ve had so many canaries in the coal mine warnings from long before now, that none of this is news but people still en mass prefer not to think of it. When polled, voters in America usually do not put climate change in the top five concerns they have. The last few years this has changed, and that might signal a positive shift to taking climate change seriously, but it’s a bit late. Things can be done to shore up some of the fragile resources, but it will take a sustained commitment. How can that happen if majority of politicians’ are more focused on power and money than true change, renewable energy that works and a consensus that if we do nothing, we only have ourselves to blame? We have to change politics, policies and education if we hope to have a meaningful impact long term.

If we replace jobs with AI and technology as we are doing, how will people afford to improve their lives and make significant change? Everything is interconnected, it all matters, but we have to care, and being distracted by technology and super stars isn’t the answer. Why can’t an eco-warrior be a hero as much as a basketball player? We must keep trying. As Dr. Vandana Shiva is quoted as saying; “I do not allow myself to be overcome by hopelessness, no matter how tough the situation. I believe that if you just do your little bit without thinking of the bigness of what you stand against, if you turn to the enlargement of your own capacities, just that itself creates new potential.”

[1] Genetically Modified Organism

.

Candice Louisa Daquin is a Psychotherapist and Editor, having worked in Europe, Canada and the USA. Daquins own work is also published widely, she has written five books of poetry, the last published by Finishing Line Press called Pinch the Lock. Her website is www thefeatheredsleep.com

.

PLEASE NOTE: ARTICLES CAN ONLY BE REPRODUCED IN OTHER SITES WITH DUE ACKNOWLEDGEMENT TO BORDERLESS JOURNAL

Click here to access the Borderless anthology, Monalisa No Longer Smiles

Click here to access Monalisa No Longer Smiles on Kindle Amazon International

Categories
The Observant Immigrant

The Immigrant’s Dilemma

By Candice L. Daquin

Courtesy: Creative Commons

I have been an immigrant to a new country three times: from France to England, England to Canada and then, Canada to America. Being an immigrant is often a highly positive experience. We may have greater opportunities, we seek our dreams, we grow them. On the other hand, immigration for those of us who have gone through the process, is not easy. It is expensive, time-consuming, nail-biting and often lonely. It is said that those who immigrate ‘successfully’ do so because of familial support and/or because their children reap the benefits of their sacrifice.

Whilst there are too many stories to condense any one feature of immigration, we can only talk of our own experiences and somehow in understanding that, perhaps stay open enough to understand others. We can come together through that collective understanding.

As a psychotherapist, I work with many immigrants. I see clients daily who were born elsewhere and sometimes struggle to acculturate in their new-found country. Where I live, near the border between Mexico and America, we have a multitude of immigrants from Mexico, central and south America as well as from around the world, coming through the borders, seeing asylum and a better life.

Consequently, there can be a high degree of racism in rebuke for the startling numbers of immigrants passing through our city. I can drive down a road and see people lined on the street much as you would see in other countries, begging and homeless. Our resources are stretched and one option chosen by the Governor of Texas was to bus immigrants and asylum seekers to other states in the US. Initially this was considered a racist, insensitive act that treated people like cattle. When you look at it closer, you can see it was perhaps these things but also a desperate plea for other states to understand the overwhelming nature of immigration for border states and share in the expense.

It is easy for a non-border state to believe the border should be effectively kept open and all immigrants allowed in. but when it’s on your door step it can be challenging. Most people in Texas care about immigrants but also experience some of the downsides of too many immigrants at once. In El Paso, people froze to death sleeping on the streets, houses were broken into, the situation was dire and extreme and locals didn’t have enough resources to manage. Shipping immigrants who wish to go to other states, to those states, might appear cruel, but also makes sense, if it’s consensual. Whilst many of the Texan Governors decisions have been quite possibly racist and prejudicial, this choice was in part to show other states how dire the situation is.

Why are there so many asylum seekers right now? As President Biden announced the lifting of closed borders to asylum seekers, the numbers attempting to come into America increased exponentially. Under the Trump administration, the Department of Homeland Security (DHS) called the “Remain in Mexico” policy (officially, the Migrant Protection Protocols) caused immigration to be somewhat halted. The original reason countries like America accepted asylum seekers goes back to WW II where the Jews who survived ethnic cleansing had nowhere to live and were essentially stateless. The right to seek asylum was incorporated into international law following the atrocities of World War II. Congress adopted key provisions of the Geneva Refugee Convention (including the international definition of a refugee) into U.S. immigration law when it passed the Refugee Act of 1980.

The laws that exist now were enacted to protect them and ensure stateless people were never again turned away in droves. The creation of Israel was in part the consequence of WW2 and the abuse against the Jews. It could be argued any issues with Israel are directly linked to the ethnic cleansing the Jews experienced and their subsequent statelessness. Laws endeavoring to protect future people from such experiences are what we now use in our handling of asylum seekers. “When Congress finally eliminated the racial provisions in U.S. immigration and nationality law in the 1940s and 1950s, generations of federal practice and procedure did not instantly disappear without a trace. Over the years, other government agencies had developed their own racial classification systems, often partially borrowed from INS experience, and such systems could take on lives of their own.”

The downside to this is, the world has dramatically changed since the 1940s (2,307M versus over 7 million today). the population is growing at a heady rate and thus, even if a small percent of people seek asylum from any one country, it is huge in comparison to previous numbers. Department of Homeland Security  statistics show that from Biden’s Inauguration Day through May 2022—just 16 months and change—about 1.05 million migrants were apprehended on the southwestern border and then released into the US. With every year, the worlds population swells and with it, a strain on resources. ‘Affluent’ countries such as America, may literally speaking have the resources to help asylum seekers but the reality for many asylum seekers is quite different once they are in-country. According to Census Bureau statistics, immigrants’ share of the U.S. population rose more from 1990 to 2010 than during any other 20-year period since these figures were first recorded in 1850—from 7.9 percent to 12.9 percent

What constituted poverty in their country of origin may be considerably lower than what money they can earn in America, if indeed such earnings can be made at all. The social welfare system protects asylum seekers by giving them somewhere to live and a stipend until they are able to find work but what of those who do not possess the necessary skills? Not to mention the dearth of certain jobs. Immigrants wishing to live in the cities, may find work is only available in the agricultural parts of America and not earn enough to live on without language and education in a city. Likewise, they must contend with crime, safety issues and making the meager money they receive, stretch to pay for themselves and their families. What might seem initially like a lot of money, in comparison to their home-countries, is quickly devoured by the more expensive living expenses of America.

Immigrants who move to America or other developed countries, on a visa rather than asylum, may fare better. But note how many PhD’s are driving cabs or serving in restaurants. Underemployment is a phenomenon whereby those who are educated, are working at a lower level than that education would typically warrant. For their children there may be greater opportunities but for many first-generation immigrants, the adjustment and opportunities are restricted. Doctors in their own countries, they find American prohibitions on accepting foreign transcripts and learning, despite the low quality of American education in comparison to many other countries. It’s almost if you were being subjective about it, like having to pay the price for immigration.

When I immigrated to Canada, I found many who possessed PhDs and advanced education were unable to find work. There was some push back from locals who resented skilled workers and felt all immigrants should ‘know their place’ and take the dregs work. This is something you really don’t believe will happen to you when you are very educated, and get a skilled worker visa, but it’s a reality, perhaps less spoken about because it makes the host country look unkind. But go beyond the shiny posters about immigration and speak to the people and you will find it’s not uncommon.

Immigration is necessary for many reasons, not least the Western world ageing and requiring new blood because of declining birth rates. But the Western world wants immigrants to do the work they don’t want to do just as much as they may appear to want immigrants to ‘succeed’ and for every Doctor and PhD who was an immigrant, there are plenty who find themselves no better off through immigration. That’s a sacrifice worth making when you have no other choices or you hope your children will inherit the American Dream but if you have no children and you’re sold a false dream, then it can be disheartening if not crushing. There are 11 million recent immigrants in transition, best estimates predict, who labour in American fields, construction and kitchens, as well as American classrooms, detention centers and immigration courts.

What we hear less about, is how many immigrants leave. And how many suffer silently, having fallen between the gaps, into anything but the American Dream. What can be done about this? Should we impose immigration restrictions not out of cruelty but an understanding that a host country is ill equipped to deal with mass influxes and that the original reasons for the laws have evolved/changed as our population has grown? Should we insist other states take some responsibility for asylum seekers? As well as demand other countries pitch in more? And understand that what may look racist, is in fact a more realistic approach than flinging open the border and allowing everyone to come in at once?

It is an interesting dilemma and one that won’t be decided any time soon. The racists and extreme economic conservatives will battle against the diametric opposite liberals who believe all should inherit the opportunity a country like America holds. Both sides are too extreme in that they don’t consider the reality. The reality is racism should not and cannot endure in a country like America where soon ‘brown skin’ will be the majority and old racist ways are being challenged. But equally, being so ‘woke’ that you don’t see the fall out of idealistic policies, isn’t the answer either. In tandem with an identity politics that emphasises the subnational, a too progressive project may place global concerns above national interests. Hence, the oft repeated slogan “global problems require global solutions.”

Speak to the people. Many times, people criticise me for living in Texas. They assume I’m one of the ‘bad guys’ without understanding Texas is made up of a huge diverse population. Within that diversity are many Latinos who don’t want mass unchecked immigration any more than the racists, but for radically different reasons. Things aren’t as simple as they seem in a Twitter comment. There are many complex considerations that must be taken into account to ensure the best outcome not only for asylum seekers but those who already live in-country. There are answers, but they won’t come from knee jerk reactions or entrenched thinking on either side.

What we do know today, is people are literally dying to come into America and with them, perhaps some unchecked terrorists sneak in, just as they did before 9/11. In order to protect everyone and ensure things are done legally and safely, immigration must have some controls and should be funded accordingly, without any one state taking the majority of the strain. Many Texans are quite the reverse from what you’d imagine, if you subscribe to stereotypes. Maybe the problem is we should really get rid of stereotypes and try knowing who people really are before we judge en mass. Houston has one of the highest Indian communities in the world. All cities within Texas have absorbed huge numbers of immigrants from around the world. Let’s think less of ‘them and us’ and more about truly doing what will be best for those seeking to come into a country and begin a new life. Immigration is a conundrum, but if we work together, instead of apart, we can find answers.

.

Candice Louisa Daquin is a Psychotherapist and Editor, having worked in Europe, Canada and the USA. Daquins own work is also published widely, she has written five books of poetry, the last published by Finishing Line Press called Pinch the Lock. Her website is www thefeatheredsleep.com

.

PLEASE NOTE: ARTICLES CAN ONLY BE REPRODUCED IN OTHER SITES WITH DUE ACKNOWLEDGEMENT TO BORDERLESS JOURNAL.

Click here to access the Borderless anthology, Monalisa No Longer Smiles

Categories
The Observant Immigrant

Piano Board Keys

By Candice Louisa Daquin

In 1967 the US Supreme Court decision Loving v. Virginia, ruled that blacks and whites had a legal right to intermarry. Between 2000 and 2010, the number of white and black biracial Americans doubled, while the population of adults with a white and Asian background increased by 87%. (Pew Research) “horizontal hostility” describes black mixed-race experiences of societal black rejection, and how this perception of ‘(in)authenticity’ impacts self-perception and the expression of ethnic identity.

Recently something that happened to me personally that segued into a greater story of biracial identification in America. I have lived in four countries in my life so far and nowhere has racial identity been as contentious as in the USA. When The Queen died, like many others, I did a post saying ‘Rest in Peace’. I am by no means, a Monarchist but serving for 70 years felt like an impressive feat. I was immediately jumped on by a few who felt I was a pro-colonialism and “white privilege oppressor”.

As a psychotherapist, I often bite my tongue and do not express myself when others are insulting or being triggered. I have grown to respect the value of doing this, because too often it inflames things when we say anything to clarify or defend contentious subjects. However as this was posted publicly, I had to clarify. My point is not about what happened to me, but about the assumption that individual made in calling me a “white privileged oppressor.” Likewise, assuming I am white.

If people of colour decide the degree of melanin in another’s skin represents their race and culture, this will only end up emulating what was done to people of colour by white-skinned racists. Two wrongs do not make a right. It is something that comes up a lot as we discuss what it means to be a person of colour. African-American presidential candidate Ben Carson accused President Obama of not being able to understand “the experience of black Americans” because he was “raised white”. It is more common for those mixed-race than a singular race to fail to ‘please’ either side.

Just ask celebrities like singers Mariah Carey or Shakira who have struggled with this their entire lives. Or singer Lenny Kravitz (Black, Jewish, and Native American) who is quoted as saying when he had to fill out the ‘race’ sections on school forms, “My great-grandmother’s Cherokee Indian. My father’s a Russian Jew. My mom’s Bahamian. [I thought], ‘what the hell do I put on this thing?’ The teachers came over and [said], “Black. That’s what you are.” And so, so many parts of your heritage are just squashed. ‘That’s it.” (Huffington Post, 2013). Obviously if you can ‘pass’ then you have that attending privilege. Where I live about 70 percent are Hispanic and only recently there is talk of ‘white’ Hispanics versus ‘brown’ Hispanics, which goes back to the caste system in countries like Mexico, where historically the darker you were, you’d be considered serving class because you were more ‘Indio’ and if you were lighter, you were considered more Spanish. Ultimately these sub-categories seek to further divide people rather than describe them.  

Fortunately, this racist tide is beginning to turn as people understand skin colour should never confer privilege even if historically it was warped to do so. Perhaps like any culture, there is a desire to stand out from the average, so anyone different may be admired more, if you are lighter than average, you may be admired more (or less), and vice versa. Ironically in countries like England, Canada, France, Germany etc., if you are darker skinned, you are considered more attractive and admired for being darker skinned, in countries where everyone is trying to tan and become darker. So, we have two polar opposites, parts of Asia where women may even bleach themselves to be lighter, and parts of Europe (and America) where people may literally die to tan.

In this day and age, so many of us are ‘mutts’ meaning we are so mixed; we carry Black, Asian, European, everything. But we’re still striated into colors because of racism and casteism. They are not the only reasons, it’s also about how we identify and how others identify us.

“Individuals who do not fit monoracial categories may be oppressed on systemic and inter-personal levels because of underlying assumptions and beliefs in singular, discrete racial categories” (Johnston, Marc P, and Kevin L Nadal. 2010. “Multiracial Microaggressions: Exposing Mono-racism in Everyday Life and Clinical Practice.”). I was assumed to be Anglo because I look it, so as far as others were concerned, I cannot understand the experience of being of colour because I don’t have any colour. Even if I were married to a person of colour with children of colour and my parents were of colour, it would be about my individual experience. But the flaw lies in assuming we can have an individual experience. We can’t. We are moulded by our family and our ancestors and whilst some of us may not know where we come from, DNA testing makes it more possible. This should alleviate some of the worst racism, but it hasn’t. Both sides seem further apart than ever before.

Author and activist James Baldwin defined his stance thus: “he was a Negro by choice and by depth of involvement–by experience, in fact.” Meaning, even if someone did not ‘look’ black if they were, and identified as such, they were. The one-drop rule is a long held legal principle of ‘racial classification,’ prominent in the 20th century United States. It asserts any person with even one ancestor of black ancestry (“one drop” of “black blood”) is considered black. Before the American Civil War, free individuals of mixed race (free people of color) were considered legally white if they had less than either one-eighth or one quarter African ancestry (depending on the state). Equally during slavery in America being born to an enslaved mother, made them automatically enslaved from birth. Racial integrity laws have existed throughout time with different groups and are essentially used to oppress a particular group. In theory they could be easier to enact now, given DNA testing.

Ironically, I have more blood of ‘colour’ than many, who if we were in a photograph together, would be assumed to be of colour, whist I would not. Which is understandable, but what is not understandable is when people deny mixed-race individuals their identity in seeking to label them or condemn them for being able to ‘pass’ ethnic groups and racially distinctive groups vary but can also be the same. Respecting someone’s ethnicity and race are necessary in order to avoid becoming as bigoted and discriminatory as the past.

When George Zimmerman fatally shot Treyvon Martin, he was called a ‘White Hispanic’ for three reasons. One he was light skinned. Two his last name was a non-Hispanic name. three, he shot a black child. It was an example of the media manipulating the truth in order to make Zimmerman more of a racist seeming person. Perhaps Zimmerman is just a bad racist, or maybe he would have shot a kid no matter their skin colour, we may never know. We know Martin called Zimmerman racist things like ‘Cracker’, but since society says a person of color cannot be racist then that was not considered. Whist most of us hopefully want violence against young black men to end, we shouldn’t deny that much violence toward young black men is perpetuated by young black men. Lack of opportunities seem to kill young black men as much as racism but maybe the two are the same thing, coming from difference directions.  

What we can say is our society hasn’t given young black men chances and that can lead to increased temptation toward crime or violence. Surely if a young black man is shot for simply walking down a street, nobody should justify it. Just as with Brianna Taylor and so many innocents, killed for the colour of their skin. However, we should be able to make this argument without turning the perpetrator into a white man when he was not. It is a classic example of manipulating the truth in order to make it more about racism than it may have been. Or it was purely about racism, but if two people of colour cannot be racist then how can it be? There are so many issues here what we do know is two wrongs don’t make a right.

Pew Research has found most Americans who are mixed race, identify with one race (61 percent) because they ‘look’ like that race. Which points to how we look as continuing to be the determinant for racial identity even if it’s inaccurate and often leaves people feeling they have lost half of their identity The survey also found that the way people may describe their personal racial background does not always match the way they think others see them. “Six-in-ten Americans with a white and black background (61%) believe they are seen as black; only 19% say they would be seen as multiracial (an additional 7% say they would be perceived as white only). The shift is happening, case in point, Rachel Dolezal, who was the head of the local chapter of the NAACP and identified herself as African-American. But her Montana birth certificate said she was born to two Anglo people. Dolezal earned a master’s degree from the historically black Howard University in Washington, D.C., and was a professor of Africana Studies at Eastern Washington University. Her contemporaries assumed she was African American. It shows that whilst for many years, people with black heritage may have sought to deny it, now some Anglos seek to be black.

One of my best friends had a red-haired white mother and a Jamaican father. She was 70 percent ‘Anglo’ because her Jamaican father was not entirely black but mixed race himself. But she ‘looked’ black and identified as black whist her brother looked white and identified as white. Which are they? Is identity sufficient to say? Or how others perceive us? I can say I’m mixed race but if I tell people I’m a black woman or a Latin woman I might be laughed at because I don’t look like I am. Would it even be right to say so? What is right? It depends upon whom you’re speaking to. These are reductive discussions of identity that parody race and don’t allow individuals to say who they are.

My siblings could look black whilst I could look white, it can leave people feeling like they have racial imposter syndrome where a person feels they are appropriating a culture that actually not their own! If we feel liminal like we drift between cultures but belong to none, isn’t that often because of the stereotyping that goes on even within cultures as much as without cultures?

I’m Jewish but I do not believe in God, nor do I go to Temple, so when I have tried to join Jewish writing groups, I have been shunned as not being Jewish enough. When I worked for a Jewish organisation, I was considered Jewish, but I was the ‘wrong’ kind of Jew because I was Mizrahi and Sephardi rather than Ashkenazi. In other settings, I wasn’t brown enough to be considered a Mizrahi or Sephardi jew. The absurdity of all the micro aggressive ways a person can be catalogued or disqualified wasn’t lost on me. It is worse for some who are more obviously mixed race but don’t possess whatever that group demands for admission but are also racially attacked by other groups. For example, what does ‘you act white’ really mean? That you are not speaking with the right accent, or that you should know another language or wear different clothes or? My other friend is constantly told she is not Latina enough because she has no accent, and her Spanish is perfect rather than Tex-Mex and she likes to eat Indian food. Does one group have more of a ‘claim’ to being of colour?

References:

https://www.pbs.org/wgbh/pages/frontline/shows/jefferson/mixed/onedrop.html

https://www.npr.org/sections/codeswitch/2017/06/08/462395722/racial-impostor-syndrome-here-are-your-stories

https://www.npr.org/2010/12/20/132209189/how-multi-ethnic-people-identify-themselves

https://theconversation.com/who-counts-as-black-71443

https://www.cnn.com/2016/10/10/health/biracial-black-identity https://www.tandfonline.com/doi/full/10.1080/01419870.2019.1642503

.

Candice Louisa Daquin is a Psychotherapist and Editor, having worked in Europe, Canada and the USA. Daquins own work is also published widely, she has written five books of poetry, the last published by Finishing Line Press called Pinch the Lock. Her website is www thefeatheredsleep.com

.

PLEASE NOTE: ARTICLES CAN ONLY BE REPRODUCED IN OTHER SITES WITH DUE ACKNOWLEDGEMENT TO BORDERLESS JOURNAL.

Categories
The Observant Immigrant

We had Joy, We had Fun…

By Candice Louisa Daquin

Courtesy: Creative Commons

Heortology (the study of festivals) has expanded beyond its initial Christian focus to embrace all festivals and their enduring appeal and necessity in our human culture. Festivals remind us to celebrate, and celebration is a positive experience. The very idea of festivals is ancient. No existing history book is old enough to document when the first festival took place or what its origins were, but it’s a safe bet they had some kind of worship element attached. Modern festivals often also land on old pagan holidays, whilst others are more obvious in their origins. Many who attend festivals have no idea of their origins but go for entirely celebratory reasons. We have learned a lot about the history of varied festivals but another question to consider is: Why are humans drawn to festivals and what do they provide us?

Imagine the ancient world. As much as we think we know now, they knew a tremendous amount also, considering their lack of modern resources. This may well be down to the ‘necessity is the mother of invention’ paradigm. Or that we severely underestimate our ancient ancestors, in our egocentric belief the modern world knows best. Just as we underestimate the knowledge of animals and their abilities to survive. Perhaps we could even say, we have lost the art of survival and wouldn’t know how to, if our computers were offline and our cars did not work and the supermarkets were empty.

What we do know, is the ancients were able to amass a great deal of knowledge, despite seemingly not having easy access like we do today, with our modern telescopes and technology. They had to understand mathematics and science at the very core, to establish theorems on the universe and our place in it. Whilst many were later corrected, it is surprising how many ancient scientists, mathematicians and philosophers, got it right. Almost against all odds. It is fair then to say, we dismiss the richness of the ancient world, and imagine everyone lived ignorant lives, which was not the case. When ignorance did reign, it did so deliberately, with the quashing of knowledge by various religious groups, and resulting periods of ‘dark ages’.

The ancient world was in touch with what it means to be human. Being human isn’t knowing how to work your iPhone or microwave. It’s not having a huge house, with a swimming pool and driving a Lexus. Nor is it eternal youth, fame and glory. Being human is about surviving — just as it is with any animal. When we then add an awareness of our own being, which it is argued, not all animals understand, then we become the modern human we recognise today. A being who has the choice, the ability to reflect and learn, and a tendency to seek beyond themselves. In seeking beyond oneself, we find an innate or shaped desire for ‘more’ and that ‘more’ has often come in the guise of a God-head or spirituality of some kind.

Whether we believe humans are prone to worshipping gods or being spiritual, because Gods actually exists or we just have a propensity to create them, is immaterial. The outcome is the same. The God gene hypothesis proposes that human spirituality is influenced by heredity and that a specific gene, called vesicular monoamine transporter 2 (VMAT2), predisposes humans towards spiritual or mystic experiences, perhaps that is what is at work? In essence a transmitter in our brain that makes it more likely we will believe in God (and could explain why some people do so fervently, whilst others do not). Or perhaps we may find meaning in believing in a spirituality beyond the temporal world. But what we do know is, as long as humans evolved from their primate ancestors, they have formed meaning around some kind of spiritual observance and festivals were tied to this worship.

Why do we do this? We are born part of something (a family) but are also separate (an individual). Perhaps festivals and what they represent, is the coming together of all things: Nature. The seasons. Marking time (birth and death). Marking passages (fertility, menstruation, maturity, marriage, children, dying). These are the cornerstones of meaning, with or without God. I say without God, because for many, their notions of God are tied to nature, so it’s more the world around them than specific deities. For others, it’s the manifold destinies of humanity, or history of deities. But whatever the reason there is a sense of coming together in celebration of being alive, and acknowledging that life. A festival in that sense, irrespective of its actual purpose (the harvest, pagan holidays, etc.) is a ‘fest’ of life. Maybe this is why we can have such a happy time being part of it.

Growing up, neither of my parents liked festivals. They thought they were silly. I remember a street festival I went to as a child, for Fête du Travail (Labour Day) in France. I dressed up as princess and the frog (taking my toy Kermit with me) and felt an excitement like I had never felt before. The throngs of people and other children, the food, the smells, the magicians, the shows and the things to see. It was like walking through a market of treasures. I couldn’t understand why neither of my parents liked this; to me, it felt like a jewel had opened. But for some, festivities are synonymous with rituals and a degree of adherence to religion, even when it’s not. And rather than entering into the spirit of it and enjoying it, they feel what it represents is part of social control.

In France, like many countries, festivals abound. The national Fête du Citron (Menton Lemon Festival) draws crowds from around the world, as does the film screening: Festival de Cannes –near where I grew up — and Fête des Lumières (festival of lights, in Lyon). More traditional festivals include Défilé du 14 Juillet (Bastille Day). In the Middle Ages in France, on Midsummer’s Day, at the end of June, people would celebrate one last party (fête de la Saint-Jean or St. John’s Day). Bonfires would mark this longest day and young men would jump over the flames. This also happened on the first Sunday of Lent (le Dimanche de la Quadragésime), where fires are lit to dance around before carrying lit torches. Religion dominated many of the Autumn/Winter festivals historically.

In France, Christmas, is marked over twelve days with the Feast of the Innocents, the Feast of the Fools, New Year’s Eve and culminates in the Feast of the Kings with its traditional galette des Rois. Events include Candlemas (Chandeleur) with its candlelight process. Likewise, many Christian societies have some celebration connected to Easter (Pâques, in French)) or its Pagan roots. In France (and New Orleans in America) these include Shrove Tuesday (typically Mardi-Gras in America), marking the last feast day before Lent, and many others until Pentecost Sunday. My favorite ‘fest’ was Shrove Tuesday (also known as Fat Tuesday or Pancake Day, in other countries) because my grandma would make pancakes, despite our being Jewish. The notion was to eat before Christian Lent and a period of fasting, which has much in common with Muslim beliefs too (unsurprisingly since God is one in the Jewish, Christian and Muslim faiths). In America, they serve fish options every Friday for much the same reason.

Far more impressive and immersive festivals occur in India, with Hinduism celebrating among the highest number of festival days in the world. Over 50 festivals are celebrated throughout India by people of different cultures and religions. These Indian festivals form an integral part of the rich heritage of the country. The ancient Hindu festival of Spring, colors and love known as Holi is one. “Holi is considered as one of the most revered and celebrated festivals of India and it is celebrated in almost every part of the country. It is also sometimes called as the ‘festival of love’ as on this day people get to unite together forgetting all resentments and all types of bad feeling towards each other.” Holi is celebrated on the last full moon in the lunar month of Phalgun, the 12th month in the Hindu calendar (which corresponds to February or March in the Gregorian calendar).

With social media, more of the world have been granted access to the visual beauty of Holi – “This ancient tradition marks the end of winter and honors the triumph of good over evil. Celebrants’ light bonfires, throw colourful powder called gulal, eat sweets, and dance to traditional folk music.” One of the most popular legends in Hindu mythology says the Holi festival marks Lord Vishnu’s triumph over King Hiranyakashyapu, who killed anyone who disobeyed him or worshipped other gods. With coloured powder thrown on people as part of the celebration, many countries now celebrate Holi just as Indians may celebrate Halloween or Día de Muertos. The crossover effect may seem to dismiss the individualistic cultural value and smack of appropriation but, in reality, it’s more a sign of respecting other cultures, learning about them, and celebrating with them.

Mexico, which I live near to now, celebrates over 500 festivals yearly and consequently is one of the most festive cultures in the world. In San Antonio, TX, where I currently live, we celebrate many of these fiestas, alongside American ones. The most popular being Día de Muertos, Día de la Virgen de Guadalupe, Cinco de Mayo and Día de la Candelaria, (like the French Candlemas, celebrated after Three Kings Day, which is a bigger holiday than Christmas in Mexico). The variables in cultures are fascinating. In San Antonio, we get a huge influx of Mexican tourists over Christmas because they aren’t home celebrating as they do so a few days later. We have a fiesta in San Antonio that is much like those in Mexico, due to our large Mexican population and it’s heartening to see the merging of the two.

As a child I celebrated the Jewish Pilgrim Festivals—Pesaḥ (Passover), Shavuot (Feast of Weeks, or Pentecost), and Sukkoth (Tabernacles)—and the High Holidays—Rosh Hashana (New Year) and Yom Kippur (Day of Atonement). But I attended a school that celebrated all faiths so we also celebrated Ramadan, the Muslim sacred month of fasting, akin to Christian Lent. Growing up, my friends of all faiths, celebrated Eid-ul-Fitr or simply Eid which is among the religious festivals for the Muslim community, marking the end of Ramadan. This festival is celebrated on the day after seeing the night crescent moon with devotees offering prayers at mosques and then feasting with their near and dear ones.

We would also celebrate Kwanzaa, which is a worldwide celebration of African culture, running from December 26 to January 1, culminating in a communal feast called Karamu. Its creator was a major figure in the black power movement in America, “Maulana Karenga created Kwanzaa in 1966 during the aftermath of the Watts riots as a specifically African-American holiday. Karenga said his goal was to ‘give black people an alternative to the existing holiday of Christmas, and give black people an opportunity to celebrate themselves and their history, rather than simply imitate the practice of the dominant society.’”

Are we socially controlled when we attend festivals? Given we have a choice, I would say no. Someone who chooses to be part of something, isn’t signing up for life, they’re passing through. Since my childhood I have been lucky enough to have attended many festivals in many countries. For me it is a reaffirming experience, seeing people from all walks of life come together in happiness. I like nothing better than dressing up and meeting with others and walking through streets thronged with people. Be they carnivals, even political events, there is an energy that you rarely feel anywhere else.

The May Pole festival, believed to have started in Roman Britain around 2,000 years ago, when soldiers celebrated the arrival of spring by dancing around decorated trees thanking their goddess Flora, is an especially interesting festival because it is still practiced almost as in ancient times. The ribbons and floral garlands that adorned it represent feminine energy and the beauty of the ritual is enduringly something to behold.

Likewise, another event ‘Guy Fawkes Night’ is steeped in ritual and British history, with much symbolism in the burning of straw dummies that are meant to represent Guy Fawkes thrown onto bonfires. However, the act of throwing a dummy on the fire to represent a person, has also been done since the 13th century to drive away evil spirits. What most people seem to take away from Guy Fawkes Night are the abundant fireworks in a beautiful night sky, alongside children and families holding sparklers and eating horse chestnuts in the cold, wrapped up in mittens. It’s a ritual that is beloved and a chance to ‘be festive’ even if it’s not a specific festival. As much as anything, it marks time, another year, another November, and gives wonderful memories. If we didn’t mark time or have those memories, we’d still have others, but there is an ease with festivals because they do it for us, unconsciously.

Young collegiates often attend festivals that involve dancing and sometimes drugs. Again, this is not a modern occurrence but has been going on for years, as rites of entering adulthood. The desire of the young to get out and meet others and dance and enjoy life, is primeval, and possibly a part of who we are as humans, marking a potent stage in our lives. Recently I went to a birthday party at a night club. I observed the diverse throngs of party goers and reveled in that abundant diversity. In just one night I saw: Pakistani women in saris, Japanese girls in anime costumes with ears, a pagan woman with huge, curled bull horns and floor length leather dress, Jamaican families in neon shorts and t-shirts, transgender wearing spandex dresses and big wigs, Hispanic Westsider’s filled with tattoos, and gold necklaces, Lesbian and gay couples holding hands. Old couples in sensible church clothes including one old black man with a pork pie hat and a waist coat.

I thought of all the diversity that had attended this club to dance the night away. All ages, all genders and backgrounds and ethnicities, and I thought how wonderful it was that one place could hold them all. In many ways this is the essence of a festival, especially nowadays where anyone can attend most festivals. Years previous, they were segregated by subject. Only those followers of that subject usually attended and you could be harmed if you tried to attend and were an outsider. The advantage we have today is we are more accepting of outsiders and when you attend festivals today, you see a wide range of people. Maybe this is the best opportunity we have to put down our differences and celebrate our similarities.

When I lived in Canada, I loved the homage paid to different seasons in varied outdoor festivals, where shaking off the lethargy of Winter, Canadians would celebrate with fairgrounds, amusements, shows and food among other things. It was like a period of renewal. Likewise, during my time in England, the Notting Hill Carnival, celebrated the Afro Caribbean culture, so essential and entrenched in English culture, with gorgeous street displays and floats, as well as some of the best music around. The idea of welcoming everyone into the fold, helps to remove any tensions between cultures and promote a feeling of unity, whilst not denying the unique properties of those cultures and ensuring they are promoted in their adopted countries. It may be idealistic and not entirely accurate, but it’s a better step than ignoring those myriad cultures exist.

As Halloween and Día de Muertos is fast approaching, I am thinking of how many of my neighbours attend these parties, despite some of them being from very conservative churches. Just last year, we all sat outside in the green spaces and had a mini fireworks display. I sat next to my little 4-year-old neighbour and watched her face as the older kids, dressed in all sorts of costumes, shrieked at the fireworks, and ran around with neon bangles, throwing glow powder at each other. I saw how inculcated we are, since childhood, but despite this I truly believe festivities are in our hearts, even if we weren’t introduced to them at an early age. Children mark their growing up by the events of their lives and it’s not just their birthday they celebrate but the touchstones of their respective culture and nowadays, many other cultures.

My Egyptian grandfather used to tell me about the Nile festival which celebrated the flooding of the river and the replenishing of life in Egypt. Without the Nile, Egypt couldn’t exist, and the ancients knew this. They employed methods to enhance the flooding and gave thanks for it. Gratitude like this can be found in many celebrations, including the American Thanksgiving (although this is a double-edged sword, given the history of genocide of the Native Americans by European pilgrims and invaders) and Harvest throughout the world. A celebration of life through food with music, is at the core of the human ability to endure and overcome hardship. More recently many of us celebrated healthcare workers by singing out of our windows and putting messages of thanks in our windows. We do this because it symbolizes essential parts of our lives, without which we would suffer.

Owing to its melting pot past, Egypt celebrates the Coptic Orthodox Christmas, the more ancient Abu Simbel Sun Festival that is akin to the Egyptian Sun God Ra (who in turn was one inspiration for the Christian God many years later), Sham Ennessim, the national festival marking the beginning of spring, as it originates from the ancient Egyptian Shemu festival, Ramadan and the Muslim Eid al-Adha (honoring the willingness of Ibrahim (Abraham) to sacrifice his son Ismail (Ishmael) as an act of obedience to Allah’s command). As a Jew, my grandfather’s family celebrated Passover, the festival celebrating the Jews Exodus from Egypt, despite our family still living there! Nowadays it is no longer safe to live in Egypt as a Jew but the memory of all people’s experiences is preserved through ancient festivals and events, marking our shared history.

Before the advent of mass-produced entertainment, festivals were also a highlight in any village or town, because they were entertainment. Traveling theatres and shows for children, even book sellers and traders of items not commonly found locally, could be bartered or purchased at such events and it was almost a spilling out from the market square economy that kept such villages alive. Perhaps evolving from our natural tendency to barter for things we want, we evolved to invite others from outside to come for specific events to gain greater reach. With this trading and bartering, came the accoutrements such as eating, drinking, dancing. Not only did this increase diversity and knowledge of foods and drinks from other locales, but brought people who may otherwise not meet, together into a camaraderie.

Sharing stories is also part of festivals, by way of theatre, or more improvised scenarios. It is at our heart to pass on oral knowledge and we haven’t lost that desire. We may do this now via YouTube more than face to face (which is a shame), but the desire to get out and talk directly, is innate, as evidenced by how many people have done just that since Covid 19 restrictions are eased. Religion, folklore, ritual and a desire for escapism, alongside our desire to celebrate things or others (saints, gods, seasons, harvest) are all reasons why festivals endure. Just like children will instinctively dance when music is played, maybe it is our innate nature to enjoy festivals because they foster inter-relationships we all crave to some degree. We may be diverse and believe different things, but we can also come together and respect the perspectives of others. Never more so than through our shared love of celebration.

.

Candice Louisa Daquin is a Psychotherapist and Editor, having worked in Europe, Canada and the USA. Daquins own work is also published widely, she has written five books of poetry, the last published by Finishing Line Press called Pinch the Lock. Her website is www thefeatheredsleep.com

.

PLEASE NOTE: ARTICLES CAN ONLY BE REPRODUCED IN OTHER SITES WITH DUE ACKNOWLEDGEMENT TO BORDERLESS JOURNAL.

Categories
The Observant Immigrant

Sometimes Less is More…

By Candice Louisa Daquin

When you read sci-fi novels and they have most of the world living in small sections of the planet, in endless skyscrapers, the future can feel a little dystopian. As practical as living in close proximity is, some of us yearn to be away from the maddening crowd. As our world swells in number (7.753 billion as of 2020, projected to reach 9.8 billion in 2050, and 11.2 billion in 2100 according to UN statistics) is it feasible to live off the grid any more? Is it becoming more difficult not to be part of the mainstream?

During a time of illness, I watched a strange TV show Alaskan Bush People, I would not usually entertain. It was a wilderness show about a family who chose to live off-the-grid. I watched it the way we view any reality TV, with disbelief and morbid curiosity. However, with time, I began to get involved. I admired that these eccentric people — even if some of it was spoofed for the camera — could live in this way. They valued being able to live off the land. I began to wonder if we put too much onus on city-urban-dwelling to the detriment of other life-styles. If we judged those who lived more basically, assuming we were sophisticated. If the grid failed in some way, if electricity or the internet failed, or a giant EMP burst took everything out, we’d need those lost-skills, we’d value those kinds of people more. Maybe we should know that now, before it does, and not get caught up on judging people on how large their house is, or what car they drive. After all, we’re rapidly hurtling toward a future where ‘big’ is going to be problematic and finding alternatives will be prized.

When I moved from a large city to a smaller one, I felt completely cut off from what I termed the trappings of city living, such as the ballet, theatre, good book stores, interesting alternative restaurants. It took me some time to adjust and settle into a slower life with less options. Part of me never stopped missing the variety of a large city, its diverse heart. But I did appreciate the calm that came with a slower pace of life. Sometimes less is more. Moreover, when I met people from big cities, I noticed how their identities were hinged on their experiences of ‘culture’ and how judgmental they were about what counted and what did not. Even the use of words like ‘native’ or ‘naïve’ artist, seemed patronising and racist. Who said one culture or city had more value over another? When did we start respecting the business man over the farmer? When our very existence depends upon the latter? It’s a little like what happened during Covid-19. We realised the value of nurses and front-line-workers a little late in the day.

There are many reasons people crave moving from larger communities to smaller ones. The most obvious is retirement. You may live in a large city but it’s expensive and fast-paced and when you retire it is possible you need different things. You may swap the city for the beach, mountains or lakes. You may find a retirement community has more to offer at that juncture in your life, you may want to have a horse farm or live in another country with more sun. The retiring Baby Boomer generation has caused a massive uptick in house prices throughout desirable parts of America, as they take their affluence to other areas and bring their expectations with it. “Baby boomers held an average wealth of $629,683 in their 50s, equivalent to $704,158 in today’s value. Worse off is Generation X who, on average, owned $396,293 when they started reaching their 50s,” Boomers may be the last ‘affluent’ generation in America to have this mobility and generational wealth. It has changed the landscape of America in terms of house prices.

Take for example a town: New Braunfels was a sleepy little town with nothing to recommend it. Boring but by a river, with an outlet mall nearby. New Braunfels is currently growing at a rate of 5.96% annually and its population has increased by 76.03% since the most recent census, which recorded a population of 57,740 in 2010. It had nothing much to recommend it. Retirees began to move in because it was affordable, had year-round good weather, you could get a lot more for your money than if you chose the more traditional retiree communities in Florida and Arizona. This incoming wave perpetuated another; an exodus of large companies from expensive states like California, wishing to re-settle in cheaper ones. They brought jobs and housing. Before you knew it, this little town was one of the fastest growing towns in America, which is baffling given it has very little to recommend it. But like anything, exodus isn’t always based upon seeking the best, but seeking the most practical, which in some ways it was. More baffling; Texas is home to seven of the 15 fastest-growing cities, which when you compare the beauty of other states, seems non-sensical, but speaks to consumers need for less expensive, warmer states, seemingly at any cost.

However, some smaller communities exist by choice before retirement. Historically there have been reasons people have chosen to live separately. Not long ago, the majority of the world was rural and historically that historically the case. But in the last 100 years, this has drastically changed with more opting for urban living. Religious difference and cultural practice are among the most common reasons people have chosen to live apart. In the 1960s and 1970s ‘fringe’ groups and sub-culture became more familiar among the main-stream. Perhaps because in the 1950’s the idea of being a ‘teenager’ really took off and emancipated young people into being more diverse and following their own interests over their parents. This led to more sub-cultures popping up. That said, is it really such a recent phenomenon?

Alexander the Great was only eighteen when he ravaged a quarter of the planet with his conquests. Other famous historical conquests were at the hands of what we’d deem today, very young people. So younger people have always sought to strike out on their own and forge their identities. The suffragettes in the 1930s, the Zazou in France in WW2, Jazz Age of the 1920’s, the Fin de siècle amongst artists from 1880 onwards … the list is endless. Existentialists, LGBTQ, Nudists, Dadaists, counterculture in the 1960’s, there are so many explosions, one would be forgiven for thinking there is no mainstream, but in reality, these groups have always been the minority and often fleeting.

Youth and age aren’t the sole determinants for such sub-cultures to evolve. People seem divided into those who seek homogenisation and those who seek diversity. For some it may not be a choice, such as LGBTQ or those on the spectrum or isolated communities that were ‘discovered’. But for others, it’s a deliberate attempt to dislocate from the mainstream to express their individual perspectives.  Of those isolated communities and uncontacted people, it is hard to establish how many would have wished to become mainstream and how much choice they had in the matter. Some indigenous peoples are in voluntary isolation, and do not require ‘saving’ as per the modern cultural assumption. Some indigenous groups live on national grounds, such as the Brazilian Vale do Javariin and those who inhabit the North Sentinel Island in India.

I have visited Quaker, Shaker Mennonite and Amish communities as they have fascinating insights on how to live outside the mainstream. Some do without electricity, others have seemingly flexible prescriptions where their ‘young’ can leave the community once adult and spend time in the outside world before choosing whether to return or not, this is known as ‘rumspringa’. This seemed risky as many could seek the excitement of the unknown, but ironically more return to the community. It reinforces the idea that small communities have staying power, which large communities may dismiss.

There are groups of youth, doing one thing, middle-aged, doing another and a whole spectrum of interests in-between. I find this particularly interesting when you go to a fair or show, and suddenly thousands of people all interested in the same thing turn out. It makes you wonder, where have they been hiding? I have experienced this at rock concerts, medieval and renaissance fairs, comic con, tattoo exposes and vampire balls. I attended out of interest but as an outsider. Watching people who are committed to their passions, get together in fantastic outfits, is a fast insight into how many sub-groups exist. Perhaps all of us have within our main-group, sub-genre groups of interest.

Back in the day we called these cults, clans, cliques and (other) but most of those terms have become insulting to future generations, that saw the impact of labeling. After one of the first American mass murders committed at a school (Columbine), the two shooters were described as ‘Goths’ and consequently, many who dressed in Goth style, were attacked. Sadly the Goth movement had nothing to do with violence but this is what happens when we assume people different from us, must have negative attributes; “Qualitative results reveal that students themselves highlight the importance of exposure to diverse others, family upbringing, the media, and several other key factors as important considerations in how they treat other people; this suggests a multitude of ways that people create their beliefs.” The same happens in America with the church of Satan which does worship the fallen angel, Lucifer, as an alternative God-head, but does not condone or sanction many of the ‘evil’ practices associated with Satanism. It isn’t hard to understand why there would be misunderstanding with such extremes but what of less extreme smaller communities?

The Mormon church not only owns Utah but much of other states too. It is one of the richest religions based out of America and has a huge recruitment reach worldwide. When Mitt Romney, an elder in the Mormon church of America, ran for President, one of the reasons he lost was due to a fear of Mormonism. The ‘other’ aspect to their faith, set them apart from the more mainstream Christianity. However, this is shifting as more politicians of Muslim and Hindu faith are becoming key figures, the fear of ‘other’ is lessening. One could argue some fear of ‘other’ isn’t a bad thing, but it’s the extent to which we react to it, that matters. I may not approve of Mormonism, I may think it’s a phony made-up version of Christianity (The Book of Mormon talks of the history of two tribes of Israel—the fair-skinned, ‘virtuous’ Nephites and the dark-skinned, ‘conniving’ Lamanites. Much of its ‘story’ is a direct retelling of The Bible, unoriginally claiming the same events occurred in North America as in Israel. To me, it seemed like racism dressed up as scriptures). Mormonism has been said to act like a pyramid scheme, but should I be prejudiced against someone on the basis of their being Mormon alone? No. We can be cautious or disagree with a religion without being prejudiced against it. On the other hand, shouldn’t we be conscientious of trying to maintain truth, which means if something perturbs us, like the church of Satan or Mormonism, bringing that to light for others to make an informed choice? Perhaps with faith there is no room for choice, it is a matter of faith, and none of us can persuade another to change their perspective. This might be why wars are so often about faith.

Currently throughout America there are many sects and groups who thrive in relative obscurity and are untouched by the mainstream. Whilst group polarisation clearly exists, the famous stories of cults throughout the world committing mass suicide like the Branch Davidians, or fighting against authorities, isn’t as common place now, but that doesn’t mean they’re not out there. Social media has made it easier to be underground and thrive but people always find ways. Whether those communities can come together, depends upon how incompatible they are. Near where I live there is a conservative Jewish community where only conservative Jews live. They chose to live separately because of a high number of hate crimes throughout America, where Jews continue to be the #1 most attacked group.

Other groups have become more comfortable co-existing. Twenty years ago, you would not have seen as much diversity as today. In my neighborhood, there are people of every culture and skin colour — Sikhs, Jews, LGBTQ, single parents, tattooed bikers, affluent conservatives, communists. It has been interesting to see how they are able to come together over a mutual interest and get along. When it’s a special event like Halloween, everyone let their children free to trick or treat. They do not avoid certain houses like they once did. There is an acceptance that we have more in common than we have differences and even if we vote differently, look differently, believe differently, we can put some of that aside for a common good.

Just recently I was asked how I could tolerate someone who was say, a Trumpster. It got me thinking that there must be a cut-off in terms of what we do tolerate. For example, if someone were a racist, a Nazi, a pedophile, I would not wish to be in touch with them or live next door to them. But both my neighbors voted for Trump, and I didn’t vote for Trump, but that isn’t enough of an ideological divide for us to not run in the same circle. Interesting they are both Hispanic and there was this idea Trumpsters were Anglo which isn’t always the case. It is those perpetuated stereotypes that cause the most harm. We can get past differences in ideology but most of us have sticking points such as extreme hate, prejudice or harm to children that would be unrecoverable differences. This is how society polices itself to some extent and legitimizes blame. If we didn’t then racism would be more acceptable, but the nuance is sometimes subtle.

The media has a powerful influence on people and can be responsible for promoting a stereotype of a particular group or enhancing scapegoating behavior. People let loose on social media and are uninhibited in their vitriol. This can create more divisions between us. It is difficult to police prejudice because it involves opinion, which may not always show itself in ways that are unlawful. But when we consider communities; communities can thrive with difference, without becoming contentious. Perhaps because our wish to be united is greater than our wish for division. Secularism is misrepresented often. Although when you drive through parts of the American South as a person of colour, you could be forgiven for thinking ‘secular people’ can be hateful, because there are towns where you will definitely not be welcome. Some groups may not outright say they don’t accept others (people of colour for example) but they will actively encourage segregation through their secularity. This may be unavoidable as much as it is racist, but how can we really change that? Would it work to demand racists accept people of colour as next door neighbours? Would it be good for the people of colour to be part of that experiment?

Another concern is a subject brought up by famed linguist, Professor Anvita Abbi, in relation to bringing distant or smaller cultures into the mainstream and their impact. Dr. Abbi received her Ph.D. from Cornell University, USA and began teaching Linguistics at Kansas State University, where she says, she “realised that a large number of Indian languages especially those spoken by the marginalized communities are under-researched.” This led to Abbi wishing to “unearth the vast knowledge base buried in the linguistic structure of Great Andamanese before it is lost to the world.” In the process, as she recorded in her book, Voices from the Lost Horizon, she realised this language was “a moribund language of the only surviving pre-Neolithic tribe, the remnants of the first migration out of Africa 70,000 years ago.” Awareness of the Great Andamanese, resulted in invariable negatives; “Outsider-contact has brought diseases, subjugation, sexual assault, and ultimately decimation of the tribal culture, tribal life, and tribal language.” But what has been learned from this outside culture, is invaluable. Sadly as Dr. Abbi says; “Jarawas maintained the isolation and now they regret the interaction with us.” Which if we consider other ‘first contact’ scenarios, seems a universal response.

‘Mainstreaming’ is a colonial model, which can suppress the indigenous dignity of people in favour of assimilation. But assimilation isn’t the same as ‘fitting in’ because often, the qualities of incoming cultures are derided by this colonial model, leaving those incoming, feeing disrespected and alienated. In America, Mexicans are considered ‘less than’ other immigrants (Asian predominantly) because they may have lower education rates. This breeds a division between immigrants that undermines those least appreciated by the host-country. With Asians set to overtake Hispanics in America, this has been at the forefront of race-relations and considerations lately, with some tensions building up as for a long time it was anticipated America would become Hispanic. When Donald Trump was President, he actively encouraged immigration from certain countries over others, because he believed those countries had more valuable people. This sounds an awful lot like the argument for eugenics and, at its core, it shares a lot with racists who believe certain groups have more potential than others.

When Abbi was asked what the ideal way for Great Andamanese integration to occur where language and cultures were not eroded but blended with the mainstream, she said in her experience,“[t]he idea of mainstreaming and merging these tribes into our civilisation is nothing but usurping their rights to their land, forest, water, and way of life. ‘Development’ may kill these tribes. These tribes have amalgamated their life with nature so well that they are aware of secrets of life.  Any kind of interference will disturb this harmony.” Perhaps we can learn from the poor, exploitative outcomes of assimilation between developed communities versus those they perceive as less developed. The fault of perceiving difference as ‘less than’ is not appreciating the dignity and abilities of those cultures. Linguistically, socially, they may have many advanced ideas over mainstream culture, but are relegated to ‘less than’ in xenophobic or colonialist thought.

Take the Native Americans of America as one example. They believed the earth was for everyone and no one group should own the earth. They are often considered one of the first cultures to be environmentalists because of their acute awareness of balance and the need to give back to the land rather than rape it. When colonialists came to America, they didn’t respect that and demanded ownership of shared lands, as well as working the land sometimes to death. Slavery and mistreatment of land have that in common, the need to conquer, own and a capitalist model of growth. Those under the yoke of such tyranny do not thrive, only the ruling minority do. In this sense, it is not far removed from fiefdoms and seems to be a penchant of humans given the opportunity. But what happens when we visit cultures where a more egalitarian approach is mainstream? Less oppression and greed in favour of sharing?

It could be argued this is why capitalist model countries like America still fear Communism and Socialism. They recognise this alternative model would undermine the oppressive aspects of Capitalism. Whilst no one ethos appears to work without serious flaws and hypocrisy, we’d probably do better to work together, blending aspects of all, than continue a ‘cold war’ about our differences. When you look at the recent antagonisms between countries, it become apparent, war solves nothing, and the wealth which could be poured into helping countries, are being squandered on military posturing and grandstanding. Until larger communities respect the dignities of smaller groups, we cannot expect this to change. On the other hand, can we afford to give up that military grandstanding if other large countries insist on becoming the conquerors we once were? How can we unite together without becoming vulnerable?

Studies have shown that integration helps overcome prejudice and racism. When people have LGBTQ children, they are more likely to become accepting of LGBTQ and racists become less racist, when people of colour move into their neighbourhoods. This suggests some of the hate is more ignorance and fear although that doesn’t justify it. But should the minority have to stomach that hate to find acceptance for their progeny? Maybe they always have. If we consider the years it has taken some minorities to become more mainstream, it has always been through personal sacrifices. Even Martin Luther King Jr’s murder galvanised more social and racial change in America. Such tragedies create martyrs, harbingers of change, but at what cost? Should it take such extremes as assassinations and mass shootings to wrought change? It seems human nature only understands things when they’re extreme. A case in point is the environment and the long duration where campaigners have warned we’re dooming future generations but business interests were put first.

How with so much division even on subjects that can be proven, such a climate change, can we hope to lay down our differences and come together? Perhaps the best we can hope for, is if enough of us try to embrace difference instead of letting our xenophobic tendencies frighten us, we will do a better job.

Immigration in America is considered a ‘problem’, but it can equally be a solution if we redefine things. Immigration is the bedrock of how America came into existence — from the Native Americans who came across the Barring Strait and made a deserted land, home, to the European conquerors who stole it but equally populated it from diverse cultures. As much as we have fought and hurt one another, we have needed each other.

Each epoch in people’s lives, shifts what matters to that particular generation, and perhaps it is the fear of being obsolete or an inability to get onboard with new ideas (or a fear that old ideas will be ignored) that causes inter-generational strife. But again, if we balance and appreciate the diverse perspective, we all have something to offer, we are stronger together than apart. If we humble ourselves and remember to learn from those cultures that may not have had as much attention given them, but held great wisdom, we may learn alternate ways of cooperating and thriving. If harmony is the goal for most of us, we need to vote and avoid dictators taking that freedom away.

Candice Louisa Daquin is a Psychotherapist and Editor, having worked in Europe, Canada and the USA. Daquins own work is also published widely, she has written five books of poetry, the last published by Finishing Line Press called Pinch the Lock. Her website is www thefeatheredsleep.com

.

PLEASE NOTE: ARTICLES CAN ONLY BE REPRODUCED IN OTHER SITES WITH DUE ACKNOWLEDGEMENT TO BORDERLESS JOURNAL.

Categories
The Observant Immigrant

Can We Create a Better World by Just Wishing for it?

By Candice Louisa Daquin

The wish to laugh and shrug off differences that create unhappiness and wars is a universal one. The majority of us want to avoid unhappiness at any cost. There is however, a downside to trying to avoid unhappiness by being too open about unhappiness. When we begin to pathologize everything as a disorder, we may inadvertently neglect our ability to generate better mental health.

Before mental illness was discussed en mass, it was private and considered shameful. This had obvious detrimental effects on those suffering, but one could also argue there was a benefit to not making everything so extremely public. Like with any argument, there are pros and cons to how far we publicize mental health. The extreme of ignoring it, didn’t work. But does the extreme of talking about it to death, really help people as much as we think?

In the second half of the 20th century, owing in part to a neglect of, and a need for; improved mental health care, societies began to shift from encouraging suppression of emotion to a recognition of psychological distress and its impact. Institutes and then the de-institutionalisation movement, became ways of coping with people who struggled to function in society. But these people didn’t choose to be unhappy. Whilst it’s obvious this shift to publishing mental health instead of hiding it, has been highly beneficial in some regards; we should also consider its far reaching ramifications.

“(Historically) Many cultures have viewed mental illness as a form of religious punishment or demonic possession. In ancient Egyptian, Indian, Greek, and Roman writings, mental illness was categorised as a religious or personal problem. In the 5th century B.C., Hippocrates was a pioneer in treating mentally ill people with techniques not rooted in religion or superstition; instead, he focused on changing a mentally ill patient’s environment or occupation or administering certain substances as medications. During the Middle Ages, the mentally ill were believed to be possessed or in need of religion. Negative attitudes towards mental illness persisted into the 18th century in the United States, leading to stigmatisation of mental illness, and unhygienic (and often degrading) confinement of mentally ill individuals,” states an article on this issue.

By publicising everything, in reaction to the days when mental health was viewed with more stigma, we have not improved suicide statistics or mental illness numbers like we’d logically assume. When something is freed of stigma and shame, more people admit to suffering from mental illness than ever before, which will make it seem like more people have mental illness, when it could simply be that they are more willing to admit to having it. On the other hand, there is an observed phenomena of things becoming socially contagious.

How can we be sure we’re not increasing mental health numbers by making it so acceptable to be mentally ill? By over-emphasising it on social media? Publicising the struggle to avoid stigma, is positive, but the degree to which we discuss mental illness may be so open, as to increase numbers or over-diagnose people. For example, everyone gets sad sometimes, that doesn’t mean everyone suffers from clinical depression. Everyone gets anxious sometimes but that doesn’t mean everyone suffers from anxiety. The distinction is: Is it a disorder or a feeling? Do clinicians spend enough time considering this when they give patients a life-long diagnosis? And what is the effect of such a diagnosis?

When psychiatrists diagnose mass numbers of people, especially easily influenced teenagers, with serious life-changing mental illnesses, that immediately means the reported numbers swell. Who is to say they would be that large if diagnosis weren’t so open ended? Nebulous? Open to outside influence? Or even, the pressure of pharmaceutical companies and desperate doctors wanting quick fixes? What of parents who don’t know how to handle their rebellious teen? Is that mental illness or just life? If they demand treatment and the teen is labeled mentally ill, do they fulfil that prophecy? And if they hadn’t been diagnosed, would their reaction and outcome be different?

Our innate ability to laugh and shrug things off, comes from the challenges in life that were so terrible we had no choice if we wanted to go forward. If we remove those challenges, are we teaching our kids how to cope with hard things or wrapping them in cotton wool and medicating them? When a family of ten children ended up with eight routinely dying, how else could families cope with such tragedy but to have that coping mechanism of laughter and the ability to shrug off despair and horror? It did not mean anyone was less caring, or feeling, but that sensitivity had to be weighed against our ability to endure. We could argue we endure less pain now than ever before, as we are less likely to lose a great number of people we know, die due to disease and famine and other historical reasons for early death. Many will never even see the body of a dead relative, so how can they process that loss?

The modern world brings with it, its own attendant risks and stressors. People growing up in 1850 may not have had to worry in the same way, about looking young to keep a job, or trying to ‘do it all.’ On the other hand, they might have had to worry about not having a society that helped them if they lost a job, or how to stop their families from starving or their village from being raided. They had fewer social cushions in that sense and more of a risky day-to-day. This was starkly true when we compare the recent pandemic outbreak with say the plagues of earlier centuries. People died in the street and were left to rot, whereas now, even as we struggled and many died, we had a modicum of order. For all our terrors with Covid 19, it could have been far, far worse and has been. I say this from a position of privilege where I lived in a society that had access to medical care, and I’m fully aware many still do not, but nevertheless if we directly compare the experience of the Black Death with Covid-19, we can see tangible improvement in what those suffering, could access.

This means whether we believe it or not, appreciate it or not, we have over-all an improved quality of life than even 50 years ago. At the same time, we may have swapped some deficits for others. It may seem a minor consolation for the myriad of modern-day woes, but we are better off than our grandparents who were called ‘The Silent Generation’. They grew up learning to not speak of their struggles but cope with them silently. These days we have outlets. And in other ways, we are more alone, it is a strange mixture of progress and back-tracking. Some would argue our grandparents had a simpler, healthier life. But if average life expectancy is anything to go by, we are growing older because for the majority, our access to medical care and over-all nutrition, are improved. On the other hand, more grow old but sick-old, which is not perhaps, something to aspire to.

When we consider how badly many eat, and in truth, we do ourselves no favour when so many of us are obese and suffering from diseases of modern living such as lack of exercise, heavy drinking, lack of sleep and eating fast-food. It might be most accurate to say we have swapped some deficits such as dying due to curable diseases, and dying from malnutrition or lack of access to care and antibiotics, with modern deficits like increasing cancer rates and increasing auto immune disorders, all of which are increasing with the swell of the modern world and its life-style.

What it comes down to is this; through the wars of the past, people stood next to each other in trenches whilst their friends were blown to pieces or died in agony. They had PTSD[1] then, they suffered from depression and anxiety, but they also had no choice but to carry on. For some, the only way out was suicide or AWOL[2], while for many, they stuffed their feelings down and didn’t speak of it. Clinicians thought this way of coping caused illness and it led along with other reasons, to an improved mental health system.

But, now, in 2022, you might be forgiven for thinking EVERYTHING was a disease, and EVERYONE suffered from something, and you might find yourself wondering if some of this perceived increase was the direct result of going from one extreme to the other. Initially, nobody was mentally ill. Nowadays, who isn’t? Is this a better model?

Having worked with mentally ill people for years as a psychotherapist, I can attest that mental illness is a reality for many. I knew it was before I ever worked in the field, and it was one reason I chose that field. I wanted to help others because I saw viscerally what happened to those who did not receive help. Despite this I came to see the value of sometimes putting aside all the labels and diagnosis and medications and treatments and trying to just get on with the process of living. If we tell someone they are mentally ill and medicate them and coddle them and tell them they don’t need to try because they are so sick, then it doesn’t give them much motivation to see what else they can do.

True, for many, they are too sick to do anything but survive and that in of itself is a big achievement. So, when we talk about the need to motivate ourselves beyond labels, we’re talking about those who we’d call high functioning. People who may suffer from depression, or anxiety, but are very able to do a lot of things despite that. Does medication and therapy and labeling them, really help them make the most of their lives? Is putting them on disability for years without reviewing if things could or have changed, help? Can they learn something from our ancestors who had to just laugh and get on with it, no matter how tough things got?

It may seem a very old-fashioned approach to consider ‘toughing it out’ and having come to America and seen how much onus they put on toughing it out, I have mixed feelings about the value of doing so. The idea of being tough enough means there is always the reverse (not being tough enough) and that feels judgmental. Being judgmental, I think, has no place in recovery.

What does have a place in recovery, is doing the best you can and not letting labels define or defeat you. In this sense, I see a lot of commonalities with those struggling today and those who struggled 150 years ago. Maybe we can all learn from them and combine that with some modern prescriptivism that give us more chance to laugh and thrive, rather than fall under the yoke of a diagnosis and its self-fulfilling prophecy?

I have had many clients who felt their diagnosis disincentivized them from any other course of action than being a patient. The medication route alone is fraught with ignorance. For so long SSRIs[3] and other anti-depressants were heralded as lifesavers for depressed people, but what proof existed for this aside the hope a cure had been found? Years later studies showed only 30% of people seemed to respond to anti-depressants versus placebo.

Then second and third generation drugs were created, all the while charging exorbitant prices, and patients routinely took 2/3/4 medications for one ‘illness.’ Aside the expense and physical toll taking that much medication can do, there was a mental cost. Patients felt over-medicated, but not happier, not ‘better.’ By tputing their faith in drugs, they lost their faith in other ways of getting ‘better’ and some spiraled downward. The reality is we are all different and we process life differently. Some of us are more forward-focused, others, through imitation, genes or experience, may not be. It isn’t a deficit or illness, it’s a personality, that can change somewhat but should also be understood as the diversity of how humans cope.

Treatment Resistant Depression became the new diagnosis when modern medication failed, and new drugs were considered in tangent with current drugs, but this led to people taking more drugs, for longer periods of time, often with little improvement. How much of this is due to a negligent approach to treatment that only saw drugs as the answer? Meanwhile therapy was cut-back or became prohibitively expensive, cutting off other options for treatment. It’s logical that therapy can help avoid feeling isolated, but when the system prefers to medicate than provide therapy, there are so many taking medicines for years, that were only meant as stopgaps.

Should the media or your general physician, be the one telling you what drugs you should be taking, if at all? Preying on the desperation families  by the introduction of for-profit medication, muddies the waters further.  The disparity of information means no one source can be trusted, especially as information is ever-changing. More recently a study showed that anti-depressants may not work at all it was commonly held clinical depression was caused by a chemical imbalance and studies show correcting that imbalance does not improve depression as was once thought.

This shows us that psychiatry still has a long way to go, and when they claim things as facts, they rarely are. It contends we should not blindly trust what has become a profit led industry, where many of its practitioners see patients for a short time but somehow still diagnose them with serious mental disorders. Surely, we should consider equally, the importance of conservative diagnoses and recognise that normal variants are not necessarily disorders. In many cases, it may be that under diagnosing rather than over-diagnosing could work better.

For example, I know of many (too many) patients who told me they were diagnosed with bipolar disorder, before the age of 21 by a regular non-mental health doctor, or by a psychiatrist. Their subsequent mistrust of the system is understandable with that experience. How can someone tell you that you have bipolar disorder at 17 years of age, from a 20-minute conversation?

Even the diagnostic criteria for bipolar 1 or 2 in the DSM (Psychiatric Diagnostic and Statistical Manual), is flawed, because it’s too generalised and only highly trained professionals would be able to understand the nuance. Most are not that trained and therefore take at face value, when a diagnostic tool says someone has bipolar if they experience an episode of mania. But firstly, are they defining mania correctly? Is the patient describing mania correctly or being led? Were there mitigating factors?

If you diagnose a child with a serious mental disorder and medicate them, how can you be sure their brains aren’t affected by taking that strong medication before they have reached full development? How can you be sure they are not becoming what they are told they are? Too often, people spend years under the cloud of medication, only to emerge and realize that what was a discrete episode of depression, was medicated for decades, robbing them of the ability to recover? Doesn’t a label make it likely that some will feel helpless?

Moreover, how much power does a label have on our sub-conscious? If we are told, we are (will not be able to do something, why would we even try? If we believe we are depressed, are we less or more likely to fight against it? Isn’t some fighting a good thing? Likewise, diagnosing older people with a disease like Bipolar (a disease that occurs after puberty), shows the mistakes of the psychiatric world. How can a 70-year-old man ‘suddenly’ be Bipolar unless he has a brain-tumour or otherwise? Dementia is often misdiagnosed as Bipolar because badly trained doctors seek answers for aberrant behavior, without considering the whole story, such as how can someone of 70 develop a disease that affects those around the age of 18? Sure, some can slip through the gaps, but often, it’s the frustration of the family or doctor colouring the diagnosis. Such life-long labels should not be given lightly.

What if we treat mental illness depending upon its severity, in a different way? Consider the value of improving real-world ways of copying despite it, instead of relying on medications that were only ever meant as a stop gap and not developed to be taken for years on end? Nor over-medicating without due cause. Nor medicating young people based on very loose diagnostic expectations. Or assuming everyone who says they feel depressed or anxious, is clinically depressed or anxious, or that medication is their only solution?

Organisations that take vulnerable teens who often have co-morbid diagnosis of drug-or-alcohol abuse alongside mental illness, into the wilds, seem to be a real-world way of encouraging those young people to find coping mechanisms outside of addiction and reliance upon medication. Equally, when a young person (or anyone really) is productively employed in something they feel has meaning, this is one way anxiety and depression can improve.

We’ve seen this with Covid-19 and the necessary isolation of so many school children. Whilst it was unavoidable, the rates of depression spiked, in part because studies show people need interaction with each other. This is why online learning has a poorer outcome than classroom learning, this is why older people are less at risk of dementia if they socialise. We are social animals, we feed off each other and we empower each other. Finding your place in the world is always in relation to others to some extent.

We may never avoid war completely or our human tendency for strife, but we also have a powerful other side that urges people to do good, to help each other, to laugh and shrug off the differences that divide us. What good does’ division ever do? Unhappiness is unavoidable at times, but sometimes it’s a choice. We can choose to recognise something is hard and actively pursue ways of improving it. We can struggle through and feel good about the struggle and the effort we put in. if we take that all away and don’t encourage people to try, we give them no way out. Sometimes there is no way out of suffering or mental illness, but often we cannot know that unless we have tried.

Many years ago, people valued older people because they were considered wise and able to impart valuable life lessons to impetuous youth. Nowadays, the elderly are not respected and are often siphoned off into homes before their time, because people find them an inconvenience. There is a theory that humans became grandparents because grandparents were an intrinsic part of the family make-up. This explained why humans were among the only mammals to live long after menopause. Most animals die shortly after menopause, nature believing once your reproductive years are behind you, you have no value. But humans were distinct because they live long after menopause. The grandparent theory supports this by demonstrating the value of grandparents, and we can learn a lot from what nature already knows. It is never too late to have value, it is never too late to learn and grow, and it is never too late to laugh and come together, setting differences aside.

Those who achieve that, may well be happier and live healthier lives, as laughter is shown to be a great anti-ager as well as an improvement on our overall mental and physical health. Of course, what we can learn from the extremism found in the cult of positivity, illustrates there must be balance and we cannot expect to be happy all the time or unaffected by tragedy when it occurs. But staying there, and not attempting to move beyond it, to reclaim ourselves and our futures, seems to be a way to avoid going down that dark tunnel of no return.

Experience shows, we are what we think. We don’t have to be positive 24/7. To some extent any extreme sets us up for burnout and puts too much pressure on us to be ‘up’ all the time, when it’s natural to have down times. But striving for happiness, or contentment, or just finding ways to shrug off the smaller things and come together, those are things most of us wish for. So, it does no harm to direct our energies accordingly and prioritise our ability to cope. Perhaps our differences are less important sometimes, than what we have in common, and what we can do to make this world a more livable place.


[1] Post-traumatic Stress Disorder

[2] Absent without Official Leave

[3] Selective Serotonin Reuptake Inhibitors

Candice Louisa Daquin is a Psychotherapist and Editor, having worked in Europe, Canada and the USA. Daquins own work is also published widely, she has written five books of poetry, the last published by Finishing Line Press called Pinch the Lock. Her website is www thefeatheredsleep.com

.

PLEASE NOTE: ARTICLES CAN ONLY BE REPRODUCED IN OTHER SITES WITH DUE ACKNOWLEDGEMENT TO BORDERLESS JOURNAL.

Categories
The Observant Immigrant

Is It Okay to be Ordinary?

By Candice Louisa Daquin

Courtesy: Creative Commons

I had a client who struggled with her self-identification as ‘ordinary.’ A Millennial, she’d grown up with social media and the perpetuated confrontation of perfectionism that it can sometimes embody. Growing up middle-class she’d felt the pressures of reaching certain goals, even if they were not her own goals. We talked about how this has been true well before the advent of social media. Society has long held ideas of what individuals ‘should’ accomplish, depending upon your background and parenting. Many children were given no incentive and left to rot on the shelf, whilst others were hounded by external pressures; relatives, academic institutions, peers, or those they admired.

For girls this ‘pressure’ is more recent, as historically girls were not expected to achieve in the same way boys were. In the last 100 years this has begun to shift, with women gaining traction in the career stakes. However, as with any advance, there are pitfalls and some women now, are putting ever increasing pressure on themselves to ‘become everything.’

By ‘become everything’ I mean; mother, provider, educated, career success, care giver, slim, healthy and attractive. And for some, this is attainable. I know many women who function well with huge responsibilities, not least, thriving careers, multiple children, and sufficient energy to stay fit, eat well and not indulge in vices like smoking.

But for every woman who is able to juggle all of the above, there are many who find it too much. Unfortunately, if one person is able to juggle everything, society can be unkind and denigrate those who are not able to, as if this is somehow a failing. In large populations the survival of the fittest is most acute and social media shines a light on success, leaving many feeling ‘less than’. Let’s examine if not being able to ‘do it all’ is indeed, a failing.

Firstly: Everyone is different. But that is no consolation for those who perceive they are not favourably compared to others. What comfort is it to know you are different, when that translates as ‘not being able to do what others can’? Especially in a competitive world where failure isn’t really tolerated. Moreover, how do we feel good about ourselves when we’re confronted with many examples of success and are constantly under scrutiny, by ourselves and others?

In the past – before social media – it was harder to compare yourself against a large swath of people. We weren’t as in touch with each other and what each other was doing. Imagine the famous Bronte sisters, living in a remote moor, having to send letters to far-flung friends, rarely seeing anyone. In some ways this was unbearable. In other ways it avoided direct scrutiny. In the past, we may have not had the rapidity of expectation we have today, given we did less, because the opportunities to do more were harder to obtain. Class divide was more stringent. If you were born the son of a labourer, chances were that’s what you’d been. The ability to climb the social ladder was prohibitive. Today if a young woman lives in a city, she has access to the potential avenues of education, finding a partner, having children, having a career, buying a home, working out in a gym, making friends, etc. We are no longer stuck in houses on the moors or in far-off villages in the same numbers. Even so, class divide exists even if we don’t want to admit it does. If your parents have the funds to send you to the best school, then your outcome is likely to be more fortunate than someone who has no means.

But realistically and statistically how many young women today can bank on having it all? Even if they are capable of working extreme long days – in a job, holding down children and a marriage successfully, finding time for socialising and working out and more, even if they can somehow juggle those stressors and are glad to be emancipated from previous restrictions. How many can keep this level of pressure up and at what cost? For every woman who can, there are doubtless, those who can’t. And it’s not always about choice, it can be due to ill health, mental health issues, stamina, different aspirations, or simply a lack of interest in competing. Perhaps we should get away from the idea that having it all is indeed the only legitimate form of ‘success.’

Case-in-point, just the other day a client boasted of working 16-hour days because for her, this was a point of pride. What wasn’t said, was that shortly after ten years of 16-hour days, she got seriously ill and had to quit work. What wasn’t said is her eldest daughter ran away from home because she spent many days unsupervised by workaholic parents. When the client stopped working 16-hour days, she saw that the façade of ‘having it all’ was just that. She spent more time with her kids, without feeling she was failing to do so. Her marriage, which was hanging on by a thread, was healed. She had to de-program the idea that she was ‘failing’ by not doing it all – part of that was realising a 16-hour work day isn’t very balanced. Some do it, but there is an invariable sacrifice, and the question should always be; how much is too much and why do we as a society tolerate this?

Just as we should tell our daughters (and sons) that your self-worth shouldn’t be tied to shallow modes of success, it’s worthwhile admitting that many can and do ‘have it all.’ It very much depends upon what we seek in life. If our goal is to be a workaholic then we should consider whether we realistically have time for other things, and not attempt to have it all, so much as aim for what we want. Society however tells us that we’re lacking if we’re not able to do it all – even if that’s not what we want to do. Why does society put this pressure on people? Perhaps because it’s the nature of competition; If one person can do it, then you have to try harder to match them. Soon, even people who didn’t want what you wanted, are striving for it. The alternative might be obscurity.

As populations grow, you see this in countries with the largest population masses. In order to get what was once relatively attainable, people are having to work harder and harder because the relative competition is greater. This leads to a bottle-neck pressure, where top universities don’t have the space for all the qualified applicants. Cheating, subterfuge, paying off professors and other methods are employed to ensure a child gets that coveted placement. Equally, more-and-more unpaid internships are expected of young people, leading to only those who can afford to work without pay, succeeding. The price being paid for what was once attainable through regular hard work, is now exaggerated.

I recall as a young person I envied and admired young women who seemed capable of getting up at 5am to work out, send their child off to school with a packed lunch, keep their husband engaged in their marriage, find time to stock the fridge and clean the house, whilst working gruelling hours in a high demanding career. Young women who seemed to have boundless energy and ambition, always seemed to have clear skin and glossy hair and wear flattering clothes, and genuinely seemed happy. I felt there must be something wrong with me because I couldn’t cope with such a ‘full dance card’. I’m sympathetic then for those young women two decades later, who talk to me of similar feelings of inadequacy.

What do you say to someone who feels inadequate when you know there are many who can have it all?

Those young women who had it all – interestingly sometimes also sought counselling. And whilst you may secretly wish they came to a counsellor and revealed they could never cope, they were hanging by a thread. That was often not the case. Instead, these young women would express something none of us might anticipate.

Feelings of failure and incompleteness.

What?

Surely if you ‘have it all’ you couldn’t possibly feel empty, or have feelings of failure? After all – you have it all!

But that’s the funny thing about ‘having it all’ it’s as much an illusion as believing you have nothing. These are two extremes. And many high achievers are never truly content because it is the feeling of not doing enough that drives them.

Have you ever met someone who clearly didn’t ‘have it all’ but they seem so happy? Yes. It is a little-known fact that people with less, are often more content. The old adage, money won’t buy you happiness isn’t far wrong. Of course, most of us want to be in a position so that we do not have to worry about finances. If you don’t have enough to live without those fears, you don’t have the luxury of being content with less. However, if you are in that lower middle band, where you may not have a lot or have achieved what you believe describes success, but you have enough to not worry about putting food on your plate or fixing your roof if there is a bad storm, then you may know contentment. Perhaps this is why people in socialist friendly countries like Norway and Sweden tend to rate highest on the world happiness scale? Because those basic needs are taken care of such as maternity leave, child credits and health care whether you work or not, and that leaves them to consider the actual process of being content.

Being content looks different to different people.

So, for an over-achiever who lives and dies by their measure of success which is often, almost verging on being unrealistic, they may not be as content as someone who isn’t as much of a high achiever. This is why there are two types of people who most commonly drop out of university: the extreme under-achiever (no surprise) and the extreme over-achiever. [MC1] The latter is a surprise to many and the reason is equally shocking. Over-achievers are more likely to implode, self-sabotage and leave university for what many of us would consider absurd reasons. Case in point: When I was a teacher, I often saw straight A students drop out because they made a A-. On that basis alone. Their grade was the kind of grade most would kill for, but to them, a 95 instead of 100 was the equivalent of abject failure. Many times, I met super-intelligent people who would be working ‘regular’ jobs because they couldn’t finish university and get the job they wanted to get, for this reason.

These discrepancies in how we perceive high achievers teaches us that just looking at someone doesn’t tell us who they are. Our perception of others is often wholly wrong. Any doubts about that and consider a family of ten. Two parents, eight kids. Invariably each child holds a different perspective of their childhood and experience. They do not grow up to be the same even if they’re identical twins. Why? Because our individualism comes from our perspective and no two people hold the same perspective. Perspective is more than what religion you are, what gender, what life experiences you have. It is about the culmination of everything and then the actual vantage point from which your perspective is formed.

Imagine a room with those ten people in it. Something significant happens, ten people walk away with ten slightly different perspectives and experiences. Just by being individuals. It is why bystander testimony is so unreliable, human beings are too subjective and bound by their own perspectives to be objective.

Consider then the over achiever, whilst you may envy them their success, they may be tormented. Equally they may be happy, but still feel they have not succeeded by their standards. Someone who is ‘ordinary’ if any such definition can be made (the argument being, we’re all extraordinary in our own way) may be more content without having achieved as much. In this case, it is more than okay to be ‘just ordinary.’ Modern society possesses many examples of exacerbating stressors such as the pressure to gain a certain grade to get into a coveted school. Despite being an over-achiever, it may not be enough and then even the over-achievers ‘fail.’ By not having those pressures in the first place, you avoid the potential success but also the downfall. It’s not easy being under the spotlight with everyone expecting you to perform. The advantage some over-achievers have, is they may excel under pressure, or be galvanised by it. But what of those who don’t find pressure helps them?

Back to my client. She was not content to be ordinary because she’d grown up with the competing pressures of a high achieving brother who ‘could do no wrong’ and her own feeling of inadequacy. But how much of that was hard truth versus her perspective? In reality it was all her perspective and because of that – it was also possible to shift that perspective to give her a chance at being less judgmental of herself.

How do you change the way you perceive things? How do you stop judging yourself for not measuring up if you are bombarded by subliminal messages that you’re not ‘enough?’

What constitutes ‘enough’ when we’ve got disparate views on what enough means? If we have low-self-esteem we’re more likely to judge ourselves harshly on not being enough, than if we are content or somewhat satisfied. If we’re too satisfied or narcissistic, we may believe we’re far ‘better’ than we in truth are. The delusion is at both ends of the extreme, telling us being an extreme usually doesn’t work out. There’s a big difference between being ambitious and letting ambition determine your self-worth. Literally, speaking there is no such thing as not being enough. There is no such thing as someone being better than someone else. Those ideas of inequity begin and end with our faulty perceptions. If we see everyone as potentially equal and follow the path we want to be on, we’re more likely to reach contentment than striving to please others, or follow someone else’s agenda for us.

Likewise, if someone is more beautiful, so what? If your boyfriend leaves you for someone more beautiful, it’s going to hurt but you’re better off without him. If someone more beautiful is also more talented than good for them. If someone more beautiful and more talented gets a job over you, good for them. You don’t have to take it personally because it’s not personal. There will always be someone who is (more talented, more beautiful, richer, more ‘successful’ in the eyes of the world). Once you realise that, you are free. Free to be yourself. Unapologetically. In many ways it comes down to freeing yourself of what others think. This is not something we can usually do until we reach a degree of maturity. If you’ve ever noticed that older people don’t seem to give a hoot about what people think of them, this might be why!

Perception is a funny thing. Whilst you may be lamenting your failure and ineptitude, or hating your failures, someone else may dream of being exactly where you are at. Someone may admire you for exactly who you are. That’s because our internal notions of what is unquestionable truth, are slanted by our inability to be objective about ourselves. How many times have you admired a person who seemingly feels they can do no right? Artists who think their work is awful, when so many love it? Dancers who leave their careers thinking they’re never going to be good enough, when they enraptured entire audiences? Models who believe they are ugly? Novelists who never write a second novel because they thought the first inferior? The perceptions we hold can be faulty and sabotage our creations. When that straight-A student thinks she’s mediocre, someone else might be dying to be just like her. Conversely, the student who doesn’t put as much onus on being the best, may be less tortured.

Navigating the world is often challenging. Ensuring you have enough self-respect but not too much or too little, is a life-long balancing act. It helps if you can read the cues of others, instead of assuming everything is a personal attack. An interesting book about different people you work with, taught me early on that not only are no two people alike, but we must consider how people come to the table, what forms their differences, if we want to stand a chance at communicating with them. In today’s world where neuro-diversity is more common place, and many people have complicated communication and diametric differences, it’s more important than ever to learn how to communicate with people you may struggle to understand.

If you consider the world had approximately half the population it has in 2022 – only sixty years ago – competition is fiercer than ever. Being ‘ordinary’ could make it harder to secure certain jobs. But ironically, employers gravitate to ‘ordinary’ because they want reliable, efficient but are often put off by high achievers (and of course, low achievers) as being too extreme. The competition at the top isn’t an enviable place when you consider the varied pressures it entails without any guarantee of the level of attainment striven for.

On the other hand, I appreciate my Millennial client’s perspective. Deluged with Instagram photos of seemingly perfect people in all senses of the word. Everywhere people want to be something more than they are. The few who reach contentment realise the wasted years focusing on ‘more’ – which doesn’t mean ambition is bad, or desiring to achieve is wrong, but like everything in life, it’s about balance. I told my client that sometimes imagining how she will feel in ten- or twenty-years’ time is a valuable exercise. Why? Because we don’t do that. We think of what we want in the future (financial security, good health etc.) but not what we will FEEL in the future. Sometimes imagining what you’ll feel in ten- or twenty-years’ time is a way to gauge what you really need (and don’t need).

Ultimately there will come a time in many of ours lives where we look back and say ‘that wasn’t important why did I spend so much time focused on it?’ The old adage, when you’re on your death bed will you wish to have spent more time in the office? At the same time there are those who live for work and there is nothing wrong with that. It’s about finding what we want, rather than what we think we should be. Avoid the ‘if only’ and consider what you really want now, and in the future, and aim for that. Be realistic. And in so doing, you may come to see that being ‘ordinary’ is a privilege not afforded everyone and something to cherish. Nobody is perfect. Nobody has to be perfect. Perfection is recognising that who you are is enough. For some of us, being ‘ordinary’ is the nirvana we hope to attain.

.

Candice Louisa Daquin is a Psychotherapist and Editor, having worked in Europe, Canada and the USA. Daquins own work is also published widely, she has written five books of poetry, the last published by Finishing Line Press called Pinch the Lock. Her website is www thefeatheredsleep.com

.

PLEASE NOTE: ARTICLES CAN ONLY BE REPRODUCED IN OTHER SITES WITH DUE ACKNOWLEDGEMENT TO BORDERLESS JOURNAL.

Categories
The Observant Immigrant

Seasons in the Sun

By Candice Louisa Daquin

Courtesy: Creative Commons

I was lucky enough to avoid the era of self-help books for the most part, but I remember when I was a kid, many adults joined groups which moved towards such pursuits. At the time it seemed sensible, I mean what’s wrong with helping yourself? Or letting others? By the time I was old enough to work, one of my early student jobs was in a bookstore, unsurprisingly. However, I was surprised at how many self-help books were still selling. Eventually those self-help books all shared the same theme: ‘Be positive. It’s the answer to everything.’

The message didn’t sit well with me once I began working with clients. It seemed the pressure of being positive and having to re-frame everything you thought/felt/said on a daily basis would be exhausting rather than healing.

Yesterday was one such example. In a small group of ten I was talking to a mom who recently had breast cancer. I said: “That’s a shame,” about something said. She immediately turned to me and said: “I try to reframe everything to a positive, so I don’t think it’s a shame, I’m all right with it.” The conversation was shut down as fast as if I had said, “I don’t care.” Fine, I thought, this is her way of handling things post-breast-cancer. Made sense. She’d been through a lot and being positive was working for her. Nobody else in the group said anything other than positive platitudes and I left feeling like I had bathed in honey, but not in a good way.

When we can’t have balanced, adult conversations that include an acceptance that not everything is going to be 100 percent peachy 24/7, then we fall into conversations of platitudes, small talk and falsity. Granted, I’m usually one of the first to be positive, but like anything, balance and a time-and-a-place factor into this. If someone comes to me crying because they lost a parent, I’m not going to say: “I try to reframe everything into a positive.” That would be insensitive. Likewise, sometimes people feel they cannot talk about things because they’re going to be deliberately or inadvertently shut down for saying something that isn’t shining and positive.

The reality, however, is we do suffer. We can be scared. We are exhausted sometimes. We may have fears, or feel overwhelmed, or just depressed for no reason. None of those emotions are ‘wrong’ and by aggressively reframing every perceived ‘negative’ thought we’re cutting out our need to express ourselves. Yes, like everything we need to keep in mind there are definite advantages to thinking positively, but we may have gone too far.

My clients overwhelmingly share with me that they are exhausted and wary of the positivity police. By this they mean, the mothers-in-law who shut them down for saying they’re tired or fed-up or cranky or peri menopausal. They embrace their ‘squad’ of female and male friends whom they can go out with for an evening and talk candidly with. No positivity police around. They are stressed from having to watch what they say, much like others complain that they cannot comment on a girl’s pretty dress for fear of being labeled toxic or inappropriate.

Linguistic change is good. It can help erase some common stereotypes, but if it shuts us all down and prevents candour, then it’s also harmful. When you jump on someone for not being ‘positive enough’ it isn’t very different from telling someone to ‘cheer up’ when they have clinical depression, or ‘get over it.’ And we all know how well that goes.

Granted, sensitivity training has fallen to the wayside since social media, but in some ways, it has reinvented itself as the erasure police. Groups of people who take it upon themselves to pontificate on the right others have to their feelings. If those feelings are racist or sexist, then maybe someone should say something, but if they’re simply about how we’re feeling, does it help to tell someone they should be more positive?

Like anything, it’s how you say it, and when you say it and why you say it. I have a client who is so negative it does harm her and so it behooves me to try to reframe her thinking – but I do so respectfully and in the context of therapy. If we were friends out for lunch, I wouldn’t shame her in front of others by saying she was too negative and she needed to be more grateful and positive. That’s not friendship that’s gaslighting[1].

The other day I was talking to a client whose husband died a few years ago. She was told by relatives, both his and hers, that she was ‘taking too long to grieve and needed to get back to living’. On the surface, she agreed, but later on, she felt bullied by their words, as if she’d fallen short of what was socially acceptable. She told me angrily (and has given me permission to share this) that she didn’t think anyone had the right to dictate how she should feel about losing her husband, or the duration of that grief, nor did it help to be shamed into thinking she wasn’t ‘doing enough’ to get over his death.

Again, clinically if a client is experiencing challenges with grieving beyond the intensity of the experience, they may wish to process this and find ways through it that are more expedient. But that’s a very different thing to being told by those you want to trust, that you’re failing to get over something that you shouldn’t have a subscribed mourning period for. The cult of positivity can be a stranglehold when it goes too far, as anything can.

Is there an alternative?

We agree that positivity tends to benefit the beholder, and others. Whilst negativity can be harmful. We also agree everything must be in the right time, and the right balance for the individual. People are different. Ironically some of my clients and acquaintances, report they feel ‘less pressured’ ‘more relaxed’ and ‘less judged’ when they hang out with sardonic, less positive people. Contrary to popular belief, the most positive person in the room isn’t always the most popular.

I can relate to this because when I feel too much positivity is heaped on my plate, I feel akin to a performing seal, it’s inauthentic, tiring, and doesn’t leave me feeling positive. Sometimes a really hard day, with plenty of negative experiences, can act as a better reminder of the value of life, than someone shouting out positivisms. Likewise, if I watch a film where everyone is radiant and happy, it can seem less authentic, and sometimes it’s the struggle, and the endurance, rather than say, the happy ending, that captures my interest.

Why else do we appreciate dark humour? Or like watching psychological thrillers? A bit like people saying you can only appreciate happiness if you have experienced grief. There is a wise lesson in the necessity for balance and reflection of both. If we police every sentence and dictate every action into what we believe are positive traits, we may be exhausting our natural state. It is possible to be realistic without being negative or positive. It is possible to be reflective without giving up hope or forgetting how challenging life can be. For some, extreme positive thinking appears to work, just like for some working out in a gym for 5 hours a day, works. But not everyone lives in extremes, in fact most of us do not. If I’m hearing from my clients that they are exhausting by having to maintain the appearance of positivity, then we’re doing something very wrong in thinking this is the answer to everything.

There is a time and a place for everything, or a season. And sometimes when the sun shines and it’s a beautiful day, we feel capable of maintaining a positive mood all day, and it seems to radiate from us. Whilst this may be an ideal, it’s not likely to be possible to sustain and if we don’t manage to sustain it, should we feel ashamed of that or believe we have failed?

Ironically, I could sit at a table with a bunch of self-deprecating, sardonic folk and have a real laugh, where we’re not all pretending everything is peachy but we’re pretty happy all things considered, and that’s without a need to reframe each word into a positive. There is something very liberating and freeing about being yourself, not having to watch what you say to such an extreme. When we police ourselves, we’re not doing therapy, or work on ourselves, so much as we’re being self-conscious about what we say, and often inauthentic.

So many times, I talk to people I meet, and they are unremittingly positive, but later on when they know me better, they reveal a totally different side of themselves. A man I have known some time, tends to make a conscious point of being positive 24/7 because he’s in marketing and believes instilling positivity is how he sells efficiently. But when he’s had a glass of wine and it’s the evening, he will reveal to us at the table as we sit talking, a completely different side to himself. I can see why he needs to maintain the positive side for work and I admire that, but I often wonder if he is a little tortured by the pressure he puts on himself to be like this 24/7, when it’s clearly not his ‘natural’ state.

Funnily enough I like the ‘real’ side of him far better. It’s more realistic, less cliched and narrowly focused and I believe, just as content, without the need to put the shine on all the time. Growing up with friends and family who were not afraid to be realistic or even negative, I can see the value and the downside of negativity. Being negative all the time reminds me of the saying ‘nothing comes from nothing’ whilst realism is underrated and underutilized. I wouldn’t want to be as self-defeating as some of those whom I grew up with, I believe in empowering people and supporting them, which involves believing in them and being positive. But I also think too much of a good thing can be insincere.

Sometimes when a client comes to me and says they’re having an awful time, I know they do not want me to reframe that and tell them tomorrow will be better, or if only they could see the positive in the event, things will improve. That would be offensive and inauthentic. Rather, they want me to sit with them, digest their experience, share it and be a friend. Sometimes a friend doesn’t need to make things better, they just need to care and show up.

The extreme end of positivity feels a lot like a cult. It’s unrelenting, it changes language and natural feelings, it acts like a cancel culture on many authentic emotions and shuts them down. One of the best things I ever heard from a therapist was from a colleague who told me: “Sometimes it just stinks and you have to throw your hands up and say I give up! Until something changes, which it might not, for a long time, but eventually it will. If during that time you feel awful, that’s just how you are going to feel and it’s okay to feel that.” That therapist was radically different in their approach. There’s something honest and real about this advice, that I don’t find when I get overdosed on mantras, affirmations, positive sayings and memes.

Too much of anything can be an overdose, that includes positivity. While a teen may get a lot from a positive meme on Instagram, they may also feel less alone by reading a negative meme. The point is to avoid the extremes of embracing darkness or pretending it doesn’t exist. The point is to consider we’re humans not trainable robots. Don’t we already stifle emotions enough by trying to be strong all the time? When did being honest about how you feel, become an anathema? If your heart tells you that you are exhausted by trying too hard, then examine this. Take a step back. Be authentic to yourself. Don’t follow the crowd. Follow your gut instinct. Sometimes the extra slice of really sugary cake isn’t delicious, it’s nauseating. Stay balanced kingfisher!


[1]  A form of emotional abuse and manipulation. https://www.healthline.com/health/gaslighting

.

Candice Louisa Daquin is a Psychotherapist and Editor, having worked in Europe, Canada and the USA. Daquins own work is also published widely, she has written five books of poetry, the last published by Finishing Line Press called Pinch the Lock. Her website is www thefeatheredsleep.com

.

PLEASE NOTE: ARTICLES CAN ONLY BE REPRODUCED IN OTHER SITES WITH DUE ACKNOWLEDGEMENT TO BORDERLESS JOURNAL.

Categories
The Observant Immigrant

A Post-Pandemic Future …?

By Candice Louisa Daquin

Courtesy: Creative Commons

Having been a reluctant fan of apocalyptic fiction since I read George Orwell’s Nineteen Eighty-four (1949), I had studied virology when the AIDS pandemic struck and read a great number of virus-related books on infectious diseases. Despite this preparedness and the knowledge that it was not a case of IF, but WHEN, the next virus would strike, I think I speak for most of us when I say we were still all unprepared for Covid-19.

What the pandemic has taught us thus far is immeasurable and I believe it will last several generations, or I hope so. That said, it’s our human nature to want to move on. Not because we don’t care, but part of being alive is putting trauma and suffering behind us and ensuring those who survive, truly survive, which means living. Is that insensitive or just the nature of the beast? It can be insensitive, especially to the millions who have lost loved ones, but it’s also how humans generally operate.

Is it possible to move on and live a full life irrespective of this global tragedy without losing our compassion and responsibility to stop this from ever happening again?

The reality is; it will happen again, and for many of us, in our lifetime. What we can do is be better prepared and all that this entails.

What are the steps being taken to move toward the new post pandemic future? What are we doing differently? And why?

The pandemic divided us, it physically kept us apart. Some who were well versed in social skills and true extroverts, struggled when they emerged from the worst of the pandemic. They found it hard to do the things they used to be so skilled at. From lack of practice. I recall sitting at lunch with a friend who used to be the life-and-soul of any social event. She struggled for, as she put it; ‘her words’. Having become so used to speaking less and not being face-to-face, she said it felt ‘overwhelming’, ‘strange’ and she looked forward to going home.

That is a habit we must break. The comfort of the living room and the immediate family is intoxicating. We can rapidly get used to living in a smaller-seemingly safer, changed world where we see less people, go out less, and become accustomed to an intimate circle. For some of us this was always our life, and maybe not as challenging — a shift as it was for those who previously socialised a great deal.

In a way the pandemic was harder on the extrovert than the introvert. Because while introverts aren’t averse to socialising, they can find it exhausting; whereas extroverts gain energy from it. When you put an extrovert in a forced setting without social opportunity, they may struggle more than someone used to their own company.

But it’s not as simple as extrovert and introverts. Many of us are a little of both, depending on the situation. I can go out with a big group one day. But on other days I want to be alone. Few of us are extremes. Most are like ‘ambiverts’ a combination of extroverts and introverts.

For those who do thrive on socialising, the pandemic was particularly challenging, but there are many ways to be affected, not least the tension and anxiety all of us picked up on or directly experienced.

Fortunately, technology became our best friend as we Zoomed more and met via video chats throughout the world. It opened up an international stage more than we’ve ever experienced and gave children a new normal in terms of how they learned online. Learning solely online had deleterious effects on underperformers. This ‘unfinished learning’ [1] particularly impacted youth who might have already been struggling in the educational system.

Having taught Critical Thinking online for years, I genuinely believe online learning cannot replace in-class learning. There are huge draws to learning from the comfort of home, especially for adult learners who do so after work [2]. “In comparisons of online and in-person classes, however, online classes aren’t as effective as in-person classes for most students. Only a little research has assessed the effects of online lessons for elementary and high school students, and even less has used the ‘gold standard’ method of comparing the results for students assigned randomly to online or in-person courses.” [3]The amount of information retained is drastically smaller and the social engagement of a classroom has benefits that are hard to quantify but necessary for social development. When you rob children of the opportunity to socialize with each other you isolate them at a crucial stage in their development.

Some kids with learning disabilities[4] are particularly affected by this, as are those who come from unsafe or impoverished backgrounds, where they may not have equal access to technology or reliable internet. They may not have parents who can help them if they are stuck or be able to work from home or have access to lunch. All those necessary elements to the education system were lost in our need to stay home and protect each other. A generation of children will always remember this time as a result.

On the other hand, they have mastered technology in a way that few older generations can boast of, and they are conversant in all the myriad ways of communicating with a wide range of technologies and devices. They are adaptable, versatile and fearless when it comes to tackling the rigors of online learning. For some who dislike social settings, it may also be a vast improvement[5].

Women left the workforce in droves [6]when the pandemic hit, with 2 million less in the work-force. The inverse of this was men began to return to work having been dropping in numbers whilst women rose. The Pew Research Center found “What accounts for the larger labor force withdrawals among less-educated women than men during the pandemic? It is complex but there seems to be a consensus that it partly reflects how women are overrepresented in certain health care, food preparation and personal service occupations that were sharply curtailed at the start of the pandemic. Although women overall are more likely than men to be able to work remotely, they are disproportionately employed in occupations that require them to work on-site and in close proximity to others.” Jobs men traditionally do like physical labor, were in high demand, whilst many jobs traditionally filled by women, were shut down, often not returning[7].

We can be glad our restaurants are open again; we’re opening borders, we’re flying abroad, we’re living again. But let’s also spare a moment to think of those who lost so much it’s almost impossible to conceive. Covid was the third leading cause of death in America during the height of the pandemic, how did this many deaths become normal? Covid killed an estimated 13% of people over 80. Aside the tragedy of a generation of elderly dying[8] and the loss of grandparents, and parents for so many, we’ve also seen younger people dying from a virus, which has shaken the belief younger people have that they are impervious to viruses similar to the flu, what effect with this have on their sense of safety going forward?

And what of the health consequences of those who technically survived bout of the pademic but developed ‘slow Covid’ or worse, the side-effects and lingering legacy of being seriously ill with the virus?[9] How many lung transplants will occur? How will ‘long haulers’ cope with lingering serious effects? What of those who live in countries where this isn’t an option? How many chronic illnesses will continue for decades as a result of this pandemic? It’s not enough to point to those who have died but also include those who survived but at such a high cost.

Financially we have collectively poured money into research, vaccines, countermeasures and prevention, but where has that money actually come from? And can we feasibly borrow that much money from our coffers without a reckoning? Economist Anton Korinek, an associate professor with a joint appointment in the University of Virginia’s Department of Economics and the Darden School of Business thinks: “People sometimes frame the policy response to COVID-19 as a trade-off between lives and livelihoods, and they ask whether it’s worth killing our economy to save people’s lives. But what they forget is that people won’t go back to a normal life and consumer demand won’t really recover if the virus is spreading through our country and killing people.” But the result of these hard choices and repeat closures, is they now predict an impending worldwide recession of global proportions, which had already been mounting prior to the pandemic, but promises to be far greater in its aftermath. I don’t think we’ve even begun to see the fall out; it begins with massive inflation but that’s just the start[10].

History tells us when we go through challenging times and survive, ‘the near miss experience’ as it’s known as, we want to live more than ever before[11], but economically this will not be possible for so many who are robbed of their financial security because of inflation, redundancy, underemployment and post-covid illness. We should be mindful that none of us are all right if many of us are still suffering and if we can support those who struggle, this battle with covid should have taught us all that we should care more about each other.

Perhaps these are the steps we can take to move toward a new post-pandemic future, where we consider ways, we may be better prepared for an invariable future of emerging viruses. We can try to find ways to avoid spilling into areas with high disease potential. “According to a group of UN biodiversity experts, around 1.7 million unidentified viruses circulate in animal populations, of which 540,000 to 850,000 have the capacity to infect humans.” So, we can avoid wet markets, and sloppy scientific research, both of which are vectors for the spread of viruses. We can pay more emerging virus hunters [12] to seek out those emerging viruses and begin work on treatments before they devastate countries. We can be borderless in our unanimous approach to equity for all, especially access to healthcare.

In America, we learned we were far from unassailable. In a New York Times article about Covid Deaths, the authors wrote: “For all the encouragement that American health leaders drew from other countries’ success in withstanding the Omicron surge, the outcomes in the U.S. have been markedly different. Hospital admissions in the U.S. swelled to much higher rates than in Western Europe, leaving some states struggling to provide care. Americans are now dying from Covid at nearly double the daily rate of Britons and four times the rate of Germans.” Nothing can diminish that fatal statistic or rectify the unnecessary deaths[13]. Our healthcare system, considered superior, proved to be full of holes. Without some type of socialised healthcare our costs and resources are too high and scarce. We don’t value the front-line workers like nurses, porters, assistants and care staff and we do not pay them for the risks they take, and whilst we do pay doctors good wages, we have severe shortages of knowledge and progress. Finding out we didn’t have enough ventilators, masks for medical staff, PCP equipment and beyond, exposed the shame of putting profit over people. [14]

It is no surprise then that the UK and USA were among the top offenders in the rise and spread of the pandemic and their death rates exposed this. No one ethnic group appears to be at greater risker of dying from the virus based on ethnicity alone, but Hispanic, Black, and native Americans or AIAN people are about twice as likely to die from COVID-19 as their White counterparts and that Hispanic and AIAN people are at one and a half times greater risk of COVID-19 infection than White people[15]. This is caused by social reasons (inequality) not ethnicity, as can be proven by Africa and some AIAN countries having some of the lowest Covid mortality rates. In the article ‘Racism not Genetics’ in Scientific American, the authors point out “the genes that influence skin colour are distributed independently of genes that influence the risk for any particular disease. Given the heterogeneity of groups we call “black” or “white,” treating those categories as proxies for genetic variation almost always leads us astray.”[16]

Even if there are increased susceptibilities related to blood type[17] and age (More than 81% of COVID-19 deaths occur in people over age 65. The number of deaths among people over age 65 is 97 times higher than the number of deaths among people ages 18-29 years). The real risk is how healthy the population is and whether they have safe access to healthcare[18]. Both America and the UK failed because they put profit above people and have large populations of sickly people[19]. Going forward this needs to change, which means redesigning what we prioritise. People need to have access to healthcare and make lifestyle changes that will reduce their risks which they cannot do if they cannot afford to see a doctor or in the case of the UK find it hard to see a doctor because of long wait times and reduced staffing. It’s not as simple as socializing healthcare as the UK proved, this alone doesn’t save lives, what saves lives is considering the larger picture.

But politicians gain from older populations dying, consider what happened in Brazil when the President denied the danger of Covid and for a time Brazil had the highest Covid mortality[20]. This is the harsh truism rarely mentioned: It benefits those in control of a society to lose the most fragile members who will suck up precious resources, much like a form of eugenics, it behooves them to let it happen and there are many examples[21]. For a politician who is looking for ways to reduce healthcare costs, what is better than some of the potentially most expensive ‘customers’ dying? This happened in France where number of elderly people died one Summer, shockingly little was said at the time, but all signs pointed to a collective signal of relief from those in power who benefited from less older people making claim on an already taxed medical system[22].

When Italy [23]and Spain [24] and Brazil [25] became epicenters of Covid 19 deaths, they did so because of ill preparedness and it’s a cautionary tale to witness which countries succumbed to the ravages of covid 19 repeatedly, versus those who learned from them. What we have learned is more, not less, needs to be done and if a country keeps its borders open including air-travel and business-travel, then as much as they hope to save their economy, they do so at the expense of their most vulnerable. For some countries this was a conscious choice (economy over lives) whereas for others it was poor communication and slow response times. For some a lack of money, for others a desire to gain at any cost. All this speaks of the tapestry that is the pandemic’s aftermath (and truly, is it really vanquished?)[26].

I’d love to say a new post pandemic future looks rosy, but the only way that happens is if we learn from our mistakes, which history tells us, we rarely do. The most important thing is empathy, when we saw others take their masks off and simply not care if the vulnerable died, we saw how bad we as humans can fall. But we also saw how wonderful humans can be, including the infinite sacrifice and compassion of thousands who sought to help strangers. If there is a way, we can reward the good and not the bad, if we can get our priorities right and stop paying sports figures astronomical sums but perhaps emphasise on compassion, kindness, and diligence, we can all grow together.

I was particularly moved by youth who in the turmoil of the pandemic created inventions or systems to help others[27]. Believing youth are our future, and thus, our hope, it gives me great faith in the future when I see those too young to vote, care for strangers and seek to do their part. We should always encourage this as we should encourage a continued dialogue into how we can create an international rapid response to emerging diseases. It is not if, but when, and now all of us should know this and have no excuse for putting our heads in the sand again. Yes, it hurts to think of it, yes, we’d rather go off and have fun, but what fun is it if we are only postponing the inevitable return of a lethal virus? Part of being responsible for our planet and each other, is not avoiding the harsh truths; of environmental changes and devastation, global poverty, continued inequality and elitism, and of course, the increasing risk of deadly diseases.

We have within us all, the power to effect change. The steps we should take to move toward a post pandemic future must necessarily include keeping our eyes open and not taking the easy road. Sure, governments don’t want to spend the money on research, science, virus hunters, predictions. And preparedness, but I challenge anyone to say this isn’t exactly what they need to do. It is necessary we keep this in mind when we vote and protest. We should be marching about this as much as any other cause, because it affects us all and equally, brings us all together with one cause.

Thinking in terms of one world, we are less divided than ever before and whilst we were separated, I think we also found ways to come together if we choose to. I say, we should. Because, together globally, we learn more than we ever would divided. With the offensive by Russia on Ukraine, we see the lunacy of war, the futility, the devastation and waste. Instead of pouring millions into wars and keeping the rich, rich at the cost of the poor and overworked, we should consider how we can all rise out of the mire and evolve towards a better future. But in order to achieve this we cannot be complacent, and we cannot let our guard down.


[1] https://www.mckinsey.com/industries/education/our-insights/covid-19-and-education-the-lingering-effects-of-unfinished-learning

[2] https://www.forbes.com/sites/dereknewton/2021/03/31/the-worst-of-times-for-online-education/?sh=401d57623a5a

[3] https://www.edweek.org/technology/opinion-how-effective-is-online-learning-what-the-research-does-and-doesnt-tell-us/2020/03

[4] https://www.washingtonpost.com/education/2021/05/20/students-disabilities-virtual-learning-failure/

[5] https://penntoday.upenn.edu/news/how-technology-making-education-more-accessible

[6] https://www.shrm.org/resourcesandtools/hr-topics/behavioral-competencies/global-and-cultural-effectiveness/pages/over-1-million-fewer-women-in-labor-force.aspx

[7] https://www.pewresearch.org/fact-tank/2022/01/14/some-gender-disparities-widened-in-the-u-s-workforce-during-the-pandemic/

[8] https://www.statista.com/statistics/1191568/reported-deaths-from-covid-by-age-us/

[9] https://www.hopkinsmedicine.org/health/conditions-and-diseases/coronavirus/covid-long-haulers-long-term-effects-of-covid19

[10] https://news.virginia.edu/content/economist-societal-costs-covid-19-outweigh-individual-costs

[11] https://www.psychologytoday.com/us/blog/understanding-grief/201803/aftereffects-the-near-death-experience

[12] https://www.france24.com/en/africa/20201218-gabon-s-virus-hunters-in-search-of-the-next-covid-19

[13] https://www.nytimes.com/interactive/2022/02/01/science/covid-deaths-united-states.html

[14] https://www.theguardian.com/us-news/2022/feb/06/us-covid-death-rate-vaccines

[15] https://www.cdc.gov/coronavirus/2019-ncov/community/health-equity/race-ethnicity.html

[16] https://blogs.scientificamerican.com/voices/racism-not-genetics-explains-why-black-americans-are-dying-of-covid-19/

[17] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8286549/

[18] https://www.bbc.com/news/world-us-canada-52245690

[19] https://theconversation.com/why-has-the-uks-covid-death-toll-been-so-high-inequality-may-have-played-a-role-156331

[20] https://www.thelancet.com/journals/lancet/article/PIIS0140-6736(21)00529-8/fulltext

[21] https://foreignpolicy.com/2021/03/18/china-covid-19-killed-health-care-workers-worldwide/

[22] https://www.nbcnews.com/news/world/hong-kong-covid-outbreak-rcna20033

[23] https://www.france24.com/en/tv-shows/revisited/20210528-covid-19-in-europe-codogno-the-italian-town-where-it-all-began

[24] https://www.nytimes.com/2020/03/13/world/europe/spain-coronavirus-emergency.html

[25] https://www.scielo.br/j/rsbmt/a/8FzbQZY57WRTwYL9MnBKBQp/?lang=en

[26] https://www.nature.com/articles/d41586-021-03003-6

[27] https://circle.tufts.edu/latest-research/deeply-affected-pandemic-youth-are-committed-helping-others

.

Candice Louisa Daquin is a Psychotherapist and Editor, having worked in Europe, Canada and the USA. Daquins own work is also published widely, she has written five books of poetry, the last published by Finishing Line Press called Pinch the Lock. Her website is www thefeatheredsleep.com

.

PLEASE NOTE: ARTICLES CAN ONLY BE REPRODUCED IN OTHER SITES WITH DUE ACKNOWLEDGEMENT TO BORDERLESS JOURNAL.

Categories
The Observant Immigrant

A Bouquet of Retorts

By Candice Louisa Daquin

Language. Children learn it before they understand its importance. Adults can struggle to learn additional languages because the brain is less elastic as it ages. The formulation of language is a key component of what makes humans, human. Our language (though not our communication) is speculated to be among the most complicated and rule-bound of living creatures. Whether disproved in the future when we are able to translate other animals’ languages, we can all agree, the impact of language on us is invaluable.

Why then are our language skills diminishing? With every person who has benefited from being able to look up information online and thus, know a little bit about a lot of subjects, we have simultaneously reduced our language breadth. We are increasingly tempted to take short cuts linguistically both in writing (texts and emails) and how we speak to each other. It may be tempting to blame this on social media but it’s not that simple. This is not new: Throughout time, there has been enormous value put on ‘banter’, ‘ridicule’, ‘sarcasm’ and pithy retorts.

Perhaps people who can summon lightning fast rapporté are considered witty, nimble minded, fashionable. Contemplate those who have been considered ‘cool’ socially. Those who had the quick response, the short soundbite that cut to the chase or was easily repeatable, was often admired. Just recently Ukrainian President Volodymyr Zelenskyy in response to President Biden’s offer of a flight to get him and his family out of harm’s way, replied: “I don’t need a ride, I need bullets.”

People applauded his response because in many ways it describes the crux of what Volodymyr Zelenskyy and his country are going through and portrays him as a brave leader. Just as Clint Eastwood in the Dirty Harry series said the much-emulated words: “Make my day punk” and Arnold Schwarzenegger coined “hasta la vista” in the Terminator series alongside other iconic statements made by film stars, celebrities, politicians and authors. It’s the admiration of cutting to the chase, emulated by millions, as a means of extracting the essence of what we’re trying to say or merely being glib. We utilise quotes now more than ever, through the social media medium which makes it easy. The only question is; when we scroll through quotes and need ever increasing variety to our lives, are we really absorbing the meaning behind the soundbite or merely parroting it?

There is a history behind the proverb, quote and parable. It was a means of remembering wisdoms easily for those who might have been illiterate or before books were widely available. Along with songs, this was a method of retaining what was not written. Religion has employed this through easy to remember choruses and proverbs, it has long been human nature to reflect on life through such proverbs and sayings. In the 14th century the popular proverb “He who sups with the Devil should have a long spoon” and other sayings were a means of social control or wisdom, depending. In quotes from Aesop, (the famous fables) “We hang the petty thieves and appoint the great ones to public office.” We can admire the truism of this, just as we admire the bravado of a TV hero saying “make my day punk.” Perhaps Confucius said it best: “They must often change who would be constant in happiness or wisdom.” And quotes are a ready means of growth that don’t require the commitment of reading say, The Four Books and Five Classics of Confucianism.

But when do quotes stop being educational and more, inadequate shortcuts to knowledge? Where I live, people don’t read whole books very often anymore. They excuse this by saying they are ‘too busy’ to read. I work a 60-hour week and I read. I can understand emergency room doctor’s and new parents not having time to read, but regular folk? You can tell how much someone reads by asking them what their favourite books are. If they quote more than one high school book, chances are, they haven’t read much since high school. Reading is an evolutionary experience and we grow as we read. Many people have forsaken televisions (which isn’t a bad thing) but also the genre of fiction, believing it has no worth. Are we ‘better’ for reading less fiction, and for reading fewer complete books and more online news channels and texts and memes? Do we lose something? And if so, what do we lose when we absorb language in this different, shortened way?

When was the last time you sat down and had a really in-depth conversation that wasn’t about your parents’ dementia or a breast cancer diagnosis or something that serious? But simply analyzing a book you read, a play you went to see, a film, or a discussion on politics or history or psychology? Granted some of us may never have sought to do this and that’s okay, but of those who did used to analyze, it seems analysis is less mainstream and now very specific to your job field. Fewer people sit on Sunday afternoons and read the paper from front to back. More scroll quickly, gleaning the basic amuse-bouche but nothing of substance. If you are a marketing analysist you analyze market trends. If you work in the financial sector, you may analyse financial impact. If you are an economist, you may consider economic development. Because most of us work such long hours, do we really have time, energy (or desire) to analyse things we don’t have to analyse?

It could be a sign of the times, of modernity, cultural shifts, progress even, that we don’t need to delve as deeply. No longer subject to pouring through piles of textbooks to hand write a paper. And some of that progress facilitates other knowledge, such as an ability to navigate the www… and beyond — to understand HTML and design websites and publish books all by ourselves, things that formerly would have seemed impossible. The scope of things we can do with technology for example, has expanded our choices recreationally and professionally. Kids are creating entire music albums in their bedrooms with affordable equipment, people are making whole films on their iPhone, others earn a living filming themselves for social media platforms, models are made by Instagram photos, we have all become graphic designers and editors of our own stories.

On the flip side of that, jobs that once promised a living wage such as graphic design, photography, editing, translation, music production, are being replaced by cheaper options. Platforms like FIVERR can design your tattoo for you, create a corporate logo, a book cover, anything you should desire, for a fraction of the price a professional would charge, because they are borderless, not beholden to the rules of old, and could well be a 16-year-old practicing graphic design skills from their bedroom. On the face of it, there is nothing wrong with that. If someone in Indonesia wants to offer those services for a fraction of the price a US supplier would charge, and under-cuts them in the process, isn’t that just the consequence of a free marketplace? If that Indonesian supplier wishes to charge what for them is a relatively fair wage, but in the US market would be considered a pauper’s wage, is it exploitative to the Indonesian? Or must we accept an individual’s choice to make that decision for themselves?

Should we question what ‘choice’ means if there is effectively, less choice when someone offers services for less? What is equitable? Ethical? With technology half the times we don’t know what country, what nationality or the details of the people we work with and it becomes less important. The only reason it is important is if someone is working from Nepal for $3 an hour whilst someone is working from Australia for $25 an hour. Simultaneously if everyone is aware of that inequity but agrees-none-the-less, then who are we to complain? Cost of living varies and maybe charging commensurate to your cost of living is a more realistic model than across-the-board flat rates. How much has changed since the internet opened borders and countries to a greater freedom of the choice of commerce and services than ever before? Like with anything, there is exploitation and there is improvement, and there’s no one simple answer to ensuring everything is fair, or nobody is exploited.

How does this relate to language? Or a series of retorts? It comes down to shifting social mores and what our expectations are – with this comes a modification of language, much like that you read about in science fiction novels of the 1960’s where the homogenised aspect of the world watered down uniqueness in favour of uniformity and created a melting pot where language among other things, was diluted for simplicities sake. Interestingly science fiction also created entirely new languages, (conlang, which is an artificial language) and worlds, so one could say it added to as much as it abstracted the future. Using posteriori languages (borrowed words and grammar from existing languages) has become commonplace, from Spanglish to Yiddish in America.

In the article ‘SMS & Its Negative Effects on Language’ (www.itstillworks.com), the authors note the mass use of shortened ‘slang’ language employed by societal groups, not least teenagers. Such habits have been speculated to carry over into reduced grammatical and spelling abilities, although conversely it could be argued, if teens are writing more (even badly) it encourages those who formerly may not have written at all. If you think how much your social skills have deteriorated since the pandemic because of lock-down and less direct socialising, is it that hard to believe persistent use of abbreviations and icons would replace language fluency? In the article ‘Alienating the Audience: How Abbreviations Hamper Scientific Communication’ (http://www.Psychologicalscience.org) the authors point to loss of deeper meaning when employing constant acronyms.

In The Times of India article ‘Shortening language has negative implications’, the authors point to a misuse of technology (always being ‘on’ and responsive to technology) ironically reduces efficiency and that ‘infomania’ can cause an overload of information. Being ‘dumbed down’ by technology and linguistic abbreviation could reduce the sharpness of our knowledge. Conversely, The Atlantic says in its article ‘Teens Aren’t Ruining Language’ that while ‘fad’ words may have a different trajectory online, they don’t have the power to ‘debase’ linguistic standards. “How much a person’s vernacular changes over time may have as much to do with personality and social standing as it has to do with age. The extent to which teenagers are credited with (or blamed for) driving lasting change to language … is grossly overstated.”

Whatever language we speak, we may be aware of this shift in seeking depth. Not only reading less complete works but expecting a synopsis instead. How does this affect conversations? Social interactions? What do we value and consider ‘worth’ as well as what the shift from meaning to soundbite imply? It is good to be able to Google everything and think we understand things we may not have looked up if the internet did not exist. But simultaneously we’re aware what’s online isn’t always factual so much as a series of compiled opinion. If history is written by the victor, then doesn’t it stand to reason what we assume is ‘fact’ shifts dramatically? What people in China right now, are exposed to as ‘incontrovertible truth’ is not the same as what people in Russia are being exposed to, or America, or India. With so much variation in what represents pure truth, shouldn’t we worry about that? Or are we imagining a past where truth existed in a purer state than it ever did?

Journalists used to have to write objectively about subjects, unless they were writing opinion pieces or columns. Even with the latter, there was a responsibility to report news from both sides, and not let personal bias taint the reporting or the information being presented. When you read news articles now, if you step back and try to put your own personal political views aside – what do you think about the reporting of those subjects? Are they objective? Take any side you want, but what you’ll find is they are often blatant or subtle bias and invested in projecting their perspective as the ‘only truth.’ Even the most basic Wikipedia definition of journalistic objectivity states: “To maintain objectivity in journalism, journalists should present the facts whether or not they like or agree with those facts. Objective reporting is meant to portray issues and events in a neutral and unbiased manner, regardless of the writer’s opinion or personal beliefs.” Why then is this not demanded?

I admire journalists who could step outside of their own views and write on a subject without that natural bias. Non-journalists are biased and look to journalism as a fair reporter of facts, where we can make our own minds up. Reporting shouldn’t be an opportunity to tear one side down to promote another. In America, the backlash against Donald Trump was perhaps the greatest witnessed in this country because of the deep divide in voters and the horror felt towards Donald Trump by many. He was considered dangerous for the country and irrespective of whether it was a truth, the majority of news outlets were a 100 percent against him. When I brought this up, I was told I obviously was a racist who supported Donald Trump otherwise why would I even care? This missed the point. I cared not because I wanted to defend Trump, but because I felt objective, rational, non-biased news reporting had been completely eroded.

It’s more important to me that we retain that objectivity even in the face of things that we may personally revile. A journalist who is unable to be objective, forfeits the right to condemn another, because they are not utilising that objectivity in their analysis. Maybe we cannot expect regular every day people to be free of bias, but when the moral underpinning of your job requires it, then you owe it to your readership not to pander to their outrage and stir the pot, but present an objective overview. The same is true of social media ‘conversation’ where a subject is presented, and people sound off, often becoming offensive, outrageous and exceeding the remit expected if we were all sitting in a room together. That anonymity afforded by a screen and physical distance, seems to have opened a pandoras box of horrors.

People can be unrecognisably offensive in their attack of others, for no discernable reason. It should be possible to discuss any subject without people devolving into personal attack and ad hominin. Has our use of language also been altered via our anonymity online? The oft disputed Sapir–Whorf Hypothesis states that the language we speak, influences the way one thinks. This verges on potential fascism if it judges one language superior to another, but the point is taken. The complexity of language has been linked to higher IQ’s which may explain why Finnish and Japanese students tend to be in the top ten achieving academically, their languages being among the most complex. Likewise, people who are polyglots, and speak multiple languages, or those who are musical (often considered a language) or mathematical (likewise) tend to possess higher IQ’s. The problem with this is we will never know what other influencing factors play a part in this, including nurture and nature.

This paradox between cancel culture, that looks to demote those who say things deemed offensive, and the increasing offensive backlash and gaslighting of others, seems to point at the hypocrisy of ‘free speech.’ It’s only free if the right person is saying the right thing, otherwise you’re liable to be canceled, but watch out, because that’s subject to social fashions that vary wildly. Beyond that hypocrisy, what of the actual quality of communication? Have we devolved to the point of only being able to say what we really think anonymously? Why do people write reams online but say little in a real-life group? Are we hiding behind rules and etiquette that only creates sub-groups that have no tolerance for other groups? Does group think ever produce something other than subjective thinking?

If aliens came to Earth, they might notice humans seem to admire ridicule and socially sanctioned sarcasm and call it witty without needing to have a deeper conversation. Twitter represents this phenomenon most acutely because the actual length of your post is restricted and thus, you have to encapsulate what you want to convey, by honing it to that word-count. That’s great if you’re delivering a highly edited statement, but how naturalistic is it as a form of legitimate communication? Does it replace your grandmother sitting at the kitchen table with you for an hour? Maybe it doesn’t have to, but do we still sit at the kitchen table and have those conversations? And if we don’t, will that affect what human communication is evolving (or devolving) into?

Sometimes flim-flam is appealing, in its starry simplicity. We embrace Digi-Fiction written and read from computers, changing how we process fiction. We can be attracted to the code-switch of language that takes the guess work out of communication. For those neurodiverse populations and people increasingly using actual code as a form of language such as HTML, it may seem like a logical next step. An improvement on the guesswork of complex modes of communication that were challenging for so many. At the same time, when we lose the ability to read a book from beginning to end, we lose the patience and journey of that process, which if not replaced, may be a genuine loss we cannot even fathom. Then again, in the spirit of all possibilities it could be we leave behind that which is not necessary and embrace a Haiku perspective of saying more in less.

I confess, when I read a ‘classic’ novel I am aware of how much superfluous information exists and doesn’t strictly need to. It is interesting to consider how much language we used to say one thing, compared to now. The medium of social media means we’re busier than ever and take our ‘fix’ of what attracts us (quickly) before signing off. Therefore, long poems have less attraction than shorter memes. We fixate on the easily presented, the humorous and immediate. Nuance, subtlety, slow burns, those are almost luxuries we may leave for rare nights in the bath. Novels are changing to adjust to this phenomenon. Graphic novels are gaining further traction, even songs. Our entire social fabric has altered, and, in some ways, this was inevitable if you recall we always admired that witty fast retort, going back to Marie Antoinette, Gloria Swanson and beyond.

Does this mean all language must conform to this new rigor? Or will epochs of devotees to other forms of communication, endure? When I browse through bookstores, I notice there are many styles of writing, including the long-winded, and the easy read. My fear is not that we read ‘easier’ books but that we stop reading altogether, believing scrolling on our phone compensates for the discipline of reading a book. One may argue, do we need discipline? But learning is invariably discipline and part of honing rigorous learning habits is being challenged. We can do crosswords, play chess, sudoku, and go to the gym to maintain a healthy body and mind, but the ‘imaginarium’ of fiction and the need for creative expression is for many of us, equally necessary.

Fiction isn’t a waste of time simply because it’s not a literalism. Fiction as a speculative field, has inspired science, politics, social advancement and a sense of possibility. Fiction can thrill, entice, or simply entertain. Not all forms of entertainment are equal. Whilst I confess, I do watch television, I recognise the lasting value of a book compared to a serial, because it requires more of us mentally. For those highly disciplined souls, there may be no need to ‘indulge’ in fiction, or television. Maybe reading Scientific American or pouring over The Financial Times or Anglers Digest will be their choice. But language has a trickle-down effect, and you can guarantee, it will eventually permeate all sectors of our lives.

Do we want to completely dilute the value of further explanation, detail and depth in favour of the glamorous soundbite? Or is it possible to harness the value of succinct communication and retain the continued relevance of detail? When I read what passes for scientific news in popular media, it concerns me that we are picking and choosing for ‘click bait’ purposes and this leads to the proliferation of inaccuracy. Case in point, the startling headline: ‘The Epstein Barr Virus (EBV) causes MS’! Then in the meat of the article, we find 95 percent of humans have ‘EBV whilst only (35.9 [95% CI: 35.87, 35.95] per 100,000 people) have MS’ – we are leaving out the most important connections in favour of scaremongering journalism which only serves to increase (inaccurate) neuroticism when it should seek to educate and elucidate.

It’s not that too much information is bad for us, it’s that too much incomplete information can distract us from truth, and we may learn to gloss over what matters in favour of what shines brightest. Sometimes it is necessary to finish the chapter.

.

Candice Louisa Daquin is a Psychotherapist and Editor, having worked in Europe, Canada and the USA. Daquins own work is also published widely, she has written five books of poetry, the last published by Finishing Line Press called Pinch the Lock. Her website is www thefeatheredsleep.com

.

PLEASE NOTE: ARTICLES CAN ONLY BE REPRODUCED IN OTHER SITES WITH DUE ACKNOWLEDGEMENT TO BORDERLESS JOURNAL.