Categories
The Observant Immigrant

Piano Board Keys

By Candice Louisa Daquin

In 1967 the US Supreme Court decision Loving v. Virginia, ruled that blacks and whites had a legal right to intermarry. Between 2000 and 2010, the number of white and black biracial Americans doubled, while the population of adults with a white and Asian background increased by 87%. (Pew Research) “horizontal hostility” describes black mixed-race experiences of societal black rejection, and how this perception of ‘(in)authenticity’ impacts self-perception and the expression of ethnic identity.

Recently something that happened to me personally that segued into a greater story of biracial identification in America. I have lived in four countries in my life so far and nowhere has racial identity been as contentious as in the USA. When The Queen died, like many others, I did a post saying ‘Rest in Peace’. I am by no means, a Monarchist but serving for 70 years felt like an impressive feat. I was immediately jumped on by a few who felt I was a pro-colonialism and “white privilege oppressor”.

As a psychotherapist, I often bite my tongue and do not express myself when others are insulting or being triggered. I have grown to respect the value of doing this, because too often it inflames things when we say anything to clarify or defend contentious subjects. However as this was posted publicly, I had to clarify. My point is not about what happened to me, but about the assumption that individual made in calling me a “white privileged oppressor.” Likewise, assuming I am white.

If people of colour decide the degree of melanin in another’s skin represents their race and culture, this will only end up emulating what was done to people of colour by white-skinned racists. Two wrongs do not make a right. It is something that comes up a lot as we discuss what it means to be a person of colour. African-American presidential candidate Ben Carson accused President Obama of not being able to understand “the experience of black Americans” because he was “raised white”. It is more common for those mixed-race than a singular race to fail to ‘please’ either side.

Just ask celebrities like singers Mariah Carey or Shakira who have struggled with this their entire lives. Or singer Lenny Kravitz (Black, Jewish, and Native American) who is quoted as saying when he had to fill out the ‘race’ sections on school forms, “My great-grandmother’s Cherokee Indian. My father’s a Russian Jew. My mom’s Bahamian. [I thought], ‘what the hell do I put on this thing?’ The teachers came over and [said], “Black. That’s what you are.” And so, so many parts of your heritage are just squashed. ‘That’s it.” (Huffington Post, 2013). Obviously if you can ‘pass’ then you have that attending privilege. Where I live about 70 percent are Hispanic and only recently there is talk of ‘white’ Hispanics versus ‘brown’ Hispanics, which goes back to the caste system in countries like Mexico, where historically the darker you were, you’d be considered serving class because you were more ‘Indio’ and if you were lighter, you were considered more Spanish. Ultimately these sub-categories seek to further divide people rather than describe them.  

Fortunately, this racist tide is beginning to turn as people understand skin colour should never confer privilege even if historically it was warped to do so. Perhaps like any culture, there is a desire to stand out from the average, so anyone different may be admired more, if you are lighter than average, you may be admired more (or less), and vice versa. Ironically in countries like England, Canada, France, Germany etc., if you are darker skinned, you are considered more attractive and admired for being darker skinned, in countries where everyone is trying to tan and become darker. So, we have two polar opposites, parts of Asia where women may even bleach themselves to be lighter, and parts of Europe (and America) where people may literally die to tan.

In this day and age, so many of us are ‘mutts’ meaning we are so mixed; we carry Black, Asian, European, everything. But we’re still striated into colors because of racism and casteism. They are not the only reasons, it’s also about how we identify and how others identify us.

“Individuals who do not fit monoracial categories may be oppressed on systemic and inter-personal levels because of underlying assumptions and beliefs in singular, discrete racial categories” (Johnston, Marc P, and Kevin L Nadal. 2010. “Multiracial Microaggressions: Exposing Mono-racism in Everyday Life and Clinical Practice.”). I was assumed to be Anglo because I look it, so as far as others were concerned, I cannot understand the experience of being of colour because I don’t have any colour. Even if I were married to a person of colour with children of colour and my parents were of colour, it would be about my individual experience. But the flaw lies in assuming we can have an individual experience. We can’t. We are moulded by our family and our ancestors and whilst some of us may not know where we come from, DNA testing makes it more possible. This should alleviate some of the worst racism, but it hasn’t. Both sides seem further apart than ever before.

Author and activist James Baldwin defined his stance thus: “he was a Negro by choice and by depth of involvement–by experience, in fact.” Meaning, even if someone did not ‘look’ black if they were, and identified as such, they were. The one-drop rule is a long held legal principle of ‘racial classification,’ prominent in the 20th century United States. It asserts any person with even one ancestor of black ancestry (“one drop” of “black blood”) is considered black. Before the American Civil War, free individuals of mixed race (free people of color) were considered legally white if they had less than either one-eighth or one quarter African ancestry (depending on the state). Equally during slavery in America being born to an enslaved mother, made them automatically enslaved from birth. Racial integrity laws have existed throughout time with different groups and are essentially used to oppress a particular group. In theory they could be easier to enact now, given DNA testing.

Ironically, I have more blood of ‘colour’ than many, who if we were in a photograph together, would be assumed to be of colour, whist I would not. Which is understandable, but what is not understandable is when people deny mixed-race individuals their identity in seeking to label them or condemn them for being able to ‘pass’ ethnic groups and racially distinctive groups vary but can also be the same. Respecting someone’s ethnicity and race are necessary in order to avoid becoming as bigoted and discriminatory as the past.

When George Zimmerman fatally shot Treyvon Martin, he was called a ‘White Hispanic’ for three reasons. One he was light skinned. Two his last name was a non-Hispanic name. three, he shot a black child. It was an example of the media manipulating the truth in order to make Zimmerman more of a racist seeming person. Perhaps Zimmerman is just a bad racist, or maybe he would have shot a kid no matter their skin colour, we may never know. We know Martin called Zimmerman racist things like ‘Cracker’, but since society says a person of color cannot be racist then that was not considered. Whist most of us hopefully want violence against young black men to end, we shouldn’t deny that much violence toward young black men is perpetuated by young black men. Lack of opportunities seem to kill young black men as much as racism but maybe the two are the same thing, coming from difference directions.  

What we can say is our society hasn’t given young black men chances and that can lead to increased temptation toward crime or violence. Surely if a young black man is shot for simply walking down a street, nobody should justify it. Just as with Brianna Taylor and so many innocents, killed for the colour of their skin. However, we should be able to make this argument without turning the perpetrator into a white man when he was not. It is a classic example of manipulating the truth in order to make it more about racism than it may have been. Or it was purely about racism, but if two people of colour cannot be racist then how can it be? There are so many issues here what we do know is two wrongs don’t make a right.

Pew Research has found most Americans who are mixed race, identify with one race (61 percent) because they ‘look’ like that race. Which points to how we look as continuing to be the determinant for racial identity even if it’s inaccurate and often leaves people feeling they have lost half of their identity The survey also found that the way people may describe their personal racial background does not always match the way they think others see them. “Six-in-ten Americans with a white and black background (61%) believe they are seen as black; only 19% say they would be seen as multiracial (an additional 7% say they would be perceived as white only). The shift is happening, case in point, Rachel Dolezal, who was the head of the local chapter of the NAACP and identified herself as African-American. But her Montana birth certificate said she was born to two Anglo people. Dolezal earned a master’s degree from the historically black Howard University in Washington, D.C., and was a professor of Africana Studies at Eastern Washington University. Her contemporaries assumed she was African American. It shows that whilst for many years, people with black heritage may have sought to deny it, now some Anglos seek to be black.

One of my best friends had a red-haired white mother and a Jamaican father. She was 70 percent ‘Anglo’ because her Jamaican father was not entirely black but mixed race himself. But she ‘looked’ black and identified as black whist her brother looked white and identified as white. Which are they? Is identity sufficient to say? Or how others perceive us? I can say I’m mixed race but if I tell people I’m a black woman or a Latin woman I might be laughed at because I don’t look like I am. Would it even be right to say so? What is right? It depends upon whom you’re speaking to. These are reductive discussions of identity that parody race and don’t allow individuals to say who they are.

My siblings could look black whilst I could look white, it can leave people feeling like they have racial imposter syndrome where a person feels they are appropriating a culture that actually not their own! If we feel liminal like we drift between cultures but belong to none, isn’t that often because of the stereotyping that goes on even within cultures as much as without cultures?

I’m Jewish but I do not believe in God, nor do I go to Temple, so when I have tried to join Jewish writing groups, I have been shunned as not being Jewish enough. When I worked for a Jewish organisation, I was considered Jewish, but I was the ‘wrong’ kind of Jew because I was Mizrahi and Sephardi rather than Ashkenazi. In other settings, I wasn’t brown enough to be considered a Mizrahi or Sephardi jew. The absurdity of all the micro aggressive ways a person can be catalogued or disqualified wasn’t lost on me. It is worse for some who are more obviously mixed race but don’t possess whatever that group demands for admission but are also racially attacked by other groups. For example, what does ‘you act white’ really mean? That you are not speaking with the right accent, or that you should know another language or wear different clothes or? My other friend is constantly told she is not Latina enough because she has no accent, and her Spanish is perfect rather than Tex-Mex and she likes to eat Indian food. Does one group have more of a ‘claim’ to being of colour?

References:

https://www.pbs.org/wgbh/pages/frontline/shows/jefferson/mixed/onedrop.html

https://www.npr.org/sections/codeswitch/2017/06/08/462395722/racial-impostor-syndrome-here-are-your-stories

https://www.npr.org/2010/12/20/132209189/how-multi-ethnic-people-identify-themselves

https://theconversation.com/who-counts-as-black-71443

https://www.cnn.com/2016/10/10/health/biracial-black-identity https://www.tandfonline.com/doi/full/10.1080/01419870.2019.1642503

.

Candice Louisa Daquin is a Psychotherapist and Editor, having worked in Europe, Canada and the USA. Daquins own work is also published widely, she has written five books of poetry, the last published by Finishing Line Press called Pinch the Lock. Her website is www thefeatheredsleep.com

.

PLEASE NOTE: ARTICLES CAN ONLY BE REPRODUCED IN OTHER SITES WITH DUE ACKNOWLEDGEMENT TO BORDERLESS JOURNAL.

Categories
The Observant Immigrant

We had Joy, We had Fun…

By Candice Louisa Daquin

Courtesy: Creative Commons

Heortology (the study of festivals) has expanded beyond its initial Christian focus to embrace all festivals and their enduring appeal and necessity in our human culture. Festivals remind us to celebrate, and celebration is a positive experience. The very idea of festivals is ancient. No existing history book is old enough to document when the first festival took place or what its origins were, but it’s a safe bet they had some kind of worship element attached. Modern festivals often also land on old pagan holidays, whilst others are more obvious in their origins. Many who attend festivals have no idea of their origins but go for entirely celebratory reasons. We have learned a lot about the history of varied festivals but another question to consider is: Why are humans drawn to festivals and what do they provide us?

Imagine the ancient world. As much as we think we know now, they knew a tremendous amount also, considering their lack of modern resources. This may well be down to the ‘necessity is the mother of invention’ paradigm. Or that we severely underestimate our ancient ancestors, in our egocentric belief the modern world knows best. Just as we underestimate the knowledge of animals and their abilities to survive. Perhaps we could even say, we have lost the art of survival and wouldn’t know how to, if our computers were offline and our cars did not work and the supermarkets were empty.

What we do know, is the ancients were able to amass a great deal of knowledge, despite seemingly not having easy access like we do today, with our modern telescopes and technology. They had to understand mathematics and science at the very core, to establish theorems on the universe and our place in it. Whilst many were later corrected, it is surprising how many ancient scientists, mathematicians and philosophers, got it right. Almost against all odds. It is fair then to say, we dismiss the richness of the ancient world, and imagine everyone lived ignorant lives, which was not the case. When ignorance did reign, it did so deliberately, with the quashing of knowledge by various religious groups, and resulting periods of ‘dark ages’.

The ancient world was in touch with what it means to be human. Being human isn’t knowing how to work your iPhone or microwave. It’s not having a huge house, with a swimming pool and driving a Lexus. Nor is it eternal youth, fame and glory. Being human is about surviving — just as it is with any animal. When we then add an awareness of our own being, which it is argued, not all animals understand, then we become the modern human we recognise today. A being who has the choice, the ability to reflect and learn, and a tendency to seek beyond themselves. In seeking beyond oneself, we find an innate or shaped desire for ‘more’ and that ‘more’ has often come in the guise of a God-head or spirituality of some kind.

Whether we believe humans are prone to worshipping gods or being spiritual, because Gods actually exists or we just have a propensity to create them, is immaterial. The outcome is the same. The God gene hypothesis proposes that human spirituality is influenced by heredity and that a specific gene, called vesicular monoamine transporter 2 (VMAT2), predisposes humans towards spiritual or mystic experiences, perhaps that is what is at work? In essence a transmitter in our brain that makes it more likely we will believe in God (and could explain why some people do so fervently, whilst others do not). Or perhaps we may find meaning in believing in a spirituality beyond the temporal world. But what we do know is, as long as humans evolved from their primate ancestors, they have formed meaning around some kind of spiritual observance and festivals were tied to this worship.

Why do we do this? We are born part of something (a family) but are also separate (an individual). Perhaps festivals and what they represent, is the coming together of all things: Nature. The seasons. Marking time (birth and death). Marking passages (fertility, menstruation, maturity, marriage, children, dying). These are the cornerstones of meaning, with or without God. I say without God, because for many, their notions of God are tied to nature, so it’s more the world around them than specific deities. For others, it’s the manifold destinies of humanity, or history of deities. But whatever the reason there is a sense of coming together in celebration of being alive, and acknowledging that life. A festival in that sense, irrespective of its actual purpose (the harvest, pagan holidays, etc.) is a ‘fest’ of life. Maybe this is why we can have such a happy time being part of it.

Growing up, neither of my parents liked festivals. They thought they were silly. I remember a street festival I went to as a child, for Fête du Travail (Labour Day) in France. I dressed up as princess and the frog (taking my toy Kermit with me) and felt an excitement like I had never felt before. The throngs of people and other children, the food, the smells, the magicians, the shows and the things to see. It was like walking through a market of treasures. I couldn’t understand why neither of my parents liked this; to me, it felt like a jewel had opened. But for some, festivities are synonymous with rituals and a degree of adherence to religion, even when it’s not. And rather than entering into the spirit of it and enjoying it, they feel what it represents is part of social control.

In France, like many countries, festivals abound. The national Fête du Citron (Menton Lemon Festival) draws crowds from around the world, as does the film screening: Festival de Cannes –near where I grew up — and Fête des Lumières (festival of lights, in Lyon). More traditional festivals include Défilé du 14 Juillet (Bastille Day). In the Middle Ages in France, on Midsummer’s Day, at the end of June, people would celebrate one last party (fête de la Saint-Jean or St. John’s Day). Bonfires would mark this longest day and young men would jump over the flames. This also happened on the first Sunday of Lent (le Dimanche de la Quadragésime), where fires are lit to dance around before carrying lit torches. Religion dominated many of the Autumn/Winter festivals historically.

In France, Christmas, is marked over twelve days with the Feast of the Innocents, the Feast of the Fools, New Year’s Eve and culminates in the Feast of the Kings with its traditional galette des Rois. Events include Candlemas (Chandeleur) with its candlelight process. Likewise, many Christian societies have some celebration connected to Easter (Pâques, in French)) or its Pagan roots. In France (and New Orleans in America) these include Shrove Tuesday (typically Mardi-Gras in America), marking the last feast day before Lent, and many others until Pentecost Sunday. My favorite ‘fest’ was Shrove Tuesday (also known as Fat Tuesday or Pancake Day, in other countries) because my grandma would make pancakes, despite our being Jewish. The notion was to eat before Christian Lent and a period of fasting, which has much in common with Muslim beliefs too (unsurprisingly since God is one in the Jewish, Christian and Muslim faiths). In America, they serve fish options every Friday for much the same reason.

Far more impressive and immersive festivals occur in India, with Hinduism celebrating among the highest number of festival days in the world. Over 50 festivals are celebrated throughout India by people of different cultures and religions. These Indian festivals form an integral part of the rich heritage of the country. The ancient Hindu festival of Spring, colors and love known as Holi is one. “Holi is considered as one of the most revered and celebrated festivals of India and it is celebrated in almost every part of the country. It is also sometimes called as the ‘festival of love’ as on this day people get to unite together forgetting all resentments and all types of bad feeling towards each other.” Holi is celebrated on the last full moon in the lunar month of Phalgun, the 12th month in the Hindu calendar (which corresponds to February or March in the Gregorian calendar).

With social media, more of the world have been granted access to the visual beauty of Holi – “This ancient tradition marks the end of winter and honors the triumph of good over evil. Celebrants’ light bonfires, throw colourful powder called gulal, eat sweets, and dance to traditional folk music.” One of the most popular legends in Hindu mythology says the Holi festival marks Lord Vishnu’s triumph over King Hiranyakashyapu, who killed anyone who disobeyed him or worshipped other gods. With coloured powder thrown on people as part of the celebration, many countries now celebrate Holi just as Indians may celebrate Halloween or Día de Muertos. The crossover effect may seem to dismiss the individualistic cultural value and smack of appropriation but, in reality, it’s more a sign of respecting other cultures, learning about them, and celebrating with them.

Mexico, which I live near to now, celebrates over 500 festivals yearly and consequently is one of the most festive cultures in the world. In San Antonio, TX, where I currently live, we celebrate many of these fiestas, alongside American ones. The most popular being Día de Muertos, Día de la Virgen de Guadalupe, Cinco de Mayo and Día de la Candelaria, (like the French Candlemas, celebrated after Three Kings Day, which is a bigger holiday than Christmas in Mexico). The variables in cultures are fascinating. In San Antonio, we get a huge influx of Mexican tourists over Christmas because they aren’t home celebrating as they do so a few days later. We have a fiesta in San Antonio that is much like those in Mexico, due to our large Mexican population and it’s heartening to see the merging of the two.

As a child I celebrated the Jewish Pilgrim Festivals—Pesaḥ (Passover), Shavuot (Feast of Weeks, or Pentecost), and Sukkoth (Tabernacles)—and the High Holidays—Rosh Hashana (New Year) and Yom Kippur (Day of Atonement). But I attended a school that celebrated all faiths so we also celebrated Ramadan, the Muslim sacred month of fasting, akin to Christian Lent. Growing up, my friends of all faiths, celebrated Eid-ul-Fitr or simply Eid which is among the religious festivals for the Muslim community, marking the end of Ramadan. This festival is celebrated on the day after seeing the night crescent moon with devotees offering prayers at mosques and then feasting with their near and dear ones.

We would also celebrate Kwanzaa, which is a worldwide celebration of African culture, running from December 26 to January 1, culminating in a communal feast called Karamu. Its creator was a major figure in the black power movement in America, “Maulana Karenga created Kwanzaa in 1966 during the aftermath of the Watts riots as a specifically African-American holiday. Karenga said his goal was to ‘give black people an alternative to the existing holiday of Christmas, and give black people an opportunity to celebrate themselves and their history, rather than simply imitate the practice of the dominant society.’”

Are we socially controlled when we attend festivals? Given we have a choice, I would say no. Someone who chooses to be part of something, isn’t signing up for life, they’re passing through. Since my childhood I have been lucky enough to have attended many festivals in many countries. For me it is a reaffirming experience, seeing people from all walks of life come together in happiness. I like nothing better than dressing up and meeting with others and walking through streets thronged with people. Be they carnivals, even political events, there is an energy that you rarely feel anywhere else.

The May Pole festival, believed to have started in Roman Britain around 2,000 years ago, when soldiers celebrated the arrival of spring by dancing around decorated trees thanking their goddess Flora, is an especially interesting festival because it is still practiced almost as in ancient times. The ribbons and floral garlands that adorned it represent feminine energy and the beauty of the ritual is enduringly something to behold.

Likewise, another event ‘Guy Fawkes Night’ is steeped in ritual and British history, with much symbolism in the burning of straw dummies that are meant to represent Guy Fawkes thrown onto bonfires. However, the act of throwing a dummy on the fire to represent a person, has also been done since the 13th century to drive away evil spirits. What most people seem to take away from Guy Fawkes Night are the abundant fireworks in a beautiful night sky, alongside children and families holding sparklers and eating horse chestnuts in the cold, wrapped up in mittens. It’s a ritual that is beloved and a chance to ‘be festive’ even if it’s not a specific festival. As much as anything, it marks time, another year, another November, and gives wonderful memories. If we didn’t mark time or have those memories, we’d still have others, but there is an ease with festivals because they do it for us, unconsciously.

Young collegiates often attend festivals that involve dancing and sometimes drugs. Again, this is not a modern occurrence but has been going on for years, as rites of entering adulthood. The desire of the young to get out and meet others and dance and enjoy life, is primeval, and possibly a part of who we are as humans, marking a potent stage in our lives. Recently I went to a birthday party at a night club. I observed the diverse throngs of party goers and reveled in that abundant diversity. In just one night I saw: Pakistani women in saris, Japanese girls in anime costumes with ears, a pagan woman with huge, curled bull horns and floor length leather dress, Jamaican families in neon shorts and t-shirts, transgender wearing spandex dresses and big wigs, Hispanic Westsider’s filled with tattoos, and gold necklaces, Lesbian and gay couples holding hands. Old couples in sensible church clothes including one old black man with a pork pie hat and a waist coat.

I thought of all the diversity that had attended this club to dance the night away. All ages, all genders and backgrounds and ethnicities, and I thought how wonderful it was that one place could hold them all. In many ways this is the essence of a festival, especially nowadays where anyone can attend most festivals. Years previous, they were segregated by subject. Only those followers of that subject usually attended and you could be harmed if you tried to attend and were an outsider. The advantage we have today is we are more accepting of outsiders and when you attend festivals today, you see a wide range of people. Maybe this is the best opportunity we have to put down our differences and celebrate our similarities.

When I lived in Canada, I loved the homage paid to different seasons in varied outdoor festivals, where shaking off the lethargy of Winter, Canadians would celebrate with fairgrounds, amusements, shows and food among other things. It was like a period of renewal. Likewise, during my time in England, the Notting Hill Carnival, celebrated the Afro Caribbean culture, so essential and entrenched in English culture, with gorgeous street displays and floats, as well as some of the best music around. The idea of welcoming everyone into the fold, helps to remove any tensions between cultures and promote a feeling of unity, whilst not denying the unique properties of those cultures and ensuring they are promoted in their adopted countries. It may be idealistic and not entirely accurate, but it’s a better step than ignoring those myriad cultures exist.

As Halloween and Día de Muertos is fast approaching, I am thinking of how many of my neighbours attend these parties, despite some of them being from very conservative churches. Just last year, we all sat outside in the green spaces and had a mini fireworks display. I sat next to my little 4-year-old neighbour and watched her face as the older kids, dressed in all sorts of costumes, shrieked at the fireworks, and ran around with neon bangles, throwing glow powder at each other. I saw how inculcated we are, since childhood, but despite this I truly believe festivities are in our hearts, even if we weren’t introduced to them at an early age. Children mark their growing up by the events of their lives and it’s not just their birthday they celebrate but the touchstones of their respective culture and nowadays, many other cultures.

My Egyptian grandfather used to tell me about the Nile festival which celebrated the flooding of the river and the replenishing of life in Egypt. Without the Nile, Egypt couldn’t exist, and the ancients knew this. They employed methods to enhance the flooding and gave thanks for it. Gratitude like this can be found in many celebrations, including the American Thanksgiving (although this is a double-edged sword, given the history of genocide of the Native Americans by European pilgrims and invaders) and Harvest throughout the world. A celebration of life through food with music, is at the core of the human ability to endure and overcome hardship. More recently many of us celebrated healthcare workers by singing out of our windows and putting messages of thanks in our windows. We do this because it symbolizes essential parts of our lives, without which we would suffer.

Owing to its melting pot past, Egypt celebrates the Coptic Orthodox Christmas, the more ancient Abu Simbel Sun Festival that is akin to the Egyptian Sun God Ra (who in turn was one inspiration for the Christian God many years later), Sham Ennessim, the national festival marking the beginning of spring, as it originates from the ancient Egyptian Shemu festival, Ramadan and the Muslim Eid al-Adha (honoring the willingness of Ibrahim (Abraham) to sacrifice his son Ismail (Ishmael) as an act of obedience to Allah’s command). As a Jew, my grandfather’s family celebrated Passover, the festival celebrating the Jews Exodus from Egypt, despite our family still living there! Nowadays it is no longer safe to live in Egypt as a Jew but the memory of all people’s experiences is preserved through ancient festivals and events, marking our shared history.

Before the advent of mass-produced entertainment, festivals were also a highlight in any village or town, because they were entertainment. Traveling theatres and shows for children, even book sellers and traders of items not commonly found locally, could be bartered or purchased at such events and it was almost a spilling out from the market square economy that kept such villages alive. Perhaps evolving from our natural tendency to barter for things we want, we evolved to invite others from outside to come for specific events to gain greater reach. With this trading and bartering, came the accoutrements such as eating, drinking, dancing. Not only did this increase diversity and knowledge of foods and drinks from other locales, but brought people who may otherwise not meet, together into a camaraderie.

Sharing stories is also part of festivals, by way of theatre, or more improvised scenarios. It is at our heart to pass on oral knowledge and we haven’t lost that desire. We may do this now via YouTube more than face to face (which is a shame), but the desire to get out and talk directly, is innate, as evidenced by how many people have done just that since Covid 19 restrictions are eased. Religion, folklore, ritual and a desire for escapism, alongside our desire to celebrate things or others (saints, gods, seasons, harvest) are all reasons why festivals endure. Just like children will instinctively dance when music is played, maybe it is our innate nature to enjoy festivals because they foster inter-relationships we all crave to some degree. We may be diverse and believe different things, but we can also come together and respect the perspectives of others. Never more so than through our shared love of celebration.

.

Candice Louisa Daquin is a Psychotherapist and Editor, having worked in Europe, Canada and the USA. Daquins own work is also published widely, she has written five books of poetry, the last published by Finishing Line Press called Pinch the Lock. Her website is www thefeatheredsleep.com

.

PLEASE NOTE: ARTICLES CAN ONLY BE REPRODUCED IN OTHER SITES WITH DUE ACKNOWLEDGEMENT TO BORDERLESS JOURNAL.

Categories
The Observant Immigrant

Sometimes Less is More…

By Candice Louisa Daquin

When you read sci-fi novels and they have most of the world living in small sections of the planet, in endless skyscrapers, the future can feel a little dystopian. As practical as living in close proximity is, some of us yearn to be away from the maddening crowd. As our world swells in number (7.753 billion as of 2020, projected to reach 9.8 billion in 2050, and 11.2 billion in 2100 according to UN statistics) is it feasible to live off the grid any more? Is it becoming more difficult not to be part of the mainstream?

During a time of illness, I watched a strange TV show Alaskan Bush People, I would not usually entertain. It was a wilderness show about a family who chose to live off-the-grid. I watched it the way we view any reality TV, with disbelief and morbid curiosity. However, with time, I began to get involved. I admired that these eccentric people — even if some of it was spoofed for the camera — could live in this way. They valued being able to live off the land. I began to wonder if we put too much onus on city-urban-dwelling to the detriment of other life-styles. If we judged those who lived more basically, assuming we were sophisticated. If the grid failed in some way, if electricity or the internet failed, or a giant EMP burst took everything out, we’d need those lost-skills, we’d value those kinds of people more. Maybe we should know that now, before it does, and not get caught up on judging people on how large their house is, or what car they drive. After all, we’re rapidly hurtling toward a future where ‘big’ is going to be problematic and finding alternatives will be prized.

When I moved from a large city to a smaller one, I felt completely cut off from what I termed the trappings of city living, such as the ballet, theatre, good book stores, interesting alternative restaurants. It took me some time to adjust and settle into a slower life with less options. Part of me never stopped missing the variety of a large city, its diverse heart. But I did appreciate the calm that came with a slower pace of life. Sometimes less is more. Moreover, when I met people from big cities, I noticed how their identities were hinged on their experiences of ‘culture’ and how judgmental they were about what counted and what did not. Even the use of words like ‘native’ or ‘naïve’ artist, seemed patronising and racist. Who said one culture or city had more value over another? When did we start respecting the business man over the farmer? When our very existence depends upon the latter? It’s a little like what happened during Covid-19. We realised the value of nurses and front-line-workers a little late in the day.

There are many reasons people crave moving from larger communities to smaller ones. The most obvious is retirement. You may live in a large city but it’s expensive and fast-paced and when you retire it is possible you need different things. You may swap the city for the beach, mountains or lakes. You may find a retirement community has more to offer at that juncture in your life, you may want to have a horse farm or live in another country with more sun. The retiring Baby Boomer generation has caused a massive uptick in house prices throughout desirable parts of America, as they take their affluence to other areas and bring their expectations with it. “Baby boomers held an average wealth of $629,683 in their 50s, equivalent to $704,158 in today’s value. Worse off is Generation X who, on average, owned $396,293 when they started reaching their 50s,” Boomers may be the last ‘affluent’ generation in America to have this mobility and generational wealth. It has changed the landscape of America in terms of house prices.

Take for example a town: New Braunfels was a sleepy little town with nothing to recommend it. Boring but by a river, with an outlet mall nearby. New Braunfels is currently growing at a rate of 5.96% annually and its population has increased by 76.03% since the most recent census, which recorded a population of 57,740 in 2010. It had nothing much to recommend it. Retirees began to move in because it was affordable, had year-round good weather, you could get a lot more for your money than if you chose the more traditional retiree communities in Florida and Arizona. This incoming wave perpetuated another; an exodus of large companies from expensive states like California, wishing to re-settle in cheaper ones. They brought jobs and housing. Before you knew it, this little town was one of the fastest growing towns in America, which is baffling given it has very little to recommend it. But like anything, exodus isn’t always based upon seeking the best, but seeking the most practical, which in some ways it was. More baffling; Texas is home to seven of the 15 fastest-growing cities, which when you compare the beauty of other states, seems non-sensical, but speaks to consumers need for less expensive, warmer states, seemingly at any cost.

However, some smaller communities exist by choice before retirement. Historically there have been reasons people have chosen to live separately. Not long ago, the majority of the world was rural and historically that historically the case. But in the last 100 years, this has drastically changed with more opting for urban living. Religious difference and cultural practice are among the most common reasons people have chosen to live apart. In the 1960s and 1970s ‘fringe’ groups and sub-culture became more familiar among the main-stream. Perhaps because in the 1950’s the idea of being a ‘teenager’ really took off and emancipated young people into being more diverse and following their own interests over their parents. This led to more sub-cultures popping up. That said, is it really such a recent phenomenon?

Alexander the Great was only eighteen when he ravaged a quarter of the planet with his conquests. Other famous historical conquests were at the hands of what we’d deem today, very young people. So younger people have always sought to strike out on their own and forge their identities. The suffragettes in the 1930s, the Zazou in France in WW2, Jazz Age of the 1920’s, the Fin de siècle amongst artists from 1880 onwards … the list is endless. Existentialists, LGBTQ, Nudists, Dadaists, counterculture in the 1960’s, there are so many explosions, one would be forgiven for thinking there is no mainstream, but in reality, these groups have always been the minority and often fleeting.

Youth and age aren’t the sole determinants for such sub-cultures to evolve. People seem divided into those who seek homogenisation and those who seek diversity. For some it may not be a choice, such as LGBTQ or those on the spectrum or isolated communities that were ‘discovered’. But for others, it’s a deliberate attempt to dislocate from the mainstream to express their individual perspectives.  Of those isolated communities and uncontacted people, it is hard to establish how many would have wished to become mainstream and how much choice they had in the matter. Some indigenous peoples are in voluntary isolation, and do not require ‘saving’ as per the modern cultural assumption. Some indigenous groups live on national grounds, such as the Brazilian Vale do Javariin and those who inhabit the North Sentinel Island in India.

I have visited Quaker, Shaker Mennonite and Amish communities as they have fascinating insights on how to live outside the mainstream. Some do without electricity, others have seemingly flexible prescriptions where their ‘young’ can leave the community once adult and spend time in the outside world before choosing whether to return or not, this is known as ‘rumspringa’. This seemed risky as many could seek the excitement of the unknown, but ironically more return to the community. It reinforces the idea that small communities have staying power, which large communities may dismiss.

There are groups of youth, doing one thing, middle-aged, doing another and a whole spectrum of interests in-between. I find this particularly interesting when you go to a fair or show, and suddenly thousands of people all interested in the same thing turn out. It makes you wonder, where have they been hiding? I have experienced this at rock concerts, medieval and renaissance fairs, comic con, tattoo exposes and vampire balls. I attended out of interest but as an outsider. Watching people who are committed to their passions, get together in fantastic outfits, is a fast insight into how many sub-groups exist. Perhaps all of us have within our main-group, sub-genre groups of interest.

Back in the day we called these cults, clans, cliques and (other) but most of those terms have become insulting to future generations, that saw the impact of labeling. After one of the first American mass murders committed at a school (Columbine), the two shooters were described as ‘Goths’ and consequently, many who dressed in Goth style, were attacked. Sadly the Goth movement had nothing to do with violence but this is what happens when we assume people different from us, must have negative attributes; “Qualitative results reveal that students themselves highlight the importance of exposure to diverse others, family upbringing, the media, and several other key factors as important considerations in how they treat other people; this suggests a multitude of ways that people create their beliefs.” The same happens in America with the church of Satan which does worship the fallen angel, Lucifer, as an alternative God-head, but does not condone or sanction many of the ‘evil’ practices associated with Satanism. It isn’t hard to understand why there would be misunderstanding with such extremes but what of less extreme smaller communities?

The Mormon church not only owns Utah but much of other states too. It is one of the richest religions based out of America and has a huge recruitment reach worldwide. When Mitt Romney, an elder in the Mormon church of America, ran for President, one of the reasons he lost was due to a fear of Mormonism. The ‘other’ aspect to their faith, set them apart from the more mainstream Christianity. However, this is shifting as more politicians of Muslim and Hindu faith are becoming key figures, the fear of ‘other’ is lessening. One could argue some fear of ‘other’ isn’t a bad thing, but it’s the extent to which we react to it, that matters. I may not approve of Mormonism, I may think it’s a phony made-up version of Christianity (The Book of Mormon talks of the history of two tribes of Israel—the fair-skinned, ‘virtuous’ Nephites and the dark-skinned, ‘conniving’ Lamanites. Much of its ‘story’ is a direct retelling of The Bible, unoriginally claiming the same events occurred in North America as in Israel. To me, it seemed like racism dressed up as scriptures). Mormonism has been said to act like a pyramid scheme, but should I be prejudiced against someone on the basis of their being Mormon alone? No. We can be cautious or disagree with a religion without being prejudiced against it. On the other hand, shouldn’t we be conscientious of trying to maintain truth, which means if something perturbs us, like the church of Satan or Mormonism, bringing that to light for others to make an informed choice? Perhaps with faith there is no room for choice, it is a matter of faith, and none of us can persuade another to change their perspective. This might be why wars are so often about faith.

Currently throughout America there are many sects and groups who thrive in relative obscurity and are untouched by the mainstream. Whilst group polarisation clearly exists, the famous stories of cults throughout the world committing mass suicide like the Branch Davidians, or fighting against authorities, isn’t as common place now, but that doesn’t mean they’re not out there. Social media has made it easier to be underground and thrive but people always find ways. Whether those communities can come together, depends upon how incompatible they are. Near where I live there is a conservative Jewish community where only conservative Jews live. They chose to live separately because of a high number of hate crimes throughout America, where Jews continue to be the #1 most attacked group.

Other groups have become more comfortable co-existing. Twenty years ago, you would not have seen as much diversity as today. In my neighborhood, there are people of every culture and skin colour — Sikhs, Jews, LGBTQ, single parents, tattooed bikers, affluent conservatives, communists. It has been interesting to see how they are able to come together over a mutual interest and get along. When it’s a special event like Halloween, everyone let their children free to trick or treat. They do not avoid certain houses like they once did. There is an acceptance that we have more in common than we have differences and even if we vote differently, look differently, believe differently, we can put some of that aside for a common good.

Just recently I was asked how I could tolerate someone who was say, a Trumpster. It got me thinking that there must be a cut-off in terms of what we do tolerate. For example, if someone were a racist, a Nazi, a pedophile, I would not wish to be in touch with them or live next door to them. But both my neighbors voted for Trump, and I didn’t vote for Trump, but that isn’t enough of an ideological divide for us to not run in the same circle. Interesting they are both Hispanic and there was this idea Trumpsters were Anglo which isn’t always the case. It is those perpetuated stereotypes that cause the most harm. We can get past differences in ideology but most of us have sticking points such as extreme hate, prejudice or harm to children that would be unrecoverable differences. This is how society polices itself to some extent and legitimizes blame. If we didn’t then racism would be more acceptable, but the nuance is sometimes subtle.

The media has a powerful influence on people and can be responsible for promoting a stereotype of a particular group or enhancing scapegoating behavior. People let loose on social media and are uninhibited in their vitriol. This can create more divisions between us. It is difficult to police prejudice because it involves opinion, which may not always show itself in ways that are unlawful. But when we consider communities; communities can thrive with difference, without becoming contentious. Perhaps because our wish to be united is greater than our wish for division. Secularism is misrepresented often. Although when you drive through parts of the American South as a person of colour, you could be forgiven for thinking ‘secular people’ can be hateful, because there are towns where you will definitely not be welcome. Some groups may not outright say they don’t accept others (people of colour for example) but they will actively encourage segregation through their secularity. This may be unavoidable as much as it is racist, but how can we really change that? Would it work to demand racists accept people of colour as next door neighbours? Would it be good for the people of colour to be part of that experiment?

Another concern is a subject brought up by famed linguist, Professor Anvita Abbi, in relation to bringing distant or smaller cultures into the mainstream and their impact. Dr. Abbi received her Ph.D. from Cornell University, USA and began teaching Linguistics at Kansas State University, where she says, she “realised that a large number of Indian languages especially those spoken by the marginalized communities are under-researched.” This led to Abbi wishing to “unearth the vast knowledge base buried in the linguistic structure of Great Andamanese before it is lost to the world.” In the process, as she recorded in her book, Voices from the Lost Horizon, she realised this language was “a moribund language of the only surviving pre-Neolithic tribe, the remnants of the first migration out of Africa 70,000 years ago.” Awareness of the Great Andamanese, resulted in invariable negatives; “Outsider-contact has brought diseases, subjugation, sexual assault, and ultimately decimation of the tribal culture, tribal life, and tribal language.” But what has been learned from this outside culture, is invaluable. Sadly as Dr. Abbi says; “Jarawas maintained the isolation and now they regret the interaction with us.” Which if we consider other ‘first contact’ scenarios, seems a universal response.

‘Mainstreaming’ is a colonial model, which can suppress the indigenous dignity of people in favour of assimilation. But assimilation isn’t the same as ‘fitting in’ because often, the qualities of incoming cultures are derided by this colonial model, leaving those incoming, feeing disrespected and alienated. In America, Mexicans are considered ‘less than’ other immigrants (Asian predominantly) because they may have lower education rates. This breeds a division between immigrants that undermines those least appreciated by the host-country. With Asians set to overtake Hispanics in America, this has been at the forefront of race-relations and considerations lately, with some tensions building up as for a long time it was anticipated America would become Hispanic. When Donald Trump was President, he actively encouraged immigration from certain countries over others, because he believed those countries had more valuable people. This sounds an awful lot like the argument for eugenics and, at its core, it shares a lot with racists who believe certain groups have more potential than others.

When Abbi was asked what the ideal way for Great Andamanese integration to occur where language and cultures were not eroded but blended with the mainstream, she said in her experience,“[t]he idea of mainstreaming and merging these tribes into our civilisation is nothing but usurping their rights to their land, forest, water, and way of life. ‘Development’ may kill these tribes. These tribes have amalgamated their life with nature so well that they are aware of secrets of life.  Any kind of interference will disturb this harmony.” Perhaps we can learn from the poor, exploitative outcomes of assimilation between developed communities versus those they perceive as less developed. The fault of perceiving difference as ‘less than’ is not appreciating the dignity and abilities of those cultures. Linguistically, socially, they may have many advanced ideas over mainstream culture, but are relegated to ‘less than’ in xenophobic or colonialist thought.

Take the Native Americans of America as one example. They believed the earth was for everyone and no one group should own the earth. They are often considered one of the first cultures to be environmentalists because of their acute awareness of balance and the need to give back to the land rather than rape it. When colonialists came to America, they didn’t respect that and demanded ownership of shared lands, as well as working the land sometimes to death. Slavery and mistreatment of land have that in common, the need to conquer, own and a capitalist model of growth. Those under the yoke of such tyranny do not thrive, only the ruling minority do. In this sense, it is not far removed from fiefdoms and seems to be a penchant of humans given the opportunity. But what happens when we visit cultures where a more egalitarian approach is mainstream? Less oppression and greed in favour of sharing?

It could be argued this is why capitalist model countries like America still fear Communism and Socialism. They recognise this alternative model would undermine the oppressive aspects of Capitalism. Whilst no one ethos appears to work without serious flaws and hypocrisy, we’d probably do better to work together, blending aspects of all, than continue a ‘cold war’ about our differences. When you look at the recent antagonisms between countries, it become apparent, war solves nothing, and the wealth which could be poured into helping countries, are being squandered on military posturing and grandstanding. Until larger communities respect the dignities of smaller groups, we cannot expect this to change. On the other hand, can we afford to give up that military grandstanding if other large countries insist on becoming the conquerors we once were? How can we unite together without becoming vulnerable?

Studies have shown that integration helps overcome prejudice and racism. When people have LGBTQ children, they are more likely to become accepting of LGBTQ and racists become less racist, when people of colour move into their neighbourhoods. This suggests some of the hate is more ignorance and fear although that doesn’t justify it. But should the minority have to stomach that hate to find acceptance for their progeny? Maybe they always have. If we consider the years it has taken some minorities to become more mainstream, it has always been through personal sacrifices. Even Martin Luther King Jr’s murder galvanised more social and racial change in America. Such tragedies create martyrs, harbingers of change, but at what cost? Should it take such extremes as assassinations and mass shootings to wrought change? It seems human nature only understands things when they’re extreme. A case in point is the environment and the long duration where campaigners have warned we’re dooming future generations but business interests were put first.

How with so much division even on subjects that can be proven, such a climate change, can we hope to lay down our differences and come together? Perhaps the best we can hope for, is if enough of us try to embrace difference instead of letting our xenophobic tendencies frighten us, we will do a better job.

Immigration in America is considered a ‘problem’, but it can equally be a solution if we redefine things. Immigration is the bedrock of how America came into existence — from the Native Americans who came across the Barring Strait and made a deserted land, home, to the European conquerors who stole it but equally populated it from diverse cultures. As much as we have fought and hurt one another, we have needed each other.

Each epoch in people’s lives, shifts what matters to that particular generation, and perhaps it is the fear of being obsolete or an inability to get onboard with new ideas (or a fear that old ideas will be ignored) that causes inter-generational strife. But again, if we balance and appreciate the diverse perspective, we all have something to offer, we are stronger together than apart. If we humble ourselves and remember to learn from those cultures that may not have had as much attention given them, but held great wisdom, we may learn alternate ways of cooperating and thriving. If harmony is the goal for most of us, we need to vote and avoid dictators taking that freedom away.

Candice Louisa Daquin is a Psychotherapist and Editor, having worked in Europe, Canada and the USA. Daquins own work is also published widely, she has written five books of poetry, the last published by Finishing Line Press called Pinch the Lock. Her website is www thefeatheredsleep.com

.

PLEASE NOTE: ARTICLES CAN ONLY BE REPRODUCED IN OTHER SITES WITH DUE ACKNOWLEDGEMENT TO BORDERLESS JOURNAL.

Categories
The Observant Immigrant

Can We Create a Better World by Just Wishing for it?

By Candice Louisa Daquin

The wish to laugh and shrug off differences that create unhappiness and wars is a universal one. The majority of us want to avoid unhappiness at any cost. There is however, a downside to trying to avoid unhappiness by being too open about unhappiness. When we begin to pathologize everything as a disorder, we may inadvertently neglect our ability to generate better mental health.

Before mental illness was discussed en mass, it was private and considered shameful. This had obvious detrimental effects on those suffering, but one could also argue there was a benefit to not making everything so extremely public. Like with any argument, there are pros and cons to how far we publicize mental health. The extreme of ignoring it, didn’t work. But does the extreme of talking about it to death, really help people as much as we think?

In the second half of the 20th century, owing in part to a neglect of, and a need for; improved mental health care, societies began to shift from encouraging suppression of emotion to a recognition of psychological distress and its impact. Institutes and then the de-institutionalisation movement, became ways of coping with people who struggled to function in society. But these people didn’t choose to be unhappy. Whilst it’s obvious this shift to publishing mental health instead of hiding it, has been highly beneficial in some regards; we should also consider its far reaching ramifications.

“(Historically) Many cultures have viewed mental illness as a form of religious punishment or demonic possession. In ancient Egyptian, Indian, Greek, and Roman writings, mental illness was categorised as a religious or personal problem. In the 5th century B.C., Hippocrates was a pioneer in treating mentally ill people with techniques not rooted in religion or superstition; instead, he focused on changing a mentally ill patient’s environment or occupation or administering certain substances as medications. During the Middle Ages, the mentally ill were believed to be possessed or in need of religion. Negative attitudes towards mental illness persisted into the 18th century in the United States, leading to stigmatisation of mental illness, and unhygienic (and often degrading) confinement of mentally ill individuals,” states an article on this issue.

By publicising everything, in reaction to the days when mental health was viewed with more stigma, we have not improved suicide statistics or mental illness numbers like we’d logically assume. When something is freed of stigma and shame, more people admit to suffering from mental illness than ever before, which will make it seem like more people have mental illness, when it could simply be that they are more willing to admit to having it. On the other hand, there is an observed phenomena of things becoming socially contagious.

How can we be sure we’re not increasing mental health numbers by making it so acceptable to be mentally ill? By over-emphasising it on social media? Publicising the struggle to avoid stigma, is positive, but the degree to which we discuss mental illness may be so open, as to increase numbers or over-diagnose people. For example, everyone gets sad sometimes, that doesn’t mean everyone suffers from clinical depression. Everyone gets anxious sometimes but that doesn’t mean everyone suffers from anxiety. The distinction is: Is it a disorder or a feeling? Do clinicians spend enough time considering this when they give patients a life-long diagnosis? And what is the effect of such a diagnosis?

When psychiatrists diagnose mass numbers of people, especially easily influenced teenagers, with serious life-changing mental illnesses, that immediately means the reported numbers swell. Who is to say they would be that large if diagnosis weren’t so open ended? Nebulous? Open to outside influence? Or even, the pressure of pharmaceutical companies and desperate doctors wanting quick fixes? What of parents who don’t know how to handle their rebellious teen? Is that mental illness or just life? If they demand treatment and the teen is labeled mentally ill, do they fulfil that prophecy? And if they hadn’t been diagnosed, would their reaction and outcome be different?

Our innate ability to laugh and shrug things off, comes from the challenges in life that were so terrible we had no choice if we wanted to go forward. If we remove those challenges, are we teaching our kids how to cope with hard things or wrapping them in cotton wool and medicating them? When a family of ten children ended up with eight routinely dying, how else could families cope with such tragedy but to have that coping mechanism of laughter and the ability to shrug off despair and horror? It did not mean anyone was less caring, or feeling, but that sensitivity had to be weighed against our ability to endure. We could argue we endure less pain now than ever before, as we are less likely to lose a great number of people we know, die due to disease and famine and other historical reasons for early death. Many will never even see the body of a dead relative, so how can they process that loss?

The modern world brings with it, its own attendant risks and stressors. People growing up in 1850 may not have had to worry in the same way, about looking young to keep a job, or trying to ‘do it all.’ On the other hand, they might have had to worry about not having a society that helped them if they lost a job, or how to stop their families from starving or their village from being raided. They had fewer social cushions in that sense and more of a risky day-to-day. This was starkly true when we compare the recent pandemic outbreak with say the plagues of earlier centuries. People died in the street and were left to rot, whereas now, even as we struggled and many died, we had a modicum of order. For all our terrors with Covid 19, it could have been far, far worse and has been. I say this from a position of privilege where I lived in a society that had access to medical care, and I’m fully aware many still do not, but nevertheless if we directly compare the experience of the Black Death with Covid-19, we can see tangible improvement in what those suffering, could access.

This means whether we believe it or not, appreciate it or not, we have over-all an improved quality of life than even 50 years ago. At the same time, we may have swapped some deficits for others. It may seem a minor consolation for the myriad of modern-day woes, but we are better off than our grandparents who were called ‘The Silent Generation’. They grew up learning to not speak of their struggles but cope with them silently. These days we have outlets. And in other ways, we are more alone, it is a strange mixture of progress and back-tracking. Some would argue our grandparents had a simpler, healthier life. But if average life expectancy is anything to go by, we are growing older because for the majority, our access to medical care and over-all nutrition, are improved. On the other hand, more grow old but sick-old, which is not perhaps, something to aspire to.

When we consider how badly many eat, and in truth, we do ourselves no favour when so many of us are obese and suffering from diseases of modern living such as lack of exercise, heavy drinking, lack of sleep and eating fast-food. It might be most accurate to say we have swapped some deficits such as dying due to curable diseases, and dying from malnutrition or lack of access to care and antibiotics, with modern deficits like increasing cancer rates and increasing auto immune disorders, all of which are increasing with the swell of the modern world and its life-style.

What it comes down to is this; through the wars of the past, people stood next to each other in trenches whilst their friends were blown to pieces or died in agony. They had PTSD[1] then, they suffered from depression and anxiety, but they also had no choice but to carry on. For some, the only way out was suicide or AWOL[2], while for many, they stuffed their feelings down and didn’t speak of it. Clinicians thought this way of coping caused illness and it led along with other reasons, to an improved mental health system.

But, now, in 2022, you might be forgiven for thinking EVERYTHING was a disease, and EVERYONE suffered from something, and you might find yourself wondering if some of this perceived increase was the direct result of going from one extreme to the other. Initially, nobody was mentally ill. Nowadays, who isn’t? Is this a better model?

Having worked with mentally ill people for years as a psychotherapist, I can attest that mental illness is a reality for many. I knew it was before I ever worked in the field, and it was one reason I chose that field. I wanted to help others because I saw viscerally what happened to those who did not receive help. Despite this I came to see the value of sometimes putting aside all the labels and diagnosis and medications and treatments and trying to just get on with the process of living. If we tell someone they are mentally ill and medicate them and coddle them and tell them they don’t need to try because they are so sick, then it doesn’t give them much motivation to see what else they can do.

True, for many, they are too sick to do anything but survive and that in of itself is a big achievement. So, when we talk about the need to motivate ourselves beyond labels, we’re talking about those who we’d call high functioning. People who may suffer from depression, or anxiety, but are very able to do a lot of things despite that. Does medication and therapy and labeling them, really help them make the most of their lives? Is putting them on disability for years without reviewing if things could or have changed, help? Can they learn something from our ancestors who had to just laugh and get on with it, no matter how tough things got?

It may seem a very old-fashioned approach to consider ‘toughing it out’ and having come to America and seen how much onus they put on toughing it out, I have mixed feelings about the value of doing so. The idea of being tough enough means there is always the reverse (not being tough enough) and that feels judgmental. Being judgmental, I think, has no place in recovery.

What does have a place in recovery, is doing the best you can and not letting labels define or defeat you. In this sense, I see a lot of commonalities with those struggling today and those who struggled 150 years ago. Maybe we can all learn from them and combine that with some modern prescriptivism that give us more chance to laugh and thrive, rather than fall under the yoke of a diagnosis and its self-fulfilling prophecy?

I have had many clients who felt their diagnosis disincentivized them from any other course of action than being a patient. The medication route alone is fraught with ignorance. For so long SSRIs[3] and other anti-depressants were heralded as lifesavers for depressed people, but what proof existed for this aside the hope a cure had been found? Years later studies showed only 30% of people seemed to respond to anti-depressants versus placebo.

Then second and third generation drugs were created, all the while charging exorbitant prices, and patients routinely took 2/3/4 medications for one ‘illness.’ Aside the expense and physical toll taking that much medication can do, there was a mental cost. Patients felt over-medicated, but not happier, not ‘better.’ By tputing their faith in drugs, they lost their faith in other ways of getting ‘better’ and some spiraled downward. The reality is we are all different and we process life differently. Some of us are more forward-focused, others, through imitation, genes or experience, may not be. It isn’t a deficit or illness, it’s a personality, that can change somewhat but should also be understood as the diversity of how humans cope.

Treatment Resistant Depression became the new diagnosis when modern medication failed, and new drugs were considered in tangent with current drugs, but this led to people taking more drugs, for longer periods of time, often with little improvement. How much of this is due to a negligent approach to treatment that only saw drugs as the answer? Meanwhile therapy was cut-back or became prohibitively expensive, cutting off other options for treatment. It’s logical that therapy can help avoid feeling isolated, but when the system prefers to medicate than provide therapy, there are so many taking medicines for years, that were only meant as stopgaps.

Should the media or your general physician, be the one telling you what drugs you should be taking, if at all? Preying on the desperation families  by the introduction of for-profit medication, muddies the waters further.  The disparity of information means no one source can be trusted, especially as information is ever-changing. More recently a study showed that anti-depressants may not work at all it was commonly held clinical depression was caused by a chemical imbalance and studies show correcting that imbalance does not improve depression as was once thought.

This shows us that psychiatry still has a long way to go, and when they claim things as facts, they rarely are. It contends we should not blindly trust what has become a profit led industry, where many of its practitioners see patients for a short time but somehow still diagnose them with serious mental disorders. Surely, we should consider equally, the importance of conservative diagnoses and recognise that normal variants are not necessarily disorders. In many cases, it may be that under diagnosing rather than over-diagnosing could work better.

For example, I know of many (too many) patients who told me they were diagnosed with bipolar disorder, before the age of 21 by a regular non-mental health doctor, or by a psychiatrist. Their subsequent mistrust of the system is understandable with that experience. How can someone tell you that you have bipolar disorder at 17 years of age, from a 20-minute conversation?

Even the diagnostic criteria for bipolar 1 or 2 in the DSM (Psychiatric Diagnostic and Statistical Manual), is flawed, because it’s too generalised and only highly trained professionals would be able to understand the nuance. Most are not that trained and therefore take at face value, when a diagnostic tool says someone has bipolar if they experience an episode of mania. But firstly, are they defining mania correctly? Is the patient describing mania correctly or being led? Were there mitigating factors?

If you diagnose a child with a serious mental disorder and medicate them, how can you be sure their brains aren’t affected by taking that strong medication before they have reached full development? How can you be sure they are not becoming what they are told they are? Too often, people spend years under the cloud of medication, only to emerge and realize that what was a discrete episode of depression, was medicated for decades, robbing them of the ability to recover? Doesn’t a label make it likely that some will feel helpless?

Moreover, how much power does a label have on our sub-conscious? If we are told, we are (will not be able to do something, why would we even try? If we believe we are depressed, are we less or more likely to fight against it? Isn’t some fighting a good thing? Likewise, diagnosing older people with a disease like Bipolar (a disease that occurs after puberty), shows the mistakes of the psychiatric world. How can a 70-year-old man ‘suddenly’ be Bipolar unless he has a brain-tumour or otherwise? Dementia is often misdiagnosed as Bipolar because badly trained doctors seek answers for aberrant behavior, without considering the whole story, such as how can someone of 70 develop a disease that affects those around the age of 18? Sure, some can slip through the gaps, but often, it’s the frustration of the family or doctor colouring the diagnosis. Such life-long labels should not be given lightly.

What if we treat mental illness depending upon its severity, in a different way? Consider the value of improving real-world ways of copying despite it, instead of relying on medications that were only ever meant as a stop gap and not developed to be taken for years on end? Nor over-medicating without due cause. Nor medicating young people based on very loose diagnostic expectations. Or assuming everyone who says they feel depressed or anxious, is clinically depressed or anxious, or that medication is their only solution?

Organisations that take vulnerable teens who often have co-morbid diagnosis of drug-or-alcohol abuse alongside mental illness, into the wilds, seem to be a real-world way of encouraging those young people to find coping mechanisms outside of addiction and reliance upon medication. Equally, when a young person (or anyone really) is productively employed in something they feel has meaning, this is one way anxiety and depression can improve.

We’ve seen this with Covid-19 and the necessary isolation of so many school children. Whilst it was unavoidable, the rates of depression spiked, in part because studies show people need interaction with each other. This is why online learning has a poorer outcome than classroom learning, this is why older people are less at risk of dementia if they socialise. We are social animals, we feed off each other and we empower each other. Finding your place in the world is always in relation to others to some extent.

We may never avoid war completely or our human tendency for strife, but we also have a powerful other side that urges people to do good, to help each other, to laugh and shrug off the differences that divide us. What good does’ division ever do? Unhappiness is unavoidable at times, but sometimes it’s a choice. We can choose to recognise something is hard and actively pursue ways of improving it. We can struggle through and feel good about the struggle and the effort we put in. if we take that all away and don’t encourage people to try, we give them no way out. Sometimes there is no way out of suffering or mental illness, but often we cannot know that unless we have tried.

Many years ago, people valued older people because they were considered wise and able to impart valuable life lessons to impetuous youth. Nowadays, the elderly are not respected and are often siphoned off into homes before their time, because people find them an inconvenience. There is a theory that humans became grandparents because grandparents were an intrinsic part of the family make-up. This explained why humans were among the only mammals to live long after menopause. Most animals die shortly after menopause, nature believing once your reproductive years are behind you, you have no value. But humans were distinct because they live long after menopause. The grandparent theory supports this by demonstrating the value of grandparents, and we can learn a lot from what nature already knows. It is never too late to have value, it is never too late to learn and grow, and it is never too late to laugh and come together, setting differences aside.

Those who achieve that, may well be happier and live healthier lives, as laughter is shown to be a great anti-ager as well as an improvement on our overall mental and physical health. Of course, what we can learn from the extremism found in the cult of positivity, illustrates there must be balance and we cannot expect to be happy all the time or unaffected by tragedy when it occurs. But staying there, and not attempting to move beyond it, to reclaim ourselves and our futures, seems to be a way to avoid going down that dark tunnel of no return.

Experience shows, we are what we think. We don’t have to be positive 24/7. To some extent any extreme sets us up for burnout and puts too much pressure on us to be ‘up’ all the time, when it’s natural to have down times. But striving for happiness, or contentment, or just finding ways to shrug off the smaller things and come together, those are things most of us wish for. So, it does no harm to direct our energies accordingly and prioritise our ability to cope. Perhaps our differences are less important sometimes, than what we have in common, and what we can do to make this world a more livable place.


[1] Post-traumatic Stress Disorder

[2] Absent without Official Leave

[3] Selective Serotonin Reuptake Inhibitors

Candice Louisa Daquin is a Psychotherapist and Editor, having worked in Europe, Canada and the USA. Daquins own work is also published widely, she has written five books of poetry, the last published by Finishing Line Press called Pinch the Lock. Her website is www thefeatheredsleep.com

.

PLEASE NOTE: ARTICLES CAN ONLY BE REPRODUCED IN OTHER SITES WITH DUE ACKNOWLEDGEMENT TO BORDERLESS JOURNAL.

Categories
The Observant Immigrant

Is It Okay to be Ordinary?

By Candice Louisa Daquin

Courtesy: Creative Commons

I had a client who struggled with her self-identification as ‘ordinary.’ A Millennial, she’d grown up with social media and the perpetuated confrontation of perfectionism that it can sometimes embody. Growing up middle-class she’d felt the pressures of reaching certain goals, even if they were not her own goals. We talked about how this has been true well before the advent of social media. Society has long held ideas of what individuals ‘should’ accomplish, depending upon your background and parenting. Many children were given no incentive and left to rot on the shelf, whilst others were hounded by external pressures; relatives, academic institutions, peers, or those they admired.

For girls this ‘pressure’ is more recent, as historically girls were not expected to achieve in the same way boys were. In the last 100 years this has begun to shift, with women gaining traction in the career stakes. However, as with any advance, there are pitfalls and some women now, are putting ever increasing pressure on themselves to ‘become everything.’

By ‘become everything’ I mean; mother, provider, educated, career success, care giver, slim, healthy and attractive. And for some, this is attainable. I know many women who function well with huge responsibilities, not least, thriving careers, multiple children, and sufficient energy to stay fit, eat well and not indulge in vices like smoking.

But for every woman who is able to juggle all of the above, there are many who find it too much. Unfortunately, if one person is able to juggle everything, society can be unkind and denigrate those who are not able to, as if this is somehow a failing. In large populations the survival of the fittest is most acute and social media shines a light on success, leaving many feeling ‘less than’. Let’s examine if not being able to ‘do it all’ is indeed, a failing.

Firstly: Everyone is different. But that is no consolation for those who perceive they are not favourably compared to others. What comfort is it to know you are different, when that translates as ‘not being able to do what others can’? Especially in a competitive world where failure isn’t really tolerated. Moreover, how do we feel good about ourselves when we’re confronted with many examples of success and are constantly under scrutiny, by ourselves and others?

In the past – before social media – it was harder to compare yourself against a large swath of people. We weren’t as in touch with each other and what each other was doing. Imagine the famous Bronte sisters, living in a remote moor, having to send letters to far-flung friends, rarely seeing anyone. In some ways this was unbearable. In other ways it avoided direct scrutiny. In the past, we may have not had the rapidity of expectation we have today, given we did less, because the opportunities to do more were harder to obtain. Class divide was more stringent. If you were born the son of a labourer, chances were that’s what you’d been. The ability to climb the social ladder was prohibitive. Today if a young woman lives in a city, she has access to the potential avenues of education, finding a partner, having children, having a career, buying a home, working out in a gym, making friends, etc. We are no longer stuck in houses on the moors or in far-off villages in the same numbers. Even so, class divide exists even if we don’t want to admit it does. If your parents have the funds to send you to the best school, then your outcome is likely to be more fortunate than someone who has no means.

But realistically and statistically how many young women today can bank on having it all? Even if they are capable of working extreme long days – in a job, holding down children and a marriage successfully, finding time for socialising and working out and more, even if they can somehow juggle those stressors and are glad to be emancipated from previous restrictions. How many can keep this level of pressure up and at what cost? For every woman who can, there are doubtless, those who can’t. And it’s not always about choice, it can be due to ill health, mental health issues, stamina, different aspirations, or simply a lack of interest in competing. Perhaps we should get away from the idea that having it all is indeed the only legitimate form of ‘success.’

Case-in-point, just the other day a client boasted of working 16-hour days because for her, this was a point of pride. What wasn’t said, was that shortly after ten years of 16-hour days, she got seriously ill and had to quit work. What wasn’t said is her eldest daughter ran away from home because she spent many days unsupervised by workaholic parents. When the client stopped working 16-hour days, she saw that the façade of ‘having it all’ was just that. She spent more time with her kids, without feeling she was failing to do so. Her marriage, which was hanging on by a thread, was healed. She had to de-program the idea that she was ‘failing’ by not doing it all – part of that was realising a 16-hour work day isn’t very balanced. Some do it, but there is an invariable sacrifice, and the question should always be; how much is too much and why do we as a society tolerate this?

Just as we should tell our daughters (and sons) that your self-worth shouldn’t be tied to shallow modes of success, it’s worthwhile admitting that many can and do ‘have it all.’ It very much depends upon what we seek in life. If our goal is to be a workaholic then we should consider whether we realistically have time for other things, and not attempt to have it all, so much as aim for what we want. Society however tells us that we’re lacking if we’re not able to do it all – even if that’s not what we want to do. Why does society put this pressure on people? Perhaps because it’s the nature of competition; If one person can do it, then you have to try harder to match them. Soon, even people who didn’t want what you wanted, are striving for it. The alternative might be obscurity.

As populations grow, you see this in countries with the largest population masses. In order to get what was once relatively attainable, people are having to work harder and harder because the relative competition is greater. This leads to a bottle-neck pressure, where top universities don’t have the space for all the qualified applicants. Cheating, subterfuge, paying off professors and other methods are employed to ensure a child gets that coveted placement. Equally, more-and-more unpaid internships are expected of young people, leading to only those who can afford to work without pay, succeeding. The price being paid for what was once attainable through regular hard work, is now exaggerated.

I recall as a young person I envied and admired young women who seemed capable of getting up at 5am to work out, send their child off to school with a packed lunch, keep their husband engaged in their marriage, find time to stock the fridge and clean the house, whilst working gruelling hours in a high demanding career. Young women who seemed to have boundless energy and ambition, always seemed to have clear skin and glossy hair and wear flattering clothes, and genuinely seemed happy. I felt there must be something wrong with me because I couldn’t cope with such a ‘full dance card’. I’m sympathetic then for those young women two decades later, who talk to me of similar feelings of inadequacy.

What do you say to someone who feels inadequate when you know there are many who can have it all?

Those young women who had it all – interestingly sometimes also sought counselling. And whilst you may secretly wish they came to a counsellor and revealed they could never cope, they were hanging by a thread. That was often not the case. Instead, these young women would express something none of us might anticipate.

Feelings of failure and incompleteness.

What?

Surely if you ‘have it all’ you couldn’t possibly feel empty, or have feelings of failure? After all – you have it all!

But that’s the funny thing about ‘having it all’ it’s as much an illusion as believing you have nothing. These are two extremes. And many high achievers are never truly content because it is the feeling of not doing enough that drives them.

Have you ever met someone who clearly didn’t ‘have it all’ but they seem so happy? Yes. It is a little-known fact that people with less, are often more content. The old adage, money won’t buy you happiness isn’t far wrong. Of course, most of us want to be in a position so that we do not have to worry about finances. If you don’t have enough to live without those fears, you don’t have the luxury of being content with less. However, if you are in that lower middle band, where you may not have a lot or have achieved what you believe describes success, but you have enough to not worry about putting food on your plate or fixing your roof if there is a bad storm, then you may know contentment. Perhaps this is why people in socialist friendly countries like Norway and Sweden tend to rate highest on the world happiness scale? Because those basic needs are taken care of such as maternity leave, child credits and health care whether you work or not, and that leaves them to consider the actual process of being content.

Being content looks different to different people.

So, for an over-achiever who lives and dies by their measure of success which is often, almost verging on being unrealistic, they may not be as content as someone who isn’t as much of a high achiever. This is why there are two types of people who most commonly drop out of university: the extreme under-achiever (no surprise) and the extreme over-achiever. [MC1] The latter is a surprise to many and the reason is equally shocking. Over-achievers are more likely to implode, self-sabotage and leave university for what many of us would consider absurd reasons. Case in point: When I was a teacher, I often saw straight A students drop out because they made a A-. On that basis alone. Their grade was the kind of grade most would kill for, but to them, a 95 instead of 100 was the equivalent of abject failure. Many times, I met super-intelligent people who would be working ‘regular’ jobs because they couldn’t finish university and get the job they wanted to get, for this reason.

These discrepancies in how we perceive high achievers teaches us that just looking at someone doesn’t tell us who they are. Our perception of others is often wholly wrong. Any doubts about that and consider a family of ten. Two parents, eight kids. Invariably each child holds a different perspective of their childhood and experience. They do not grow up to be the same even if they’re identical twins. Why? Because our individualism comes from our perspective and no two people hold the same perspective. Perspective is more than what religion you are, what gender, what life experiences you have. It is about the culmination of everything and then the actual vantage point from which your perspective is formed.

Imagine a room with those ten people in it. Something significant happens, ten people walk away with ten slightly different perspectives and experiences. Just by being individuals. It is why bystander testimony is so unreliable, human beings are too subjective and bound by their own perspectives to be objective.

Consider then the over achiever, whilst you may envy them their success, they may be tormented. Equally they may be happy, but still feel they have not succeeded by their standards. Someone who is ‘ordinary’ if any such definition can be made (the argument being, we’re all extraordinary in our own way) may be more content without having achieved as much. In this case, it is more than okay to be ‘just ordinary.’ Modern society possesses many examples of exacerbating stressors such as the pressure to gain a certain grade to get into a coveted school. Despite being an over-achiever, it may not be enough and then even the over-achievers ‘fail.’ By not having those pressures in the first place, you avoid the potential success but also the downfall. It’s not easy being under the spotlight with everyone expecting you to perform. The advantage some over-achievers have, is they may excel under pressure, or be galvanised by it. But what of those who don’t find pressure helps them?

Back to my client. She was not content to be ordinary because she’d grown up with the competing pressures of a high achieving brother who ‘could do no wrong’ and her own feeling of inadequacy. But how much of that was hard truth versus her perspective? In reality it was all her perspective and because of that – it was also possible to shift that perspective to give her a chance at being less judgmental of herself.

How do you change the way you perceive things? How do you stop judging yourself for not measuring up if you are bombarded by subliminal messages that you’re not ‘enough?’

What constitutes ‘enough’ when we’ve got disparate views on what enough means? If we have low-self-esteem we’re more likely to judge ourselves harshly on not being enough, than if we are content or somewhat satisfied. If we’re too satisfied or narcissistic, we may believe we’re far ‘better’ than we in truth are. The delusion is at both ends of the extreme, telling us being an extreme usually doesn’t work out. There’s a big difference between being ambitious and letting ambition determine your self-worth. Literally, speaking there is no such thing as not being enough. There is no such thing as someone being better than someone else. Those ideas of inequity begin and end with our faulty perceptions. If we see everyone as potentially equal and follow the path we want to be on, we’re more likely to reach contentment than striving to please others, or follow someone else’s agenda for us.

Likewise, if someone is more beautiful, so what? If your boyfriend leaves you for someone more beautiful, it’s going to hurt but you’re better off without him. If someone more beautiful is also more talented than good for them. If someone more beautiful and more talented gets a job over you, good for them. You don’t have to take it personally because it’s not personal. There will always be someone who is (more talented, more beautiful, richer, more ‘successful’ in the eyes of the world). Once you realise that, you are free. Free to be yourself. Unapologetically. In many ways it comes down to freeing yourself of what others think. This is not something we can usually do until we reach a degree of maturity. If you’ve ever noticed that older people don’t seem to give a hoot about what people think of them, this might be why!

Perception is a funny thing. Whilst you may be lamenting your failure and ineptitude, or hating your failures, someone else may dream of being exactly where you are at. Someone may admire you for exactly who you are. That’s because our internal notions of what is unquestionable truth, are slanted by our inability to be objective about ourselves. How many times have you admired a person who seemingly feels they can do no right? Artists who think their work is awful, when so many love it? Dancers who leave their careers thinking they’re never going to be good enough, when they enraptured entire audiences? Models who believe they are ugly? Novelists who never write a second novel because they thought the first inferior? The perceptions we hold can be faulty and sabotage our creations. When that straight-A student thinks she’s mediocre, someone else might be dying to be just like her. Conversely, the student who doesn’t put as much onus on being the best, may be less tortured.

Navigating the world is often challenging. Ensuring you have enough self-respect but not too much or too little, is a life-long balancing act. It helps if you can read the cues of others, instead of assuming everything is a personal attack. An interesting book about different people you work with, taught me early on that not only are no two people alike, but we must consider how people come to the table, what forms their differences, if we want to stand a chance at communicating with them. In today’s world where neuro-diversity is more common place, and many people have complicated communication and diametric differences, it’s more important than ever to learn how to communicate with people you may struggle to understand.

If you consider the world had approximately half the population it has in 2022 – only sixty years ago – competition is fiercer than ever. Being ‘ordinary’ could make it harder to secure certain jobs. But ironically, employers gravitate to ‘ordinary’ because they want reliable, efficient but are often put off by high achievers (and of course, low achievers) as being too extreme. The competition at the top isn’t an enviable place when you consider the varied pressures it entails without any guarantee of the level of attainment striven for.

On the other hand, I appreciate my Millennial client’s perspective. Deluged with Instagram photos of seemingly perfect people in all senses of the word. Everywhere people want to be something more than they are. The few who reach contentment realise the wasted years focusing on ‘more’ – which doesn’t mean ambition is bad, or desiring to achieve is wrong, but like everything in life, it’s about balance. I told my client that sometimes imagining how she will feel in ten- or twenty-years’ time is a valuable exercise. Why? Because we don’t do that. We think of what we want in the future (financial security, good health etc.) but not what we will FEEL in the future. Sometimes imagining what you’ll feel in ten- or twenty-years’ time is a way to gauge what you really need (and don’t need).

Ultimately there will come a time in many of ours lives where we look back and say ‘that wasn’t important why did I spend so much time focused on it?’ The old adage, when you’re on your death bed will you wish to have spent more time in the office? At the same time there are those who live for work and there is nothing wrong with that. It’s about finding what we want, rather than what we think we should be. Avoid the ‘if only’ and consider what you really want now, and in the future, and aim for that. Be realistic. And in so doing, you may come to see that being ‘ordinary’ is a privilege not afforded everyone and something to cherish. Nobody is perfect. Nobody has to be perfect. Perfection is recognising that who you are is enough. For some of us, being ‘ordinary’ is the nirvana we hope to attain.

.

Candice Louisa Daquin is a Psychotherapist and Editor, having worked in Europe, Canada and the USA. Daquins own work is also published widely, she has written five books of poetry, the last published by Finishing Line Press called Pinch the Lock. Her website is www thefeatheredsleep.com

.

PLEASE NOTE: ARTICLES CAN ONLY BE REPRODUCED IN OTHER SITES WITH DUE ACKNOWLEDGEMENT TO BORDERLESS JOURNAL.

Categories
The Observant Immigrant

Seasons in the Sun

By Candice Louisa Daquin

Courtesy: Creative Commons

I was lucky enough to avoid the era of self-help books for the most part, but I remember when I was a kid, many adults joined groups which moved towards such pursuits. At the time it seemed sensible, I mean what’s wrong with helping yourself? Or letting others? By the time I was old enough to work, one of my early student jobs was in a bookstore, unsurprisingly. However, I was surprised at how many self-help books were still selling. Eventually those self-help books all shared the same theme: ‘Be positive. It’s the answer to everything.’

The message didn’t sit well with me once I began working with clients. It seemed the pressure of being positive and having to re-frame everything you thought/felt/said on a daily basis would be exhausting rather than healing.

Yesterday was one such example. In a small group of ten I was talking to a mom who recently had breast cancer. I said: “That’s a shame,” about something said. She immediately turned to me and said: “I try to reframe everything to a positive, so I don’t think it’s a shame, I’m all right with it.” The conversation was shut down as fast as if I had said, “I don’t care.” Fine, I thought, this is her way of handling things post-breast-cancer. Made sense. She’d been through a lot and being positive was working for her. Nobody else in the group said anything other than positive platitudes and I left feeling like I had bathed in honey, but not in a good way.

When we can’t have balanced, adult conversations that include an acceptance that not everything is going to be 100 percent peachy 24/7, then we fall into conversations of platitudes, small talk and falsity. Granted, I’m usually one of the first to be positive, but like anything, balance and a time-and-a-place factor into this. If someone comes to me crying because they lost a parent, I’m not going to say: “I try to reframe everything into a positive.” That would be insensitive. Likewise, sometimes people feel they cannot talk about things because they’re going to be deliberately or inadvertently shut down for saying something that isn’t shining and positive.

The reality, however, is we do suffer. We can be scared. We are exhausted sometimes. We may have fears, or feel overwhelmed, or just depressed for no reason. None of those emotions are ‘wrong’ and by aggressively reframing every perceived ‘negative’ thought we’re cutting out our need to express ourselves. Yes, like everything we need to keep in mind there are definite advantages to thinking positively, but we may have gone too far.

My clients overwhelmingly share with me that they are exhausted and wary of the positivity police. By this they mean, the mothers-in-law who shut them down for saying they’re tired or fed-up or cranky or peri menopausal. They embrace their ‘squad’ of female and male friends whom they can go out with for an evening and talk candidly with. No positivity police around. They are stressed from having to watch what they say, much like others complain that they cannot comment on a girl’s pretty dress for fear of being labeled toxic or inappropriate.

Linguistic change is good. It can help erase some common stereotypes, but if it shuts us all down and prevents candour, then it’s also harmful. When you jump on someone for not being ‘positive enough’ it isn’t very different from telling someone to ‘cheer up’ when they have clinical depression, or ‘get over it.’ And we all know how well that goes.

Granted, sensitivity training has fallen to the wayside since social media, but in some ways, it has reinvented itself as the erasure police. Groups of people who take it upon themselves to pontificate on the right others have to their feelings. If those feelings are racist or sexist, then maybe someone should say something, but if they’re simply about how we’re feeling, does it help to tell someone they should be more positive?

Like anything, it’s how you say it, and when you say it and why you say it. I have a client who is so negative it does harm her and so it behooves me to try to reframe her thinking – but I do so respectfully and in the context of therapy. If we were friends out for lunch, I wouldn’t shame her in front of others by saying she was too negative and she needed to be more grateful and positive. That’s not friendship that’s gaslighting[1].

The other day I was talking to a client whose husband died a few years ago. She was told by relatives, both his and hers, that she was ‘taking too long to grieve and needed to get back to living’. On the surface, she agreed, but later on, she felt bullied by their words, as if she’d fallen short of what was socially acceptable. She told me angrily (and has given me permission to share this) that she didn’t think anyone had the right to dictate how she should feel about losing her husband, or the duration of that grief, nor did it help to be shamed into thinking she wasn’t ‘doing enough’ to get over his death.

Again, clinically if a client is experiencing challenges with grieving beyond the intensity of the experience, they may wish to process this and find ways through it that are more expedient. But that’s a very different thing to being told by those you want to trust, that you’re failing to get over something that you shouldn’t have a subscribed mourning period for. The cult of positivity can be a stranglehold when it goes too far, as anything can.

Is there an alternative?

We agree that positivity tends to benefit the beholder, and others. Whilst negativity can be harmful. We also agree everything must be in the right time, and the right balance for the individual. People are different. Ironically some of my clients and acquaintances, report they feel ‘less pressured’ ‘more relaxed’ and ‘less judged’ when they hang out with sardonic, less positive people. Contrary to popular belief, the most positive person in the room isn’t always the most popular.

I can relate to this because when I feel too much positivity is heaped on my plate, I feel akin to a performing seal, it’s inauthentic, tiring, and doesn’t leave me feeling positive. Sometimes a really hard day, with plenty of negative experiences, can act as a better reminder of the value of life, than someone shouting out positivisms. Likewise, if I watch a film where everyone is radiant and happy, it can seem less authentic, and sometimes it’s the struggle, and the endurance, rather than say, the happy ending, that captures my interest.

Why else do we appreciate dark humour? Or like watching psychological thrillers? A bit like people saying you can only appreciate happiness if you have experienced grief. There is a wise lesson in the necessity for balance and reflection of both. If we police every sentence and dictate every action into what we believe are positive traits, we may be exhausting our natural state. It is possible to be realistic without being negative or positive. It is possible to be reflective without giving up hope or forgetting how challenging life can be. For some, extreme positive thinking appears to work, just like for some working out in a gym for 5 hours a day, works. But not everyone lives in extremes, in fact most of us do not. If I’m hearing from my clients that they are exhausting by having to maintain the appearance of positivity, then we’re doing something very wrong in thinking this is the answer to everything.

There is a time and a place for everything, or a season. And sometimes when the sun shines and it’s a beautiful day, we feel capable of maintaining a positive mood all day, and it seems to radiate from us. Whilst this may be an ideal, it’s not likely to be possible to sustain and if we don’t manage to sustain it, should we feel ashamed of that or believe we have failed?

Ironically, I could sit at a table with a bunch of self-deprecating, sardonic folk and have a real laugh, where we’re not all pretending everything is peachy but we’re pretty happy all things considered, and that’s without a need to reframe each word into a positive. There is something very liberating and freeing about being yourself, not having to watch what you say to such an extreme. When we police ourselves, we’re not doing therapy, or work on ourselves, so much as we’re being self-conscious about what we say, and often inauthentic.

So many times, I talk to people I meet, and they are unremittingly positive, but later on when they know me better, they reveal a totally different side of themselves. A man I have known some time, tends to make a conscious point of being positive 24/7 because he’s in marketing and believes instilling positivity is how he sells efficiently. But when he’s had a glass of wine and it’s the evening, he will reveal to us at the table as we sit talking, a completely different side to himself. I can see why he needs to maintain the positive side for work and I admire that, but I often wonder if he is a little tortured by the pressure he puts on himself to be like this 24/7, when it’s clearly not his ‘natural’ state.

Funnily enough I like the ‘real’ side of him far better. It’s more realistic, less cliched and narrowly focused and I believe, just as content, without the need to put the shine on all the time. Growing up with friends and family who were not afraid to be realistic or even negative, I can see the value and the downside of negativity. Being negative all the time reminds me of the saying ‘nothing comes from nothing’ whilst realism is underrated and underutilized. I wouldn’t want to be as self-defeating as some of those whom I grew up with, I believe in empowering people and supporting them, which involves believing in them and being positive. But I also think too much of a good thing can be insincere.

Sometimes when a client comes to me and says they’re having an awful time, I know they do not want me to reframe that and tell them tomorrow will be better, or if only they could see the positive in the event, things will improve. That would be offensive and inauthentic. Rather, they want me to sit with them, digest their experience, share it and be a friend. Sometimes a friend doesn’t need to make things better, they just need to care and show up.

The extreme end of positivity feels a lot like a cult. It’s unrelenting, it changes language and natural feelings, it acts like a cancel culture on many authentic emotions and shuts them down. One of the best things I ever heard from a therapist was from a colleague who told me: “Sometimes it just stinks and you have to throw your hands up and say I give up! Until something changes, which it might not, for a long time, but eventually it will. If during that time you feel awful, that’s just how you are going to feel and it’s okay to feel that.” That therapist was radically different in their approach. There’s something honest and real about this advice, that I don’t find when I get overdosed on mantras, affirmations, positive sayings and memes.

Too much of anything can be an overdose, that includes positivity. While a teen may get a lot from a positive meme on Instagram, they may also feel less alone by reading a negative meme. The point is to avoid the extremes of embracing darkness or pretending it doesn’t exist. The point is to consider we’re humans not trainable robots. Don’t we already stifle emotions enough by trying to be strong all the time? When did being honest about how you feel, become an anathema? If your heart tells you that you are exhausted by trying too hard, then examine this. Take a step back. Be authentic to yourself. Don’t follow the crowd. Follow your gut instinct. Sometimes the extra slice of really sugary cake isn’t delicious, it’s nauseating. Stay balanced kingfisher!


[1]  A form of emotional abuse and manipulation. https://www.healthline.com/health/gaslighting

.

Candice Louisa Daquin is a Psychotherapist and Editor, having worked in Europe, Canada and the USA. Daquins own work is also published widely, she has written five books of poetry, the last published by Finishing Line Press called Pinch the Lock. Her website is www thefeatheredsleep.com

.

PLEASE NOTE: ARTICLES CAN ONLY BE REPRODUCED IN OTHER SITES WITH DUE ACKNOWLEDGEMENT TO BORDERLESS JOURNAL.

Categories
The Observant Immigrant

A Post-Pandemic Future …?

By Candice Louisa Daquin

Courtesy: Creative Commons

Having been a reluctant fan of apocalyptic fiction since I read George Orwell’s Nineteen Eighty-four (1949), I had studied virology when the AIDS pandemic struck and read a great number of virus-related books on infectious diseases. Despite this preparedness and the knowledge that it was not a case of IF, but WHEN, the next virus would strike, I think I speak for most of us when I say we were still all unprepared for Covid-19.

What the pandemic has taught us thus far is immeasurable and I believe it will last several generations, or I hope so. That said, it’s our human nature to want to move on. Not because we don’t care, but part of being alive is putting trauma and suffering behind us and ensuring those who survive, truly survive, which means living. Is that insensitive or just the nature of the beast? It can be insensitive, especially to the millions who have lost loved ones, but it’s also how humans generally operate.

Is it possible to move on and live a full life irrespective of this global tragedy without losing our compassion and responsibility to stop this from ever happening again?

The reality is; it will happen again, and for many of us, in our lifetime. What we can do is be better prepared and all that this entails.

What are the steps being taken to move toward the new post pandemic future? What are we doing differently? And why?

The pandemic divided us, it physically kept us apart. Some who were well versed in social skills and true extroverts, struggled when they emerged from the worst of the pandemic. They found it hard to do the things they used to be so skilled at. From lack of practice. I recall sitting at lunch with a friend who used to be the life-and-soul of any social event. She struggled for, as she put it; ‘her words’. Having become so used to speaking less and not being face-to-face, she said it felt ‘overwhelming’, ‘strange’ and she looked forward to going home.

That is a habit we must break. The comfort of the living room and the immediate family is intoxicating. We can rapidly get used to living in a smaller-seemingly safer, changed world where we see less people, go out less, and become accustomed to an intimate circle. For some of us this was always our life, and maybe not as challenging — a shift as it was for those who previously socialised a great deal.

In a way the pandemic was harder on the extrovert than the introvert. Because while introverts aren’t averse to socialising, they can find it exhausting; whereas extroverts gain energy from it. When you put an extrovert in a forced setting without social opportunity, they may struggle more than someone used to their own company.

But it’s not as simple as extrovert and introverts. Many of us are a little of both, depending on the situation. I can go out with a big group one day. But on other days I want to be alone. Few of us are extremes. Most are like ‘ambiverts’ a combination of extroverts and introverts.

For those who do thrive on socialising, the pandemic was particularly challenging, but there are many ways to be affected, not least the tension and anxiety all of us picked up on or directly experienced.

Fortunately, technology became our best friend as we Zoomed more and met via video chats throughout the world. It opened up an international stage more than we’ve ever experienced and gave children a new normal in terms of how they learned online. Learning solely online had deleterious effects on underperformers. This ‘unfinished learning’ [1] particularly impacted youth who might have already been struggling in the educational system.

Having taught Critical Thinking online for years, I genuinely believe online learning cannot replace in-class learning. There are huge draws to learning from the comfort of home, especially for adult learners who do so after work [2]. “In comparisons of online and in-person classes, however, online classes aren’t as effective as in-person classes for most students. Only a little research has assessed the effects of online lessons for elementary and high school students, and even less has used the ‘gold standard’ method of comparing the results for students assigned randomly to online or in-person courses.” [3]The amount of information retained is drastically smaller and the social engagement of a classroom has benefits that are hard to quantify but necessary for social development. When you rob children of the opportunity to socialize with each other you isolate them at a crucial stage in their development.

Some kids with learning disabilities[4] are particularly affected by this, as are those who come from unsafe or impoverished backgrounds, where they may not have equal access to technology or reliable internet. They may not have parents who can help them if they are stuck or be able to work from home or have access to lunch. All those necessary elements to the education system were lost in our need to stay home and protect each other. A generation of children will always remember this time as a result.

On the other hand, they have mastered technology in a way that few older generations can boast of, and they are conversant in all the myriad ways of communicating with a wide range of technologies and devices. They are adaptable, versatile and fearless when it comes to tackling the rigors of online learning. For some who dislike social settings, it may also be a vast improvement[5].

Women left the workforce in droves [6]when the pandemic hit, with 2 million less in the work-force. The inverse of this was men began to return to work having been dropping in numbers whilst women rose. The Pew Research Center found “What accounts for the larger labor force withdrawals among less-educated women than men during the pandemic? It is complex but there seems to be a consensus that it partly reflects how women are overrepresented in certain health care, food preparation and personal service occupations that were sharply curtailed at the start of the pandemic. Although women overall are more likely than men to be able to work remotely, they are disproportionately employed in occupations that require them to work on-site and in close proximity to others.” Jobs men traditionally do like physical labor, were in high demand, whilst many jobs traditionally filled by women, were shut down, often not returning[7].

We can be glad our restaurants are open again; we’re opening borders, we’re flying abroad, we’re living again. But let’s also spare a moment to think of those who lost so much it’s almost impossible to conceive. Covid was the third leading cause of death in America during the height of the pandemic, how did this many deaths become normal? Covid killed an estimated 13% of people over 80. Aside the tragedy of a generation of elderly dying[8] and the loss of grandparents, and parents for so many, we’ve also seen younger people dying from a virus, which has shaken the belief younger people have that they are impervious to viruses similar to the flu, what effect with this have on their sense of safety going forward?

And what of the health consequences of those who technically survived bout of the pademic but developed ‘slow Covid’ or worse, the side-effects and lingering legacy of being seriously ill with the virus?[9] How many lung transplants will occur? How will ‘long haulers’ cope with lingering serious effects? What of those who live in countries where this isn’t an option? How many chronic illnesses will continue for decades as a result of this pandemic? It’s not enough to point to those who have died but also include those who survived but at such a high cost.

Financially we have collectively poured money into research, vaccines, countermeasures and prevention, but where has that money actually come from? And can we feasibly borrow that much money from our coffers without a reckoning? Economist Anton Korinek, an associate professor with a joint appointment in the University of Virginia’s Department of Economics and the Darden School of Business thinks: “People sometimes frame the policy response to COVID-19 as a trade-off between lives and livelihoods, and they ask whether it’s worth killing our economy to save people’s lives. But what they forget is that people won’t go back to a normal life and consumer demand won’t really recover if the virus is spreading through our country and killing people.” But the result of these hard choices and repeat closures, is they now predict an impending worldwide recession of global proportions, which had already been mounting prior to the pandemic, but promises to be far greater in its aftermath. I don’t think we’ve even begun to see the fall out; it begins with massive inflation but that’s just the start[10].

History tells us when we go through challenging times and survive, ‘the near miss experience’ as it’s known as, we want to live more than ever before[11], but economically this will not be possible for so many who are robbed of their financial security because of inflation, redundancy, underemployment and post-covid illness. We should be mindful that none of us are all right if many of us are still suffering and if we can support those who struggle, this battle with covid should have taught us all that we should care more about each other.

Perhaps these are the steps we can take to move toward a new post-pandemic future, where we consider ways, we may be better prepared for an invariable future of emerging viruses. We can try to find ways to avoid spilling into areas with high disease potential. “According to a group of UN biodiversity experts, around 1.7 million unidentified viruses circulate in animal populations, of which 540,000 to 850,000 have the capacity to infect humans.” So, we can avoid wet markets, and sloppy scientific research, both of which are vectors for the spread of viruses. We can pay more emerging virus hunters [12] to seek out those emerging viruses and begin work on treatments before they devastate countries. We can be borderless in our unanimous approach to equity for all, especially access to healthcare.

In America, we learned we were far from unassailable. In a New York Times article about Covid Deaths, the authors wrote: “For all the encouragement that American health leaders drew from other countries’ success in withstanding the Omicron surge, the outcomes in the U.S. have been markedly different. Hospital admissions in the U.S. swelled to much higher rates than in Western Europe, leaving some states struggling to provide care. Americans are now dying from Covid at nearly double the daily rate of Britons and four times the rate of Germans.” Nothing can diminish that fatal statistic or rectify the unnecessary deaths[13]. Our healthcare system, considered superior, proved to be full of holes. Without some type of socialised healthcare our costs and resources are too high and scarce. We don’t value the front-line workers like nurses, porters, assistants and care staff and we do not pay them for the risks they take, and whilst we do pay doctors good wages, we have severe shortages of knowledge and progress. Finding out we didn’t have enough ventilators, masks for medical staff, PCP equipment and beyond, exposed the shame of putting profit over people. [14]

It is no surprise then that the UK and USA were among the top offenders in the rise and spread of the pandemic and their death rates exposed this. No one ethnic group appears to be at greater risker of dying from the virus based on ethnicity alone, but Hispanic, Black, and native Americans or AIAN people are about twice as likely to die from COVID-19 as their White counterparts and that Hispanic and AIAN people are at one and a half times greater risk of COVID-19 infection than White people[15]. This is caused by social reasons (inequality) not ethnicity, as can be proven by Africa and some AIAN countries having some of the lowest Covid mortality rates. In the article ‘Racism not Genetics’ in Scientific American, the authors point out “the genes that influence skin colour are distributed independently of genes that influence the risk for any particular disease. Given the heterogeneity of groups we call “black” or “white,” treating those categories as proxies for genetic variation almost always leads us astray.”[16]

Even if there are increased susceptibilities related to blood type[17] and age (More than 81% of COVID-19 deaths occur in people over age 65. The number of deaths among people over age 65 is 97 times higher than the number of deaths among people ages 18-29 years). The real risk is how healthy the population is and whether they have safe access to healthcare[18]. Both America and the UK failed because they put profit above people and have large populations of sickly people[19]. Going forward this needs to change, which means redesigning what we prioritise. People need to have access to healthcare and make lifestyle changes that will reduce their risks which they cannot do if they cannot afford to see a doctor or in the case of the UK find it hard to see a doctor because of long wait times and reduced staffing. It’s not as simple as socializing healthcare as the UK proved, this alone doesn’t save lives, what saves lives is considering the larger picture.

But politicians gain from older populations dying, consider what happened in Brazil when the President denied the danger of Covid and for a time Brazil had the highest Covid mortality[20]. This is the harsh truism rarely mentioned: It benefits those in control of a society to lose the most fragile members who will suck up precious resources, much like a form of eugenics, it behooves them to let it happen and there are many examples[21]. For a politician who is looking for ways to reduce healthcare costs, what is better than some of the potentially most expensive ‘customers’ dying? This happened in France where number of elderly people died one Summer, shockingly little was said at the time, but all signs pointed to a collective signal of relief from those in power who benefited from less older people making claim on an already taxed medical system[22].

When Italy [23]and Spain [24] and Brazil [25] became epicenters of Covid 19 deaths, they did so because of ill preparedness and it’s a cautionary tale to witness which countries succumbed to the ravages of covid 19 repeatedly, versus those who learned from them. What we have learned is more, not less, needs to be done and if a country keeps its borders open including air-travel and business-travel, then as much as they hope to save their economy, they do so at the expense of their most vulnerable. For some countries this was a conscious choice (economy over lives) whereas for others it was poor communication and slow response times. For some a lack of money, for others a desire to gain at any cost. All this speaks of the tapestry that is the pandemic’s aftermath (and truly, is it really vanquished?)[26].

I’d love to say a new post pandemic future looks rosy, but the only way that happens is if we learn from our mistakes, which history tells us, we rarely do. The most important thing is empathy, when we saw others take their masks off and simply not care if the vulnerable died, we saw how bad we as humans can fall. But we also saw how wonderful humans can be, including the infinite sacrifice and compassion of thousands who sought to help strangers. If there is a way, we can reward the good and not the bad, if we can get our priorities right and stop paying sports figures astronomical sums but perhaps emphasise on compassion, kindness, and diligence, we can all grow together.

I was particularly moved by youth who in the turmoil of the pandemic created inventions or systems to help others[27]. Believing youth are our future, and thus, our hope, it gives me great faith in the future when I see those too young to vote, care for strangers and seek to do their part. We should always encourage this as we should encourage a continued dialogue into how we can create an international rapid response to emerging diseases. It is not if, but when, and now all of us should know this and have no excuse for putting our heads in the sand again. Yes, it hurts to think of it, yes, we’d rather go off and have fun, but what fun is it if we are only postponing the inevitable return of a lethal virus? Part of being responsible for our planet and each other, is not avoiding the harsh truths; of environmental changes and devastation, global poverty, continued inequality and elitism, and of course, the increasing risk of deadly diseases.

We have within us all, the power to effect change. The steps we should take to move toward a post pandemic future must necessarily include keeping our eyes open and not taking the easy road. Sure, governments don’t want to spend the money on research, science, virus hunters, predictions. And preparedness, but I challenge anyone to say this isn’t exactly what they need to do. It is necessary we keep this in mind when we vote and protest. We should be marching about this as much as any other cause, because it affects us all and equally, brings us all together with one cause.

Thinking in terms of one world, we are less divided than ever before and whilst we were separated, I think we also found ways to come together if we choose to. I say, we should. Because, together globally, we learn more than we ever would divided. With the offensive by Russia on Ukraine, we see the lunacy of war, the futility, the devastation and waste. Instead of pouring millions into wars and keeping the rich, rich at the cost of the poor and overworked, we should consider how we can all rise out of the mire and evolve towards a better future. But in order to achieve this we cannot be complacent, and we cannot let our guard down.


[1] https://www.mckinsey.com/industries/education/our-insights/covid-19-and-education-the-lingering-effects-of-unfinished-learning

[2] https://www.forbes.com/sites/dereknewton/2021/03/31/the-worst-of-times-for-online-education/?sh=401d57623a5a

[3] https://www.edweek.org/technology/opinion-how-effective-is-online-learning-what-the-research-does-and-doesnt-tell-us/2020/03

[4] https://www.washingtonpost.com/education/2021/05/20/students-disabilities-virtual-learning-failure/

[5] https://penntoday.upenn.edu/news/how-technology-making-education-more-accessible

[6] https://www.shrm.org/resourcesandtools/hr-topics/behavioral-competencies/global-and-cultural-effectiveness/pages/over-1-million-fewer-women-in-labor-force.aspx

[7] https://www.pewresearch.org/fact-tank/2022/01/14/some-gender-disparities-widened-in-the-u-s-workforce-during-the-pandemic/

[8] https://www.statista.com/statistics/1191568/reported-deaths-from-covid-by-age-us/

[9] https://www.hopkinsmedicine.org/health/conditions-and-diseases/coronavirus/covid-long-haulers-long-term-effects-of-covid19

[10] https://news.virginia.edu/content/economist-societal-costs-covid-19-outweigh-individual-costs

[11] https://www.psychologytoday.com/us/blog/understanding-grief/201803/aftereffects-the-near-death-experience

[12] https://www.france24.com/en/africa/20201218-gabon-s-virus-hunters-in-search-of-the-next-covid-19

[13] https://www.nytimes.com/interactive/2022/02/01/science/covid-deaths-united-states.html

[14] https://www.theguardian.com/us-news/2022/feb/06/us-covid-death-rate-vaccines

[15] https://www.cdc.gov/coronavirus/2019-ncov/community/health-equity/race-ethnicity.html

[16] https://blogs.scientificamerican.com/voices/racism-not-genetics-explains-why-black-americans-are-dying-of-covid-19/

[17] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8286549/

[18] https://www.bbc.com/news/world-us-canada-52245690

[19] https://theconversation.com/why-has-the-uks-covid-death-toll-been-so-high-inequality-may-have-played-a-role-156331

[20] https://www.thelancet.com/journals/lancet/article/PIIS0140-6736(21)00529-8/fulltext

[21] https://foreignpolicy.com/2021/03/18/china-covid-19-killed-health-care-workers-worldwide/

[22] https://www.nbcnews.com/news/world/hong-kong-covid-outbreak-rcna20033

[23] https://www.france24.com/en/tv-shows/revisited/20210528-covid-19-in-europe-codogno-the-italian-town-where-it-all-began

[24] https://www.nytimes.com/2020/03/13/world/europe/spain-coronavirus-emergency.html

[25] https://www.scielo.br/j/rsbmt/a/8FzbQZY57WRTwYL9MnBKBQp/?lang=en

[26] https://www.nature.com/articles/d41586-021-03003-6

[27] https://circle.tufts.edu/latest-research/deeply-affected-pandemic-youth-are-committed-helping-others

.

Candice Louisa Daquin is a Psychotherapist and Editor, having worked in Europe, Canada and the USA. Daquins own work is also published widely, she has written five books of poetry, the last published by Finishing Line Press called Pinch the Lock. Her website is www thefeatheredsleep.com

.

PLEASE NOTE: ARTICLES CAN ONLY BE REPRODUCED IN OTHER SITES WITH DUE ACKNOWLEDGEMENT TO BORDERLESS JOURNAL.

Categories
The Observant Immigrant

A Bouquet of Retorts

By Candice Louisa Daquin

Language. Children learn it before they understand its importance. Adults can struggle to learn additional languages because the brain is less elastic as it ages. The formulation of language is a key component of what makes humans, human. Our language (though not our communication) is speculated to be among the most complicated and rule-bound of living creatures. Whether disproved in the future when we are able to translate other animals’ languages, we can all agree, the impact of language on us is invaluable.

Why then are our language skills diminishing? With every person who has benefited from being able to look up information online and thus, know a little bit about a lot of subjects, we have simultaneously reduced our language breadth. We are increasingly tempted to take short cuts linguistically both in writing (texts and emails) and how we speak to each other. It may be tempting to blame this on social media but it’s not that simple. This is not new: Throughout time, there has been enormous value put on ‘banter’, ‘ridicule’, ‘sarcasm’ and pithy retorts.

Perhaps people who can summon lightning fast rapporté are considered witty, nimble minded, fashionable. Contemplate those who have been considered ‘cool’ socially. Those who had the quick response, the short soundbite that cut to the chase or was easily repeatable, was often admired. Just recently Ukrainian President Volodymyr Zelenskyy in response to President Biden’s offer of a flight to get him and his family out of harm’s way, replied: “I don’t need a ride, I need bullets.”

People applauded his response because in many ways it describes the crux of what Volodymyr Zelenskyy and his country are going through and portrays him as a brave leader. Just as Clint Eastwood in the Dirty Harry series said the much-emulated words: “Make my day punk” and Arnold Schwarzenegger coined “hasta la vista” in the Terminator series alongside other iconic statements made by film stars, celebrities, politicians and authors. It’s the admiration of cutting to the chase, emulated by millions, as a means of extracting the essence of what we’re trying to say or merely being glib. We utilise quotes now more than ever, through the social media medium which makes it easy. The only question is; when we scroll through quotes and need ever increasing variety to our lives, are we really absorbing the meaning behind the soundbite or merely parroting it?

There is a history behind the proverb, quote and parable. It was a means of remembering wisdoms easily for those who might have been illiterate or before books were widely available. Along with songs, this was a method of retaining what was not written. Religion has employed this through easy to remember choruses and proverbs, it has long been human nature to reflect on life through such proverbs and sayings. In the 14th century the popular proverb “He who sups with the Devil should have a long spoon” and other sayings were a means of social control or wisdom, depending. In quotes from Aesop, (the famous fables) “We hang the petty thieves and appoint the great ones to public office.” We can admire the truism of this, just as we admire the bravado of a TV hero saying “make my day punk.” Perhaps Confucius said it best: “They must often change who would be constant in happiness or wisdom.” And quotes are a ready means of growth that don’t require the commitment of reading say, The Four Books and Five Classics of Confucianism.

But when do quotes stop being educational and more, inadequate shortcuts to knowledge? Where I live, people don’t read whole books very often anymore. They excuse this by saying they are ‘too busy’ to read. I work a 60-hour week and I read. I can understand emergency room doctor’s and new parents not having time to read, but regular folk? You can tell how much someone reads by asking them what their favourite books are. If they quote more than one high school book, chances are, they haven’t read much since high school. Reading is an evolutionary experience and we grow as we read. Many people have forsaken televisions (which isn’t a bad thing) but also the genre of fiction, believing it has no worth. Are we ‘better’ for reading less fiction, and for reading fewer complete books and more online news channels and texts and memes? Do we lose something? And if so, what do we lose when we absorb language in this different, shortened way?

When was the last time you sat down and had a really in-depth conversation that wasn’t about your parents’ dementia or a breast cancer diagnosis or something that serious? But simply analyzing a book you read, a play you went to see, a film, or a discussion on politics or history or psychology? Granted some of us may never have sought to do this and that’s okay, but of those who did used to analyze, it seems analysis is less mainstream and now very specific to your job field. Fewer people sit on Sunday afternoons and read the paper from front to back. More scroll quickly, gleaning the basic amuse-bouche but nothing of substance. If you are a marketing analysist you analyze market trends. If you work in the financial sector, you may analyse financial impact. If you are an economist, you may consider economic development. Because most of us work such long hours, do we really have time, energy (or desire) to analyse things we don’t have to analyse?

It could be a sign of the times, of modernity, cultural shifts, progress even, that we don’t need to delve as deeply. No longer subject to pouring through piles of textbooks to hand write a paper. And some of that progress facilitates other knowledge, such as an ability to navigate the www… and beyond — to understand HTML and design websites and publish books all by ourselves, things that formerly would have seemed impossible. The scope of things we can do with technology for example, has expanded our choices recreationally and professionally. Kids are creating entire music albums in their bedrooms with affordable equipment, people are making whole films on their iPhone, others earn a living filming themselves for social media platforms, models are made by Instagram photos, we have all become graphic designers and editors of our own stories.

On the flip side of that, jobs that once promised a living wage such as graphic design, photography, editing, translation, music production, are being replaced by cheaper options. Platforms like FIVERR can design your tattoo for you, create a corporate logo, a book cover, anything you should desire, for a fraction of the price a professional would charge, because they are borderless, not beholden to the rules of old, and could well be a 16-year-old practicing graphic design skills from their bedroom. On the face of it, there is nothing wrong with that. If someone in Indonesia wants to offer those services for a fraction of the price a US supplier would charge, and under-cuts them in the process, isn’t that just the consequence of a free marketplace? If that Indonesian supplier wishes to charge what for them is a relatively fair wage, but in the US market would be considered a pauper’s wage, is it exploitative to the Indonesian? Or must we accept an individual’s choice to make that decision for themselves?

Should we question what ‘choice’ means if there is effectively, less choice when someone offers services for less? What is equitable? Ethical? With technology half the times we don’t know what country, what nationality or the details of the people we work with and it becomes less important. The only reason it is important is if someone is working from Nepal for $3 an hour whilst someone is working from Australia for $25 an hour. Simultaneously if everyone is aware of that inequity but agrees-none-the-less, then who are we to complain? Cost of living varies and maybe charging commensurate to your cost of living is a more realistic model than across-the-board flat rates. How much has changed since the internet opened borders and countries to a greater freedom of the choice of commerce and services than ever before? Like with anything, there is exploitation and there is improvement, and there’s no one simple answer to ensuring everything is fair, or nobody is exploited.

How does this relate to language? Or a series of retorts? It comes down to shifting social mores and what our expectations are – with this comes a modification of language, much like that you read about in science fiction novels of the 1960’s where the homogenised aspect of the world watered down uniqueness in favour of uniformity and created a melting pot where language among other things, was diluted for simplicities sake. Interestingly science fiction also created entirely new languages, (conlang, which is an artificial language) and worlds, so one could say it added to as much as it abstracted the future. Using posteriori languages (borrowed words and grammar from existing languages) has become commonplace, from Spanglish to Yiddish in America.

In the article ‘SMS & Its Negative Effects on Language’ (www.itstillworks.com), the authors note the mass use of shortened ‘slang’ language employed by societal groups, not least teenagers. Such habits have been speculated to carry over into reduced grammatical and spelling abilities, although conversely it could be argued, if teens are writing more (even badly) it encourages those who formerly may not have written at all. If you think how much your social skills have deteriorated since the pandemic because of lock-down and less direct socialising, is it that hard to believe persistent use of abbreviations and icons would replace language fluency? In the article ‘Alienating the Audience: How Abbreviations Hamper Scientific Communication’ (http://www.Psychologicalscience.org) the authors point to loss of deeper meaning when employing constant acronyms.

In The Times of India article ‘Shortening language has negative implications’, the authors point to a misuse of technology (always being ‘on’ and responsive to technology) ironically reduces efficiency and that ‘infomania’ can cause an overload of information. Being ‘dumbed down’ by technology and linguistic abbreviation could reduce the sharpness of our knowledge. Conversely, The Atlantic says in its article ‘Teens Aren’t Ruining Language’ that while ‘fad’ words may have a different trajectory online, they don’t have the power to ‘debase’ linguistic standards. “How much a person’s vernacular changes over time may have as much to do with personality and social standing as it has to do with age. The extent to which teenagers are credited with (or blamed for) driving lasting change to language … is grossly overstated.”

Whatever language we speak, we may be aware of this shift in seeking depth. Not only reading less complete works but expecting a synopsis instead. How does this affect conversations? Social interactions? What do we value and consider ‘worth’ as well as what the shift from meaning to soundbite imply? It is good to be able to Google everything and think we understand things we may not have looked up if the internet did not exist. But simultaneously we’re aware what’s online isn’t always factual so much as a series of compiled opinion. If history is written by the victor, then doesn’t it stand to reason what we assume is ‘fact’ shifts dramatically? What people in China right now, are exposed to as ‘incontrovertible truth’ is not the same as what people in Russia are being exposed to, or America, or India. With so much variation in what represents pure truth, shouldn’t we worry about that? Or are we imagining a past where truth existed in a purer state than it ever did?

Journalists used to have to write objectively about subjects, unless they were writing opinion pieces or columns. Even with the latter, there was a responsibility to report news from both sides, and not let personal bias taint the reporting or the information being presented. When you read news articles now, if you step back and try to put your own personal political views aside – what do you think about the reporting of those subjects? Are they objective? Take any side you want, but what you’ll find is they are often blatant or subtle bias and invested in projecting their perspective as the ‘only truth.’ Even the most basic Wikipedia definition of journalistic objectivity states: “To maintain objectivity in journalism, journalists should present the facts whether or not they like or agree with those facts. Objective reporting is meant to portray issues and events in a neutral and unbiased manner, regardless of the writer’s opinion or personal beliefs.” Why then is this not demanded?

I admire journalists who could step outside of their own views and write on a subject without that natural bias. Non-journalists are biased and look to journalism as a fair reporter of facts, where we can make our own minds up. Reporting shouldn’t be an opportunity to tear one side down to promote another. In America, the backlash against Donald Trump was perhaps the greatest witnessed in this country because of the deep divide in voters and the horror felt towards Donald Trump by many. He was considered dangerous for the country and irrespective of whether it was a truth, the majority of news outlets were a 100 percent against him. When I brought this up, I was told I obviously was a racist who supported Donald Trump otherwise why would I even care? This missed the point. I cared not because I wanted to defend Trump, but because I felt objective, rational, non-biased news reporting had been completely eroded.

It’s more important to me that we retain that objectivity even in the face of things that we may personally revile. A journalist who is unable to be objective, forfeits the right to condemn another, because they are not utilising that objectivity in their analysis. Maybe we cannot expect regular every day people to be free of bias, but when the moral underpinning of your job requires it, then you owe it to your readership not to pander to their outrage and stir the pot, but present an objective overview. The same is true of social media ‘conversation’ where a subject is presented, and people sound off, often becoming offensive, outrageous and exceeding the remit expected if we were all sitting in a room together. That anonymity afforded by a screen and physical distance, seems to have opened a pandoras box of horrors.

People can be unrecognisably offensive in their attack of others, for no discernable reason. It should be possible to discuss any subject without people devolving into personal attack and ad hominin. Has our use of language also been altered via our anonymity online? The oft disputed Sapir–Whorf Hypothesis states that the language we speak, influences the way one thinks. This verges on potential fascism if it judges one language superior to another, but the point is taken. The complexity of language has been linked to higher IQ’s which may explain why Finnish and Japanese students tend to be in the top ten achieving academically, their languages being among the most complex. Likewise, people who are polyglots, and speak multiple languages, or those who are musical (often considered a language) or mathematical (likewise) tend to possess higher IQ’s. The problem with this is we will never know what other influencing factors play a part in this, including nurture and nature.

This paradox between cancel culture, that looks to demote those who say things deemed offensive, and the increasing offensive backlash and gaslighting of others, seems to point at the hypocrisy of ‘free speech.’ It’s only free if the right person is saying the right thing, otherwise you’re liable to be canceled, but watch out, because that’s subject to social fashions that vary wildly. Beyond that hypocrisy, what of the actual quality of communication? Have we devolved to the point of only being able to say what we really think anonymously? Why do people write reams online but say little in a real-life group? Are we hiding behind rules and etiquette that only creates sub-groups that have no tolerance for other groups? Does group think ever produce something other than subjective thinking?

If aliens came to Earth, they might notice humans seem to admire ridicule and socially sanctioned sarcasm and call it witty without needing to have a deeper conversation. Twitter represents this phenomenon most acutely because the actual length of your post is restricted and thus, you have to encapsulate what you want to convey, by honing it to that word-count. That’s great if you’re delivering a highly edited statement, but how naturalistic is it as a form of legitimate communication? Does it replace your grandmother sitting at the kitchen table with you for an hour? Maybe it doesn’t have to, but do we still sit at the kitchen table and have those conversations? And if we don’t, will that affect what human communication is evolving (or devolving) into?

Sometimes flim-flam is appealing, in its starry simplicity. We embrace Digi-Fiction written and read from computers, changing how we process fiction. We can be attracted to the code-switch of language that takes the guess work out of communication. For those neurodiverse populations and people increasingly using actual code as a form of language such as HTML, it may seem like a logical next step. An improvement on the guesswork of complex modes of communication that were challenging for so many. At the same time, when we lose the ability to read a book from beginning to end, we lose the patience and journey of that process, which if not replaced, may be a genuine loss we cannot even fathom. Then again, in the spirit of all possibilities it could be we leave behind that which is not necessary and embrace a Haiku perspective of saying more in less.

I confess, when I read a ‘classic’ novel I am aware of how much superfluous information exists and doesn’t strictly need to. It is interesting to consider how much language we used to say one thing, compared to now. The medium of social media means we’re busier than ever and take our ‘fix’ of what attracts us (quickly) before signing off. Therefore, long poems have less attraction than shorter memes. We fixate on the easily presented, the humorous and immediate. Nuance, subtlety, slow burns, those are almost luxuries we may leave for rare nights in the bath. Novels are changing to adjust to this phenomenon. Graphic novels are gaining further traction, even songs. Our entire social fabric has altered, and, in some ways, this was inevitable if you recall we always admired that witty fast retort, going back to Marie Antoinette, Gloria Swanson and beyond.

Does this mean all language must conform to this new rigor? Or will epochs of devotees to other forms of communication, endure? When I browse through bookstores, I notice there are many styles of writing, including the long-winded, and the easy read. My fear is not that we read ‘easier’ books but that we stop reading altogether, believing scrolling on our phone compensates for the discipline of reading a book. One may argue, do we need discipline? But learning is invariably discipline and part of honing rigorous learning habits is being challenged. We can do crosswords, play chess, sudoku, and go to the gym to maintain a healthy body and mind, but the ‘imaginarium’ of fiction and the need for creative expression is for many of us, equally necessary.

Fiction isn’t a waste of time simply because it’s not a literalism. Fiction as a speculative field, has inspired science, politics, social advancement and a sense of possibility. Fiction can thrill, entice, or simply entertain. Not all forms of entertainment are equal. Whilst I confess, I do watch television, I recognise the lasting value of a book compared to a serial, because it requires more of us mentally. For those highly disciplined souls, there may be no need to ‘indulge’ in fiction, or television. Maybe reading Scientific American or pouring over The Financial Times or Anglers Digest will be their choice. But language has a trickle-down effect, and you can guarantee, it will eventually permeate all sectors of our lives.

Do we want to completely dilute the value of further explanation, detail and depth in favour of the glamorous soundbite? Or is it possible to harness the value of succinct communication and retain the continued relevance of detail? When I read what passes for scientific news in popular media, it concerns me that we are picking and choosing for ‘click bait’ purposes and this leads to the proliferation of inaccuracy. Case in point, the startling headline: ‘The Epstein Barr Virus (EBV) causes MS’! Then in the meat of the article, we find 95 percent of humans have ‘EBV whilst only (35.9 [95% CI: 35.87, 35.95] per 100,000 people) have MS’ – we are leaving out the most important connections in favour of scaremongering journalism which only serves to increase (inaccurate) neuroticism when it should seek to educate and elucidate.

It’s not that too much information is bad for us, it’s that too much incomplete information can distract us from truth, and we may learn to gloss over what matters in favour of what shines brightest. Sometimes it is necessary to finish the chapter.

.

Candice Louisa Daquin is a Psychotherapist and Editor, having worked in Europe, Canada and the USA. Daquins own work is also published widely, she has written five books of poetry, the last published by Finishing Line Press called Pinch the Lock. Her website is www thefeatheredsleep.com

.

PLEASE NOTE: ARTICLES CAN ONLY BE REPRODUCED IN OTHER SITES WITH DUE ACKNOWLEDGEMENT TO BORDERLESS JOURNAL.

Categories
The Observant Immigrant

The Paradox of Modern Communication

By Candice Louisa Daquin

Courtesy: Creative Commons

Microaggression is a relatively newly used term to describe less direct action that can cause offence. It is a term that has been setting the social media alight for the past few years, sometimes for good, sometimes not so much.

Having just attended another course on this buzz word du jour, it struck me how absurd some of the ‘lessons’ on identifying microaggression were. One could argue, it’s pop-psychology and political correctness, gone awry.

Before we get to the absurd scales over policing microaggression, it’s important to explain why acknowledgement of passive-aggression and gaslighting (confusing a victim with the perceived reality) does matter and should be discussed (if perhaps not exhaustively).

Think back to those days of old where eve teasing in offices was considered du rigor, where a woman of colour with an afro was told it wasn’t an acceptable ‘hairstyle’ and threatened with being fired if she didn’t ‘tone it down’. Think of a Muslim worker being told he couldn’t pray during work hours, a Jewish colleague not being given time off for Hannukah, a black footballer having bananas thrown at him whilst he played, or an Indian politician being called a ‘Paki.’

These all still happen despite being labelled as racism, bigotry, hate and microaggression in the workplace and beyond. They are intolerable and unacceptable and can lead to suicide, depression, financial insecurity and feelings of ostracisation. The laws protecting against these are sometimes hard to enforce. People who do follow through on a complaint, are considered ‘trouble-makers’ and their careers are thwarted by this unfair reputation.

Stereotyping is part of human nature, even as we try to rinse it out. When we are unfamiliar with a culture (sometimes when we are familiar with it), we reduce people to descriptors that can be stereotypical. In my case, I’m often assumed to be English because I have a slight English accent. In incorrectly assuming this, people often ask me if I would like ‘tea’ (I hate tea) or make fun of my accent. Friends from other parts of the world have the same experience. It’s annoying and a constant reminder that I’m not American. I’m an immigrant.

For people of colour, this is even more extreme. If you are light skinned you are ‘assumed’ not to be a person of colour though you might be (many are) whilst if you are dark skinned you are more likely to be ‘assumed’ a person of colour (though you might not be) and if you have African or Asian ‘features’ as considered by (a stereotype of what constitutes ‘being African’ or ‘being Asian’) you are assumed to be African or Asian though you might not be.

However, this is a complete minefield and I want to take a few scenarios to illustrate this point in relation to the trend toward calling out ‘micro aggressions’ – which if you let it go too far, can be every bit as exclusionary/judging/alienating as if we go in the other direction and return to a mass acceptance of bigotry.

Before I share some examples, I should say, there may be a middle ground where we can all be relatively certain of fair treatment, even if this cannot include historical bias and mistreatment of our ancestors in the past. I would say this is the place we want to aim for, but we’re not there yet. My issue with microaggression is it’s a dog whistle that is going off so often that we’re becoming blunted to real outrages in favour of a knee jerk response daily to little errors in our current way of communicating, that also by default, leave us fearful of saying/doing anything for fear of offence. Yes, it is possible to go too far. And before you say I’m coming from a position of privilege — that’s why I’m saying this — no I’m not. That’s an example of what I’m talking about — taking things so far with political correctness that none of us can say or do anything without fear of serious reprisal.

My friend attended the same course on micro aggression. We discussed it afterwards. She is African American. I am mixed-race but have white skin. She noticed the course tended to use examples of black/white characters and wondered why if race and ethnicity were not specifically mentioned in the course. The point of this microaggression course is to point out the varied ways you can be microaggressive without knowing it – and what you could do about it. However, as my friend pointed out, it’s almost a no-win situation.

The first scenario of microaggression is a (seemingly) white-blonde woman commenting on a (seemingly) black woman’s hair and saying “I like your hair”. Why? Because it’s culturally insensitive, it’s fetishising ‘exotic’ hair by the mainstream (white) which can cause the person being complimented, to feel embarrassed, self-conscious or that they’re being stigmatised or singled out for their (non-white) hair.

Ironically, if my friend were not African American could she even say what she thought without being criticised for being microaggressive herself? Fortunately, she is so, this is what she said and she cannot be called out for saying it because she’s African American herself.

“I thought it was a stupid scenario. A blonde woman is just as likely as a black woman to have people comment on her hair. She wouldn’t think it was racist/insensitive, unless the person went too far and started touching it or saying things that were sexually inappropriate like ‘You have really sexy hair’. Most of the black women I know would be happy if someone said they liked their hair and wouldn’t think it was culturally insensitive if that person wasn’t the same race/ethnicity – as much as anything because how can they be sure they’re not (of the same race/ethnicity?). Plenty of people who don’t look mixed-race, are, or their parent may be, and they might be a light-skinned person of colour saying to a darker-skinned person of colour ‘I like your hair’ in which case that cancels out the microaggression, which is assumptive at best. But even if it were a white person saying this, if their motive was simple, ‘I like your hair’ then assuming they mean anything more/less than that, is assumptive, and thus more of a micro-aggression than the original statement.”

My take on it (although by sharing this I can be accused of being micro aggressive because I have light colored skin) is:

“I have told friends of colour (all races/nationalities) ‘I like your hair.’ Never once did I mean anything less or more than that statement. Recently I saw a girl with hair down to her knees. I said ‘Wow! I like your hair!’ She was (seemingly) a white girl with brown hair (though I have no real idea of her ethnicity (nor did I think about this). She was really happy. But according to the microaggression lessons if I say the very same thing to a girl who does look (to me) to be African American or a ‘minority’ (which is kind of racist in its own right, and thus absurd, because how can any of us know for sure what someone ‘is’ and by thinking about it so much, I find this more potentially offensive than to not think of it) I would be being micro aggressive?”

My friend and I talked about this at length until we got to the impossible scenario which is this: two women meet and one says to the other, I like your hair. The other woman says “Thanks! I like your hair also.” One is microaggressive, but the other isn’t?

This is microaggressive because if one of the women is a person of colour (in any way) then the person saying to the (supposed person of colour) “I like your hair” is being micro aggressive because it’s insensitive for a person who is not of colour to say this to a person of colour.

But what if neither knows what the other person is because it’s not clear, or they don’t want to assume (which is a good thing). What if they’re both saying it for the same reason?

Well then according to the course, they could still be micro aggressive because one of the other concerns is someone ‘hitting’ on someone else in some way, or pointing out something ‘personal’ about them, which could make them feel uncomfortable.

Play the scenario again: Two white women (we’ve removed race for the time being because this was one of the ‘issues’ and we want to see if there are any other ‘issues’ in the scenario) meet, and one says to the other, “I like your hair” and the other woman says “Thanks! I like your hair also.” The one who initially said it, could be accused of being micro aggressive because they overstepped the work relationship which should be professional (meaning, no personal comments). The other person may have replied out of feeling they had no choice but had to respond. If one was a boss and one was not, then it could be microaggression of power and if one was richer than the other, it could be a subtle put-down of their income. So, the list goes on.

Find anything absurd yet?

Again, let me ratify this by saying I am all for equality, and treating people compassionately, with dignity and respect and cultural sensitivity. But I think like my friend said, this can go too far and become a minefield of absurdity.

We laughed and then said – what if the two women were wearing t-shirts that said ‘I am heterosexual’ so the issue of sexuality was removed, and both women were white or both women were black, so the issue of race was removed. Would it be okay to say it then? My friend cleverly pointed out that wearing a t-shirt that said I was heterosexual (as stupid as that is) would be deemed offensive to those who were not. So basically, the bottom line is, you cannot win, you cannot stop going down the rabbit hole.

Here’s the truth. If someone comes into work and touches your hair or your pregnant belly without you asking and makes a big fuss, then you might feel they have violated your space. But if they literally said, “I like your hair”, then it would not necessarily mean they were microaggressive. I know a lot of people of colour who compliment each other all the time. If we segregate races to their compliments, we’re dividing not bringing everyone together. If we say only a black woman can say to a white woman “I like your hair” but not vice-versa, we’re creating absurd rules.

Obviously, this is necessary sometimes. Using the “N” word is a word people of colour can and do sometimes use with each other but if person outside their race says it to them, it is racist. That’s true. It is a double standard that stands because of the history of racism and discrimination, murder and hate. I still think anyone using the “N” word, irrespective of their race, shouldn’t use it because of its history but that’s beside the point.

Sometimes a white woman might say something that is derogatory to a black woman and that statement will be wrapped up in a passive-aggressive “compliment” such as “I like your hair.” But until you know the motivation, isn’t assuming this, impossible to prove and thus, impossible to police? By shutting all such comments down, aren’t we dividing each other more? Segregating our language and what can and cannot be said to each other? How does this help if we then become so afraid of saying anything to anyone?

If my colleague who lives alone asks how my New Year was. I say, “Not bad, how was yours? Did you spend it alone?” This is a microaggression because I’m potentially ‘shaming’ my colleague for living alone, whereas in reality, I was asking because they had previously told me they spend New Year alone. Can you see how we’re increasingly walking on eggshells? How does it help relationships to be that stilted and self-conscious? Isn’t it true that we’ll likely offend everyone repeatedly in little ways, but if they ‘know’ us they will know we didn’t mean to and it was a blunder rather than something intentional?

So how do we police and protect when it is intentional? Someone I work with once said to me, “You are a very long-winded writer with lots of detail aren’t you, Candy?” I felt ‘hurt’ because I thought it wasn’t entirely true and a negative characterisation. But were they gaslighting me or simply stating their opinion about my writing? Does everyone I know have to think I’m a good writer? Is it wrong to offer that I could be more long-winded than them? Does it nullify my writing? Or is it simply an opinion? Not wrong, or right? I let it go because I suspected it was not intended to be gaslighting or passive-aggressive and at the same time I considered how I could avoid being too long-winded.

Surely the same can be done in any conversation/interaction without us having to police every single sentence or condemn people for things they may simply not have meant? I hate the statement “you are too sensitive” because it implies there is such a thing as being too sensitive, yet as we all know, there are times when we’re too sensitive to what is being said, and it’s our assumptions of (their assumptions) that hurt us more than what they actually said.

Yes, if someone says “I think people of colour need to comb their hair more and keep it straight” that’s obviously out-and-out racist and awful. It’s unacceptable. But if someone says “I like your hair” they may be saying I really like your hair, because maybe their hair isn’t thick or a nice colour or maybe their hair is lank and lifeless, and they perceive your hair to be beautiful. It may be nothing more than that and why should it be a statement only people in your ethic group can say? I get told “I like your hair” quite a lot, because I have very long hair. Granted I don’t have Afro hair, but my mom does, and she was told “I like your hair” a lot too. She didn’t think it was racist and it likely wasn’t.

Now if she were told “I like your hair and your chest, want to go to bed?” That wouldn’t be okay. Just like “I like your hair – how do you people with afro hair manage it” wouldn’t be acceptable. Are the nuances easy to remember or understand? Especially for the neurodiverse (those who accept diversities) they can be a confusing minefield, and this might be one reason we’re more likely to put rules than try to pick apart nuance, because so many of us are neurodiverse and modes of communication seem to have intricate layers to them that are hard to unpick. Isn’t it simpler then to put rules in place that call someone out for micro aggression? But in so doing, we shame good intentions as well as bad.

Maybe microaggression policing can go a little too far in its zeal to police everything and everyone and we’d be more cohesive if we didn’t impose a multitude of rules on conversation. I’d like to be able to talk to people without fear. I think we can do this again and I can be trusted not to deliberately gaslight or obfuscate or passively-aggressively shame or put down. I know the difference between someone who says, “Oh I like your hair” in a bitchy voice, and someone who simply says, “I like your hair!” I’m not going to treat them both like thought criminals. Or continually watch what I say, I’m going to trust myself to be conscious and listen to others, and in learning what their boundaries are, hopefully most of the time adhere to them. By treating us like we are in a nanny state, we lose the art of communication and coming together. I think we need to get it back – more now than ever before. 

.

Candice Louisa Daquin is a Psychotherapist and Editor, having worked in Europe, Canada and the USA. Daquins own work is also published widely, she has written five books of poetry, the last published by Finishing Line Press called Pinch the Lock. Her website is www thefeatheredsleep.com

.

PLEASE NOTE: ARTICLES CAN ONLY BE REPRODUCED IN OTHER SITES WITH DUE ACKNOWLEDGEMENT TO BORDERLESS JOURNAL.

Categories
The Observant Immigrant

To be or Not to be…

 Candice Louisa Daquin takes a close look at death and suicide.

The Death of Socrates by Jacques-Louis David (1748-1825) Courtesy: Creative Commons

Recently I’ve been grappling with the imminent death of a very dear friend. She is very young. Maybe her youth or maybe my attachment to her, makes this particularly moving to me. I have dealt with death before and it is never something as simple as going softly into the night. Yet it is something we must all deal with, multiple times, often intimately and up-close-and-personal when it comes to our turn. This may be a macabre subject but, in a way, it shouldn’t be because it’s the one thing we will all are guaranteed to experience. Maybe it’s time to see death and dying differently?

In my friend’s case she wanted passionately to live. She didn’t want to get cancer in her thirties and have the rest of her days taken up with trying to survive. It’s unfair, it’s horrible, it’s what most of us fear. Despite this, her grace leading up to her death, including her time in the Hospice, had been startling for me. I saw a change in her, that I found unexpected and illuminating. One could have argued it was the hand of God touching her, those who do not believe might have mooted for inner peace. As much as I rail against her dying, and wish stupidly something could prevent it, what helps is knowing what I feel and think is nothing compared to what her family is experiencing, and it puts things in perspective and demands that I rein my feelings. By that, I don’t mean ignore the sadness, but put it in a context because everything is a context.

My context is: A few years back my beloved cat died, I was very upset. But I knew he was a cat who lived a long life, and he was ‘just a cat’ versus a human. For me, that context is different (despite ultimately believing us all equal). If my father passed on, it would devastate me more than say, a stranger, and that’s just being honest. When my grandparents died, I was very upset, but I put it in the context of expecting grandparents to die and whilst I still think of them and miss (some) of them, I’m able to cope. I think this is true with my clients, they rationalise by contextualising. It’s how we cope with grief and fear. When we cannot do this, we usually develop some type of imbalance and that can cause us to have unexpected responses like feeling as if it is the end of the world if your pet turtle has passed on. Everything is context-based. While it might seem insensitive to admit this, it’s what keeps us sane.

People can be dismissive of others pain. I have often heard people say things like: “Well her parents died more than a year ago, don’t you think she should be ‘over’ it by now?” Invariably when their time comes to lose their parents, they may finally understand why most people don’t get ‘over’ it when they lose parents, or siblings or their children. Others believe you are weak if you don’t ‘suck it up’ or that there is some time limit on grieving. In the faith I grew up in, we tend to give a year for grieving for parents (or close relatives) which allows the griever to not feel guilty (or guilted) for their emotions – it frees them. On the other hand, it could be negative if say, after a year they feel worse as many do because grief is not linear. Nothing is perfect. The important thing is to have empathy and realise we all process differently and there is nothing that is wrong or right, except for the individual.

If we all die, then we shouldn’t stigmatise illness or fear it but many people do. It is the great unknown. In modern times, a worse fear is suffering. We may not know what happens when we die, but we know suffering and nobody wants to suffer, but increasing numbers of people do. Why? We always suffered to some extent. In wars people would lie on battle fields for long periods of time dying of awful wounds. In modern times we harnessed the power of pain killers and believed ourselves free of pain and thus, suffering. Despite this quite the opposite phenomena has occurred. In some ways people suffer more.

Why? Because with the advent of ways to prolong life, came medication that extended life but it came with a price. When a person gets cancer say, it’s no longer a death sentence. Many recover fully and go on to live long healthy lives. For those who get a late diagnosis however, the cancer might have spread so much so that eventually it will take their life. This is where modern medicine seems to shine. There are a plethora of medications that prolong life. No, not cure but cause remission or sustain life for a period of time.

If you are someone with children or grandchildren, living an extra two years might be worth the astronomical sum it costs (if you can afford it) but at what cost? Many of these drugs cause horrible side-effects and do not reduce suffering, in fact it is the suffering (toward death) that becomes the experience. You could say, we have prolonged suffering in our attempts to give people ‘more time.’ The question then, is it worth more time and what do we mean by more time?

Whilst doctors want to offer hope, they do so more out of a stubborn desire to ‘try anything’ rather than because the six months they may give a dying person, is really beneficial when you consider the sheer backbreaking cost (bankruptcy from medical costs being the #1 reason) and very small gains (six months more of life and you have spent all your money on a treatment which only benefits Big Pharma, according to Dr. Azra Raza in her ground breaking book about death and dying, The First Cell). The unwillingness of doctors to give up, is admirable and very human (who wants to tell someone there is no hope?) but it brings with it, a false promise.

In the modern world, people have smaller families by and large, and as such, many of us when we get older and more likely to die, may have less obvious incentives to live if we get sick. It’s not so much a wish to die, but a pragmatism about death and dying. Easy in theory, perhaps less so in practice, we must all eventually deal with this issue. Better to consider it before the time comes, hence the increase of living wills, whereby we inform doctors and loved ones, what we want to happen should we get sick. Again, macabre maybe, but imminently sensible, and useful when doctors are not mind-readers and cannot predict what a patient coming into Emergency may wish by way of life sustaining treatment.

Speaking of life sustaining, one debate long held relates to dementia. Right now, if you were to write in a living will that should you get dementia you do not want any life sustaining treatment that would only include if something actually happened to you, such as a heart attack or your kidneys shutting down. For many with dementia though, it is the day-to-day living that is hard, both on the individual whom on some level knows they are not themselves, and on those who care for them. Whilst it seems inhumane to some, to consider letting people with dementia die, those lucid enough to know they have dementia will often wish to die rather than live, without any hope of recovery and the mental state of a child without any chance of growing up. Since dementia is so common place and we’re all living longer, this is a real issue and yet little is done to combat the increase in dementia patients who may often have nobody to look after them and no funds for full time care. What should be done?

Clients of mine with parents suffering from dementia will often describe the agonising decision to put them in care if they have the financial wherewithal or take care of them at home, often at great expense (loss of job, career). Some wish nothing more than the opportunity to care for their parents or loved ones, whilst others feel guilty for wishing it were anyone but them looking after their dementia patient. Both perspectives are understandable. For some there is a redemptive quality to caring, and it comes naturally. For others, financial or emotional reasons may make being a person who cares for others, incredibly hard. It is not surprising that older carers such as the husband of a woman with Alzheimer’s, often dies before the wife. The actual act of caring is exhausting as it may be redemptive, and nobody should be blamed for being unable to do it. Sometimes however, there is no choice, with rising costs for elder care, especially with dementia. It makes me wonder what will happen if the predicted number of elderly develop dementia. Will it be common place to see them walking the streets without anyone to look after them? Take me for example, should my father develop dementia and I live in a different continent, what would I do? This is something many of my client’s fear and yet little is done to resolve this issue by world Governments.

Some people believe dementia patients should be allowed to access euthanasia. As of now, all countries with some degree of access to euthanasia ask that the patient be ‘sound of mind’ – which would not apply to someone with dementia. But even if you write a living will stipulating that you wish to have euthanasia if you get dementia, this is a tricky situation because by the time that would apply, you would not be sound of mind, so it negates the ability to have euthanasia. Furthermore, what of unscrupulous relatives or friends, jockeying for inheritance or to financial gains? How can this be stopped from being misused? How can we gauge whether someone with full blown dementia really wants to die? The reason this matters, is by the time a woman is 80 she has a 1/3 chance of dementia of some kind and by the time she’s 90 that chance is 1/2. This is and is going to affect a huge swath of the population and as such, these conversations need to be had. The only way we do this is to consider what end of life means and how best to end life.

End of life conversations are common in therapy. Clients may be bereaved. They may have a partner dying of a terminal illness, they may have lost a child, parent, friend. Perhaps it is a relief to have counselors because many people don’t want to ‘go there’ and talk about death and dying as if it is contagious. It’s a hard subject, a sad subject, and difficult to put a positive spin on it. Life can be tough enough, without considering dying before it’s time, but that’s exactly what we should do to protect ourselves.

The other subject not discussed in depth is suicide. It’s one thing for us to die of disease, we know that’s a possibility but we’ve always struggled with the idea of suicide. For most of history, suicide has been variously unsanctioned by society-en-mass, but the question is why? Is it an ancient fear of the ‘unnatural’ impulse to die? That we work so hard to stay alive, it seems absolutely wrong to wish to die? Or something else? Whilst Emile Durkheim (1858-1917) wrote on suicide and many have questioned this, no one person that I’m aware of, has managed to figure out what our collective horror is in relation to suicide.

Let’s play devils advocate for a moment. What is it about the decision of another wishing to take their life that causes people to react so strongly? Is it the same as abortion? Pro-life over Pro-choice? Or the visceral horror we all have about dying? If the latter, why does it matter so much to us what others do? What right do we have to intervene? Yet we do, societally and individually. One might say we’re just trying to save lives, but do we put as much passion into feeding the starving or rescuing women from abusive relationships as we do banning abortion and making suicide illegal or socially frowned upon? When it’s not illegal, it’s considered ‘weak’ and a ‘cop out’ (to use an American term). The only society where suicide was really ‘embraced’ if such a thing can exist, is Japan, and nobody really knows why that singular country took the notion of suicide to such an artform.

Suicide bombers today may be Muslim fundamentalists protesting against the tyranny of an oppressive country. They are sold on the idea the after life will be a paradise, and much of what they are told is not from the Koran but from the machinations of those who impel them toward acts of terror. When a young person blows themselves up to destroy others, this is not the suicide we think we understand, but an act of anger, revenge, justice or ignorance. Suicide in its more common form is less about revenge or justice but can be about ignorance or anger. Ignorance in that many young people kill themselves almost upon an impulse, with little pre-thought or planning, just in a reactionary ‘of the moment’ way, that shocks loved ones who didn’t see it coming because it may not have been (coming). There is an element of ignorance to this act, they may be copying others, or reacting abruptly, and had they known more in that moment, they would possibly have looked back on the act and regretted it. As such the impulsive element of the act could be viewed as ignorance/nativity.

Anger is a very common reason for suicide, although probably the one everyone is most familiar with is despair (depression). Ironically anger is as much as incentive for suicide as despair, because of the heightened tendency to react with both those emotions. It is in that heightened state that many attempt or complete suicide and those who survive, often realize there were other options they wish they had taken. For the committed, then suicide is a personal decision, perhaps based on a terminal illness diagnosis or long-term suffering. Despite this people intervene and tell anyone who wishes to die rather than linger and suffer, that they are weak for taking this option. I have always found that fascinating and awful, because I see it exactly the way I would putting my cat to sleep. I did it out of mercy and love. I did not want my cat to suffer. A human being is not a cat but the same emotion applies. Where is the true difference?

The difference lies in religion. People who condemn people for taking their lives are not universally religious but many are. The tenants of a religion might dictate that someone should not take their life. I have always wondered why, because I believe most religions were written and created by humans and so it begs the question, why were they so intent on stopping people from taking their own lives? I can understand that if a God exists and says we should not take our lives, this would give pause to many. But it also creates a challenge, because what God would wish someone to suffer terribly?

This is a very personal decision and that’s my final word on whether someone should choose suicide if they are suffering. To some extent it should be the right of the individual and never anyone else, what they do with their life and death. I recall Brittany Maynard (1984-2014), a young woman with a terminal cancer in her 20’s talking of her right to end her life rather than brutally suffer and I felt then, even if I did believe in God, I would not strip her of that right, because I did not want her to suffer and I could not believe any God or human would. Again, if we have tools in place to deal with these issues, then people who are suffering terribly, do not have to battle through this, at their most vulnerable.

Many of us may not know this but suicide if we’re talking technically suicide, is incredibly common. Old people often stop eating and will themselves to die when they have had enough of life. Technically that’s suicide even if they do not see it that way. Maybe it’s more natural than we imagined? Likewise, hospices and long-term care facilities will medicate terminal patients at the end to hasten their death and relieve some of the suffering. Effectively causing a person to die before their ‘time’ if we think in God-given terms. I do not begrudge this because these carers are seeing people suffer in ways few of us can imagine, if there is not a mercy to ending a life of agony, then I don’t know what mercy really is.

The stigma of suicide is incredibly pervasive. Just like mental illness, suicide is seen as ‘not right in the head’ and a weakness. How sad that society believes judging those during their worst times is the right thing to do? How is that going to help someone choose anything else? And what of their right to end their suffering? This begs the question: Is it right to die before your time? And by right, what do we mean? Who is the judge of what is right and wrong? For some, it’s easy, it’s God and most Gods say suicide is ‘wrong’ but again, why? And at what cost? For those of us without God, then right and wrong become moral principles that we try to adhere to. Is there a downside to morality being the choice of humans rather than something more than us? Absolutely. We are flawed and liable to influence, but sometimes trying our best is all we can do. I would like to think I would be merciful before judging if someone I loved dearly wished to end their life. It would of course depend upon the circumstances, as it should.

Switzerland is the only country in the world where you can elect euthanasia for no specific reason. In America in the ten states that permit some degree of euthanasia, you need to be close to death and have a terminal illness.

Why is death sad?

Switzerland is testing a 3-D-printed pod that its creator says can painlessly end someone’s life in a matter of minutes. The device is called ‘Sarco’ and users can potentially end their lives at the push of a button. The advantage of this system is he pod becomes filled with nitrogen gas, which lowers oxygen in a person, until they fall unconscious and this occurs fast, within a minute. The idea of a truly painless death by euthanasia is why this was invented and users do not Sacro suffocate they die fast of oxygen deprivation after they’ve fallen asleep.

Some have argued a machine like this could glamorise suicide (although how, is not explained) and that it may be overly appealing to a mentally ill person seeking death. I don’t really agree. If there are checks and balances in place that work effectively, this won’t be an issue. The real issue is who wants to consider people taking their lives when death is sad? And how do we access without bias, who is ‘eligible’ for euthanasia and who can be helped another way, that doesn’t involve a premature death? After all, someone in the throes of depression, may wish legitimately to die, but what if a year later they are well enough to be glad they didn’t die? Those kinds of grey areas must be resolved before euthanasia could ever be expanded or wide-spread.

The Netherlands and Belgium permit assisted suicide for patients with unbearable physical or psychological suffering. Which differs from the USA where it can only be for terminal end stage physical illness. Switzerland has less qualms, making it a destination for ‘suicide tourism’ which again, is very sad, but perhaps what is sadder, is that they have to resort to this and what about those who don’t have the means? People assume those who are pro-euthanasia must be eugenicists or simply not care about others, but often the extreme opposite is true, just as it is with Pro-Choice advocates, who believe choice is freedom and the basic right of any woman.

Should we be able to die because we’re simply ‘tired of life’ or should we be expected to carry on to the ‘bitter’ end? This is such a cultural dilemma and maybe suicide became more acceptable in Japan because they reframed the concept of death, making it honourable to commit suicide (or as a means of regaining honor) just as suicide bombers find honour in the act. Should we need honour or some justification to believe suicide needs to be part of any argument about end of life? Maybe this is also connected to concepts of the death penalty. It is believed by many that America will not have the death penalty in years to come, that it’s a faded system that doesn’t work and scapegoats the most vulnerable. I see the death penalty as very different (a consequence for a bad act) than legally assisted suicide (to end suffering) and thus, when I consider arguments about end of life with clients, my considerations are about their quality of life and not, the myriad ways we can consider forms of dying.

In an ideal scenario nobody would wish to die, perhaps. Why? Because so much can be found in living and those who love us would not wish to lose us. Realistically however, people get sick, people get tired, people suffer. As long as we recognise this in ourselves and others, then debates about what end of life represents, will be part of a larger conversation and a necessary one. Perhaps if we cannot see anything positive in this, we can at least not shun it until it is too late. Likewise, we could consider that death is not the worst thing to happen to us, though suffering might be (or the suffering of those we love) and ways of alleviating suffering that are compassionate, should be part of the conversation.

.

Candice Louisa Daquin is a Psychotherapist and Editor, having worked in Europe, Canada and the USA. Daquins own work is also published widely, she has written five books of poetry, the last published by Finishing Line Press called Pinch the Lock. Her website is www thefeatheredsleep.com

.

PLEASE NOTE: ARTICLES CAN ONLY BE REPRODUCED IN OTHER SITES WITH DUE ACKNOWLEDGEMENT TO BORDERLESS JOURNAL.