It worked for a friend of a friend of mine!

shutterstock_185333696-702x336

“You can just put an onion in a sock and lay it on your child’s chest to reduce the fever from measles. I’ve heard from people who say that it works!”

This is an honest-to-God post I read on an anti-vaccine forum (albeit a slight paraphrasing). To set the record straight, there is absolutely no scientific or medical reasoning, research or evidence to suggest that this works. This post on the forum is what we call anecdotal evidence. Little stories with (a lack of true) evidence based on hearsay, that is, information passed on by word-of-mouth.

The truth hurts

The internet is rife with forums providing anecdotal evidence, mostly to do with cancer cures, ‘natural’ remedies and detox diets, and all lacking true bona fide rigorous, peer-reviewed evidence. So why does it appear that people choose to believe anecdotal evidence over true facts? According to Scientific American it has to do with the outcome.

Anecdotal evidence usually involves believing that there is a relationship between two things, which leads to a ‘false positive’ harmless result, while believing that there is no connection between two things could lead to a harmful result. Doesn’t make sense? Basically it means that it is easier to believe a non-existent relationship if it promises an answer that you want, rather than the evidence to the contrary. This has to do with the human tendency to seek out patterns and look for relationships between things that don’t actually exist in order to avoid the truth of the situation.

This is the opposite to scientific rigour, which normally involves trying to disprove a relationship between two things, and acts to avoid generating false positive results.

It comes down to hurt feelings

It is a mystery to me (and I might add, frustratingly so) that people choose to believe word-of-mouth over scientific evidence, whether it is to prove or disprove something. However, scholars investigating anecdotal evidence believe that the strength of belief in anecdotal evidence stems from the boundaries between the ‘experts’ (scientists, doctors etc) and the ‘lay’ people (the general public). As scientific knowledge increases, so-called ‘lay’ experts are disproven or disregarded and thus their previous position in the lay-society is lowered. So they argue against the ‘experts’.

Besides the multitude of websites and forums helping to unite ‘lay experts’, the media also plays a role. An example of this is a case regarding mobile phones and brain cancer. In this particular example, an individual decided to sue a mobile phone company as he blamed them for his wife’s brain cancer. Although there was no evidence to back up this claim, the media promoted it as a true fact. This has spawned public belief that mobile phones cause brain cancer, when in actual fact, a very large meta-analysis of studies investigating if phones cause cancer found no evidence to support these claims.

Last, but not least, anecdotal evidence can also be presented in the form of testimonies, which is almost always used to promote a product or an agenda.

Think, ‘by drinking this product, I lost 20 kgs in one week!’ Another, and very famous example of using testimonies based on anecdotal evidence is the response to the editors by Andrew Wakefield after his paper describing a (supposed) link between autism and the MMR vaccine was retracted for being fraudulent. He wrote:

Clinician’s duties are to their patients, and the clinical researcher’s obligation is to test hypotheses of disease pathogenesis on the basis of the story as it is presented to him by the patient or the patient’s parent’’ (Wakefield 1998, 905; italics mine). Parents have said ‘‘‘my child has a problem with his/her bowels which I believe is related to their autism’’’….

I have hesitated writing about vaccines and autism as I don’t really want to deal with the responses that will no doubt appear, however the statement made above is a perfect example of anecdotal evidence. Wakefield wrote that the parents conclusion about their child’s symptoms was evidence enough to demonstrate a link between the MMR vaccine and autism, when what he really had was anecdotal evidence. I don’t really want to delve into this debate any further because it has been demonstrated time and time again, including in a massive meta-analysis, that there is no link.

Final thoughts (and I am preparing for the onslaught)

So what do we learn from this? To me, it seems, that people want to believe in anecdotal evidence because it is easier to believe than actual peer-reviewed research and evidence-based data.

I also feel that it is because with anecdotal evidence, people feel that they do not need to be held accountable for whatever it is they are talking about, despite often pushing an agenda, or trying to discredit the experts. It is also frustrating that (as an expert) you can tell someone until you are blue in the face that ‘no, that is not true’, and they still will not believe you! The latter statement is supported by scholars who wrote that:

…anecdotal evidence is used to demarcate the boundaries of science…and as a site for contestation and negotiation between experts and lay actors in public scientific controversies….

So next time you hear someone say “it’s true, it happened to a friend of a friend” or “I read about it on X, Y, Z”, please question the source and the actual scientific-based evidence!

Advertisements

The fork in the road

2394723984

Sorry everyone! There has been a gap between posts due to the fact that I had a big decision to make……

It continues on from my earlier post on grieving for your career, and the article in Nature Blogs. I finally found a job, not as a researcher but as a medical writer. A good job utilising my hard-wrought skills from my time as a researcher and PhD.

But even though I was ready for a change and I was excited to start on a new path, I still felt guilt, and shame. Why, you ask? I felt that I was giving up on trying to maintain a research career, that I had failed as a post doc or as a career researcher. This is despite the fact that I have been at the bench for more than 15 years! I felt that people who were still in research, successful researchers and academics, would look down upon me and judge me, or view my change in career as selling out…

To make matters worse, after I had started my new job, I got offered a senior researcher position at a good institute, in a good lab…Now what was I going to do?

I spent more than two weeks agonising over the decision. Do I move back to my country of birth, but to another city? Do I start all over again, setting myself up, making new friends? The project was so appealing, and it would my project, one that I could grow with.

What was the catch, you ask? It came down to the insecurity of funding in research, and the length of the contracts. Faced with the permanent position that I had just accepted, in an industry that also has a lot of potential for growth, a short-term contract paled in comparison. Oh, but it was a tough decision! I love being at the bench, and I love Australia, but I love writing and communication, and I love Paris…Finally, I ended up talking to a therapist to help me work through the decision.

This was actually more beneficial than I thought. The therapist identified patterns in what I said and thought about myself, regarding my identity and where I fit in the world. I have always seen myself as a scientist, and getting the PhD made me feel like I was part of an elite group. Deciding to change careers made me feel like I was losing a part of myself, and that maybe I wasn’t ‘special’ anymore. But, I am and will always be a scientist. The therapist pointed out that the way I think, and the way that I approach problems is because of my time as a scientist. He commented that my identity won’t change just because I am not at the bench…!

My new job is very science-heavy. I do just as much writing and literature searches as before, except I am not at the bench.

It seems to me that researchers and scientists often forget that there are alternate careers that utilise our skills. Junior researchers, PhD students and post docs are more than aware of it, but moving into an alternate career requires mentoring. something that often, albeit not always, is lacking. This is why when we are unemployed, we have no idea how to find these other careers. Leaving research is often not considered, and sometimes feels like it is not encouraged unless it is because a lab head thinks you ‘won’t make it‘.

There is that toxic phrase again!

Final Thought

So what is the moral of this rambling prose? Buried deep in the writing, somewhere, is the thought that we scientists should not feel ashamed or guilty for leaving research to pursue other careers! whether we realise it or not, we have been shaped by our time in research, and this is what makes us desirable candidates for many jobs.

I’m not saying these jobs are easy to find. It took me 6 months, a lot stress and internet searches to find my job, but there are other options. It is about time that there was more support for helping people leave research without making them feel embarrassed or like failures.

End of rambling prose.

The fork in the road

2394723984

Sorry everyone! There has been a gap between posts due to the fact that I had a big decision to make……

It continues on from my earlier post on grieving for your career, and the article in Nature Blogs. I finally found a job, not as a researcher but as a medical writer. A good job utilising my hard-wrought skills from my time as a researcher and PhD.

But even though I was ready for a change and I was excited to start on a new path, I still felt guilt, and shame. Why, you ask? I felt that I was giving up on trying to maintain a research career, that I had failed as a post doc or as a career researcher. This is despite the fact that I have been at the bench for more than 15 years! I felt that people who were still in research, successful researchers and academics, would look down upon me and judge me, or view my change in career as selling out…

To make matters worse, after I had started my new job, I got offered a senior researcher position at a good institute, in a good lab…Now what was I going to do?

I spent more than two weeks agonising over the decision. Do I move back to my country of birth, but to another city? Do I start all over again, setting myself up, making new friends? The project was so appealing, and it would my project, one that I could grow with.

What was the catch, you ask? It came down to the insecurity of funding in research, and the length of the contracts. Faced with the permanent position that I had just accepted, in an industry that also has a lot of potential for growth, a short-term contract paled in comparison. Oh, but it was a tough decision! I love being at the bench, and I love Australia, but I love writing and communication, and I love Paris…Finally, I ended up talking to a therapist to help me work through the decision.

This was actually more beneficial than I thought. The therapist identified patterns in what I said and thought about myself, regarding my identity and where I fit in the world. I have always seen myself as a scientist, and getting the PhD made me feel like I was part of an elite group. Deciding to change careers made me feel like I was losing a part of myself, and that maybe I wasn’t ‘special’ anymore. But, I am and will always be a scientist. The therapist pointed out that the way I think, and the way that I approach problems is because of my time as a scientist. He commented that my identity won’t change just because I am not at the bench…!

My new job is very science-heavy. I do just as much writing and literature searches as before, except I am not at the bench.

It seems to me that researchers and scientists often forget that there are alternate careers that utilise our skills. Junior researchers, PhD students and post docs are more than aware of it, but moving into an alternate career requires mentoring. something that often, albeit not always, is lacking. This is why when we are unemployed, we have no idea how to find these other careers. Leaving research is often not considered, and sometimes feels like it is not encouraged unless it is because a lab head thinks you ‘won’t make it‘.

There is that toxic phrase again!

Final Thought

So what is the moral of this rambling prose? Buried deep in the writing, somewhere, is the thought that we scientists should not feel ashamed or guilty for leaving research to pursue other careers! whether we realise it or not, we have been shaped by our time in research, and this is what makes us desirable candidates for many jobs.

I’m not saying these jobs are easy to find. It took me 6 months, a lot stress and internet searches to find my job, but there are other options. It is about time that there was more support for helping people leave research without making them feel embarrassed or like failures.

End of rambling prose.

Hair today, gone tomorrow*

*sorry, I can’t miss the opportunity for a good pun!

shaving-razor

Recently I decided to make an effort to reduce my plastic consumption and waste. To this end, I decided to ditch the (relatively) expensive plastic razors and research re-useable safety razors. In doing so, I stumbled upon the obvious truth that there is absolutely no difference between men and women’s razors, and that it is the result of a very successful marketing campaign.

King of his domain

History states that King Camp Gillette (his real name!) developed the first disposable safety razor. Whilst it was not the first safety razor on the market, it was the first ever with blades that could be removed and changed. This change revolutionised the industry, as well as reducing the price of the razors.

 The other half

As these razors were no longer restricted to the elite, meaning that shaving was no longer a sign of class status amongst men, Gillette decided to target an untapped market, women.

The marketing campaigns coincided with a drastic change in women’s fashion, and it is discussed that it was likely a circular effect of fashion dictating women’s hair removal, and women’s hair removal dictating fashion.

 Fashion dictates fashion

During the Victorian era, women’s clothing had covered both arms and legs, thus body hair was neither a concern nor noticed. However, the evolution of more daring clothing revealed the arms, shoulders, and, (gasp) armpits. This is where the clever marketing campaigns began. Prior to the 1900s, advertisements involved describing a product rather than telling a person as to why they need to buy it.

Securing a market 101: target insecurities

il_570xn-660181229_ncwi

But the marketing campaigns surrounding women’s hair removal centred on promoting disgust and repulsion about body hair. These campaigns preyed on the desire of women to be ‘on trend’, as well as focusing on the loneliness of unmarried or single women. In 1915, Gillette launched its Milady Razor, and thus began the market of women-specific razors.

In the 1940s and 1950s, dresses became longer again due to a shortage of nylon. As a result, sales of razors slowed. Rather than promoting shaving as an ‘on trend’ fashion necessity, they started throwing words into the campaigns such as “unsightly”, “unwanted”, “embarrassing” and, “unhygienic”. Thereby cementing the idea that if you had body hair, you were unclean.

A change of focus

Later marketing ploys targeted the idea of femininity and women’s empowerment. The 1960s saw the Scaredy Kit, a shaving kit aimed at women who were reluctant to shave (possibly due to a fear of razors).

 28874bf0-a2a2-0133-6e37-0efce411145f

Also around this time, campaigns used phrases such as “smooth like a child”, promoting the idea that being brought back to a pre-pubescent state was somehow the ultimate form of femininity.

The irony of these campaigns is the image of teenage girls shaving to show their maturity, while bringing themselves back to a pre-pubescent, childlike state.

Particularly distasteful when we look at the current marketing campaigns that focus on sexuality and desirability!

Final Thought

So what have we learnt from this? One, the fact that men’s and women’s razors exist is simply from a desire to corner every aspect of the market, and two, the marketing campaigns targeting women were morally questionable!

Full disclosure: I’m still buying a safety razor!

The potential of youth

cord-blood-aware-fanconi-anaemia1

Previously, I wrote about defying ageing with a focus on miracle creams and lotions. The general consensus is that you cannot stop the ageing process.

However, recently an article popped up in my inbox discussing about how the blood of young mice can rejuvenate older mice. Cue images of horror movies where older people are harvesting young people for their blood!

In reality it is far more complicated than injecting the blood of a young person. We need to know how the young blood factors are acting to ‘rejuvenate’.

The potential of umbilical chord blood

In this particular study the older mice who had received plasma from the umbilical cord blood (UCB) of young mice had more neural connections forming, and showed improved memory and learning compared to control mice.

The researchers found that there was expression of a UCB-specific protein in the hippocampus of the older mice who had received the UCB plasma.

Previously, studies have only been able to demonstrate the ‘rejuvenating’ effects of young blood on older animals through a technique called parabiosis, which is where the circulatory system of two mice are joined (ewww!). Obviously, ethically there would be issues in humans, and in animal research it is a proof-principle technique that is also not overly practical. So knowing that we are able to identify factors in the plasma that can ‘rejuvenate’, is a big win.

UCB can also repair damaged tissue

This same year, another article demonstrated that stem cells isolated from human UCB can prevent kidney failure in rats suffering from acute kidney injury. Currently, human UBC cells are used to treat a range of diseases such as

  • Immune deficiency
  • Leukaemias
  • Blood diseases such as Aplastic and Fanconi Anaemia
  • Metabolic storage diseases
  • Thalassaemia

Final thought

It is undeniable that there are properties of young blood that can ‘defy the ageing process’. In terms of medical research, it seems that these factors will be able to counteract age-related memory loss, and promote repair to damaged organs. Unfortunately, UCB relies on tissues being donated, and has obvious limitations as well as ethical considerations. At the moment these experiments are ‘proof-of-principle’ but pave the way for more UCB-factors to be isolated that may help promote tissue rejuvenation. Think repairing damaged spinal chords!

And, let’s face it, eventually the cosmetic industry will jump on this band wagon to promise ‘age-defying’ treatments!!

Side note

Many hospitals collect human umbilical chord blood. Please consider donating your child’s umbilical chord blood and tissue for medical research or to be used in life-saving treatments.

Australia: www.abmdr.org.au/auscord

US: http://www.parentsguidecordblood.org/en/donate-cord-blood

UK: www.nhsbt.nhs.uk/cordblood

Europe: http://www.eurocord.org/eurocord-registry.php

When life gives you lemons

grief

In the 6 months leading to the end of my contract as a postdoc, and in my search for employment, I experienced a range of emotions that were not unlike the 5 stages of grief. First I was in denial, then I progressed through bargaining, anger, depression and finally towards, acceptance. Writing this piece was cathartic, but I also think that it is important to discuss the mental health of researchers….

You can grieve for a career

How is it I can grieve for a lack of employment? In actual fact it is more than possible, it makes sense. Grieving is a natural response to loss, and just as we can grieve for the loss of a loved one, we can grieve for a loss of self-identity, self-worth and our place in the world.

Denial

Faced with an ending contract, the prospect of a lack of financial security, and the fact that I am a foreigner with visa requirements, I threw myself head first into finding work. I somewhat naively (given I had worked for many years as a research assistant and had seen first hand the plight of the postdoc) thought that with my 15+ years’ research experience and a decent number of first author publications, I would be inundated with responses!

What followed was email silence. So I told myself that maybe I was applying a little too early, and that people were not interested in my applications because I was still employed. Denial. I convinced myself that these were the reasons and that I still would not have a problem finding a new job.

 Bargaining

While often the bargaining stage occurs after denial, it can also occur early on in the grieving process. Bargaining often comes in the form of a promise to change an action or behaviour. For me, the bargaining stage was a period of great productivity fuelled by desperation, as well as a period of guilt. I felt guilty that I had obviously (in my mind) not taken advantage of opportunities presented to me. So I reasoned that if I invested more in X, Y and Z, I would improve my chances of employment. I undertook a part-time Masters Degree, I started my blog, and I emailed every contact I had no matter how tenuous the link. I asked people for advice and went to networking and career events.

 Anger

I transitioned to the anger phase quickly. I was angry with everyone who was happy with their job. I was angry with people who had permanent contracts and took it for granted, at people who didn’t care about their work, at those who did not take advantage of career enhancing opportunities. I was angry at a lack of career mentorship. I cried all the time out of frustration. The slightest thing would set me off. Then there were the roadblocks to career advancement. For example, being told that I was too old to do another a postdoc and therefore not eligible for many fellowships (despite only being 30-something!).

 Depression

This naturally progressed into the depression stage. For those facing or experiencing unemployment, scholars have found that self-worth, self-doubt of one’s abilities and place in society, their ability to provide an income along with financial security, is the driving force of the depression stage. I also felt shame that I was unable to find a job as a researcher, that I am disappointing the people who have given me opportunities.

 Finally….Acceptance

However, a chance networking event showed me I could look outside the box. This helped my transition into acceptance.

What needs to be spoken more often is that even if you don’t work in a lab, it doesn’t mean that you aren’t a scientist. Rather than fighting against what is happening and further wallowing in self-pity, I have come to the conclusion that I am trying the best I can. It is as simple as that. My lack of unemployment is a reflection of the status quo in academia and research, and unfortunately, common. What we also need to remember is that there is no shame in looking for career alternatives that still utilise hard-wrought scientific skills!

Final Thought

This piece was originally written for the blog section of a newspaper, but they have asked me to write about something different so I decided to publish it here. Although this is a very personal piece, I think it is important to discuss how unemployment affects your mental health, and to maybe put my somewhat erratic mood swings into perspective! I didn’t write this to gain sympathy, but to put a voice to a common situation.

March for Science

march-for-science

On Saturday April 22nd, I participated in the March for Science. I was expecting, given it was an election weekend in France,  not be many people would march. I was proven wrong, and it was great to see that the march had a good turnout!

Even though the March for Science originated in the US in response to funding cuts for research, the sentiment has been echoed around the world. Researchers everywhere, including Europe and Australia, are facing reduced funding, reduced support and a lack of recognition for the hard work they do.

Being a scientist is not a stable, long term career by any stretch of the imagination. Yet we persist with it out of passion, and out of understanding that society will not move forward, nor will issues such as (gasp) climate change be tackled, if we don’t have researchers. Thus, the need for continued funding.

So maybe each country, and even each researcher had a different reason for marching on the 22nd, but I for one was glad that people were motivated to do it, and for others to see just how many scientists there actually are!

Images of the March for Science (Paris)

18033490_10154992258776210_2207909827476608287_n
The Paris March for Science.
18034268_10154992258711210_4735695959752289575_n
“Breaking News: Science is more ffective than magic (p<0.05)”.
18110477_1328718167166014_2135135030_n
This may have been my favourite! “Sticking your head in the sand is not a solution to Global Warming…Your ass will still get hot!!”
18073413_1328717787166052_476544120_n
“Effect size, not hand size, matters!”

Final Thought

The images shown are from the March for Science in Paris. Thanks to Rebecca Whelan and Rachel Macmaster for the photos.

The myth of the tissue-destroying white-tailed spider

642x361_spider_bites_jumping_spider

Warning: if you do not like spiders, or are squeamish, maybe don’t read this post!

When I was at university, I found a red bump on my elbow that progressed to an actual hole. Many doctor’s visits and anti-inflammatory steroid injections later, I had an impressive scar and perhaps, an impressive story.

A persistent myth

My doctor told me that the hole was the result of a white-tailed spider (Lampona cylindrata and Lampona murina) bite, which causes tissue necrosis. Anyone in Australia has heard about people being bitten by a white-tailed spider and ending up requiring multiple skin grafts, or in the worse case scenario, amputation! In actual fact, spider bite-induced necrosis (necrotic arachnidism) is linked to only one spider, the Brown recluse (Loxosceles reclusa), which is found in the southcentral and southeastern areas of the United States. A compound found in the spider venom creates an acute immune response that results in inflammation-driven tissue destruction.

The link between the white-tailed spider and tissue necrosis is in fact an urban legend that has persisted since the 1980s.

So if the white-tailed spider doesn’t actually cause tissue necrosis, how did I get a hole in my elbow?

The jury is still out

The theories put forward focus on mycobacterium ulcerans infection at the bite sites resulting in an ulcer, or Staphylococcus aureus infection resulting in cellulitis (bacterial skin infection).

It is unlikely that the majority of the cases are the result of a M. ulcerans infection. Firstly, this type of infection is predominantly localised to tropical areas, and is a highly contagious infection. Secondly, studies have shown that the white-tailed spider venom does not carry this bacterium.

The second theory, that the tissue necrosis is from S. aureus infection resulting in cellulitis, is more likely. I couldn’t find a straightforward answer, but it seems that most researchers and clinicians feel that the S. aureus infection occurs from entering at the site of broken skin, i.e. a bite site that someone has scratched.

Final thought

So, despite a lack of evidence linking the white-tailed spider to necrotic arachnidism, the myth persists. I mean, what is going to have viewers glued to their TV or clicking on links:

“I lost my leg to a spider bite!” or, “I scratched a spider bite and now I have a bacterial infection!”

??

Tip: don’t enter tissue ulcer into Google images if you are of a weak constitution…!

Sources

This post was inspired by a recent post in Australian Geographic.

https://www.researchgate.net/profile/Scott_Weinstein2/publication/263096757_A_phoenix_of_clinical_toxinology_White-tailed_spider_Lampona_spp_bites_A_case_report_and_review_of_medical_significance/links/546d38d50cf2a7492c55b3df/A-phoenix-of-clinical-toxinology-White-tailed-spider-Lampona-spp-bites-A-case-report-and-review-of-medical-significance.pdf

http://www.ijam-web.org/article.asp?issn=2455-5568;year=2016;volume=2;issue=2;spage=256;epage=259;aulast=Fegley

https://www.researchgate.net/profile/Maria_Lima4/publication/302556178_Phoneutria_nigriventer_Venom_and_Toxins_A_Review/links/5785002008ae36ad40a4b43d.pdf#page=45

http://journals.plos.org/plosntds/article?id=10.1371/journal.pntd.0002770

https://www.mja.com.au/system/files/issues/186_02_150107/joh10634_fm.pdf

Disclaimer: the image used in this post is of the common ‘jumping spider’ and is not a white-tailed spider.

All watched over by machines of loving grace

moss-cloche-closeup

I recently saw a documentary at the Palais de Tokyo as part of their exhibition entitled “All watched over by machines of loving grace.” The documentary, by BBC journalist Adam Curtis, was a fascinating insight into systems theory, cybernetics and ecology.

So of course, I took to the trusty scholarly search engines to find out more.

A (vicious) circle

Early scholars of the movement described nature as an electrical circuit, with amplifiers and dampeners of the natural order. In terms of ecology, systems theory described nature as a self-governing machine that responded to changes in the environment and adjusted to maintain a natural balance. In essence, an ordered cycle of life.

feedback-1
A systems theory cycle

This is called a feedback loop, i.e there is a cause and an effect. Following on from this, there can be another factor that then influences the original input.

feedback-3
Feedback loop

Cybernetics

No, I’m not talking about robots!

Cybernetics is at the heart of systems theory, describing nature as a system that can be controlled and managed. Cybernetics considers nature in the bigger picture, looking at the response of the environment to changes.

Cybernetics introduced the concept of ‘negative feedback’, where in order to maintain equilibrium, where the output result that feeds back into the network is out of equilibrium, and is reduced to maintain the steady state.

feedback-2
Negative feedback loop

Earth as a spaceship

Cybernetics spawned the early environmental movement in the 1970s. This was based on the modelling of the ecological feedback loops. Scholars and activists realised that if a steady-state of ecological systems could not be maintained, irreversible damage or a catastrophe would occur.

This produced the idea of the earth as a spaceship. A self-contained object that required all systems to exist and work in harmony in order to maintain a sustainable environment within the ‘spaceship’. If not, water, air, or food would be compromised. In fact, cybernetics also contributed to the development of the Doomsday Clock. This is a metaphorical countdown to the end of the world based on the (dis)equilibrium of the population and our environment.

It’s not just science fiction

Systems theory feedback loops are used in everything from psychology (understanding people’s responses to the environment around them), to machine learning and computers and, to the development of the internet.

Final Thought

The most fascinating focus of the documentary was the realisation that man’s reliance on machines in order to ‘improve’ our quality of life as well as increase productivity in industry, has destroyed the idea of an ecological cybernetic system. The early theorists failed to anticipate that the negative feedback loop would not adjust to a rapidly changing human population, one that was at disequilibrium with its environment. This can be seen in the rapid extinction of animal and plant species, as well as the wealth of some countries versus the absolute poverty of their neighbours.

It really was such an interesting documentary, and I urge you all to watch it (link included in first section).

Sources

Bernard C. Patten and Eugene P. Odum. The American Naturalist, Vol. 118, No. 6 (Dec., 1981), pp. 886-895

http://www.its.dept.uncg.edu/hdf/facultystaff/Tudge/Bronfenbrenner%201995.pdf

http://maft.dept.uncg.edu/hdf/facultystaff/Tudge/Bronfenbrenner%201977.pdf

https://staff.washington.edu/jhannah/geog270aut07/readings/population/Ehrlich%20-%20Population%20Bomb%20Ch1.pdf

 

 

The Communication Series: Critical Theory

agency-communication

So I have discussed the post-modernism approach from the feminist critique aspect, with a focus on the use of language and communication to dominate. However, the emerging theory also consists of the critical approach to understanding communication. The goal of this theory is to produce communication that is free from domination and to meet the needs of all individuals.

Am I too critical?

This means that we could say that the aim of critical theory is to deconstruct structures of communication (i.e. in organisations) so that domination cannot be used to control people. This theory states that communicative domination can be used not just as a coercive force, but can also be found in attitudes and culture. For example, how a company sees itself and its employees can also be a form of domination.

Dominating culture and attitudes

One study that shows an example of creating a culture through ‘dominating’ communication was the removal of temporary workers to cover absent employees. It was discussed how the remaining workers were forced to work harder to make up for their teammate’s absence. Instead of creating a workplace were the employees had close bonds, it created tension and negativity when they interacted with both their employers and with the absent employees. And in the absent person, there were feelings of guilt and a sense of betrayal. The attitude and culture to being “absent” was “dominating” the team.

Why have communication theories?

Although this is not a conventional form of domination, this use of the critical perspective showed the example that domination can be pervasive within an organisation, in many different forms.

So in this example, the questions would have been: “is there domination?”, “who is dominating?”, “how is it affecting attitude and culture?” Once these questions have been answered, the following questions of “how can we change the communication?” and “what are we trying to achieve?”, should be asked.

The aim of these theories is to understand how the communication occurs and the effect that it has. For organisations, understanding how the communication does and doesn’t work (by using the theories) allows for communication to be improved and change the culture and structure of a workplace.

Sources

Agger, Ben. (1991). Annual review of sociology, 105-131.

Alvesson, Mats, & Deetz, S. (2006). The Sage handbook of organization studies, 255.

Buzzanell, Patrice M, & Liu, Meina. (2005). Journal of Applied Communication Research, 33(1), 1-25.

Cheney, George, Christensen, Lars Thøger, Zorn Jr, Theodore E, & Ganesh, Shiv. (2010). Organizational communication in an age of globalization: Issues, reflections, practices: Waveland Press.

Cooper, Robert, & Burrell, Gibson. (1988).  Organization studies, 9(1), 91-112.

Deetz, Stanley A. (1982).  Western Journal of Communication (Includes Communication Reports), 46(2), 131-149.

Harvey, Michael, Speier, Cheri, & Novecevic, Milorad M. (2001). International Journal of Human Resource Management, 12(6), 898-915.

Johansson, Catrin, & Heide, Mats. (2008).  Corporate Communications: An International Journal, 13(3), 288-305. doi: doi:10.1108/13563280810893661

Mehta, Rajiv, Larsen, Trina, Rosenbloom, Bert, & Ganitsky, Joseph. (2006). Industrial Marketing Management, 35(2), 156-165.

Mumby, Dennis K., & Stohl, Cynthia. (1991). Discourse & Society, 2(3), 313-332. doi: 10.1177/0957926591002003004

Papa, Michael J, Singhal, Arvind, Ghanekar, Dattatray V, & Papa, Wendy H. (2000). Communication Theory, 10(1), 90-123.

Peng, Wei, & Litteljohn, David. (2001). International Journal of Contemporary Hospitality Management, 13(7), 360-363.

Shockley-Zalabak, Pamela S. (2012). Fundamentals of organizational communication (8th ed., pp. 27-68). Boston: Pearson/Allyn and Bacon.

Vaara, Eero, & Tienar, Janne. (2008).  Academy of Management Review, 33(4), 985-993.

van Vuuren, Mark, & Elving, Wim JL. (2008). Corporate Communications: An International Journal, 13(3), 349-359.