Thursday, 3 July 2014

Don't mess with our emotions Facebook!

There are no emoticons to express how I feel about Facebook this week; it certainly isn't happy, maybe a touch of bemusement and a large dash of fury but maybe the best way to describe my feelings is utter bewilderment that a major global corporation can treat its customers' emotions and well-being so carelessly.

So why am I so hot under the collar? Yesterday my business students at Anglia Ruskin University got 'the lecture' and anyone who has studied market research or marketing with me will know the content of this heart-felt declaration. It is my responsibility as a CIM Chartered Marketer and member of the Market Research Society to send my students out into the commercial world with a strong consideration of the ethics involved in researching customer behaviour. We are governed by legislation and should follow the strict principles of research ethics to ensure that no one is ever harmed through research ever again. The principles of the MRS Code of Conduct and all other research codes stem from the 1947 Nuremberg Code and the 1964 Declaration of Helsinki to ensure that participation in research is voluntary, the rights and well-being of participants are protected, no one is harmed or adversely affected and researchers are transparent about the subject and purpose of data collection.

We follow these codes because it matters, because people died and were experimented upon, because we are human, because we know better. So where do Facebook get off thinking this doesn't apply to them?


If you've missed the news (read The Guardian's article here), it has just been revealed via a published research article that Facebook manipulated 700,000 user feeds over one week in 2012 as an experiment to see if hiding emotional words without the user's knowledge would impact upon the status updates and likes that they posted. This is now being investigated by the UK's Information Commissioner's Office. The Guardian reported that Monika Bickert, Head of Policy, claimed these experiments were necessary to enable Facebook to remain innovative and continue to improve their platform. She reportedly said, "It’s concerning when we see legislation that could possibly stifle that sort of creativity and that innovation."

I think it's more concerning that Facebook didn't consider how religiously some users follow status updates and how it can dictate their emotional state. Cyber bullying is rife. Some users may suffer from anxiety, depression or other serious conditions which mean that every emotional word is interpreted to the nth degree. If that emotion was lacking or enhanced through manipulation of the users feed, it could have had a devastating effect.

There are plenty of news articles online to say that Facebook will decline by 2017 and you can understand why they need to constantly develop and innovate. Good on them for thinking that market research plays a part in their new product development process, but doing this via stealth without consideration of the ethical issues and the moral issues is wrong, and the Information Commissioner's Office may feel it's criminal.

No comments:

Post a Comment