If you read this Manchester public relations firm’s blog on the regular then chances are our Friday media wrap, The Blagger’s Blog, hasn’t gone unnoticed. And, if you tuned in last week, there’s even more likelihood a reference to a certain ‘covert experiment’ conducted by the world’s most talked about social network raised a few alarm bells.
So many, in fact, that we’ve deemed it worthy of a stand alone social media agency post.
As per those headlines then, Facebook selected 700,000 users at random for a test to see whether or not content could emotionally manipulate the public. These poor, unsuspecting profiles were exposed to either an abnormally high number of positive stories in their News Feed, or an abhorrently high number of negative stories. Their behaviour during this period of time was monitored to see how they reacted to the tampered feeds, and whether this had any impact on their overall emotional state.
The results were actually published back in 2012, in the Proceedings of the National Academy of Sciences, but it’s only within the last few weeks that the proverbial effluence has hit the fan with full force, sparking the network to issue a sort-of apology in which it claimed there was no unnecessary collection of personal data as a result of the observations.
More so, Katherine Sledge Moore, psychology professor at Elmhurst College in Illinois, added to the get-out clauses, and is quoted on bbc.co.uk as saying: “Based on what Facebook does with their newsfeed all of the time and based on what we’ve agreed to by joining Facebook, this study really isn’t that out of the ordinary…
“…The results are not even that alarming or exciting.”
In many ways she has a fair point.
There are abundant algorithms at work behind the scenes of the Big Blue ensuring we only receive content and posts deemed ‘relevant’ to our interests and social circles- a judgement based on likes, group membership, friends and probably a number of other criteria. Yet we say this still raises some questions as to whether presumptions and assumptions can actually benefit end-consumers, or if they are indeed the maternal parents of all cock ups.
Consider the detritus that ensued after Apple took it upon themselves to give away a free copy of the new U2 album to anyone buying a product with iTunes pre-installed. Barring the fact that said Irish platinum-sellers lost any credibility they had years ago, and represent the creative tour de force one associates with a moss-covered rock, the decision seems to be founded on the ability to create a shiny advert promoting the deal, rather than solid research as to whether most people would actually want to own a copy of yet another run of the mill release from Bono et al.
Needless to say, though, Facebook has taken this one step further, firstly by making the presumption that this kind of emotional manipulation would be accepted by the public, and also in deciding what we are exposed to on the network each and every day both before and after the test itself. While we can’t imagine what our News Feed would look like if the actions made by all acquaintances and associations were visible (an educated guess being ‘messy’), it’s possible to argue that this approach goes against the modern mantra of personal choice for consumers- regardless if they are consuming information, electrical products or food.
Perhaps the most fascinating aspect of this, though, is the fact that if it were any other brand conducting a secret study we would expect a huge fallout and possibly even a boycott on behalf of those unwittingly involved. Instead, though, other than the obvious outrage there has been no significant fall in daily log-ins, which in itself suggests that when it comes to the digital powers that be We The People are becoming ever-more passive to how influential they are, and ignoring the signs that should the moral or ethical rot ever set in for them, the impact on lives could and would be significant. That goes for everything, from psychological frames of mind to the ability to source current affairs reports from a comprehensive and unbiased range of sources, therefore increasing the chances of being provided with a relatively neutral understanding of a given situation.
Or maybe this is all too Orwellian?