learn from facebook

What have we learnt from Facebook recently?

This blog post is by Nikhol Hui from our Applied Innovation team.

Facebook founder, Mark Zuckerberg has famously said in the past that the social media site is firmly a technology company and not a media one… but is this really the case when the platform promotes content and news articles, often tailored to an individual based on past activity? This discussion has been the basis of some large waves in the news agenda in the past few weeks. Controversy surrounding its influence over voters in the US election coupled with rumours of a secret task force tackling fake news, has prompted a discussion around Facebook’s role and responsibility in society.

The backstory: earlier this year, the company fired a group of staff, whose job it was to populate the ‘Trending News’ sidebar. Instead, Facebook took the decision to use an unbiased algorithm to suggest which news stories to publish. Facebook’s algorithm uses hundreds of variables to determine whether or not a post should appear on a user’s profile feed and can predict whether a given user will like, click, comment, share, hide or even mark a post as spam. It predicts each of these outcomes with a degree of confidence – the relevancy score. It does this for every post, ranks them and posts them in the order. Essentially, the post at the top of the news feed is the one you are most likely to engage with…supposedly. This poses a discussion around editorial responsibility that Facebook should have if it’s deciding what information to give people.

For us in comms, this triggers an interesting discussion around how Facebook is being used by people today, as well as the way in which people are consuming content. We need to understand how and why people engage types of content. Are we inherently programmed to engage with those who agree with what we believe? There continues to be scrutiny over Facebook’s methods of distributing news on their site. It’s raising the question over the responsibility Facebook has as an organisation that reaches people; and the unintended consequences of its ‘unbiased’ algorithms reinforcing bias within digital microcosms, and even propagating untruths and ‘fake news’ in some cases.

For those of us charged with driving and changing opinion – this discussion highlights how hard it is to get issues, messages and stories to leap between communities, particularly around contentious or challenging issues. We may need to identify those people or platforms that sit in between communities who’ve already formed opinions. It’s time to look beyond traditional sources of influence – the media, which arguably failed to influence the outcome of the US election despite an almost universally pro-Clinton stance – and to other nodes of influence deeper into our target audiences’ communities.

At the very least, media strategies need to account for the way in which content tends to amplify within different communities and ensure all their audiences are covered. This is as true today as it’s always been, despite the rapid ascent of social media, as this story has shown. If Facebook could find a way to ensure news is shared across communities, not just to like-minded friends within your social media echo-chamber, could it be the ‘connecting platform’ for media content?

Until that happens: today, the fact is, people tend to share posts that their friends have shared. They like what their friends like. Creating content that people in different communities will not only have react to, but can relate to, could help spark something special for brands. It’ll be important to understand not just what that content is, but how to propagate it to drive real reach and change across an increasingly divided society.

No Comments

Sorry, the comment form is closed at this time.