real life

What does Facebook really know about you?

By Peter Greste, Anne Davies and Janine Cohen for Four Corners.

If you are reading this, the chances are you also have a Facebook account.

Of the 20 million Australians who use the internet, the vast majority of us also use Facebook — with over 16 million Australian accounts.

Around the world, the social media platform has almost 2 billion users — more than a quarter of everybody on the planet, and that includes those either too young to go online, or too poor to have access.

At the same time, more and more of us are spending more and more time on Facebook — 1.7 hours a day, on average.

Facebook has turned that into a business earning more than $US16 for every user around the world.

That turned into a staggering profit of more than $US10 billion last year — 177 per cent more than 2015.

"They are arguably the most successful company in human history at just gathering people's time and turning that time into money," New York Times journalist John Herrman told Four Corners.

Zuckerberg's vision for the future

So when Mark Zuckerberg, Facebook's co-founder and CEO, lays out his plans for the future, we need to take notice.

Facebook declined to take part in the Four Corners program, which airs this evening, but in February, Mr Zuckerberg posted a letter almost 6,000 words long on his own Facebook page, outlining his vision.

It was an extraordinary document that placed Facebook at the centre of just about everything we do.

One commentator said it turned Mr Zuckerberg into "the president of the internet".

"History is the story of how we've learned to come together in ever greater numbers — from tribes to cities to nations," Mr Zuckerberg wrote.

"At each step, we built social infrastructure like communities, media and governments to empower us to achieve things we couldn't on our own ... Facebook stands for bringing us closer together and building a global community."

In Mr Zuckerberg's vision, Facebook can help detect terrorist attacks, and locate people who might be caught up in them.

It can help develop real-world communities, improve "civic engagement", get more people to vote, or even detect when someone is considering taking their own life.

For analysts like journalist John Herrman, the letter was a response to a growing body of criticism about the power and influence of the online colossus.

"It's a document that really felt like an attempt to take some responsibility but it wasn't apologetic. It was bold and it seems to suggest that the solution to Facebook's problems is more Facebook," he said.

What Herrman means is, whenever there's been a complaint about Facebook — the way it helped turbo-charge fake news during the US election for example, or how some young people have used its live video-streaming service to broadcast their own suicides — Mr Zuckerberg has suggested fixes that involve ever more engagement with the site.

In the case of fake news, it's piloting a way to identify stories that have been flagged by independent fact-checkers, and in the case of suicides, it's working out how to use artificial intelligence to identify anybody contemplating harming themselves and alerting friends to respond.

Facebook builds a picture of who we are for advertisers

Facebook's monitoring isn't limited to what we do on their site, though.

Even if we are not logged on to our accounts, the company follows much of what we do online.

Any business that wants to advertise on Facebook can embed a cookie, called Facebook Pixel, in their website.

It's a piece of code that sends detailed information about what we did on the website to Facebook. The business can then ask Facebook to send ads to any user who might have visited their site.

The technology itself is not unusual. Cookies are a routine part of many websites, and in the case of Pixel, it helps advertisers reach people who've shown an interest in their businesses.

But because Facebook collates data from disparate websites and associates them with user profiles, it also gives them a deep understanding of our own browsing history that helps build a nuanced picture of who we are, and what we like — even what we believe and how we feel.

"Facebook has sentiment analysis tools," said Adam Helfgott, a digital marketing analyst who runs New York-based company MadHive.

"As we are writing on Facebook, or talking to our friends, they are very well aware of our sentiment, our mood, so it is able to put that data together with what it knows about our likes, our friends and so on, to give us things that are likely to keep us online for longer."

That information is incredibly useful for advertisers wanting to locate potential buyers with increasingly sophisticated precision.

But it's also a boon for anybody in the business of persuasion, including politicians who have been leveraging Facebook's marketing tools to reach voters with messages more tailored to our own beliefs than ever before.

In a statement to Four Corners, the company said it placed the privacy of its users at the core of everything it did.

"We work to keep people informed about privacy from they day they sign up for Facebook and beyond … We're focused on helping people understand how to use the tools we've created so they can make informed decisions and control their experience," the statement said.

What does our data say about us?

From marketing to communications, to politics and even commerce, Facebook is changing the world as it grows.

And yet while Facebook becomes ever more deeply embedded in our own lives, it is opaque about what it does with our data.

Rebecca MacKinnon runs a group called Ranking Digital Rights, which monitors the openness and accountability of each of the big digital companies. Facebook is not one of their best performers.

"Facebook has a responsibility to inform people of what is happening to their data, so there can be a conversation with their community about whether people agree they are using it appropriately," she said.

"Right now they're not providing enough information for that conversation to take place."

In 2014, Facebook researchers published the results of a study, in which they deliberately skewed the "mood" of the news feeds of almost 700,000 users.

They tweaked the algorithms to give some users predominantly negative posts, and others generally more positive news to see just how much they could influence people. They then monitored users' posts to see if their moods shifted.

The company later apologised for the study, admitting its researchers had crossed a line, but the results were still revealing: they confirmed the more negative our news feeds, the worse we tend to feel; and the more positive, the happier we become.

The shift was quite small for each user but statistically significant overall, proving the social media platform is so deeply embedded in our lives that it has an impact not just on what we see of the world, but also how we feel about it.

Psychology Professor Ethan Kross from the University of Michigan said these networks were "now a part of our life".

"What makes them so interesting is how quickly they've transformed the way human beings operate, how we interact with one another and so the real challenge is to understand how to navigate the networks optimally."

You can watch the full program, Cracking the Code: What Facebook Really Knows About You, at 8:30pm on Monday on ABC TV and iView.

This post originally appeared on ABC News.


© 2017 Australian Broadcasting Corporation. All rights reserved. Read the ABC Disclaimer here

Related Stories

Recommended