One of Mark Zuckerberg's top lieutenants was in Sydney this week, as Facebook launched a charm offensive on the Australian media industry.
Adam Mosseri, a ten-year veteran of the social media behemoth and the man in charge of one of its most critical features, the news feed, was in town, offering olive branches to Australia's top publishers, including executives from Fairfax Media and News Corp.
Facebook is not the most popular entity on the planet among publishers, who - despite benefiting from its massive reach - also are being hurt by the network's growing dominance of the digital advertising market, alongside Google.
The good news for the local media industry was that Mosseri came bearing gifts: Facebook is testing new tools designed to weed out fake news and clickbait and to help publishers grow their audiences, and in theory even their revenues, including through subscriptions.
The bad news is that his visit suggests the $US500 billion media disrupter is taking business in this part of the world much more seriously.
Facebook has a low corporate profile in Australia, compared to its rival in the digital ad duopoly.
Senior executives from Google Australia such as former CEOs Maile Carnegie (now at ANZ) and Nick Leeder (now in France), current CEO Jason Pellegrino and engineering boss Alan Noble have been on the national business radar for a while now.
Google's Sydney office has also carved out an important position in the universe of Alphabet, its parent company. It was where one of the search giant's most important products - Google Maps - was conceived and built.
In contrast, Facebook has been much more inconspicuous here. Until now.
'Bad for business'
The arrival of the most senior Facebook executive to visit these shores to date came amid a global backlash against the two giants of the internet.
Facebook has been under fire for allowing fake news to proliferate on its platform during the US election, and after.
This week, Roger McNamee, an early investor in Facebook and Google, warned they were "causing enormous harm to society, to democracy and to the economy."
None of this has had any impact on either company's profits yet, but Facebook still says it wants to fix it.
"It is bad for business," Mosseri told journalists at a briefing. "It [fake news] erodes trust in our platform, not only with people, but with publishers and with advertisers. We are an ad-based business, and that can be really, really painful."
Fake news shot to prominence during the US presidential election campaign when sites masquerading as authentic news outlets published completely untrue stories - that the Pope had endorsed Donald Trump, or that the Democrats were operating a human trafficking ring from a DC pizza joint - that subsequently spread like wildfire on Facebook.
The term has since been corrupted, after being adopted by politicians here and in the US to describe reports they disagree with.
Living in 'filter bubbles'
The rise of fake news exposed a troubling flaw inherent to Facebook.
The news feed is designed to show content you will engage with, so you spend more time on Facebook, so it can serve you more ads. An algorithm chooses what to show you, based on what you have previously consumed or "liked", and what your friends (and their friends) have engaged with.
Throw fake stories into a platform like this, with 2 billion users, combine them with today's hyper-partisan environment - the fact that people increasingly live in "filter bubbles", only seeking out information that reinforces their existing views - and you have a recipe for widespread misinformation on an unprecedented scale.
For its part, Facebook believes the motivation for most fake news is financial rather than ideological. For example, some repeat offenders in this bizarre cottage industry, operating from locations far away from the US like the Balkans, run the same stories with different headlines, intended for opposing audiences. They want page views to sell cheap ads against (often served by Google).
And the social network doesn't buy into the claims that fake news affected the US election result, either. "Did false news swing the election one way or another? I don't think so," said Mosseri. "There wasn't enough of it, or enough exposure, for that to make any sense mathematically."
People are the answer
But Facebook still wants to eradicate it, and it's testing new features and processes designed to better identify fake stories and stop them from going viral. These include the ability for users to flag disputed stories and alerts to discourage people from sharing problematic content.
A big part of the proposed solution will be human. Facebook says it will employ machine learning to identify fake news in parts of the world and refer this content to human fact checkers.
Mosseri says there has been a decline in false news on Facebook since the US election, but he's unsure whether that is just because the campaign is over.
According to the Facebook executive, fake news tends to spike in places where there is an election.
It wasn't a problem during Australia's last federal election, and as we head into what looks like an extremely divisive plebiscite over marriage equality, lets hope it stays that way.
Morning & Afternoon Newsletter