Buried in all the headlines around parental consent and age verification, this week's Online Privacy Bill signalled the potential for a significant change in the government's approach to regulating Big Tech. And it's long overdue.
Australian regulation has been really strong on content moderation and demanding the speedy takedown of some types of content over the last few years, but this is not a systemic solution nor is it commensurate with the scale of the problem. It essentially dooms regulators to playing an endless game of content 'whack-a-mole'.
Instead, regulation needs to focus on how platforms work; on their systems, features and algorithms, all of which create and amplify risks. As yet another scandalous document leaked this week described it: "The mechanics of (Facebook's) platform are not neutral." None of them are.
Elements of the Online Privacy Bill bravely square up to these mechanics, by paving the way for a code that would regulate how platforms can and cannot use data - the fuel of their risky machines. At least for young people, it stipulates social media platforms can only use data in ways demonstrated to be in children's best interests.
For example, social media's algorithms - trained to serve kids all sorts of awful content just to maximise interactions - aren't currently well regulated. Platforms can meet Australian regulation simply by quickly taking down content flagged by regulators. If this bill comes to pass, social media platforms may instead need to demonstrate how their algorithms, built and trained on young people's data, functions in children's best interests in the first place. This could be a very welcome shift towards regulating the systems and operations of tech platforms.
This shift is also part of a global trend in tech regulation. Governments are increasingly trying to regulate 'upstream' systems and operations, before online harms happen. Regulations are taking aim at data collection and use, business purchases and acquisitions and creating obligations for a 'duty of care' towards users. All of these upstream approaches move beyond specific content and attempt to create accountability for the risks and harms of platforms.
But the Online Privacy Bill has a long way to go before it delivers an online privacy code that could realise this massive potential (even if it's just for young people at the moment). The legal structures that create this code stipulate the information commissioner must draft it alongside regulated industries - aka the very social media platforms we're trying to protect kids from in the first place. Under Australian laws, they have to work so closely together that in fact industry has the first opportunity to draft the code.
This is obviously a giant fly in the ointment, and while the attorney-general's office was keen to remind us there are still two potential code drafters in the mix (industry or the information commissioner), the information commissioner herself was more categorical, stating: "The code will be developed by Industry."
This co-regulatory approach, allowing industry to draft the regulations regulators then enforce, just won't work for Big Tech. Where industries are filled with good corporate citizens who act in good faith, it may be a great way forward. But as Frances Haugen pointed out in Canberra last week, Big Tech is not that and should not be trusted. They are freeloading on the good will of Australian regulators hard earned by other well-behaved industries.
For proof, we need look no further than our woefully inadequate mis- and dis-information code, drafted by Big Tech's industry body Digi (co-founded by Facebook). ACMA, the regulator that oversees this code, did not pull any punches saying the code failed to meet expectations (and continues to. In the wake of this criticism, Digi announced an oversight board of three whole people, a laughable attempt to shore up a wholly inadequate code).
If the social media industry drafts this online privacy code it, too, will be woefully inadequate. Facebook is absolutely prepared to serve young people harmful products in pursuit of profits. Asking them, or their industry body, to pen the code would be deeply irresponsible.
As Australia pivots towards upstream, systemic tech regulations, we also need to pivot away from self- and co-regulatory models. Our light touch, downstream focus has allowed Big Tech to profit from risks for too long.
Sign up for our newsletter to stay up to date.