Hughes said “it’s time to break up Facebook” and “the government must hold Mark accountable”. He was referring to the huge power Zuckerberg holds through controlling the algorithms that keep Facebook – and more recently acquired platforms Instagram and Whatsapp – ticking over. Those algorithms functionalise Facebook’s vast body of user data.
I know we don’t exactly have the strongest reputation on privacy right now, to put it lightly.
Facebook’s business model is built on harvesting platform data about its users, crunching that to generate behavioural inferences like “divorced, male, no children, interested in weight loss”, and then selling this package to advertisers.
Technology scholar Shoshanna Zuboff calls the process of collecting and selling user data “surveillance capitalism”.
That’s because developers are not the customer – nor are the users who are clicking on like buttons or buying yoga pants. Facebook’s customers are advertisers.
Facebook sells one product: a powerful capacity to personalise and target ads that is unparalleled in any other platform. This turned a profit of US$16 billion in the last quarter of 2018.
It seems reasonable to assume it’s going to do everything it can to protect its ability to keep collecting the raw material for that profit.
But recent questions put to Facebook by US Senator Josh Hawley reveal that Facebook is still not willing or able to share its plans on privacy relating to metadata collection and use.
In response to the senator, Kevin Martin, Vice President of US Public Policy at Facebook said:
[…] there are still many open questions about what metadata we will retain and how it may be used. We’ve committed to consult safety and privacy experts, law enforcement, and governments on the best way forward.
This year’s election will be the first in Australia where the parties will be advertising more on social and digital platforms than traditional media (TV, radio, newspapers and magazines).
There are a few key reasons for this. First, cost-wise, social media is far cheaper, sometimes as low as a few cents per click. Unlike heritage media, digital and social is extremely targeted, and can be done in the “dark,” so your opponents may not even be aware of the message you are pushing out.
Digital and social advertising can also be shared or even created by users themselves, further increasing the reach of a party’s messaging. This gets around the Australian Electoral Commission rules on advertising – technically they are not ads since no party is paying for them to be shared on people’s feeds.
Throw into the mix laws on political advertising – which allow parties to advertise up to and on election day on social media, but not traditional media – and we are likely seeing the first largely digitally driven election campaign in Australian political history.
From a campaign perspective, Palmer is ticking many of the right boxes: a mix of different platforms on digital and social; heritage media ads for mass market awareness featuring candidates selected from the middle; the use of memes and user-generated content; and even text messaging.
Despite the ubiquity of his ads, though, Palmer is still struggling to connect with most voters. This demonstrates a very important aspect to any advertising campaign: the actual brand still needs to be seen as offering real value to voters.
The UAP has used text messaging like this one below, for example, to try to change its negative perception with voters by delivering positive campaign promises.
The ‘Grim Reaper’ strategy and micro-targeting
One of the most effective ads ever done in Australia was the “Grim Reaper” AIDS awareness campaign in 1987, which showed how well “scare campaigns” and negative messaging can work, given the right context and framing. The ad’s micro-messaging was another aspect that worked so well: it personalised the issue and made it tangible to anyone sexually active.
Basically, negative messaging works on the theory that what you fear, you will avoid – or the “fight or flight response”. Negative political ads highlight the level of risk and consequence of a certain party’s policies – and then emphasise how to avoid this by not voting for them.
Trouble is, most ads on TV are losing their potency. As attitudes towards political messaging and brands become increasingly negative, voters are less likely to watch ads in their entirety. Many people also don’t see them as being personally relevant.
Social media, though, provides an excellent delivery mechanism for these types of messages. Digital ads can be personalised and focused on issues that voters have already expressed an interest in and therefore find relevant to their lives.
Social media ads can also be altered to be even more targeted as the campaign goes on, based on voter responses. And their speed of production – only taking a matter of hours to produce and place online – allows digital advertising to do what heritage no longer can and provide a more fluid, grassroots dynamic to campaigning.
That said, even on social media, negative advertising is not as effective if it just comes from the party itself. But when combined with information from third-party sources, such as from the media, this can increase the effectiveness. For example, the Liberal Party used the 10 Network image in this ad to support its claims on Labor’s tax policies.
The major parties are aware of this and are creating ads specifically targeting this demographic on Snapchat, WhatsApp and Instagram. Some of these are “dark social” ads (meaning they can only be seen by the target market) or are user-made so not to be subject to disclosure rules.
For more general audiences, Labor has created ads like this one on Facebook that highlight issues young voters are concerned about, such as wage increases and penalty rates. Ads like this also attempt to engage with these voters by asking them to sign petitions – a form of experiential marketing that’s proved highly effective with young audiences, as seen through platforms such as Change.org.
Groups like the Australian Youth Climate Coalition are tapping into experiential marketing by combining online advertising with a call for offline action on issues that appeal to young voters, such as climate change. Part-rock concert, part-protest, these events might remind some of the rallies that proved so popular during the Gough Whitlam era.
The increasing influence of lobbying groups
One of the more interesting developments of this election so far is the increasing sophistication, knowledge and strategies of political lobbying groups, or Australia’s equivalent to America’s PACs.
GetUp! is one such group, collecting A$12.8 million in donations in the last 12 months alone. Among the group’s tactics are direct phone calls to voters, partly achieved through “phone parties” where volunteers freely offer their time, phones and other resources to call people in targeted electorates. GetUp! has a goal of making 1 million phone calls in the lead-up to the election.
Other well-funded groups, such as the right-aligned Advance Australia, are also seeking to influence the narrative in the election, particularly in electorates like Warringah, where it has released ads against Tony Abbott’s challenger, Zali Steggall.
In part to counter the influence of lobbying groups, the Australian Council of Trade Unions has launched its own advertising campaign featuring working Australians describing how hard it is to make ends meet.
The rise of these groups in Australian politics opens a Pandora’s Box on just who can influence elections without even standing a single candidate – an issue that’s becoming part of politics now in many Western democracies. As many in politics would know, where there is money, there is power, and where there is power, there are those who are seeking to influence it.
But beyond these immediate responses, this terrible incident presents an opportunity for longer term reform. It’s time for social media platforms to be more open about how livestreaming works, how it is moderated, and what should happen if or when the rules break down.
With the alleged perpetrator apparently flying under the radar prior to this incident in Christchurch, our collective focus is now turned to the online radicalisation of young men.
As part of that, online platforms face increased scrutiny and Facebook and Youtube have drawn criticism.
After dissemination of the original livestream occurred on Facebook, YouTube became a venue for the re-upload and propagation of the recorded footage.
Both platforms have made public statements about their efforts at moderation.
In the first 24 hours we removed 1.5 million videos of the attack globally, of which over 1.2 million were blocked at upload […]
Focusing chiefly on live-streaming is somewhat reductive. Although the shooter initially streamed his own footage, the greater challenge of controlling the video largely relates to two issues:
the length of time it was available on Facebook’s platform before it was removed
the moderation of “mirror” video publication by people who had chosen to download, edit, and re-upload the video for their own purposes.
These issues illustrate the weaknesses of existing content moderation policies and practices.
Not an easy task
Content moderation is a complex and unenviable responsibility. Platforms like Facebook and YouTube are expected to balance the virtues of free expression and newsworthiness with socio-cultural norms and personal desires, as well as the local regulatory regimes of the countries they operate in.
People might reasonably expect platforms like Facebook and YouTube to have thorough controls over what is uploaded on their sites. However, the companies’ huge user bases mean they often must balance the application of automated, algorithmic systems for content moderation (like Microsoft’s PhotoDNA, and YouTube’s ContentID) with teams of human moderators.
And while some algorithmic systems can be effective at scale, they can also be subverted by competent users who understand aspects of their methodology. If you’ve ever found a video on YouTube where the colours are distorted, the audio playback is slightly out of sync, or the image is heavily zoomed and cropped, you’ve likely seen someone’s attempt to get around ContentID algorithms.
For online platforms, the response to terror attacks is further complicated by the difficult balance they must strike between their desire to protect users from gratuitous or appalling footage with their commitment to inform people seeking news through their platform.
Facebook and YouTube’s challenges in addressing the issue of livestreamed hate crimes tells us something important. We need a more open, transparent approach to moderation. Platforms must talk openly about how this work is done, and be prepared to incorporate feedback from our governments and society more broadly.
A good place to start is the Santa Clara principles, generated initially from a content moderation conference held in February 2018 and updated in May 2018. These offer a solid foundation for reform, stating:
companies should publish the numbers of posts removed and accounts permanently or temporarily suspended due to violations of their content guidelines
companies should provide notice to each user whose content is taken down or account is suspended about the reason for the removal or suspension
companies should provide a meaningful opportunity for timely appeal of any content removal or account suspension.
A more socially responsible approach to platforms’ roles as moderators of public discourse necessitates a move away from the black-box secrecy platforms are accustomed to — and a move towards more thorough public discussions about content moderation.
In the end, greater transparency may facilitate a less reactive policy landscape, where both public policy and opinion have a greater understanding around the complexities of managing new and innovative communications technologies.
For social media and search engines, the law is back in town.
Prompted by privacy invasions, the spread of misinformation, a crisis in news funding and potential interference in elections, regulators in several countries now propose a range of interventions to curb the power of digital platforms.
A newly published UK report is part of this building global momentum.
Shortly after Valentine’s Day, a committee of the British House of Commons published its final report into disinformation and “fake news”. It was explicitly directed at Facebook CEO Mark Zuckerberg, and it was less a love letter than a challenge to a duel.
The report found:
Companies like Facebook should not be allowed to behave like ‘digital gangsters’ in the online world, considering themselves to be ahead of and beyond the law.
The committee was particularly vexed by Zuckerberg himself, concluding:
By choosing not to appear before the Committee … Mark Zuckerberg has shown contempt.
Its far-reaching recommendations included giving the UK’s Information Commissioner greater capacity to be “… an effective ‘sheriff in the Wild West of the Internet’.”
Then, on February 12, the Cairncross Review – an independent analysis led by UK economist and journalist Frances Cairncross – handed down its report, A Sustainable Future for Journalism.
Referring to sustainability of the production and distribution of high-quality journalism, “Public intervention may be the only remedy,” wrote Cairncross. “The future of a healthy democracy depends on it.”
And a week later, the Digital, Culture, Media and Sport Committee of the House of Commons issued its challenge in its final report on disinformation and “fake news”:
The big tech companies must not be allowed to expand exponentially, without constraint or proper regulatory oversight … only governments and the law are powerful enough to contain them.
How do the responses of the three reports compare?
ACCC inquiry broadest in scope
First, it’s important to note that the scope of these three inquiries varied significantly.
The ongoing ACCC inquiry, billed as a world-first and set to hand down its final report in June, is seeking to assess the impact of digital platforms on media and advertising, with a focus on news.
The Cairncross Review was narrower in intent, addressing “the sustainability of the production and distribution of high quality journalism, and especially the future of the press, in this dramatically changing market.”
And the House of Commons committee had a very direct brief to investigate fake news. It then chose to focus on Facebook.
As such, the three inquiries overlap substantially, but the ACCC investigation is unequivocally the broadest in scope.
Not just distribution platforms
However, all three reports land in roughly the same place when it comes to characterising these businesses. They all see digital platforms as more than just conduits of other people’s content – and this brings certain responsibilities.
The ACCC says digital intermediaries are “considerably more than mere distributors or pure intermediaries” when it comes to the supply of news and journalism.
The Cairncross Review stresses there is a “fundamental difference” between distributors and content creators.
The House of Commons committee proposes “a new category of tech company” as a legal mechanism for having digital platforms assume liability for harmful content.
Need more oversight
A related important point is that all three reviews recommend that digital platforms are brought more squarely into the legal and regulatory environment.
By this, they don’t just mean cross-industry laws that apply to all businesses. There is some of that – for example, adapting competition laws so certain conduct is regulated.
But these inquiries also raise the prospect of specific rules for platforms as part of communications regulation. How they go about this shows the point at which the inquiries diverge.
The ACCC has flagged the need for further work on a platforms code of practice that would bring them into the orbit of the communications regulator, the ACMA.
The platforms would be bound to the code, which would require them to badge content produced under established journalistic standards. It would be the content creators – publishers and broadcasters, not platforms – that would be subject to these standards.
In the UK, Cairncross proposes a collaborative approach under which a new regulator would monitor and report on platforms’ initiatives to improve reliability of news – perhaps, in time, moving to specific regulatory obligations.
In Australia, the ACCC has proposed what others refer to as a new “algorithms regulator”. This would look at how ads and news are ranked in search results or placed in news feeds, and whether vertically integrated digital platforms that arrange advertising favour their own services.
The algorithms regulator would monitor, investigate and report on activity, but would rely on referral to other regulators rather than have its own enforcement powers.
For its part, Cairncross does recommend new codes on aspects such as indexing and ranking of content and treatment of advertising. The codes would be overseen by a new regulator but they would be developed by platforms and a move to a statutory code would only occur if they were inadequate.
In contrast to both these reviews, the House of Commons committee’s Code of Ethics is concerned with “online harms”. Right from the outset, it would be drawn up and enforced by a new regulator in a similar way to Ofcom, the UK communications regulator, enforcing its Broadcasting Code.
It says this would create “a regulatory system for online content that is as effective as that for offline content industries”. Its forcefulness on this is matched by its recommendation on algorithms: it says the new regulator should have access to “tech companies’ security mechanisms and algorithms, to ensure they are operating responsibly”.
Both the ACCC and Cairncross pointedly avoid this level of intervention.
However, the ACCC does raise the prospect of a new digital platforms ombudsman. Apart from delivering 11 preliminary recommendations, the ACCC also specified nine proposed areas for further analysis and assessment. Among these areas, the ACCC suggested the idea of such an ombudsman to deal with complaints about digital platforms from consumers, advertisers, media companies and businesses.
And then there is data privacy.
This is where the ACCC and the House of Commons committee delivered some of their most significant recommendations. It’s also where regulators in other jurisdictions have been turning their attention, often on the understanding that the market power of digital platforms is largely derived from their ability to access user data.
Earlier this month, Germany’s Federal Cartel Office (Bundeskartellamt) found that Facebook could no longer merge a person’s data from their Instagram, Facebook and WhatsApp accounts, without their explicit consent.
In Germany, the law has spoken. In Australia and the UK, it’s still clearing its throat.
Facebook hasn’t escaped without scandal. It’s been subject to data breaches and allegations that it has failed to protect user privacy. Reports suggest that numerous Facebook users have responded to these incidents by giving up the platform.
But the data say otherwise. Preliminary findings from my recent research suggest that although Australian Facebook users do care about the privacy and security of their personal information, this is not enough to drive them to leave the platform.
One of the most prominent scandals Facebook has been caught up in involves allegations made in March 2018: that analytics company Cambridge Analytica was using personal data from Facebook users to help political parties in their election campaigns.
This, and other news stories about Facebook’s use of user data, received widespread international attention, including significant coverage in Australia. Numerous news reports claimed that large numbers of Australians were deleting their Facebook accounts as part of the #DeleteFacebook trend. As one news story contended:
Many Australians are for the first time discovering just how much Facebook knows about them and many are shocked, leading them to quit the platform.
Statistics on Australians’ use of Facebook, however, show no change in numbers since the Cambridge Analytica scandal first received public attention. There were 15 million active monthly Facebook users 12 months ago (just before the scandal erupted) and this figure remained steady over the course of the year. Facebook is still far and above the most highly used social media platform in Australia.
The study: what I wanted to know
In September and October 2018, I conducted a study involving in-depth telephone interviews with 30 Australians who were current or past Facebook users.
An equal number of females and males participated across a broad age distribution (10 participants aged 18-40, 10 participants aged 41-60, and 10 participants aged 61 and over) and geographical distribution (10 participants living in rural towns or areas, 20 participants living in cities or major towns).
During the interview, the participants were asked:
whether they were bothered or concerned about Facebook’s use of information about them
if they had ever changed their use of Facebook or privacy settings in response to these concerns
what kinds of personal information they would not want internet companies like Facebook or apps to access or use
what steps these companies should take to protect users’ information.
In the interview questions there was no direct mention of Cambridge Analytica or any other scandal about Facebook. I wanted to see if participants spontaneously raised these events and issues in their responses.
The findings show people continue to use Facebook for a wide variety of reasons. For some people, their business depended on their active Facebook use, so they could advertise their offerings and connect with potential clients:
I know that if I did delete it, I’d be harming myself and my business, so yes, so that’s kind of the main reason I keep it.
For most people, however, the key incentive was the desire to connect with family and friends. This included being able to find old friends and reconnect with them, as well as maintaining ties with current friends and family members.
Several people commented that using Facebook was the best way of knowing about the lives of their adult children, grandchildren or other young relatives.
That’s the only thing that’s kept me on there – because my kids are on there and I just want to see what they’re doing, and what and who they’re hanging around with.
For others, being part of a community (for example, a health-related group) was an important way of alleviating isolation and loneliness.
These comments suggest Facebook is an important tool to support social interactions in a world in which people are more dispersed and physically separated from friends and family.
The drawbacks: mundane trivia, too many ads
The drawbacks of being on Facebook reported by participants included feeling annoyed by aspects such as having to see other people’s mundane trivia, random friend requests or too many ads:
Sometimes there’s a lot of nonsense that goes up on there, people posting you don’t want to get involved in, or I think it’s stupid or rude or whatever it may be.
Some people also talked about not liking feeling that you are being watched, and their anxiety about their personal information being accessed and identity theft or bank details being stolen. However, these issues were not considered serious enough for them to leave Facebook.
Apart from targeted advertising, most people were unsure how Facebook might use their personal information. Very few participants mentioned the Cambridge Analytica scandal or related issues, such as the use of Facebook for political campaigning or to disseminate “fake news”. Even when they did refer to these issues, they had difficulty explaining exactly how personal data were involved:
Well, I know Facebook collected the data for that Cambridge business and they collected it via a quiz with an app, and then passed it on to other parties. So I think that’s all they do. I think it’s just maybe for them to earn money off it. I don’t really know.
Data privacy: employing workarounds to stay on Facebook
Most people thought they were careful in not revealing too much information about themselves on Facebook, and therefore protected their data. They reported engaging in practices such as avoiding uploading details about themselves, limiting their number of friends or the type of friends, blocking or unfriending people who annoyed them, clearing their history regularly and being very selective about what photos to upload (including of their children).
Several people mentioned they had recently checked and changed their privacy settings, often in response to a prompt from Facebook to do so:
I keep my personal stuff to myself, and then I share what I want to share through my friends. And I’ve got strict privacy things in place so that I only get things to people that I know, rather than people I don’t know. So that’s fine with me.
These findings show it’s not so much that Australian Facebook users don’t care about their personal data privacy and security. They do think about these issues and have their own ways of managing them.
Australians think Facebook serves them well. They consider other people’s over-sharing or having to see too many ads as more of a problem than alleged political manipulation or other misuse of their information by Facebook.
Two reports out this week – one into the operations of Facebook and Google, the other into the competitive neutrality of the ABC and SBS – present the federal government with significant policy and political challenges.
The first is by far the more important of the two.
It is the interim report by the Australian Competition and Consumer Commission of its Digital Platforms Inquiry, and in a set of 11 preliminary recommendations it proposes far-reaching changes to media regulation.
Of particular interest are its preliminary recommendations for sustaining journalism and news content.
These are based on the premise that there is a symbiotic relationship between news organisations and the big digital platforms. Put simply, the news organisations depend heavily on these platforms to get their news out to their audiences.
The problem, the ACCC says, is that the way news stories are ranked and displayed on the platforms is opaque. All we know – or think we know – is that these decisions are made by algorithms.
The ACCC says this lack of transparency causes concerns that the algorithms and other policies of the platform giants may be operating in a way that affects the production of news and journalistic content.
To respond to this concern, the preliminary recommendation is for a new regulatory authority to be established. It would have the power to peer into these algorithms and monitor, investigate and report on how content – including news content – is ranked and displayed.
The purpose would be to identify the effects of the algorithms and other policies on the production of news and journalistic content.
It would also allow the authority to assess the impact on the incentives for news and journalistic content creation, particularly where news organisations have invested a lot of time and money in producing original content.
In this way, the ACCC is clearly trying to protect and promote the production of public-interest journalism, which is expensive but vital to democratic life. It is how the powerful are held to account, how wrongdoing is uncovered, and how the public finds out what is going on inside forums such as the courts and local councils.
So far, the big news media organisations have concentrated on these aspects of the ACCC interim report and have expressed support for them.
However, there are two other aspects of the report on which their response has been muted.
The first of these is the preliminary recommendation that proposes a media regulatory framework that would cover all media content, including news content, on all systems of distribution – print, broadcast and online.
The ACCC recommends that the government commission a separate independent review to design such a framework. The framework would establish underlying principles of accountability, set boundaries around what should be regulated and how, set rules for classifying different types of content, and devise appropriate enforcement mechanisms.
Much of this work has already been attempted by earlier federal government inquiries – the Finkelstein inquiry and the Convergence Review – both of which produced reports for the Gillard Labor government in 2012.
Their proposals for an overarching regulatory regime for all types of media generated a hysterical backlash from the commercial media companies, who accused the authors of acting like Stalin, Mao, or the Kim clan in North Korea.
So if the government adopts this recommendation from the ACCC, the people doing the design work can expect some heavy flak from big commercial media.
The other aspect of the ACCC report that is likely to provoke a backlash from the media is a preliminary recommendation concerning personal privacy.
Here the ACCC proposes that the government adopt a 2014 recommendation of the Australian Law Reform Commission that people be given the right to sue for serious invasions of privacy.
The media have been on notice over privacy invasion for many years. As far back as 2001, the High Court developed a test of privacy in a case involving the ABC and an abattoir company called Lenah Game Meats.
Now, given the impact on privacy of Facebook and Google, the ACCC has come to the view that the time has arrived to revisit this issue.
The ACCC’s interim report is one of the most consequential documents affecting media policy in Australia for many decades.
The same cannot be said of the other media-related report published this week: that of the inquiry into the competitive neutrality of the public-sector broadcasters, the ABC and SBS.
This inquiry was established in May this year to make good on a promise made by Malcolm Turnbull to Pauline Hanson in 2017.
He needed One Nation’s support for the government’s changes to media ownership laws, without which they would not have passed the Senate.
Hanson was not promised any particular focus for the inquiry, so the government dressed it up in the dull raiment of competitive neutrality.
While it had the potential to do real mischief – in particular to the ABC – the report actually gives both public broadcasters a clean bill of health.
There are a couple of minor caveats concerning transparency about how they approach the issue of fair competition, but overall the inquiry finds that the ABC and SBS are operating properly within their charters. Therefore, by definition, they are acting in the public interest.
This has caused pursed lips at News Corp which, along with the rest of the commercial media, took this opportunity to have a free kick at the national broadcasters. But in the present political climate, the issue is likely to vanish without trace.
While the government still has an efficiency review of the ABC to release, it also confronts a political timetable and a set of the opinion polls calculated to discourage it from opening up another row over the ABC.
The report says Google and Facebook each possess substantial power in markets such as online search and social media services in Australia.
It’s not against the law to possess substantial market power alone. But these companies would breach our November 2017 misuse of market power law if they engaged in any conduct with the effect, likely effect or purpose of substantially lessening competition – essentially, blocking rivalry in a market.
Moving forwards, the ACCC has indicated it will scrutinise the accumulation of market power by these platforms more proactively. Noting that “strategic acquisitions by both Google and Facebook have contributed to the market power they currently hold”, the ACCC says it intends to ask large digital platforms to provide advance notice of any planned acquisitions.
While such pre-notification of certain mergers is required in jurisdictions such as the US, it is not currently a requirement in other sectors under the Australian law.
At the moment the ACCC is just asking the platforms to do this voluntarily – but has indicated it may seek to make this a formal requirement if the platforms don’t cooperate with the request. It’s not currently clear how this would be enforced.
The ACCC has also recommended the standard for assessing mergers should be amended to expressly clarify the relevance of data acquired in the transaction as well as the removal of potential competitors.
The law doesn’t explicitly refer to potential competitors in addition to existing competitors at present, and some argue platforms are buying up nascent competitors before the competitive threat becomes apparent.
According to the ACCC, there is a “lack of transparency” in Google’s and Facebook’s arrangements concerning online advertising and content, which are largely governed by algorithms developed and owned by the companies. These algorithms – essentially a complex set of instructions in the software – determine what ads, search results and news we see, and in what order.
The problem is nobody outside these companies knows how they work or whether they’re producing results that are fair to online advertisers, content producers and consumers.
The report recommends a regulatory authority be given power to monitor, investigate and publish reports on the operation of these algorithms, among other things, to determine whether they are producing unfair or discriminatory results. This would only apply to companies that generate more than A$100 million per annum from digital advertising in Australia.
These algorithms have come under scrutiny elsewhere. The European Commission has previously fined Google €2.42 billion for giving unfair preference to its own shopping comparison services in its search results, relative to rival comparison services, thereby contravening the EU law against abuse of dominance. This decision has been criticised though, for failing to provide Google with a clear way of complying with the law.
The important questions following the ACCC’s recommendation are:
what will the regulator do with the results of its investigations?
if it determines that the algorithm is producing discriminatory results, will it tell the platform what kind of results it should achieve instead, or will it require direct changes to the algorithm?
The ACCC has not recommended the regulator have the power to make such orders. It seems the most the regulator would do is introduce some “sunshine” to the impacts of these algorithms which are currently hidden from view, and potentially refer the matter to the ACCC for investigation if this was perceived to amount to a misuse of market power.
If a digital platform discriminates against competitive businesses that rely on its platform – say, app developers or comparison services – so that rivalry is stymied, this could be an important test case under our misuse of market power law. This law was amended in 2017 to address longstanding weaknesses but has not yet been tested in the courts.
This last recommendation was previously made by the Australian Law Reform Commission in 2014 and 2008, and would finally allow individuals in Australia to sue for harm suffered as a result of such an invasion.
If consent is to be voluntary and specific, companies should not be allowed to “bundle” consents for a number of uses and collections (both necessary and unnecessary) and require consumers to consent to all or none. These are important steps in addressing the unfairness of current data privacy practices.
But the effectiveness of these changes would depend to a large extent on whether the government would also agree to improve funding and support for the federal privacy regulator, which has been criticised as passive and underfunded.
Another recommended change to consumer protection law would make it illegal to include unfair terms in consumer contracts and impose fines for such a contravention. Currently, for a first-time unfair contract terms “offender”, a court could only “draw a line” through the unfair term such that the company could not force the consumer to comply with it.
Making such terms illegal would increase incentives for companies drafting standard form contracts to make sure they do not include detrimental terms which create a significant imbalance between them and their customers, which are not reasonably necessary to protect their legitimate interests.
The ACCC might also take action on these standard terms under our misleading and deceptive conduct laws. The Italian competition watchdog last week fined Facebook €10 million for conduct including misleading users about the extent of its data collection and practices.
The ACCC appears to be considering the possibility of even broader laws against “unfair” practices, which regulators like the US Federal Trade Commission have used against bad data practices.
Final report in June 2019
As well as 11 recommendations, the report mentions nine areas for “further analysis and assessment” which in itself reflects the complexity of the issues facing the ACCC.
The ACCC is seeking responses and feedback from stakeholders on the preliminary report, before creating a final report in June 2019.
Increasingly, news producers depend on social media and search engines to connect with consumers. Google is used for 95% of searches (98% on mobile devices).
The rise of Google, Facebook and other digital platforms has been accompanied by unprecedented pressures on traditional news organisations.
Most obviously, classified advertising revenue has been unbundled from newspapers.
In 2001, classified advertising revenue stood at A$2 billion. By 2016, it had fallen to A$200 million. The future of newspapers’ ability to produce news is under a cloud, and digital platforms help control the weather.
Of course, advertisers care too.
But the stakeholders with the most to gain or lose are us, Australian citizens.
Our lives are mediated by Google, Facebook, Apple, Amazon, Twitter and others as never before. Google answers our search queries; Facebook hosts friends’ baby snaps; YouTube (owned by Google) distributes professional and user-generated videos; Instagram (owned by Facebook) hosts our holiday snaps.
As the ACCC notes, they have given us tremendous benefits, for minimal (apparent) cost.
And they’ve done it at lightning speed. Google arrived in 1998, Facebook in 2004 and Twitter in 2006. They are mediating what comes before our eyes in ways we don’t understand and (because they keep their algorithms secret) in ways we can’t understand.
What does the ACCC recommend?
The ACCC’s preliminary recommendations are far-reaching and bold.
First, it suggests an independent review to address the inadequacy of current media regulatory frameworks.
This would be a separate, independent inquiry to “design a regulatory framework that is able to effectively and consistently regulate the conduct of all entities which perform comparable functions in the production and delivery of content in Australia, including news and journalistic content, whether they are publishers, broadcasters, other media businesses, or digital platforms”.
This is a commendable and urgent proposal. Last year, cross-media ownership laws were repealed as anachronistic in a digital age. To protect media diversity and plurality, the government needs to revisit the issue of regulatory frameworks.
Second, it proposes privacy safeguards. Privacy in Australia is dangerously under-protected. Digital platforms such as Google and Facebook generate revenue by knowing their users and targeting advertising with an accuracy unseen in human history.
As the ACCC puts it, “the current regulatory framework, including privacy laws, does not effectively deter certain data practices that exploit the information asymmetries and the bargaining power imbalances that exist between digital platforms and consumers.”
It makes a number of specific preliminary recommendations, including creating a right to erasure and the requirement of “express, opt-in consent”.
It wants the penalties for breaches of our existing Privacy Act increased. It recommends the creation of a third-party certification scheme, which would enable the Office of the Australian Information Commissioner to give complying bodies a “privacy seal or mark”.
And it wants a new or existing organisation to monitor attempts by vertically-integrated platforms such as Google to favour their own businesses. This would happen where Google gives prominence in search results to products sold through Google platforms, or prominence to stories from organisations with which it has a commercial relationship.
The organisation would oversee platforms that generate more than A$100 million annually, and which disseminate news, or hyperlinks to news, or snippets of news.
It would investigate complaints and even initiate its own investigations in order to understand how digital platforms are disseminating news and journalistic content and advertising.
As it notes,
The algorithms operated by each of Google and Facebook, as well as other policies, determine which content is surfaced and displayed to consumers in news feed and search results. However, the operation of these algorithms and other policies determining the surfacing of content remain opaque. (p10)
It makes other recommendations, touching on areas including merger law, pre-installed browsers and search engines, takedown procedures for copyright-infringing content, implementing a code of practice for digital platforms and changing the parts of Australian consumer law that deal with unfair contract terms.
Apart from its preliminary recommendations, there are further areas on which it invites comment and suggestions.
These include giving media organisations tax offsets for producing public interest news, and making subscribing to news publications tax deductible for consumers.
Platforms could be brought into a co-regulatory system for flagging content that is subject to quality control, creating their own quality mark. And a new ombudsman could deal with consumer complaints about scams, misleading advertising and the ranking of news content.
All of these recommendations and areas of interest will generate considerable debate.
The ACCC will accept submissions in response to its preliminary report until February 15.
At the Centre for Media Transition, we played a background role in one aspect of this inquiry.
Earlier this year, we were commissioned by the ACCC to prepare a report on the impact of digital platforms on news and journalistic content. It too was published on Monday.
Our findings overlap with the ACCC on some points, and diverge on others.
Many thorny questions remain, but one point is clear: the current regime that oversees digital platforms is woefully inadequate. Right now, as the ACCC notes, digital platforms are largely unregulated.
New ways of thinking are needed. A mix of old laws (or no laws) and new media spells trouble.
You should not trust all the metrics platform companies offer.
A recent article in The Atlantic revealed that a lawsuit has been filed against Facebook, because it allegedly overstated its video statistics for years, “exaggerating the time spent watching them by as much as 900% percent.”
The article asserts that “hundreds of journalists” lost their job after Facebook lured news companies to invest in video. However, Facebook has denied these claims.
The Spinoff, a digital native media company in New Zealand, recently posted a story with a similar analysis of the platform’s impact, saying that Facebook’s fake video statistics “smashed New Zealand journalism” because when it encouraged news companies to invest in online video production, they followed.
New Zealand media were collateral damage in Facebook’s obsessive desire to grow at all costs.
Additionally, two recent studies demonstrates how news companies rely on Facebook to drive their traffic. My own research of four New Zealand news companies shows that 24% of their traffic came from social media, and most of that came from Facebook. Together social and search drove 47% of New Zealand news sites’ traffic.
… news organisations are making major investments in social media and report receiving significant amounts of traffic” from social media.
These studies confirm news companies’ Facebook dependency in terms of traffic, although the level of dependency differs between media outlets. The Reuters study found that Facebook’s algorithm changes in January had a severe impact on some of the news companies’ traffic. However, the severity of impact differed between media outlets. For example, Le Monde saw its interactions drop by almost a third, but for The Times these grew.
In New Zealand, The Spinoff suffered a substantial drop in its traffic after Facebook’s algorithm tweaks: its traffic from the platform dropped from 50% to 30%. We have seen similar drops in other markets, too.
The Nieman Lab, which reports on digital media innovation, found that following Facebook’s algorithm change, the drop in referral traffic was not universal. Some not-for-profit media organisations benefited.
Bottom line: The decline in referrals to publishers from Facebook is not universal, and in the face of those declines, other sources of traffic are more important than ever.
Indeed, it would be wise for all news outlets to grow their direct traffic which delivers better user engagement and monetisation.
Lesson three: don’t do it for the money
You don’t make much money on Facebook.
My research suggests if news companies abandoned Facebook, they would hardly lose any money. For the companies studied, social media traffic made up 0.03%–0.14% of their total revenue, and social shares 0.009%–0.2% of total revenue. These figures don’t take into account advertising income from platforms or subscription conversion.
The reason news companies continue to distribute their content on social media platforms is because they gain attention. News companies believe they can turn this attention into money, but so far with little success.
However, some news publishers have reported that they have gained digital subscriptions from the platforms. The Reuters study notes Facebook delivers news companies audience engagement that is “considered more cost effective at driving digital subscription sales.”
The bottom line is there is no data yet to verify how well news publishers manage to convert social media attention to digital subscriptions. Google, Facebook, Amazon and Apple are all offering, or are in the process of offering, digital subscription services to news publishers, but how well these will work for news companies remains to be seen.
The recent Cambridge Analytica scandal and congressional testimony of Facebook CEO Mark Zuckerberg has brought global attention to the power and influence of Facebook as a platform. It has also invigorated discussions about how such platforms should be regulated.
Meanwhile, the Australian Competition and Consumer Commission (ACCC) has been conducting an inquiry into the influence of digital platforms on media and advertising markets in Australia.
Submissions to the inquiry by a range of media outlets, advertisers, as well as Google and Facebook, were published last week. Although Facebook has expressed interest in participating in regulatory debates, its submission is a disappointing early indication of how we might expect the company to downplay its magnitude and its roles in future regulatory debates.
The purpose of the inquiry
Late in 2017, the Federal Treasurer, Scott Morrison, directed the ACCC to conduct the inquiry into digital platforms, including search engines, social networks and other aggregators. As part of the ongoing inquiry, the ACCC will consider:
the impact of digital platforms on the supply of news and journalistic content and the implications of this for media content creators, advertisers and consumers.
It came about as a result of negotiations between the government and the former independent Senator Nick Xenophon. Xenophon insisted on the inquiry in exchange for his support for the government’s changes to Media Ownership laws.
To some extent, the inquiry retreads familiar ground. Old anxieties about declining revenues for journalistic organisations and the advent of internet technologies and internet-focused stakeholders continue a conversation that has been going for well over a decade.
News outlets air grievances
In total, the ACCC published 57 submissions. This includes contributions from most major Australian media organisations, industry bodies, unions and advertisers.
Many respondents took the opportunity to criticise the narrow scope of the inquiry. The inquiry’s scope is somewhat frustrating considering the complexities digital platforms present. They impact not just media and journalism markets, but also aspects of political, social and everyday life.
While the ABC’s submission was generally favourable in its discussion of online platforms, other Australian media organisations used the inquiry as an opportunity to air grievances about the impact of digital platforms.
Seven West Media and Ten argued that there is a barrier to entry imposed on traditional publishers by the significant existing collection of personal data that platforms like Facebook and Google can leverage.
The platforms respond
In their submissions, Facebook and Google both attempted to build a narrative that emphasised how the tools and systems they provide can empower journalists and other content creators. Meanwhile, they minimised or outright ignored the opportunity to discuss the broader concerns of the broadcasters, publishers and individuals who are stakeholders in the industries Facebook and Google are operating in.
Google’s short response to the inquiry is not particularly interesting, in part due to its brevity and its focus on championing Google’s notionally positive influence for publishers. Facebook had significantly more to say in its 56 page submission, which also gives context to Mark Zuckerberg’s recent comments welcoming the potential for regulation.
Facebook plays the underdog
Facebook’s submission reveals how the company portrays itself to regulators, with an interesting element of self-deprecation. Take for example, the statement that:
Facebook is popular, but it is just one small part of how Australians connect with friends, family and the world around us.
Given a user-base that dwarfs the population of, well, even the most populous countries, Facebook’s most compelling option for presenting itself as an underdog in this space is to compare itself by share of “attention”, rather than share of market.
Facebook presents “multi-homing” – the practice of having and using a variety apps on your phone – as a key concern. It paints a picture of precarity in a marketplace that they dominate.
Facebook’s arguments about competition also ring hollow because the platform’s design and scale allows it to benefit from significant network effects.
Put simply, a network effect is when existing and new users benefit from the growth of a network. A familiar example of these effects can be seen in the services of mobile phone network providers. Telstra and Optus provide cheaper, or no-cost calls or messaging between customers of their own service.
But the similarities end there. While you could still call a friend with a competing mobile phone provider, there is no such interoperability with platforms like Facebook. This design helps Facebook protect its market power by keeping total control over the Facebook platfom’s network.
If you decide to leave Facebook, you sever the connections between yourself and other users of the platform. Given Facebook’s focus on augmenting social functions this can, quite literally, be an ostracising endeavour. In spite of both the recent Cambridge Analytica revelations, and several #deletefacebook campaigns, we’re yet to see a significant exodus of users from the platform.
A disappointing response
Facebook has a colossal user base. Over two billion people use the platform each month, and almost three quarters of those people use Facebook on a daily basis. It owns Instagram and WhatsApp – each of which are profoundly successful platforms in their own right.
Facebook is a titan of this industry, and the sooner it stops pretending to be a bit player, the richer our discourse about platforms and their role in society can become.