Regulating Facebook, Google and Amazon is hard given their bewildering complexity



Governments are attempting to regulate tech giants, but the digital disruption genie is already out of the bottle.
Shutterstock

Zac Rogers, Flinders University

Back in the 1990s – a lifetime ago in internet terms – the Spanish sociologist Manuel Castells published several books charting the rise of information networks. He predicted that in the networked age, more value would accrue in controlling flows of information than in controlling the content itself.

In other words, those who positioned themselves as network hubs – the routers and switchers of information – would become the gatekeepers of power in the digital age.

With the rise of internet juggernauts Google, Facebook, Amazon and others, this insight seems obvious now. But over the past two decades, a fundamentally new business model emerged which even Castells had not foreseen – one in which attracting users onto digital platforms takes precedence over everything else, including what the user might say, do, or buy on that platform.

Gathering information became the dominant imperative for tech giants – aided willingly by users charmed first by novelty, then by the convenience and self-expression afforded by being online. The result was an explosion of information, which online behemoths can collate and use for profit.




Read more:
Here’s how tech giants profit from invading our privacy, and how we can start taking it back


The sheer scale of this enterprise means that much of it is invisible to the everyday user. The big platforms are now so complex that their inner workings have become opaque even to their engineers and administrators. If the system is now so huge that not even those working within it can see the entire picture, then what hope do regulators or the public have?

Of course, governments are trying to fight back. The GDPR laws in Europe, the ACCC Digital Platforms report in Australia, and the DETOUR Act introduced to the US Congress in April – all are significant attempts to claw back some agency. At the same time, it is dawning on societies everywhere that these efforts, while crucial, are not enough.




Read more:
Consumer watchdog calls for new measures to combat Facebook and Google’s digital dominance


Gatekeepers reign supreme

If you think of the internet as a gigantic machine for sharing and copying information, then it becomes clear that the systems for sorting that information are vitally important. Think not just of Google’s search tool, but also of the way Google and Amazon dominate cloud computing – the largely invisible systems that make the internet usable.

Over time, these platforms have achieved greater and greater control over how information flows through them. But it is an unfamiliar type of control, increasingly involving autonomous, self-teaching systems that are increasingly inscrutable to humans.

Information gatekeeping is paramount, which is why platforms such as Google, Amazon and Facebook have risen to supremacy. But that doesn’t mean these platforms necessarily need to compete or collude with one another. The internet is truly enormous, a fact that has allowed each platform to become emperor of a growing niche: Google for search, Facebook for social, Amazon for retail, and so on. In each domain, they played the role of incumbent, disruptor, and innovator, all at the same time.

Now nobody competes with them. Whether you’re an individual, business, or government, if you need the internet, you need their services. The juggernauts of the networked age are structural.

Algorithms are running the show

For these platforms to stay on top, innovation is a constant requirement. As the job of sorting grows ever larger and more complex, we’re seeing the development of algorithms so advanced that their human creators have lost the capacity to understand their inner workings. And if the output satisfies the task at hand, the inner workings of the system are considered of minor importance.

Meanwhile, the litany of adverse effects are undeniable. This brave new machine-led world is eroding our capacity to identify, locate, and trust authoritative information, in favour of speed.

It’s true that the patient was already unwell; societies have been hollowed out by three decades of market fundamentalism. But as American tech historian George Dyson recently warned, self-replicating code is now out there in the cyber ecosystem. What began as a way for humans to coax others into desired behaviours now threatens to morph into nothing less than the manipulation of humans by machines.

The digital age has spurred enormous growth in research disciplines such as social psychology, behavioural economics, and neuroscience. They have yielded staggering insights into human cognition and behaviour, with potential uses that are far from benign.

Even if this effort had been founded with the best of intentions, accidents abound when fallible humans intervene in complex systems with fledgling ethical and legal underpinnings. Throw malign intentions into the mix – election interference, information warfare, online extremism – and the challenges only mount.

If you’re still thinking about digital technologies as tools – implying that you, the user, are in full control – you need to think again. The truth is that no one truly knows where self-replicating digital code will take us. You are the feedback, not the instruction.

Regulators don’t know where to start

A consensus is growing that regulatory intervention is urgently required to stave off further social disruption, and to bring democratic and legal oversight into the practices of the world’s largest monopolies. But, if Dyson is correct, the genie is already out of the bottle.

Entranced by the novelty and convenience of life online, we have unwittingly allowed silicon valley to pull off a “coup from above”. It is long past time that the ideology that informed this coup, and is now governing so much everyday human activity, is exposed to scrutiny.




Read more:
Explainer: what is surveillance capitalism and how does it shape our economy?


The challenges of the digital information age extend beyond monopolies and privacy. This regime of technologies was built by design without concerns about exploitation. Those vulnerabilities are extensive and will continue to be abused, and now that this tech is so intimately a part of daily life, its remediation should be pursued without fear or favour.

Yet legislative and regulatory intervention can only be effective if industry, governments and civil society combine to build, by design, a digital information age worthy of the name, which doesn’t leave us all open to exploitation.The Conversation

Zac Rogers, Research Lead, Jeff Bleich Centre for the US Alliance in Digital Technology, Security, and Governance, Flinders University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Advertisements

Here’s how tech giants profit from invading our privacy, and how we can start taking it back



Your online activity can be turned into an intimate portrait of your life – and used for profit.
Shutterstock.com

Katharine Kemp, UNSW

Australia’s consumer watchdog has recommended major changes to our consumer protection and privacy laws. If these reforms are adopted, consumers will have much more say about how we deal with Google, Facebook, and other businesses.

The proposals include a right to request erasure of our information; choices about whether we are tracked online and offline; potential penalties of A$10 million or more for companies that misuse our information or impose unfair privacy terms; and default settings that favour privacy.




Read more:
Consumer watchdog calls for new measures to combat Facebook and Google’s digital dominance


The report from the Australian Competition and Consumer Commission (ACCC) says consumers have growing concerns about the often invisible ways companies track us and disclose our information to third parties. At the same time, many consumers find privacy policies almost impossible to understand and feel they have no choice but to accept.

My latest research paper details how companies that trade in our personal data have incentives to conceal their true practices, so they can use vast quantities of data about us for profit without pushback from consumers. This can preserve companies’ market power, cause harm to consumers, and make it harder for other companies to compete on improved privacy.

The vicious cycle of privacy abuse.
Helen J. Robinson, Author provided

Privacy policies are broken

The ACCC report points out that privacy policies tend to be long, complex, hard to navigate, and often create obstacles to opting out of intrusive practices. Many of them are not informing consumers about what actually happens to their information or providing real choices.

Many consumers are unaware, for example, that Facebook can track their activity online when they are logged out, or even if they are not a Facebook user.




Read more:
Shadow profiles – Facebook knows about you, even if you’re not on Facebook


Some privacy policies are outright misleading. Last month, the US Federal Trade Commission settled with Facebook on a US$5 billion fine as a penalty for repeatedly misleading users about the fact that personal information could be accessed by third-party apps without the user’s consent, if a user’s Facebook “friend” gave consent.

If this fine sounds large, bear in mind that Facebook’s share price went up after the FTC approved the settlement.

The ACCC is now investigating privacy representations by Google and Facebook under the Australian Consumer Law, and has taken action against the medical appointment booking app Health Engine for allegedly misleading patients while it was selling their information to insurance brokers.

Nothing to hide…?

Consumers generally have very little idea about what information about them is actually collected online or disclosed to other companies, and how that can work to their disadvantage.

A recent report by the Consumer Policy Research Centre explained how companies most of us have never heard of – data aggregators, data brokers, data analysts, and so on – are trading in our personal information. These companies often collect thousands of data points on individuals from various companies we deal with, and use them to provide information about us to companies and political parties.

Data companies have sorted consumers into lists on the basis of sensitive details about their lifestyles, personal politics and even medical conditions, as revealed by reports by the ACCC and the US Federal Trade Commission. Say you’re a keen jogger, worried about your cholesterol, with broadly progressive political views and a particular interest in climate change – data companies know all this about you and much more besides.

So what, you might ask. If you’ve nothing to hide, you’ve nothing to lose, right? Not so. The more our personal information is collected, stored and disclosed to new parties, the more our risk of harm increases.

Potential harms include fraud and identity theft (suffered by 1 in 10 Australians); being charged higher retail prices, insurance premiums or interest rates on the basis of our online behaviour; and having our information combined with information from other sources to reveal intimate details about our health, financial status, relationships, political views, and even sexual activity.




Read more:
Why you might be paying more for your airfare than the person seated next to you


In written testimony to the US House of Representatives, legal scholar Frank Pasquale explained that data brokers have created lists of sexual assault victims, people with sexually transmitted diseases, Alzheimer’s, dementia, AIDS, sexual impotence or depression. There are also lists of “impulse buyers”, and lists of people who are known to be susceptible to particular types of advertising.

Major upgrades to Australian privacy laws

According to the ACCC, Australia’s privacy law is not protecting us from these harms, and falls well behind privacy protections consumers enjoy in comparable countries in the European Union, for example. This is bad for business too, because weak privacy protection undermines consumer trust.

Importantly, the ACCC’s proposed changes wouldn’t just apply to Google and Facebook, but to all companies governed by the Privacy Act, including retail and airline loyalty rewards schemes, media companies, and online marketplaces such as Amazon and eBay.

Australia’s privacy legislation (and most privacy policies) only protect our “personal information”. The ACCC says the definition of “personal information” needs to be clarified to include technical data like our IP addresses and device identifiers, which can be far more accurate in identifying us than our names or contact details.




Read more:
Explainer: what is surveillance capitalism and how does it shape our economy?


Whereas some companies currently keep our information for long periods, the ACCC says we should have a right to request erasure to limit the risks of harm, including from major data breaches and reidentification of anonymised data.

Companies should stop pre-ticking boxes in favour of intrusive practices such as location tracking and profiling. Default settings should favour privacy.

Currently, there is no law against “serious invasions of privacy” in Australia, and the Privacy Act gives individuals no direct right of action. According to the ACCC, this should change. It also supports plans to increase maximum corporate penalties under the Privacy Act from A$2.1 million to A$10 million (or 10% of turnover or three times the benefit, whichever is larger).

Increased deterrence from consumer protection laws

Our unfair contract terms law could be used to attack unfair terms imposed by privacy policies. The problem is, currently, this only means we can draw a line through unfair terms. The law should be amended to make unfair terms illegal and impose potential fines of A$10 million or more.

The ACCC also recommends Australia adopt a new law against “unfair trading practices”, similar to those used in other countries to tackle corporate wrongdoing including inadequate data security and exploitative terms of use.

So far, the government has acknowledged that reforms are needed but has not committed to making the recommended changes. The government’s 12-week consultation period on the recommendations ends on October 24, with submissions due by September 12.The Conversation

Katharine Kemp, Senior Lecturer, Faculty of Law, UNSW, and Co-Leader, ‘Data as a Source of Market Power’ Research Stream of The Allens Hub for Technology, Law and Innovation, UNSW

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Australian media regulators face the challenge of dealing with global platforms Google and Facebook



‘Google and Facebook are global companies, headquartered in the US, for whom Australia is a significant but relatively small market.’
Shutterstock/Roman Pyshchyk

Terry Flew, Queensland University of Technology

With concerns growing worldwide about the economic power of digital technology giants such as Google and Facebook, there was plenty of interest internationally in Australia’s Digital Platforms Inquiry.

The Australian Competition and Consumer Commission (ACCC) inquiry was seen as undertaking a forensic account of market dominance by digital platforms, and the implications for Australian media and the rights of citizens around privacy and data protection.

The inquiry’s final report, released last month, has been analysed from perspectives such as competition policy, consumer protection and the future of journalism.




Read more:
Consumer watchdog calls for new measures to combat Facebook and Google’s digital dominance


But the major limitation facing the ACCC, and the Australian government, in developing new regulations for digital platforms is jurisdictional authority – given these companies are headquartered in the United States.

More ‘platform neutral’ approach

Among the ACCC’s 23 recommendations is a proposal to reform media regulations to move from the current platform-specific approaches (different rules for television, radio, and print media) towards a “platform-neutral” approach.

This will ensure comparable functions are effectively and consistently regulated:

Digitalisation and the increase in online sources of news and media content highlight inconsistencies in the current sector-specific approach to media regulation in Australia […]

Digital platforms increasingly perform similar functions to media businesses, such as selecting and curating content, evaluating content, and ranking and arranging content online. Despite this, virtually no media regulation applies to digital platforms.

The ACCC’s recommendations to harmonise regulations across different types of media draw on major Australian public enquiries from the early 2010s, such as the Convergence Review and the Australian Law Reform Commission’s review of the national media classification system. These reports identified the inappropriateness of “silo-ised” media laws and regulations in an age of digital convergence.




Read more:
What Australia’s competition boss has in store for Google and Facebook


The ACCC also questions the continued appropriateness of the distinction between platforms and publishers in an age where the largest digital platforms are not simply the carriers of messages circulated among their users.

The report observes that such platforms are increasingly at the centre of digital content distribution. Online consumers increasingly access social news through platforms such as Facebook and Google, as well as video content through YouTube.

The advertising dollar

While the ACCC inquiry focused on the impact of digital platforms on news, we can see how they have transformed the media landscape more generally, and where issues of the wider public good arise.

Their dominance over advertising has undercut traditional media business models. Online now accounts for about 50% of total advertising spend, and the ACCC estimates that 71 cents of every dollar spent on digital advertising in Australia goes to Google or Facebook.

All media are now facing the implications of a more general migration to online advertising, as platforms can better micro-target consumers rather than relying on the broad brush approach of mass media advertising.

The larger issue facing potential competitors to the digital giants is the accumulation of user data. This includes the lack of transparency around algorithmic sorting of such data, and the capacity to use machine learning to apply powerful predictive analytics to “big data”.

In line with recent critiques of platform capitalism, the ACCC is concerned about the lack of information consumers have about what data the platforms hold and how it’s being used.

It’s also concerned the “winner-takes-most” nature of digital markets creates a long term structural crisis for media businesses, with particularly severe implications for public interest journalism.

Digital diversity

Digital platform companies do not sit easily within a recognisable industry sector as they branch across information technology, content media, and advertising.

They’re also not alike. While all rely on the capacity to generate and make use of consumer data, their business models differ significantly.

The ACCC chose to focus only on Google and Facebook, but they are quite different entities.

Google dominates search advertising and is largely a content aggregator, whereas Facebook for the most part provides display advertising that accompanies user-generated social media. This presents its own challenges in crafting a regulatory response to the rise of these digital platform giants.

A threshold issue is whether digital platforms should be understood to be media businesses, or businesses in a more generic sense.

Communications policy in the 1990s and 2000s commonly differentiated digital platforms as carriers. This indemnified them from laws and regulations relating to content that users uploaded onto their sites.

But this carriage/content distinction has always coexisted with active measures on the part of the platform companies to manage content that is hosted on their sites. Controversies around content moderation, and the legal and ethical obligations of platform providers, have accelerated greatly in recent years.

To the degree that companies such as Google and Facebook increasingly operate as media businesses, this would bring aspects of their activities within the regulatory purview of the Australian Communication and Media Authority (ACMA).

The ACCC recommended ACMA should be responsible for brokering a code of conduct governing commercial relationships between the digital platforms and news providers.




Read more:
Consumer watchdog: journalism is in crisis and only more public funding can help


This would give it powers related to copyright enforcement, allow it to monitor how platforms are acting to guarantee the trustworthiness and reliability of news content, and minimise the circulation of “fake news” on their sites.

Overseas, but over here

Companies such as Google and Facebook are global companies, headquartered in the US, for whom Australia is a significant but relatively small market.

The capacity to address competition and market dominance issues is limited by the fact real action could only meaningfully occur in their home market of the US.

Australian regulators are going to need to work closely with their counterparts in other countries and regions: the US and the European Union are the two most significant in this regard.The Conversation

Terry Flew, Professor of Communication and Creative Industries, Queensland University of Technology

This article is republished from The Conversation under a Creative Commons license. Read the original article.

What Australia’s competition boss has in store for Google and Facebook



Google will find it harder to expand, but there’s only so much the ACCC can do.
Shutterstock

Caron Beaton-Wells, University of Melbourne

Central to the Australian Competition and Consumer Commission’s Digital Platforms inquiry were two questions:

  • do Google and Facebook hold substantial power in crucial digital markets?

  • does this power pose a risk to competitive processes?

In its Final Report released by the government on Friday, the ACCC correctly answered both with a resounding “yes”.


ACCC, July 28, 2019

The ACCC did not set out to determine whether either company has broken the competition rules. That can only be determined in an investigation of specific conduct based on specific facts and evidence.

The report itemises six such investigations already underway.

Having identified risks, the ACCC did set out to determine how they might be contained.

Its proposals are rightly cautious, reflecting the complexities of digital markets and the challenges in ensuring that any intervention protects the competitive process rather than individual competitors.

With market power comes dangers

The ACCC points out that substantial power won by serving consumers is not against the law.

It acknowledges that Google and Facebook provide services that are highly valued.

And it emphasises the distinctive features of digital markets that contribute to this power: extraordinary economies of scale, network effects, massive accumulations of data and the use of highly sophisticated data analytic techniques.

These features help Google dominate internet search and internet search advertising and help Facebook dominate social networks and display advertising.

While they also help deliver value for consumers, they can be used against new entrants that may offer a better deal and against other businesses (such as traditional media companies) that have come to rely on Google and Facebook to deliver services to customers.

The ACCC wants to reduce the risks…

There are no quick fixes. The ACCC rightly rejected the idea that platforms such as Google and Facebook be broken up.

Given the highly interconnected complex nature of the markets in which the major platforms participate, divestiture would not guarantee, and might in fact harm, consumer welfare.

The report recommends instead building up the ACCC’s capacity to aggressively enforce the competition rules and to review acquisitions that would further entrench the dominant players’ market power.

Many of the other recommendations are designed to ameliorate imbalances in information and bargaining power between the platforms and business users, and between the platforms and consumers in relation to the collection and use of their personal data.




Read more:
Consumer watchdog calls for new measures to combat Facebook and Google’s digital dominance


Implementing these recommendations presents challenges, not the least of which is to ensure they don’t themselves damage competition.

…hunt out abuses…

The ACCC proposes the establishment of a new specialist branch within the ACCC to build and sustain the skills needed to continue studying digital platforms and enforcing their competition and consumer rules.

This is a welcome initiative. It replicates similar capacity-building initiatives in the United States and Europe.

The report is peppered with references to European cases in which Google has been subject to thundering fines for various abuses of dominance. It also invokes the European mantra that these powerful companies have “special responsibility”.

But the Australian misuse of market power prohibition may not be flexible as the one in Europe. The ACCC has recommended broadening the unfair trading law in order to allow it more flexibility, and not only for use in dealing with digital platforms.

The recently amended section 46 of the Competition and Consumer Act will play a role, but it is yet to be taken for a proper run and, in the digital context, its application will be complicated by the rapid pace of innovation in digital markets.

…and scrutinise mergers…

In an acknowledgement that digital mergers are different, the ACCC wants to ensure the merger laws pay attention to mergers with potential as well as actual competitors, and to mergers with the owners of data assets.

It also wants Google and Facebook to voluntarily notify it of any future acquisitions. This is a polite request backed by a thinly veiled threat of repercussions.

But the report also implies that neither of these proposals may be enough.

Still more changes to the merger law might be needed to persuade judges of the need to stem unhealthy concentration in the Australian economy generally.

Australia almost certainly needs a compulsory notification regime, triggered by a combination of turnover and transaction value thresholds to ensure nascent competitors are not snuffed out.




Read more:
ACCC wants to curb digital platform power – but enforcement is tricky


Both of these are bigger conversations that the Commission needs to engage government and business in.

…while not offering much for legacy media…

The Commission has stepped away from a proposal in its preliminary report that there be a special regulator to oversee the relationships between platforms and media organisations, significant business users and advertisers.

It might have listened to criticism that the proposal would benefit traditional players in disrupted industries more than it benefits consumers.

The advertising industry is highly fragmented, complex and constantly changing. The evidence that the new platforms are distorting competition in the industry is questionable at best. The ACCC has sensibly suggested it needs to thoroughly examine dynamics in the ad tech supply chain before firming up any recommendation.

For the media industry, the compromise is that each platform be required to negotiate a code of conduct to be overseen and enforced by the Australian Communications and Media Authority.

Whether this will address media concerns about the appropriation of their content and about short notice periods for algorithm changes that can make their products hard to find remains to be seen.




Read more:
Digital platforms. Why the ACCC’s proposals for Google and Facebook matter big time


But, recognising that the platforms are themselves knee-deep in the media business, the ACCC has called for a wholesale overhaul of media regulation to level the playing field and remove regulatory impediments to competition, an idea the government seems to have accepted.

…and upgrading protections for privacy

The call for broad ranging reform of our privacy laws to wrench them into the digital age is also likely to be accepted by government.

The platforms might grumble at additional privacy requirements imposed country by country without an international standard, but the proposal to work with them on the development of an enforceable code at least allows them a seat at the table, and a chance to ensure the regulations are workable.

The challenge will be to ensure that the regulatory burdens don’t disproportionately hurt small businesses and prospective entrants, the ones the ACCC wants to help.

An imminent ACCC-led reform that will help both new entrants and consumers is the Consumer Data Right, which will give consumers more control of their data and enable them to move it between suppliers.




Read more:
We can put a leash on Google and Facebook, but there’s no saving the traditional news model


The ACCC’s work on digital platforms has just begun and there is a long and bumpy road ahead. The government should give it the time and money it will need to get on with it.


Caron Beaton-Wells is host of the Competition Lore podcast, exploring competition policy and law in a digital age.The Conversation

Caron Beaton-Wells, Professor, Melbourne Law School, University of Melbourne

This article is republished from The Conversation under a Creative Commons license. Read the original article.

We can put a leash on Google and Facebook, but there’s no saving the traditional news model


Amanda Lotz, Queensland University of Technology

Living with two preteens, I get almost daily requests to approve new apps. My standard response is to ask my kids to describe the app, why they want it, and how it makes money.

The last question is important, and not just to avoid to avoid in-app charges. Understanding the forces that drive the online economy is crucial for consumers, and increasingly citizens. All the new tools we access come at a cost even when they seem to be free.

How technology companies make money is a good question for digital media users of any age. It lies at the heart of the Australian Competition and Consumer Commission’s inquiry into the power and profits of Google and Facebook, the world’s two most ubiquitous digital platforms.


Australians’ time spent online.
ACCC Digital Platforms Inquiry Final Report

The competition watchdog’s job was to look at how online search engines, social media and digital content aggregators wield power in media and advertising, how that undermines the viability of traditional journalism (print in particular), and what can be done about it.

Limited recommendations

Its final report makes a swag of recommendations to limit these platforms’ market dominance and use of personal data.




Read more:
What Australia’s competition boss has in store for Google and Facebook


One example is requiring devices to offer consumers a choice of search engine and default browsers. Google now requires Android phones to pre-install Google apps. This feeds a “default bias” that contributes to it being used for 95% of Australian searches.

Another is reforming Australia’s privacy laws to address the digital environment. Platforms’ “take it or leave it” policies now give consumers little choice on having their data harvested.




Read more:
Consumer watchdog calls for new measures to combat Facebook and Google’s digital dominance


But on the area of concern central to the inquiry’s establishment –
the decline in journalism – the recommendations are relatively minor:

  • a code of conduct to treat news media businesses “fairly, reasonably and transparently”
  • “stable and adequate” government funding for the ABC and SBS
  • government grants (A$50 million a year) to support original local journalism
  • tax incentives to encourage philanthropic support for journalism.

The reality is that there is little governments can do to reverse the technological disruption of the journalism business.

Targeted revolution

The internet has made stark that news organisations aren’t primarily in the journalism business. The stories they produce play an incomparable social role, but the business model is to deliver an audience to advertisers.


Australian advertising expenditure by media format and digital platform.
ACCC

Social media and search give advertisers better tools to target messages to more precise groups of potential consumers. It is a phenomenally better mousetrap.

Traditional advertising is expensive and inefficient. An advertiser pays to reach a broad audience, most with no interest in what is being advertised.

Search allows advertisers to pay to reach people precisely when they are looking for something. Google knows what you are interested in, and serves up advertising accordingly. In the last quarter alone advertising in its properties (Search, Maps, Gmail, YouTube, Play Store and Shopping) made US$27.3 billion in revenue.

Social media platforms have a different model, but one no less damaging to the old newspaper business model. It’s a bit more like traditional mass media advertising, selling the attention of users to advertisers, but in a far more targeted way.

To the extent Facebook, Instagram, Twitter and so on capture your attention, and effectively monetise content made by others through sharing, they also undercut traditional news businesses.

Follow the money

No regulation can fix this. As the competition watchdog’s report notes, Australian law does not prohibit a company from having substantial market power. Nor does it prohibit a company “from ‘out-competing’ its rivals by using superior skills and efficiency”.

No one – not even the tech companies – is necessarily to blame for the technological innovation that has disrupted traditional news organisations.

To see that, as with my kids understanding how their apps make money, it’s just a case of following the money.The Conversation

Amanda Lotz, Fellow, Peabody Media Center; Professor of Media Studies, Queensland University of Technology

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Consumer watchdog calls for new measures to combat Facebook and Google’s digital dominance



Facebook and Google potentially face fresh curbs on their market power.
Shutterstock.com

Rob Nicholls, UNSW and Katharine Kemp, UNSW

The Australian Competition and Consumer Commission (ACCC) has called for “holistic, dynamic reforms” to address the online dominance of digital behemoths such as Google and Facebook.

A 600-page report, released today, makes 23 recommendations for regulating digital platforms – covering competition law, consumer protection, media regulation, and privacy.

Most of the suggested reforms are aimed squarely at countering the dominance of Facebook and Google, which the ACCC says has distorted a range of markets including advertising and media.




Read more:
ACCC wants to curb digital platform power – but enforcement is tricky


The ACCC recommends forming a new branch to deal specifically with Google and Facebook. But it doesn’t propose itself as the sole watchdog: the report also recommends a regulatory role for the Australian Communications and Media Authority (ACMA).

Meanwhile, the Office of the Australian Information Commissioner (OAIC) is called upon to develop an enforceable code to regulate platforms’ use of data. And even the Australian Tax Office will potentially be involved, as part of a proposal to introduce measures to encourage philanthropic funding of public-interest journalism.

Digital platforms with more than a million active users in Australia will be required to provide ACMA with codes to address the imbalance in the bargaining relationship between these platforms and news media businesses. These codes are expected to recognise the need for value-sharing and monetisation of news content.

Under the recommendations, ACMA would also be expected to monitor digital platforms’ efforts to identify reliable and trustworthy news, and to manage a mandatory take-down code for content that breaches copyright.

Market muscle

The ACCC report highlights the “substantial market power” enjoyed by Google and Facebook in their respective domains of web searching and social media. While it is not unlawful for firms to have this degree of power, it does mean they are likely to be subject to the (as yet untested) misuse of market power law introduced in 2017.

The ACCC is concerned that current merger laws do not go far enough, given large platforms’ ability to remove future competitive threats by simply buying start-ups outright. Such acquisitions may also increase the platforms’ access to data. The ACCC considers that either or both of these could entrench a platform’s market power.

As a result, the report recommends changes to Australia’s merger laws to expressly require consideration of the effect of potential competition, and to recognise the importance of data. It also recommends that platforms should be obliged to notify the ACCC in advance of any proposed acquisition.

This is not a substantial change to the existing law, which already allows consideration of anti-competitiveness. But it is a signal that the ACCC will be focusing on this issue.

The ACCC also wants Google to allow Australian users of Android devices to choose their search engine and internet browser – a right already enjoyed by Android users in the European Union.

Empowering consumers

The ACCC recommends substantial changes to Australian Consumer Law, to address the huge inequalities in bargaining power between digital platforms and consumers when it comes to terms of use, and particularly privacy.

The report’s most significant proposal in this area is to outlaw “unfair practices”, in line with similar bans in the US, UK, Europe, Canada, and elsewhere. This would cover conduct that is not covered by existing laws governing the misuse of market power, misleading or deceptive conduct, or unconscionable conduct.

This could be relevant, for example, where a digital platform imposes particularly invasive privacy terms on its users, which far outweigh the benefits of the service provided. The ACCC also called for digital platforms to face significant fines for imposing unfair contract terms on users.

The report recommends a new mandatory standard to bolster digital platforms’ internal dispute resolution processes. This would be reinforced by the creation of a new ombudsman to assist with resolving disputes and complaints between consumers and digital platforms.

Protecting privacy

The ACCC found that digital platforms’ privacy policies are long, complex, vague, and hard to navigate, and that many platforms do not provide consumers with meaningful control over how their data is handled.

The report therefore calls for stronger legal privacy protections, as part of a broader reform of Australian privacy law. This includes agreeing with the Australian Law Reform Commission on the need for a statutory tort for serious invasions of privacy.

Legal action ahead?

The ACCC also highlighted several matters on which it is considering future actions. These include the question of whether Facebook breached consumer law by allowing users’ data to be shared with third parties (potentially raising similar issues to the investigation by the US Federal Trade Commission, which this week resulted in a US$5 billion fine against Facebook), and whether Google has collated users’ location data in an unlawful way.




Read more:
Digital platforms. Why the ACCC’s proposals for Google and Facebook matter big time


In a statement, Treasurer Josh Frydenberg and federal communications minister Paul Fletcher accepted the ACCC’s overriding conclusion that there is a need for reform.

The federal government will now begin a 12-week public consultation process, and said it expects to release its formal response to the report by the end of the year.The Conversation

Rob Nicholls, Senior lecturer in Business Law, UNSW and Katharine Kemp, Senior Lecturer, Faculty of Law, UNSW, and Co-Leader, ‘Data as a Source of Market Power’ Research Stream of The Allens Hub for Technology, Law and Innovation, UNSW

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Facebook is now cleaner, faster and group-focused, but still all about your data


Belinda Barnet, Swinburne University of Technology

Have you noticed your Facebook feed looks different lately?

It’s a bit more “zen”, uncluttered and faster. Instagram-like story posts are displayed first, and a separate feed allows you to keep up with the latest activity in your groups.

Someone has assembled a ring of comfy chairs in your lounge room and invited the local mums and bubs group over for hot cocoa and biscuits. Even the hearts are squishier.

Facebook hearts are now bigger and squishier.
Screen shot June 4 2019

According to Facebook’s CEO Mark Zuckerberg, it’s “the biggest change to the app and website in the last five years”.

This cosmetic change could represent the first step in Facebook’s “privacy pivot” announced in March 2019. But we’re still waiting to hear exactly what will be happening with our data.




Read more:
Privacy pivot: Facebook wants to be more like WhatsApp. But details are scarce


Pile on Facebook

Facebook has been under immense pressure from both the Federal Trade Commission in the United States, and governments around the world in the wake of a string of privacy scandals (including Cambridge Analytica).

After live-streamed terrorism in New Zealand, Jacinda Ardern is leading a global charge for regulation and oversight. The recent Christchurch Call meeting resulted in tech companies and world leaders signing an agreement to eliminate terrorist and violent extremist content online.

Everyone is piling on Facebook, even Zuckerberg’s original platform co-founder Chris Hughes.

Hughes said “it’s time to break up Facebook” and “the government must hold Mark accountable”. He was referring to the huge power Zuckerberg holds through controlling the algorithms that keep Facebook – and more recently acquired platforms Instagram and Whatsapp – ticking over. Those algorithms functionalise Facebook’s vast body of user data.




Read more:
The ‘Christchurch Call’ is just a start. Now we need to push for systemic change


Putting it lightly

Zuckerberg admits that changes must be made, saying in April:

I know we don’t exactly have the strongest reputation on privacy right now, to put it lightly.

Facebook’s business model is built on harvesting platform data about its users, crunching that to generate behavioural inferences like “divorced, male, no children, interested in weight loss”, and then selling this package to advertisers.

Technology scholar Shoshanna Zuboff calls the process of collecting and selling user data “surveillance capitalism”.

Privacy was never part of Facebook’s floor plan.

In its defence, it doesn’t sell identifiable data, and it has clamped down on developer access to its data.

That’s because developers are not the customer – nor are the users who are clicking on like buttons or buying yoga pants. Facebook’s customers are advertisers.

Facebook sells one product: a powerful capacity to personalise and target ads that is unparalleled in any other platform. This turned a profit of US$16 billion in the last quarter of 2018.

It seems reasonable to assume it’s going to do everything it can to protect its ability to keep collecting the raw material for that profit.

But recent questions put to Facebook by US Senator Josh Hawley reveal that Facebook is still not willing or able to share its plans on privacy relating to metadata collection and use.

In response to the senator, Kevin Martin, Vice President of US Public Policy at Facebook said:

[…] there are still many open questions about what metadata we will retain and how it may be used. We’ve committed to consult safety and privacy experts, law enforcement, and governments on the best way forward.

Chat, shop, watch … and wait

You can now easily navigate straight to Groups, Marketplace and Watch on Facebook.
Screen shot June 4 2019

At a developer conference last month, Zuckerberg outlined his proposed changes: mainly, change the focus to communities and privacy, make messaging faster and encrypted, and transform the user experience.

The square logo is now a circle. There’s a lot of white space, and someone KonMari’d the title bar.

Shopping within Facebook is prioritised through the Marketplace feed, and you can watch shows and online videos in groups through the Watch function.

Facebook Messenger loads faster, the interface is cleaner and a dating service may soon be available in Australia.

What hasn’t changed is the core product: the capacity for Facebook to collect platform data and generate behavioural inferences for advertisers.




Read more:
Why are Australians still using Facebook?


The Conversation


Belinda Barnet, Senior Lecturer in Media and Communications, Swinburne University of Technology

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Facebook videos, targeted texts and Clive Palmer memes: how digital advertising is shaping this election campaign


Andrew Hughes, Australian National University

This year’s election will be the first in Australia where the parties will be advertising more on social and digital platforms than traditional media (TV, radio, newspapers and magazines).

There are a few key reasons for this. First, cost-wise, social media is far cheaper, sometimes as low as a few cents per click. Unlike heritage media, digital and social is extremely targeted, and can be done in the “dark,” so your opponents may not even be aware of the message you are pushing out.

Digital and social advertising can also be shared or even created by users themselves, further increasing the reach of a party’s messaging. This gets around the Australian Electoral Commission rules on advertising – technically they are not ads since no party is paying for them to be shared on people’s feeds.

Throw into the mix laws on political advertising – which allow parties to advertise up to and on election day on social media, but not traditional media – and we are likely seeing the first largely digitally driven election campaign in Australian political history.




Read more:
Election explainer: what are the rules governing political advertising?


Here are a few ways the parties are using advertising in the campaign so far and what makes this election unique:

What you can do with A$30 million

Among all the candidates running this year, perhaps no one has used political advertising as prolifically as Clive Palmer. This shows what money can buy.

The most recent Nielsen figures put the cost of Palmer’s ads since September at around A$30 million, though Palmer says himself he’s spent at least A$50 million. This compares to just A$16 million spent in total advertising during the last federal election, with Labor and the Coalition accounting for more than 90% of that.

From a campaign perspective, Palmer is ticking many of the right boxes: a mix of different platforms on digital and social; heritage media ads for mass market awareness featuring candidates selected from the middle; the use of memes and user-generated content; and even text messaging.

This United Australia Party ad has over 2.4 million views on YouTube thus far, making it the most viewed election ad on the platform.

Despite the ubiquity of his ads, though, Palmer is still struggling to connect with most voters. This demonstrates a very important aspect to any advertising campaign: the actual brand still needs to be seen as offering real value to voters.

The UAP has used text messaging like this one below, for example, to try to change its negative perception with voters by delivering positive campaign promises.

UAP text message advertisement.
ABC

The ‘Grim Reaper’ strategy and micro-targeting

One of the most effective ads ever done in Australia was the “Grim Reaper” AIDS awareness campaign in 1987, which showed how well “scare campaigns” and negative messaging can work, given the right context and framing. The ad’s micro-messaging was another aspect that worked so well: it personalised the issue and made it tangible to anyone sexually active.

Basically, negative messaging works on the theory that what you fear, you will avoid – or the “fight or flight response”. Negative political ads highlight the level of risk and consequence of a certain party’s policies – and then emphasise how to avoid this by not voting for them.




Read more:
Why scare campaigns like ‘Mediscare’ work – even if voters hate them


Trouble is, most ads on TV are losing their potency. As attitudes towards political messaging and brands become increasingly negative, voters are less likely to watch ads in their entirety. Many people also don’t see them as being personally relevant.

Social media, though, provides an excellent delivery mechanism for these types of messages. Digital ads can be personalised and focused on issues that voters have already expressed an interest in and therefore find relevant to their lives.

Personalised messaging from the LNP on Facebook, targeting voters in the seat of Ryan in western Brisbane.
Facebook Ad Library

Social media ads can also be altered to be even more targeted as the campaign goes on, based on voter responses. And their speed of production – only taking a matter of hours to produce and place online – allows digital advertising to do what heritage no longer can and provide a more fluid, grassroots dynamic to campaigning.

This ad by Labor featuring Prime Minister Scott Morrison in bed with Palmer, for example, was released on social media within 24 hours of the preference deal struck between the Coalition and Palmer’s UAP.

Labor’s Facebook ad depicting Scott Morrison in bed with the UAP’s Clive Palmer over their preference dealing.
Facebook/Click here to watch the video

That said, even on social media, negative advertising is not as effective if it just comes from the party itself. But when combined with information from third-party sources, such as from the media, this can increase the effectiveness. For example, the Liberal Party used the 10 Network image in this ad to support its claims on Labor’s tax policies.


Facebook Ad Library

Youth engagement

Youth voter enrolment is at an all-time high in Australia, driven, in part, by engagement and participation in the marriage equality plebiscite in 2017.

The major parties are aware of this and are creating ads specifically targeting this demographic on Snapchat, WhatsApp and Instagram. Some of these are “dark social” ads (meaning they can only be seen by the target market) or are user-made so not to be subject to disclosure rules.

For more general audiences, Labor has created ads like this one on Facebook that highlight issues young voters are concerned about, such as wage increases and penalty rates. Ads like this also attempt to engage with these voters by asking them to sign petitions – a form of experiential marketing that’s proved highly effective with young audiences, as seen through platforms such as Change.org.

Labor Facebook ad inviting voters to sign a petition demanding a higher wage.
Facebook Ad Library

Groups like the Australian Youth Climate Coalition are tapping into experiential marketing by combining online advertising with a call for offline action on issues that appeal to young voters, such as climate change. Part-rock concert, part-protest, these events might remind some of the rallies that proved so popular during the Gough Whitlam era.

The AYCC is using a combination of online and offline strategies to engage with young voters.
Facebook Ad Library

The increasing influence of lobbying groups

One of the more interesting developments of this election so far is the increasing sophistication, knowledge and strategies of political lobbying groups, or Australia’s equivalent to America’s PACs.

GetUp! is one such group, collecting A$12.8 million in donations in the last 12 months alone. Among the group’s tactics are direct phone calls to voters, partly achieved through “phone parties” where volunteers freely offer their time, phones and other resources to call people in targeted electorates. GetUp! has a goal of making 1 million phone calls in the lead-up to the election.

A GetUp! video ad encouraging voters to host ‘calling parties’

Other well-funded groups, such as the right-aligned Advance Australia, are also seeking to influence the narrative in the election, particularly in electorates like Warringah, where it has released ads against Tony Abbott’s challenger, Zali Steggall.

In part to counter the influence of lobbying groups, the Australian Council of Trade Unions has launched its own advertising campaign featuring working Australians describing how hard it is to make ends meet.

The ACTU’s “Change the Government, Change the Rules” campaign.

The rise of these groups in Australian politics opens a Pandora’s Box on just who can influence elections without even standing a single candidate – an issue that’s becoming part of politics now in many Western democracies. As many in politics would know, where there is money, there is power, and where there is power, there are those who are seeking to influence it.The Conversation

Andrew Hughes, Lecturer, Research School of Management, Australian National University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Anxieties over livestreams can help us design better Facebook and YouTube content moderation



File 20190319 60995 19te2fg.jpg?ixlib=rb 1.1
Livestream on Facebook isn’t just a tool for sharing violence – it has many popular social and political uses.
glen carrie / unsplash, CC BY

Andrew Quodling, Queensland University of Technology

As families in Christchurch bury their loved ones following Friday’s terrorist attack, global attention now turns to preventing such a thing ever happening again.

In particular, the role social media played in broadcasting live footage and amplifying its reach is under the microscope. Facebook and YouTube face intense scrutiny.




Read more:
Social media create a spectacle society that makes it easier for terrorists to achieve notoriety


New Zealand’s Prime Minister Jacinda Ardern has reportedly been in contact with Facebook executives to press the case that the footage should not available for viewing. Australian Prime Minister Scott Morrison has called for a moratorium on amateur livestreaming services.

But beyond these immediate responses, this terrible incident presents an opportunity for longer term reform. It’s time for social media platforms to be more open about how livestreaming works, how it is moderated, and what should happen if or when the rules break down.

Increasing scrutiny

With the alleged perpetrator apparently flying under the radar prior to this incident in Christchurch, our collective focus is now turned to the online radicalisation of young men.

As part of that, online platforms face increased scrutiny and Facebook and Youtube have drawn criticism.

After dissemination of the original livestream occurred on Facebook, YouTube became a venue for the re-upload and propagation of the recorded footage.

Both platforms have made public statements about their efforts at moderation.

YouTube noted the challenges of dealing with an “unprecedented volume” of uploads.

Although it’s been reported less than 4000 people saw the initial stream on Facebook, Facebook said:

In the first 24 hours we removed 1.5 million videos of the attack globally, of which over 1.2 million were blocked at upload […]

Focusing chiefly on live-streaming is somewhat reductive. Although the shooter initially streamed his own footage, the greater challenge of controlling the video largely relates to two issues:

  1. the length of time it was available on Facebook’s platform before it was removed
  2. the moderation of “mirror” video publication by people who had chosen to download, edit, and re-upload the video for their own purposes.

These issues illustrate the weaknesses of existing content moderation policies and practices.

Not an easy task

Content moderation is a complex and unenviable responsibility. Platforms like Facebook and YouTube are expected to balance the virtues of free expression and newsworthiness with socio-cultural norms and personal desires, as well as the local regulatory regimes of the countries they operate in.

When platforms perform this responsibility poorly (or, utterly abdicate it) they pass on the task to others — like the New Zealand Internet Service Providers that blocked access to websites that were re-distributing the shooter’s footage.

People might reasonably expect platforms like Facebook and YouTube to have thorough controls over what is uploaded on their sites. However, the companies’ huge user bases mean they often must balance the application of automated, algorithmic systems for content moderation (like Microsoft’s PhotoDNA, and YouTube’s ContentID) with teams of human moderators.




Read more:
A guide for parents and teachers: what to do if your teenager watches violent footage


We know from investigative reporting that the moderation teams at platforms like Facebook and YouTube are tasked with particularly challenging work. They seem to have a relatively high turnover of staff who are quickly burnt-out by severe workloads while moderating the worst content on the internet. They are supported with only meagre wages, and what could be viewed as inadequate mental healthcare.

And while some algorithmic systems can be effective at scale, they can also be subverted by competent users who understand aspects of their methodology. If you’ve ever found a video on YouTube where the colours are distorted, the audio playback is slightly out of sync, or the image is heavily zoomed and cropped, you’ve likely seen someone’s attempt to get around ContentID algorithms.

For online platforms, the response to terror attacks is further complicated by the difficult balance they must strike between their desire to protect users from gratuitous or appalling footage with their commitment to inform people seeking news through their platform.

We must also acknowledge the other ways livestreaming features in modern life. Livestreaming is a lucrative niche entertainment industry, with thousands of innocent users broadcasting hobbies with friends from board games to mukbang (social eating), to video games. Livestreaming is important for activists in authoritarian countries, allowing them to share eyewitness footage of crimes, and shift power relationships. A ban on livestreaming would prevent a lot of this activity.

We need a new approach

Facebook and YouTube’s challenges in addressing the issue of livestreamed hate crimes tells us something important. We need a more open, transparent approach to moderation. Platforms must talk openly about how this work is done, and be prepared to incorporate feedback from our governments and society more broadly.




Read more:
Christchurch attacks are a stark warning of toxic political environment that allows hate to flourish


A good place to start is the Santa Clara principles, generated initially from a content moderation conference held in February 2018 and updated in May 2018. These offer a solid foundation for reform, stating:

  1. companies should publish the numbers of posts removed and accounts permanently or temporarily suspended due to violations of their content guidelines
  2. companies should provide notice to each user whose content is taken down or account is suspended about the reason for the removal or suspension
  3. companies should provide a meaningful opportunity for timely appeal of any content removal or account suspension.

A more socially responsible approach to platforms’ roles as moderators of public discourse necessitates a move away from the black-box secrecy platforms are accustomed to — and a move towards more thorough public discussions about content moderation.

In the end, greater transparency may facilitate a less reactive policy landscape, where both public policy and opinion have a greater understanding around the complexities of managing new and innovative communications technologies.The Conversation

Andrew Quodling, PhD candidate researching governance of social media platforms, Queensland University of Technology

This article is republished from The Conversation under a Creative Commons license. Read the original article.

The law is closing in on Facebook and the ‘digital gangsters’


Sacha Molitorisz, University of Technology Sydney and Derek Wilding, University of Technology Sydney

For social media and search engines, the law is back in town.

Prompted by privacy invasions, the spread of misinformation, a crisis in news funding and potential interference in elections, regulators in several countries now propose a range of interventions to curb the power of digital platforms.

A newly published UK report is part of this building global momentum.




Read more:
Why are Australians still using Facebook?


Shortly after Valentine’s Day, a committee of the British House of Commons published its final report into disinformation and “fake news”. It was explicitly directed at Facebook CEO Mark Zuckerberg, and it was less a love letter than a challenge to a duel.

The report found:

Companies like Facebook should not be allowed to behave like ‘digital gangsters’ in the online world, considering themselves to be ahead of and beyond the law.

The committee was particularly vexed by Zuckerberg himself, concluding:

By choosing not to appear before the Committee … Mark Zuckerberg has shown contempt.

Its far-reaching recommendations included giving the UK’s Information Commissioner greater capacity to be “… an effective ‘sheriff in the Wild West of the Internet’.”

The law is back in town

In December 2018, the Australian Competition and Consumer Commission (ACCC) handed down its preliminary report into the impact of digital platforms. It tabled a series of bold proposals.




Read more:
Digital platforms. Why the ACCC’s proposals for Google and Facebook matter big time


Then, on February 12, the Cairncross Review – an independent analysis led by UK economist and journalist Frances Cairncross – handed down its report, A Sustainable Future for Journalism.

Referring to sustainability of the production and distribution of high-quality journalism, “Public intervention may be the only remedy,” wrote Cairncross. “The future of a healthy democracy depends on it.”

And a week later, the Digital, Culture, Media and Sport Committee of the House of Commons issued its challenge in its final report on disinformation and “fake news”:

The big tech companies must not be allowed to expand exponentially, without constraint or proper regulatory oversight … only governments and the law are powerful enough to contain them.

How do the responses of the three reports compare?

ACCC inquiry broadest in scope

First, it’s important to note that the scope of these three inquiries varied significantly.

The ongoing ACCC inquiry, billed as a world-first and set to hand down its final report in June, is seeking to assess the impact of digital platforms on media and advertising, with a focus on news.




Read more:
Attention economy: Facebook delivers traffic but no money for news media


The Cairncross Review was narrower in intent, addressing “the sustainability of the production and distribution of high quality journalism, and especially the future of the press, in this dramatically changing market.”

And the House of Commons committee had a very direct brief to investigate fake news. It then chose to focus on Facebook.

As such, the three inquiries overlap substantially, but the ACCC investigation is unequivocally the broadest in scope.

Not just distribution platforms

However, all three reports land in roughly the same place when it comes to characterising these businesses. They all see digital platforms as more than just conduits of other people’s content – and this brings certain responsibilities.

The ACCC says digital intermediaries are “considerably more than mere distributors or pure intermediaries” when it comes to the supply of news and journalism.

The Cairncross Review stresses there is a “fundamental difference” between distributors and content creators.

The House of Commons committee proposes “a new category of tech company” as a legal mechanism for having digital platforms assume liability for harmful content.

Need more oversight

A related important point is that all three reviews recommend that digital platforms are brought more squarely into the legal and regulatory environment.

By this, they don’t just mean cross-industry laws that apply to all businesses. There is some of that – for example, adapting competition laws so certain conduct is regulated.




Read more:
Google and Facebook cosy up to media companies in response to the threat of regulation


But these inquiries also raise the prospect of specific rules for platforms as part of communications regulation. How they go about this shows the point at which the inquiries diverge.

News reliability

The ACCC has flagged the need for further work on a platforms code of practice that would bring them into the orbit of the communications regulator, the ACMA.

The platforms would be bound to the code, which would require them to badge content produced under established journalistic standards. It would be the content creators – publishers and broadcasters, not platforms – that would be subject to these standards.

In the UK, Cairncross proposes a collaborative approach under which a new regulator would monitor and report on platforms’ initiatives to improve reliability of news – perhaps, in time, moving to specific regulatory obligations.

Algorithms regulator

In Australia, the ACCC has proposed what others refer to as a new “algorithms regulator”. This would look at how ads and news are ranked in search results or placed in news feeds, and whether vertically integrated digital platforms that arrange advertising favour their own services.

The algorithms regulator would monitor, investigate and report on activity, but would rely on referral to other regulators rather than have its own enforcement powers.

Unsurprisingly, the leading digital platforms in Australia oppose the new algorithms regulator. Equally unsurprisingly, media companies think the proposal doesn’t go far enough.




Read more:
Facebook needs regulation – here’s why it should be done by algorithms


For its part, Cairncross does recommend new codes on aspects such as indexing and ranking of content and treatment of advertising. The codes would be overseen by a new regulator but they would be developed by platforms and a move to a statutory code would only occur if they were inadequate.

In contrast to both these reviews, the House of Commons committee’s Code of Ethics is concerned with “online harms”. Right from the outset, it would be drawn up and enforced by a new regulator in a similar way to Ofcom, the UK communications regulator, enforcing its Broadcasting Code.

It says this would create “a regulatory system for online content that is as effective as that for offline content industries”. Its forcefulness on this is matched by its recommendation on algorithms: it says the new regulator should have access to “tech companies’ security mechanisms and algorithms, to ensure they are operating responsibly”.

Both the ACCC and Cairncross pointedly avoid this level of intervention.

However, the ACCC does raise the prospect of a new digital platforms ombudsman. Apart from delivering 11 preliminary recommendations, the ACCC also specified nine proposed areas for further analysis and assessment. Among these areas, the ACCC suggested the idea of such an ombudsman to deal with complaints about digital platforms from consumers, advertisers, media companies and businesses.

Data privacy

And then there is data privacy.

This is where the ACCC and the House of Commons committee delivered some of their most significant recommendations. It’s also where regulators in other jurisdictions have been turning their attention, often on the understanding that the market power of digital platforms is largely derived from their ability to access user data.

Earlier this month, Germany’s Federal Cartel Office (Bundeskartellamt) found that Facebook could no longer merge a person’s data from their Instagram, Facebook and WhatsApp accounts, without their explicit consent.

In Germany, the law has spoken. In Australia and the UK, it’s still clearing its throat.The Conversation

Sacha Molitorisz, Postdoctoral Research Fellow, Centre for Media Transition, Faculty of Law, University of Technology Sydney and Derek Wilding, Co-Director, Centre for Media Transition, University of Technology Sydney

This article is republished from The Conversation under a Creative Commons license. Read the original article.