Commentary: Google should not be allowed to run experiments on users without their knowledge

It was recently discovered that Google modified search results for some in Australia. This isn’t the first time Google has been caught experimenting on users without adequate disclosure, says a communication researcher.

google search bar
(Photo: Unsplash/@charlesdeluvio)

BRISBANE: On Jan 13, the Australian Financial Review reported Google had removed some Australian news content from its search results for some local users.

Speaking to the Guardian, a Google spokesperson confirmed the company was “running a few experiments that will each reach about 1 per cent of Google Search users in Australia to measure the impacts of news businesses and Google Search on each other”.

So what are these “experiments”? And how concerned should we be about Google’s actions?

READ: Commentary: Is news worth a lot or a little? Google and Facebook want to have it both ways

ENGINEERING OUR ATTENTION

Google’s experiment (which is supposed to run until early February) involves displaying an “alternative” news website ranking for certain Australian users – at least 160,000, according to The Guardian.

A Google spokesperson told The Conversation the experiment didn’t prevent users (being experimented on) from accessing a news story. Rather, they would not discover the story through Search and would have to access it another way, such as directly on a publisher’s website.

Google’s experiment is a form of “A/B testing”, which classically involves dividing a population randomly in half – into groups A and B – and subjecting each group to a different “stimulus”.

READ: Commentary: Is this the end of Google as we know it?

For example, in the case of web design, the two groups may be served different web layouts. This could be done to test changes to layout, the colour scheme or any other element.

Performance in A/B testing is judged on a range of factors, such as which links are clicked first, or the average time spent on a page. If group A perused the site longer than group B, the modification tested on group A may be considered favourable.

In Google’s case, we don’t know the motivation behind the tests. But we do know a small subset of users received different results to the majority and were not alerted.

Property agents records online Google
(Photo: Unsplash)

The experiment has resulted in the promotion of dubious news sources over trusted ones, some of which have been known to publish disinformation (which intends to mislead) and misinformation (false claims that are spread regardless of intent).

When asked about this ranking, Google’s spokesperson said it was a “single anecdotal screenshot” and the experiment didn’t “remove results that link to official government departments and agencies”.

INTENT TO MANIPULATE

A/B testing is a widespread practice. It can range from being fairly benign – such as to determine the best location for an advertisement banner – to much more invasive, such as Facebook’s infamous mood experiment.

In January 2012, Facebook conducted an experiment on 700,000 users without their knowledge or explicit consent. It adjusted users’ feeds to artificially boost either positive or negative news content.

READ: Commentary: Imagine a world with more than one Facebook. Here’s why you can’t

One reported aim, according to Facebook’s own researchers, was to examine whether emotional states could spread from user to user on the platform. Results were reported in the Proceedings of the National Academy of Sciences.

Following the report’s publication, Facebook’s “experiment” was widely condemned by academics, journalists and the public as ethically dubious. It had a specific objective to emotionally manipulate users and didn’t obtain informed consent.

Similarly, it’s unlikely users caught in the midst of Google’s Australian news experiment would realise it.

And while the direct risk to those being tested may seem lower than with Facebook’s mood experiment, tweaking news results on Google Search introduces its own set of risks. As research has shown, platforms and news media both play a large role in spreading conspiracy theories.

READ: Commentary: We are living in a golden age of ignorance

NO EXCUSE FROM SCRUTINY

Google tried to downplay the significance of the experiment, noting that it conducts “tens of thousands of experiments in Google Search” each year.

But this doesn’t excuse the company from scrutiny. If anything, it’s even more concerning.

Man looking frustrated in front of laptop
(Photo: Unsplash/Tim Gouw)

Imagine if a police officer pulled you over for speeding and you said: “Well, I speed thousands of times each year, so why should I pay a fine just this one time I’ve been caught?”

If this is just one experiment among of tens of thousands, as Google has admitted, in what other ways have we been manipulated in the past? Without basic disclosures, it’s difficult to know.

This isn’t the first time Google has been caught experimenting on users without adequate disclosure. In 2018, the company released Google Duplex, a speech-enabled digital assistant that could purportedly make restaurant and other personal service bookings on a user’s behalf.

READ: Commentary: Bad news. Artificial intelligence is biased

In the Duplex demos, Google played audio of an AI-enabled speech agent making bookings via conversations with real service workers. What was missing from the calls, however, was a disclosure that the agent opening the call was a bot, not a human.

Critics questioned the deceptiveness of the technology, given its mimicry of human speech.

Google’s controversial dismissal in December of world-leading AI ethics researcher Timnit Gebru (former co-lead of its ethical AI team) cast further shade over the company’s internal culture.

READ: Commentary: Artificial intelligence and automation would actually benefit Singapore

WHAT NEEDS TO CHANGE

Digital media platforms including Google, Facebook, Netflix and Amazon (among others) exert enormous power over our lives. They also have vast political influence.

It’s no coincidence Google’s news ranking experiment took place against the backdrop of the escalating news media bargaining code debate, wherein the federal government wants Google and Facebook to negotiate with Australian news providers to pay for using their content.

Google’s spokesperson confirmed the experiment is “directly connected to the need to gather information for use in arbitration proceedings, should the code become law”.

READ: Google says it will shut search engine in Australia if forced to pay for news

Smartphone with google app icon is seen in front of the displayed Australian flag in this illustrat
Smartphone with google app icon is seen in front of the displayed Australian flag in this illustration taken, January 22, 2021. REUTERS/Dado Ruvic/Illustration

While users benefit from the services big tech provides, we need to appreciate we’re more than mere consumers of these services. The data we forfeit are essential input for the massive algorithmic machinery that runs at the core of enterprises such as Google.

The result is what digital media scholars call an “algorithmic culture”. We feed these machines our data and in the process tune them towards our tastes. Meanwhile, they feed us back more things to consume, in a giant human-machine algorithmic loop.

READ: Commentary: Confidence to face an AI-dominated future requires preparing Singaporeans for jobs not yet created

Until recently, we have been uncritical participants in these algorithmic loops and experiments, willing to use “free” services in exchange for our data. But we need to rethink our relationship with platforms and must hold them to a higher standard of accountability.

Governments should mandate minimum standards of disclosure for platforms’ user testing. A/B testing by platforms can still be conducted properly with adequate disclosures, oversight and opt-in options.

In the case of Google, to “do the right thing” would be to adopt a higher standard of ethical conduct when it comes to user testing.

Listen to a lawyer and a media professor break down WhatsApp's new terms of service on CNA's Heart of the Matter podcast:

 

Daniel Angus is Associate Professor in Digital Communication in the School of Communication at Queensland University of Technology. This commentary first appeared on The Conversation.


Source: CNA/el