Spying on Big Tech may work better than splitting firms up

Haugen’s Facebook revelations whetted US public appetite for more ap (AP)Premium
Haugen’s Facebook revelations whetted US public appetite for more ap (AP)
4 min read . Updated: 16 Jan 2022, 09:53 PM ISTParmy Olson

Listen to this article

Over the years, as social media firms gorged themselves on the data of billions to fuel vast profits, the information flow never went the other way. Now the tables are turning. A promising piece of legislation in US Congress tackling Big Tech’s undue influence would force them to share data on how people use their platforms. Sure, that doesn’t have same ring to it as ‘break up Big Tech’, but it could be more effective at stopping hate speech and political divisiveness on social media. A spotlight on what people are viewing and why could guide regulators to solutions and harness the power of public pressure. Think of the impact of Facebook’s whistle-blower Frances Haugen and multiply that.

Governments are engaged in an array of efforts to rein in tech giants; US antitrust regulators got a go-ahead last week to pursue a lawsuit against Meta that could lead to its breakup. That could take years, though, and doesn’t deal directly with the psychological harms caused by its sites. But under the proposed law, researchers would get anonymized user data to study in detail how hate speech, conspiracy theories and more spread across social media. Companies that fail to share such data would be penalized.

Haugen had shared thousands of internal documents with publications to show the extent to which Facebook knew of the toll Instagram was taking on the mental health of teenagers. The revelations were a bombshell. But the world can’t rely on whistle-blowers, and Meta appears to have clamped down on internal research on side effects. So outsiders need to step in.

America’s Platform Accountability and Transparency Act (PATA) would create an avenue for academics to study user activity on Meta’s Facebook and Instagram; Alphabet’s YouTube; Twitter; ByteDance’s TikTok; and others in detail. The bill also provides some access to an even wider array of parties including non-profit organizations, according to Brandon Silverman, who is helping senators with nuts and bolts. Silverman founded a social analytics tool called CrowdTangle that Facebook bought in 2016 and left the company last year.

The data would cover millions of people, broken down by factors such as age, approximate location, gender and race, and get matched with the content these groups are viewing. The findings should help assess how and why something like hate speech draws in users. Such concrete data could turn accepted wisdom on its head. Critics of Facebook, for instance, often say that its most troubled users fall into conspiracy rabbit holes because algorithms recommend such content. But what if Facebook’s algorithms don’t always work that way? What if people visit YouTube expressly [to confirm absurd beliefs]? “What’s happening is people are arriving on YouTube by seeing something on Twitter or Facebook, and they’re actually searching for it," said Nate Persily, a professor of law at Stanford Law School who designed the framework on which PATA is based. Regulators and advocacy groups can’t pressure Facebook to change anything if they don’t know exactly why so many people have viewed QAnon or anti-vaccine posts on those sites, Persily said. That’s why evidence-gathering is so critical. “Right now we only have glimpses," added the professor.

For all its revelations, Haugen’s research dump barely scratched the surface of the most troubling activity on social media. It couldn’t show exactly how a minority of people made posts about a stolen election go dangerously viral ahead of the 6 January insurrection in the US last year. To find the precise activity around those outliers, researchers need huge amounts of current and historical data, including information on how people hop between different social media platforms.

This would be an unprecedented glimpse into a world that social media firms have never wanted people to see. PATA, and a similar proposal in Europe that has a better chance of passing given congressional gridlock, could divulge data even more intricate than what researchers obtained from Facebook before 2018. That’s when the company shut researchers out of studying user activity on the site in the wake of the Cambridge Analytica privacy scandal. Even with the limited insights researchers could glean then, more than 130 studies on Facebook’s side effects and activities were published before that shutdown. No similar insights into user behaviour have ever been provided by YouTube or TikTok.

Online platforms and Meta in particular will argue that they have bent over backward to be transparent, even publishing regular transparency reports. But civil society groups and researchers have long rolled their eyes at their lack of useful detail. That’s no surprise. Few businesses would willingly reveal their toll on human well-being. But being forced to look at the problem is the first step to solving it. ©bloomberg

Parmy Olson is a Bloomberg Opinion columnist covering technology

 

Subscribe to Mint Newsletters
* Enter a valid email
* Thank you for subscribing to our newsletter.

Never miss a story! Stay connected and informed with Mint. Download our App Now!!

Close