
Facebook is working hard on technology that would allow it to use encrypted information as a part of its targeted advertising. That's according to reporting from The Information, which noted that Facebook has been hiring experts in homomorphic encryption.
That technology is meant to allow data analysis on encrypted information without revealing or exposing its contents. In theory, that would allow medical researchers, for example, to look for rare diseases across wide sets of data, without compromising patient privacy. It also would enable fraud detection without decrypting account information, which would make it more vulnerable to hackers.
It's not surprising that Facebook would be interested in the technology, though, as The Information points out, it's relatively new to the field. Facebook's entire business model is built around its ability to gather as much information about its users' activity, interactions, and conversations, and then use that to show what it calls "personalized ads."
End-to-end encryption, like the kind used in messaging services like WhatsApp, which is owned by Facebook, makes it impossible for the social media giant to collect data. It's one of the reasons that Facebook hasn't been able to monetize WhatsApp the way it does the Facebook app. But, to users, that's a benefit, not a problem to be solved.
Besides, I can't be the only one that thinks that Facebook is missing an important piece here. Facebook seems to assume that its customers are interested in encryption solely to keep their information safe from cyber threats. To that end, it's true that I've never met anyone who wants hackers to get into their account or snoop on their private messages.
The thing is, the same thing is true about Facebook. Most of Facebook's users, especially those using WhatsApp, for example, would very much like to keep their personal information private from Facebook. And, I've never met anyone who thought that what Facebook, or any of its apps, really needs is more ads.
More importantly, coming up with a way to keep information encrypted but readable by Facebook defeats the point entirely. Sure, I can understand why Facebook would want to do this, but the company clearly isn't thinking about what actually matters to users, which isn't the appearance of privacy, but actual privacy.
It's actually something Facebook has talked about many times. I remember hearing the company's VP of policy, Erin Egan, at CES in 2019 talk about how everything Facebook does is with "privacy by design." There were 200 or so people in the room, all of whom audibly laughed because no one outside of Facebook believes that's true.
It's actually quite revealing, however. Facebook clearly thinks that if it gives people encryption to protect their data from "bad guys," people will be fine with the company doing whatever it wants with the information it collects. Never mind that a handful of leaks over the past few years demonstrate it's not even particularly good at that.
There's really a bigger issue here, to be honest. Facebook isn't capable of seeing itself the way the rest of the world does. Facebook's founder, Mark Zuckerberg, isn't able to imagine that Facebook could be going too far because he only sees the company through his own best intentions. He only sees it as the version that exists in his mind--a version with a noble purpose: "to give people the power to build community and bring the world closer together."
The problem is, that isn't how other people see Facebook. Most people, despite using Facebook as a convenient place to keep in touch with friends or family, see it as a place that foments division through amplifying divisive content and misinformation. They see it as a place that "must be listening to my conversations," because of the "creepy ads" it shows in their news feed.
Those ads, driven by gathering as much information as possible, are highly profitable, and it isn't hard to see why they would distort Facebook's version of reality. Money has a way of doing that, and there's simply too much money to be made collecting personal data. It really doesn't matter if you talk about doing it in a way--as Egan did--that "privacy is protected," on Facebook.
No one outside of Facebook thinks that's what privacy means. In fact, when it comes to abusing privacy, Facebook is one of the bad guys. If only the company understood that people want their data kept private from everyone, including Facebook.