Everything is not awesome at 1 Hacker Way. No, the House of Zuck is in serious trouble over many aspects of the business model that Sheryl Sandberg perfected for Facebook. However, Facebook has been very good at keeping attention focused on what third parties have done with Facebook’s data, whether that use violated the Facebook terms of service, all the usual overly legalistic twisting and turning to avoid legal liability that we are so familiar with.
The real question, of course, is not so much what some app developers did with “Facebook’s data” but rather what was Facebook doing with your data and how does Facebook profit from creating an addictive product. It was only a matter of time before this came up and it’s unfortunate that it came up as Facebook tries to get the “royalty deadbeat” monkey of off their back and onto Twitter’s–more on that later.
But as the Guardian reports, it was Sheryl Sandberg–nowhere to be seen in the current crisis–who developed Facebook’s stalker-esque business model that made Bentham’s panopticon look like a high school lark.
Under Sandberg’s leadership, an ad model that took advantage of Facebook’s social graph emerged, starting with “engagement ads” that invited users to “like” the page of an advertiser and interact with the brand. Later, Facebook developed “custom audiences”, allowing external advertisers to merge the data they had about individuals with Facebook’s data.
This meant companies could micro-target their existing customers on the platform, layering their own customer data with Facebook’s invaluable information about likes, friends and biographical material.
It is this exploitation of your Facebook addiction that makes them rich–just ask Sean Parker. (“‘God only knows what it’s doing to our children’s brains,’ Parker said.”)
But it also meant that Facebook’s “product” was you. And it meant that they needed you good and hooked, “addicted” as Sean Parker has said, to all those likes and comments and comments on comments to keep you working for them for free in the great tradition of drug fueled manipulators as Professor Alter demonstrates in his very important book Irresistible: The Rise of Addictive Technology and the Business of Keeping Us Hooked.
Spotify also has had recent problems with manipulating users privacy. They came up with the usual excuse: It’s all for your benefit. It’s not that we want to sell your data out the back door, it’s that we want to give you a better user experience.
No one believes this, of course. Although often maligned for using music as a loss leader for other high margin products, at least Apple has a product that is quite distinct from the music they sell and a brand that has always had extremely high integrity. Apple has no nonprosecution agreements for violating the Controlled Substances Act like Google–that directly implicated Sheryl Sandberg during her Google days.
In an upcoming interview with MSNBC, Apple CEO Tim Cook makes the key point, the major difference between data scrapers and people who actually have a product:
“If our customer was our product, we could make a ton of money. We’ve elected not to do that….We don’t subscribe to the view that you have to let everybody in who wants to or if you don’t you don’t believe in free speech….Because we’re like the guy in the corner store. What you sell in that store says something about you.”
But to focus solely on what Cambridge Analytica did with Facebook’s data is to accept the premise that Facebook wants you to accept. The real question is what is Facebook doing with your data? Because it is your data. Facebook may be able to flash its terms of service in your face, but prosecutors are going to look at that TOS with a jaundiced eye once the nasty stuff starts to come out.
For example, Buzzfeed reports on a memo written by Facebook senior manager Andrew Bosworth:
“We connect people. Period. That’s why all the work we do in growth is justified. All the questionable contact importing practices. All the subtle language that helps people stay searchable by friends. All of the work we do to bring more communication in. The work we will likely have to do in China some day. All of it,” VP Andrew “Boz” Bosworth wrote.
“So we connect more people,” he wrote in another section of the memo. “That can be bad if they make it negative. Maybe it costs someone a life by exposing someone to bullies.
“Maybe someone dies in a terrorist attack coordinated on our tools.”
I’ve always said that there is a “Pinto memo” out there somewhere at Facebook and Google and that Big Tech is going to get taken down by a Jeffrey Wigand-style whistleblower. (See Grimshaw v. Ford Motor Co., 119 Cal.App.3d 757 (1981) and the Tobacco Master Settlement Agreement (1998).) The downfall will be over the addiction issue which is, of course, directly tied to the data issue which is tied to the indifference issue. The amoral indifference to “maybe someone dies in a terrorist attack coordinated on our tools.”
I often get criticized for saying that these are bad people. Actually, it can’t be said enough. And they’re not that different from other bad people we’ve dealt with over the years.
What also can’t be said enough: Where was the board?