Home > Facebook Meltdown, Uncategorized > You Furnish The Emotion and I’ll Furnish the Votes

You Furnish The Emotion and I’ll Furnish the Votes

“You furnish the pictures and I’ll furnish the war.”

Attributed to an 1898 cable from William Randolph Hearst to Frederick Remington on assignment in Cuba before the Spanish-American War. 

yellow_kid_02829v

William Randolph Hearst was one of the original purveyors of fake news in the mass media.  (You may know him as Charles Foster Kane from the Orson Welles classic Citizen Kane.  And if you don’t know Citizen Kane, make time to watch it.)

In many ways, Hearst is an historical antecedent (and perhaps a role model) for Mark Zuckerberg.  And it should not be lost on you that Hearst was elected and re-elected to the U.S. House of Representatives as a Tammany man from the old 11th District.  He ran unsuccessfully for the Democratic Party nomination for President of the United States in 1904 (which he resoundingly lost to Judge Alton Parker who resoundingly lost the election to the incumbent, Theodore Roosevelt).

I firmly believe that Zuckerberg intends to run for President.  If it turns out that I’m wrong, you can all snicker, but if I’m right….  Why do I believe it?  Because I also believe that the hangover from the Facebook Kool Aid kegger creates a pathological case of high functioning techno-fabulisim in which Zuckerberg believes that He Is The One to Save the World.  In short, delusions of grandeur on a global level.  And who at Facebook will tell him he isn’t The One?  He’s hired some political hacks to advise him who have worked on campaigns of successful candidates like President Obama for one.  Take this to the bank:  Candidates matter.  And this:  Lightning doesn’t strike twice.  These guys will be dining out on Barack Obama for many years to come.

Why do we care what Zuckerberg’s plans are?  Aside from the nausea of the possibility of a royalty deadbeat and purveyor of fake news sitting in the White House…ahem….there is actually a much more sinister reason that was well said this week in a USA Today op ed by Roger McNamee, who runs a venture capital outfit called Elevation Partners I have a lot of time for.

I invested in Google and Facebook years before their first revenue and profited enormously. I was an early adviser to Facebook’s team, but I am terrified by the damage being done by these Internet monopolies….

Facebook and Google get their revenue from advertising, the effectiveness of which depends on gaining and maintaining consumer attention. Borrowing techniques from the gambling industry, Facebook, Google and others exploit human nature, creating addictive behaviors that compel consumers to check for new messages, respond to notifications, and seek validation from technologies whose only goal is to generate profits for their owners….

How does this work? A 2013 study found that average consumers check their smartphones 150 times a day. And that number has probably grown. People spend 50 minutes a day on Facebook. Other social apps such as Snapchat, Instagram and Twitter combine to take up still more time. Those companies maintain a profile on every user, which grows every time you like, share, search, shop or post a photo. Google also is analyzing credit card records of millions of people….

Consider a recent story from Australia, where someone at Facebook told advertisers that they had the ability to target teens who were sad or depressed, which made them more susceptible to advertising. In the United States, Facebook once demonstrated its ability to make users happier or sadder by manipulating their news feed. While it did not turn either capability into a product [yet, that we know of], the fact remains that Facebook influences the emotional state of users every moment of every day. Former Google design ethicist Tristan Harris calls this “brain hacking.”

Roger McNamee just lost his membership in the Silicon Valley Tech Bros Club for violating the first rule of Fight Club.  But having said that, please appreciate the sheer balls it takes to do what Roger did in that op-ed (and you should read the whole thing for full effect).  I firmly believe he’s correct, too.

This makes Facebook’s manipulation of innocent customers actually worse than Ford with the Pinto’s exploding gas tank.  Ford’s acceptance of the cost/benefit analysis of fixing the gas tank was cold blooded and someone should have gone to jail.  But it wasn’t like they were trading on the gas tank as a feature.  What Roger is saying is what we all have suspected, which is that these people at Facebook–starting with POTUS wannabe Zuckerberg–know exactly what they are doing.  And what they are doing demonstrates that data lords are not that different than drug lords.  Except that drug lords never dreamed they could tap into a junkie market the size of the whole planet.

If you doubt the addiction, try this experiment.  Try only using your phone for phone calls for 36 hours.  Just check your email on your computer.  No email, Facebook, Twitter, Google on your phone for 36 hours.  See how you react.

The Diagnostic and Statistical Manual of Mental Disorders (“DSM”) addresses a number of these Internet based addictions, including Internet gambling disorder and Internet addiction disorder.  Internet gambling disorder as a manifestation of the larger category of gambling disorder.  So when Roger says Facebook and Google “borrow” techniques from the gambling industry, they are playing with well known and documented addictive pathologies for profit.  And just imagine the private research that Facebook is able to conduct with over a billion users.  There’s a certain point with a very large sample size that the predictive power of probability may as well be certainty.

In fact, Facebook data lord Adam Kramer (more about him shortly) said in an interview that the large user base was one of the reasons he joined Facebook:

Q: Why did you join Facebook?

A: Facebook data constitutes the largest field study in the history of the world. Being able to ask–and answer–questions about the world in general is very, very exciting to me. At Facebook, my research is also immediately useful: When I discover something, we can use this to make improvements to the product. In an academic position, I would have to have a paper accepted, wait for publication, and then hope someone with the means to usefully implement my work takes notice. At Facebook, I just message someone on the right team and my research has an impact within weeks if not days.

Q: What are some of the interesting questions you’ve answered since you’ve been here?

A: Do emotions spread contagiously? What do the words we choose have to say about how we are and who we are?

DSM-5 (the 2013 edition of DSM) includes Internet addiction disorder in the appendix, which is where pathologies being studied start out in the psychiatric definitional world.  This essentially means that the editorial staff of DSM think there is something there, but there hasn’t been enough documentation to arrive at a uniform definition.

Roger cites to this 2014 study commissioned by Facebook “Experimental evidence of massive-scale emotional contagion through social networks” written by Adam D. I. Kramer of Facebook’s “Core Data Science Team” and two academics from Cornell.  (Cornell was one of the first campuses outside of Harvard to adopt the early version of Facebook.)

The study concluded:

Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. Emotional contagion is well established in laboratory experiments, with people transferring positive and negative emotions to others….

In an experiment with people who use Facebook, we test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed. When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred.

These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks. This work also suggests that, in contrast to prevailing assumptions, in-person interaction and nonverbal cues are not strictly necessary for emotional contagion, and that the observation of others’ positive experiences constitutes a positive experience for people.

Now what ever do you suppose that POTUS aspirant Zuckerberg did with this emotional contagion methodology, hmmm?  And remember, the only reason we know about this study at all is because it was published.  My bet is that the two academics probably demanded that the research be published.  I wonder which particular emotion Facebook was interested in measuring?

The twist on this particular study is that it was done on the sly.  Had the study been conducted purely in an academic environment, it would have to have been approved as human subject research by the Cornell Institutional Review Board.  Subjects are given the opportunity to opt out.  This is exactly the kind of thing that Mr. Kramer evidently found oh so frustrating about academic life.

The study’s subjects were Facebook customers–well, users anyway–and none of them were told that they were being observed, much less offered an opportunity to opt out of being studied like lab rats.  That made the study like the Menlo Park version of The Truman Show.  

I’m not the only one who found it disturbing.  The Proceedings of the National Academy of Sciences published the paper, but also found it necessary to include an Editorial Expression of Concern regarding the underhanded nature of not informing the subjects that they were subjects.

Questions have been raised about the principles of informed consent and opportunity to opt out in connection with the research in this paper. The authors noted in their paper, “[The work] was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.” When the authors prepared their paper for publication in PNAS, they stated that: “Because this experiment was conducted by Facebook, Inc. for internal purposes, the Cornell University IRB [Institutional Review Board] determined that the project did not fall under Cornell’s Human Research Protection Program.” This statement has since been confirmed by Cornell University.

Obtaining informed consent and allowing participants to opt out are best practices in most instances under the US Department of Health and Human Services Policy for the Protection of Human Research Subjects (the “Common Rule”). Adherence to the Common Rule is PNAS policy, but as a private company Facebook was under no obligation to conform to the provisions of the Common Rule when it collected the data used by the authors, and the Common Rule does not preclude their use of the data. Based on the information provided by the authors, PNAS editors deemed it appropriate to publish the paper. It is nevertheless a matter of concern that the collection of the data by Facebook may have involved practices that were not fully consistent with the principles of obtaining informed consent and allowing participants to opt out.

Quick—when you signed up for your Facebook account, did you know you were agreeing to be a lab rat?

I am not an authority on IRBs and human subject research, but I have encountered it in the legal context.  My impression of that encounter would lead me to believe that Cornell got it wrong–the study should have been submitted to their IRB and should have followed the “Common Rule” which basically says you have to tell people they are being studied and allow them to opt out.  Based on the unusual editorial comment from the Proceedings of the National Academy of Sciences, they have concerns, too, so it’s not just Roger and it’s not just me.

So here’s the question:  What if a candidate for President controlled “the largest field study in the history of the world”?  What if that candidate used that data for polling at a minimum and in an effort to control public opinion in an extreme case, all based on “Facebook’s Data Use Policy”?   How comfortable are you that any result that candidate produces isn’t somehow tainted by the exploitation of addictive behaviors that Roger McNamee describes in his op-ed?

Who needs Tammany when you’ve got the Zuck?

  1. No comments yet.
  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: