Must Read: Weapons of Math Destruction

If you’re looking for longreads…you know, books…for your summer reading list, I’d strongly suggest Weapons of Math Destruction by Cathy O’Neil.  The author is a confirmed quant who has extraordinary insight into both the beneficial and the destructive power of algorithms.

Here’s an except:

The math-powered applications powering the data economy [are] based on choices made by fallible human beings.  Some of these choices were no doubt made with the best intentions.  Nevertheless, many of these models encoded human prejudice, misunderstanding, and bias into the software systems that increasingly managed our lives.  Like gods, these mathematical models were opaque, their workings invisible to all but the highest priests in their domain:  mathematicians and computer scientists.  There verdicts, even when wrong or harmful, were beyond dispute or appeal.  And they tended to punish the poor and the oppressed in our society, while making the rich richer.

I came up with a name for these harmful kinds of models:  Weapons of Math Destruction, or WMDs for short.

Needless to say, the one algorithm we all are affected by is Google’s search WMD.  What Google shows you in search results, how much Google knows about you and how much that affects which search results Google shows you, are each examples of mathematical models that encode human prejudice and make choices that the highest priests at Google decide you would want based on a given set of inputs scraped from your Gmail, Google Voice, YouTube viewing and what you search for online among other things.

Facebook knows a phenomenal amount of information about you, profiles you and tracks your every move on Facebook as well as off of Facebook to the extent possible.  And remember–if Mark Zuckerberg wants to run for President, we will have a true data lord in the campaign as a candidate for the first time.

When Spotify programs a playlist, they very consciously use scraped data about your listening habits and those of other Spotify users to create an algorithm that includes–and excludes–recordings.  We’re a long way from “it has a nice beat and you can dance to it” now.

When Amazon lets you search books they’ll only show you all the search results if you “sign in”–meaning identify yourself to Amazon’s algorithm.  And Alexa?  Ah, Alexa.  A little box that sits in the corner listening to you.  Google may send cars around to take pictures of your house and put them on the Internet, but Amazon’s algorithms actually run inside your house.

Algorithms, especially Google’s algorithms and what Facebook includes as news, can have a profound effect on democracy itself.  When Dr. Robert Epstein first wrote of how Google’s algorithm could rig elections in 2013 (and discussed on PBS Newshour), he was frequently attacked as something of a quack by Google executives who denied that such a thing were even possible.  Given the sudden discovery of fake news, the secret workings of Google’s algorithm to shape what we read has some pretty real interest.  I think this background makes Cathy O’Neil’s writing all the more compelling.

Here’s a talk on the subject she gave at the Personal Democracy Forum: