In the new “Steve Jobs” movie (the new new one, with Fassbender, not Kutcher), there’s a running debate between Jobs and his ex-girlfriend Chrisann Brennan over whether or not Jobs is the father of her child – Jobs says no, Brennan says yes. At one point, Brennan attacks Jobs for a quote he gave TIME magazine in 1983: “28 percent of the male population of the United States could be the father.” Ouch.
Jobs defends himself by proclaiming that he used some algorithm to get that statistic. But the quote’s implication still stands, obviously. Jobs could try to hide behind this “algorithm,” but at the end of the day he still essentially called his ex a slut in one of the nation’s largest news publications.
As I read William G. Thomas’ “Computing and the Historical Imagination”, my mind returned to this part of the film. In particular, Jobs’ remarks strike me as awfully similar to Time on the Cross: The Economics of American Negro Slavery, which Thomas cites as an early example of computational methods colliding with and fueling historical argument. Thomas explains that Time on the Cross and its authors received intense criticism not just for the accuracy of their data, but for also for its arguments, which seemed to paint slavery in at least a much softer light.
Certainly, computers do have a certain crystal-ball aura about them that makes hiding behind their predictions incredibly tantalizing. Now more than ever, it is so easy to feed data into a given program or website and receive seconds later some output that we can immediately spin into an argument. Often, it’s enough that the computer – the pinnacle of exactness and precision – handles the work that we accept its output without hardly a second glance. Or, as Kim wrote last week, that vague sense of digital creations being inherently “different” and unique alone gives them an air of authority. But the real danger, I think, comes not when we produce faulty data, but when we position arguments as a product of a computer or an algorithm, that we might absolve ourselves of our responsibility for them.