I recently learned of people who think that Facebook is listening to them through the microphones in their phones. This weekend someone told me a story about how she was reading a book out in nature and then when she got back to her car, her phone was recommending the book she was reading to her. Another person had a story of telling her business partner of a thing they needed to get, not on email but in person, and soon after gettings ads for it. Everyone seems to have a story like this. People think Facebook must be listening to them, because it just seems like magic. The algorithms at Facebook, it turns out, collect information to predict your interests and your wants and your concerns to target ads to you for just what you want at just the right time that it makes people feel as if the only explanation is that they are listening to you. But Facebook isn’t listening to you. As an article about six months ago in Wired argued, they don’t have to. You are giving them the information. They target ads to you the way that those mindreaders get people to say something through suggestive questions. Because we can’t see the algorithms and can’t tell how they work, we think, it must be magic. And for most of us, it feels creepy.
In this regard, Facebook is a lot like the market. We put information into the market by buying things at a certain price, by buying cheaper versions of the same thing or more expensive versions, by deciding to spend our money on travel and food rather than shoes or clothes, by leaving one job for another, by turning down jobs that don’t pay enough or taking jobs that are below our credentials, by renting the too small apartment for too much money, or passing up the small apartment when the rent is too high. We think we as individuals have made an agreement with the landlord, or with the people at Macy’s or our boss, not unlike the way we think that Facebook is listening to us. We think it is about individuals making decisions, but so much less freedom and choice is involved. The market collects the information to sell us the goods at the price point it expects we will buy and we buy it and we think we have made decisions, when the market has collected masses of information to give us what we want at the point we are willing to pay for it.
The magic of Facebook creeps us out. But this creepiness should be extended to the market as well, if not more. The collective cringe at Facebook’s capacity to target to us the things we didn’t even know we wanted should be extended to the market’s work of giving us the things we want at the point we are willing to pay for them. Instead of supposing that this is the happy magic of the market, we should find the magic cringe-worthy. The cringe at Facebook betrays the truth of the impersonal determination of our desires. What’s surprising is only that we don’t also cringe at the market. The cringe at Facebook is the recognition that we are getting played. If we think that’s less the case with the market it is only because its workings are even more opaque to the everyday worker and consumer than Facebook’s are.
Many people in response to the recent publicity of Facebook’s data becoming available to other vendors are calling for regulation of Facebook. Zeynep Tufekci of the University of North Carolina offered guidelines this week on NPR for how Facebook could become a force for good community instead of for feeding consumption. Cathy O’Neill has similarly argued for how to make Facebook a place for community, beginning with making their algorithms more transparent. Facebook itself is rushing to show how they plan to better regulate themselves in the face of grossly underinformed U.S. Senators who were unable to grasp both the workings of Facebook and the implications of this current situation. But again this interest in regulation of Facebook, not unlike the move to regulate previous utilities like telephones, should point to a greater willingness to regulate the market. If we wish to regulate Facebook because we think that the algorithms cannot tell us what is good and might themselves need human input and human restraints, then it seems we should be able to see how the market similarly cannot tell us what is good and needs human inputs and human restraints.
Arguments against human interference in the market have worked by arguing that human beings can never have enough information to determine the best distribution. Only markets can. This situation seems to be exactly the case Facebook is in. They have the most information. And yet our response to that is to cringe. To the extent that you cringe at Facebook’s vast information about us and the decisions that follow from it, you should cringe at the notion that the market has more information that can better determine and organize our lives. This operation is not freedom from tyranny, it is the tyranny of market forces absent human engagement and deliberation.