Not long ago, Zeynep Tufekci, a sociologist who studies social media, wrote that she wanted to pay for Facebook. More precisely, she wants the company to offer a cash option (about twenty cents a month, she calculates) for people who value their privacy, but also want a rough idea of what their friends’ children look like. In return for Facebook agreeing not to record what she does—and to not show her targeted ads—she would give them roughly the amount of money that they make selling the ads that she sees right now. Not surprisingly, her request seems to have been ignored. But the question remains: just why doesn’t Facebook want Tufekci’s money? One reason, I think, is that it would expose the arbitrage scheme at the core of Facebook’s business model and the ridiculous degree to which people undervalue their personal data.
Since the late eighteenth century or so, there have been three main business models in the entertainment and media business: selling content, selling advertisements, or both. Some media are predominantly driven by payment (books, movies, and Netflix), others by advertisements (Google and broadcast television); while some rely on both (cable, newspapers, and magazines). Along with the classic models, since the nineteen-nineties or so there has been a fourth way: giving your customers stuff in exchange for their personal data, which you then use to make money. Facebook is not the only company that operates this way but it is the champion, widely assumed to have more data than anyone else. That data is useful for advertising, which is Facebook’s main source of revenue. But the data is also an asset. The two-hundred-and-seventy-billion-dollar valuation of Facebook, which made a profit of three billion dollars last year, is based on some faith that piling up all of that data has value in and of itself. It’s like a virtual Fort Knox—with a gold mine attached to it. One reason Mark Zuckerberg is so rich is that the stock market assumes that, at some point, he’ll figure out a new way to extract profit from all the data he’s accumulated about us.
Seen this way, Facebook’s lack of interest in Tufekci’s proposal makes sense (at least for Facebook). For the most valuable innovation at the heart of Facebook was probably not the social network (Friendster thought of that) so much as the creation of a tool that convinced hundreds of millions of people to hand over so much personal data for so little in return. As such, Facebook is a company fundamentally driven by an arbitrage opportunity—namely, the difference between how much Facebook gets, and what it costs to simply provide people with a place to socialize. That’s an arbitrage system that might evaporate in a world of rational payments. If we were smart about the accounting, we’d be asking Facebook to pay us.
The trick is that most people think they are getting a good deal out of Facebook; we think of Facebook to be “free,” and, as marketing professors explain, “consumers overreact to free.” Most people don’t feel like they are actually paying when the payment is personal data and when there is no specific sensation of having handed anything over. If you give each of your friends a hundred dollars, you might be out of money and will have a harder time buying dinner. But you can hand over your personal details or photos to one hundred merchants without feeling any poorer.
So what does it really mean, then, to pay with data? Something subtler is going on than with the more traditional means of payment. Jaron Lanier, the author of “Who Owns the Future,” sees our personal data not unlike labor—you don’t lose by giving it away, but if you don’t get anything back you’re not receiving what you deserve. Information, he points out, is inherently valuable. When billions of people hand data over to just a few companies, the effect is a giant wealth transfer from the many to the few.
A different way of understanding the surrender of personal data is more personal. To pay with data is to make yourself more vulnerable to the outside world. Just as we all perk up if someone says our name, we are inherently more receptive to whoever knows more about us. The more data you give away, the more commercially customized your world becomes. It becomes harder to ignore advertisements or intrusions. Those willing to pay will be able to grab your attention and, in certain cases, exploit your weaknesses.
Ultimately, Tufekci wants us to think harder about what it means when we pay with data or attention instead of money, which is what makes her proposition so interesting. While every business has slightly mixed motives, those companies that we pay live and die by how they serve the customer. In contrast, the businesses we are paying with attention or data are conflicted. We are their customers, but we are also their products, ultimately resold to others. We are unlikely to stop loving free stuff. But we always pay in the end—and it is worth asking how.