Berfrois

How Personalization Changes Society

Print

by Cory Doctorow

MoveOn co-founder Eli Pariser’s new book The Filter Bubble: What the Internet Is Hiding from You is a thoughtful, often alarming look at the dark side of Internet personalization. Pariser is concerned that invisible “smart” customization of your Internet experience can make you parochial, exploiting your cognitive blind-spots to make you overestimate the importance or prevalence of certain ideas, products and philosophies and underestimate others. In Pariser’s view, invisible, unaccountable, commercially driven customization turns into a media-bias-of-one, an information system that distorts your perception of reality. Pariser doesn’t believe that this is malicious or intentional, but he worries that companies with good motives (“let’s hide stuff you always ignore; let’s show you search results similar to the kinds you’ve preferred in the past”) and bad (“let’s spy on your purchasing patterns to figure out how to trick you into buying stuff that you don’t want”) are inadvertently, invisibly and powerfully changing the discourse.

Pariser marshalls some good examples and arguments in favor of this proposition. Students whose teachers believe they are stupid end up acting stupid — what happens when the filters decide we’re dumb, or smart, or athletic, or right wing, or left wing? He cites China and reiterates the good arguments we’ve heard from the likes of Rebecca McKinnon: that the Chinese politburo gets more political control over the way it shapes which messages and arguments you see (through paid astroturfers) than by mere censorship of the Internet. Pariser cites research from cognitive scientists and behavioral economists on how framing and presentation can radically alter our perception of events. Finally, he convincingly describes how a world of messages that you have to consciously tune out is different from one in which the tuning out is done automatically — for example, if you attend a town hall meeting in which time is taken up with discussion of issues that you don’t care about, you still end up learning what your neighbors care about. This creates a shared frame of reference that strengthens your community.

Pariser also points out — correctly, in my view — that filtering algorithms are editorial in nature. When Google’s programmers tweak and modify their ranking algorithm to produce a result that “feels” better (or that users click on more), they’re making an editorial decision about what sort of response they want their search results to evince. Putting more-clicked things higher up is an editorial decision: “I want to provide you with the sort of information whose utility is immediately obvious.” And while this is, intuitively, a useful way to present stuff, there’s plenty of rewarding material whose utility can’t be immediately divined or described (I thought of Jonah Lehrer’s How We Decide, which describes an experiment in which subjects who were asked to explain why they liked certain pictures made worse choices than ones who weren’t asked to explain their preferences). When we speak of Google’s results as being driven by “relevance,” we act as though there was a platonic, measurable, independent idea of “relevance” that was separate from judgment, bias, and editorializing. Some relevance can’t be divined a priori — how relevant is an open window to Fleming’s Petri dish?

There were places where I argued with Pariser’s analysis, however. On the one hand, Pariser’s speculation about the future seems overly speculative: “What if augmented reality as presently practiced by artists and futurists becomes commonplace?” On the other hand, Pariser’s futures are too static: He presumes a world in which filtering tools become increasingly sophisticated, but anti-filtering tools (ad-blockers, filter-comparison tools, etc) remain at present-day levels. The first wave of personalization in the Web was all about changing how your browser displayed the information it received; the trend to modular, fluid site-design built around XML, CSS, DHTML, AJAX, etc, makes it even more possible to block, rearrange, and manage the way information is presented to you. That is, even as site designers are becoming increasingly sophisticated in the way they present their offerings to you, you are getting more and more power to break that presentation, to recombine it and filter it yourself. Filters that you create and maintain are probably subject to some of the dangers that Pariser fears, but they’re also a powerful check against the alarming manipulation he’s most anxious about. Pariser gives short shrift to this, dismissing the fact that the net makes it theoretically easier than ever to see what the unfiltered (or differently filtered) world looks like with hand-waving: the filters will make it so we don’t even want to go outside of them.

I don’t believe that anti-filters or personal filters will automatically act as a check against manipulative customization, but I believe that they have this potential. The Filter Bubble is mostly a story about potential — the potential of filtering technology to grow unchecked. And against that, I think it’s worth discussing (and caring about, and working for) the potential of a technological response to that chilling future.

Piece originally posted at BoingBoing |

Top image from Things That Make You Happy