You may not realize it, but you could be in one right now. In fact, if you have Google or Facebook open, you most certainly are. Filter bubbles are algorithms that adjust what you see in your Facebook News Feed, Google results and even the stories on Yahoo! News. They pick up on your clicking habits and attempt to give you more of what you want.
Online organizer Eli Pariser has a problem with this. Pariser, who wrote a book called The Filter Bubble, spoke at TED2011 in February about the problem with these algorithms. Describing himself as politically progressive, Pariser said he realized the problem when he no longer saw his conservative friends popping up on his News Feed. What had happened was that Facebook recognized he was clicking on his liberal friends’ links more often and weeded out the less-often-clicked right-wingers. Pariser said Facebook is now presenting him with a narrower worldview.
But that’s not out of line with Mark Zuckerberg’s News Feed philosophy. Pariser quoted the Facebook founder as saying, “A squirrel dying in front of your house may be more relevant to your interests right now than people dying in Africa.” But should it be?
Pariser said that when he was growing up in rural Maine, the Internet meant something very different to him. It was a way to connect with the world, to learn new things. It was supposed to be good for democracy. “But there’s this kind of shift in how information is flowing online,” he explained, “and it’s invisible, and if we don’t pay attention to it, it could be a real problem.”
The trouble, he argued, is that these algorithms—the new gatekeepers of information—don’t have the ethics of human editors. And that’s something developers should work on, he believes. “The best editing gives us a bit of both,” he said. “It gives us a little bit of Justin Bieber, and a little bit of Afghanistan. It gives us some information vegetables; it gives us some information dessert.”