Facebook

Facebook When it comes to content, Google and Facebook are offering us too much candy, and not enough carrots. That's according to political activist and former MoveOn.org executive director Eli Pariser, who warned that the "invisible algorithmic editing of the web" via personalized search results, news feeds and other customized content is threatening to limit our exposure to new information and narrow our outlook. Pariser, who describes his political leanings as "progressive," said at the annual TED conference that he has always made an effort to befriend both liberals and conservatives on Facebook so he could keep track of the issues each group was discussing. Over time, however, something strange happened, Pariser said: his conservative Facebook friends disappeared from his news feed. He realized that Facebook's algorithm had "edited them out" because Pariser was clicking more on links from liberal friends than conservative ones.
Google is also guilty of tweaking what it shows users based on past online behavior. Pariser highlighted how two users can receive drastically different Google search results after querying the same term because the search engine monitors 57 signals to tweak and personalize results. "There is no standard Google anymore," Pariser noted. "This moves us very quickly toward a world in which the Internet is showing us what it thinks we want to see, but not necessarily what we need to see," Pariser said of editing via algorithms. Because of algorithms that determine what we see online based on our browsing, reading, and clicking history, we risk being exposed to fewer viewpoints and a more limited array of opinions, content sources, and viewpoints, Pariser argued. "If you take all of these filters together, all of these algorithms you get what I call a filter bubble. Your filter bubble is your own personal unique universe of information that you live in online," he said. "What's in your filter bubble depends on who you are and it depends on what you do you, but the thing is that you don't decide what gets in...and more importantly you don't actually see what gets edited out."
Companies have billed the personalization of information as a way of serving up content that is more relevant to a user's interests. When it rolled out personalized search to all users, Google boasted the feature would "[help] people get better search results." According to The Facebook Effect, Mark Zuckerberg explained the utility of Facebook's "News Feed" by telling his staff, "A squirrel dying in front of your house may be more relevant to your interests right now than people dying in Africa." Pariser appealed to tech executives from companies like Facebook and Google present at the TED conference to reconsider their approach in order to create the internet "that we all dreamed of it being"--one introducing us to alternate, novel perspectives that challenge us to think in new ways. "We really need you to make sure that these algorithms have encoded in them a sense of the public life, a sense of civic responsibility," Pariser said. "The thing is that the algorithms don't yet have the kind of embedded ethics that the editors did. So if algorithms are going to curate the world for us, if they're going to decide what we get to see and what we don't get to see, then we need to make sure that they're not just keyed to relevance. We need to make sure that they also show us things that are uncomfortable or challenging or important."
Smarter, more "concerned" algorithms are necessary to ensure we have a balanced information diet, Pariser said. "The best editing gives us a bit of both," he said. "It gives us a little bit of Justin Bieber and a little bit of Afghanistan. It gives us some information vegetables and it gives us some information dessert."
Otherwise, he warned, we risk consuming too much "fast food" content.
 "Instead of a balanced information information diet, you can end up surrounded by information junk food," Pariser said.