The Danger of Personalization: Filter Bubbles

By Director of Technology Robert Derby
Have you ever experienced a situation where a group of people are given an informed choice and roughly half of them disagreed with you? Afterward, did you think, "How could we all be looking at the same information and not come to the same conclusion?" There may be many variables involved in a decision, but a critical one is that, paradoxically, in our connected world we are getting more and more data filtered to agree with our existing views.

In 2011, internet activist and Upworthy co-founder Eli Pariser coined the phrase "filter bubble" in his bestseller, The Filter Bubble:How the New Personalized Web Is Changing What We Read and How We Think. Mr. Pariser also recorded a TED talk on the same topic.  
 
A "filter bubble" references how websites like Google and Facebook use algorithms to selectively guess what information a person would like to see based on information about them (such as location, past click behavior and search history). As a result, people become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles. Back in 2011, Google used about 57 "signals" (variables) to tailor data for your searches; today there are over 200. Most major websites are using or developing personalization algorithms.  
 
Quite literally, this means that you and I can do the same search on Google and will get different results based upon data collected about us. For example, two people who search the term "BP": one could receive links to investment news about BP and the other could receive links to the Deepwater Horizon oil spill.  
 
Filter bubbles are driven by commercial interests. The more personalized information a website provides allows them to target more relevant ads, which is how these companies make money. The more data collected about each of us allows the companies to develop a comprehensive profile of our behavior, which can be sold as a commodity.  
 
Facebook is currently under fire as stated in the New York Times for the impact it may have had in the presidential election, including the proliferation of fake news. According to a report published by the Pew Research Center, 62 percent of adults get their news from social media with more than 40 percent from Facebook.
 
These personalization algorithms are likely to get far more advanced in the near future. We, as educators, must prepare students to think critically as the tools we use become more filtered to only provide information that reinforces our own views. We have a responsibility to offer many viewpoints on topics of discussion to allow students to draw their own opinions and conclusions.

Back

La Jolla Country Day School

9490 Genesee Avenue
La Jolla, CA 92037
858-453-3440

© 2024 La Jolla Country Day School 

Privacy Policy

COVID-19 Prevention Plan

Country Day Connection Newsletter