Submitted by Anne Landman on
The efforts of Google, Yahoo, Facebook and other major websites to tailor our online experiences to our supposed interests can affect our ability to get a view of the world the way it really is. Instead, we are fed a view of the world that these organizations "think" we'd like to see. Google, Yahoo, Facebook and other websites use proprietary algorithms to "personalize" news for us -- that is, to select news they think we will like to see -- not to select news that challenges the user, contradicts his or her views or that the user would be unlikely to see. The algorithms work invisibly, so users have no way of knowing what the websites are editing out and preventing us from seeing. The algorithms pick information based on what we usually look at, resulting in a feedback loop that Internet guru Eli Pariser calls "autopropaganda" -- unknowingly indoctrinating yourself with your own views. If you and another person you don't know both perform a search on Google on the exact same term, you can both get shockingly different results, based on Google's analysis of what each of you usually look at. What this means is that Internet users essentially get an edited worldview based on personal information over which they have no control. The fact that this activity is hidden leaves users without the ability to seek out sources of information and news with which they are unfamiliar, that might challenge them or give them a broader view of the world.