Search results which are created by a website algorithm that selectively guesses what information a user would like to see based on information about the user (such as location, past click behavior and search history). As a result users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles. (Wikipedia)
Eli Pariser | Beware the Online Filter Bubble
As web companies strive to tailor their services (including news and search results) to our personal tastes, there's a dangerous unintended consequence: We get trapped in a "filter bubble" and don't get exposed to information that could challenge or broaden our worldview. Eli Pariser argues powerfully that this will ultimately prove to be bad for us and bad for democracy.
"What we're seeing is more of a passing of the torch from human gatekeepers to algorithmic ones. And the thing is that the algorithms don't yet have the kind of embedded ethics that the editors did. So if algorithms are going to curate the world for us, if they're going to decide what we get to see and what we don't get to see, then we need to make sure that they're not just keyed to relevance. We need to make sure that they also show us things that are uncomfortable or challenging or important."