Filter bubble
   HOME

TheInfoList



OR:

A filter bubble or ideological frame is a state of intellectual isolationTechnopedia
Definition – What does Filter Bubble mean?
, Retrieved October 10, 2017, "....A filter bubble is the intellectual isolation, that can occur when websites make use of algorithms to selectively assume the information a user would want to see, and then give information to the user according to this assumption ... A filter bubble, therefore, can cause users to get significantly less contact with contradicting viewpoints, causing the user to become intellectually isolated...."
that can result from
personalized search Personalized search is a web search tailored specifically to an individual's interests by incorporating information about the individual beyond the specific query provided. There are two general approaches to Personalization, personalizing search ...
es, recommendation systems, and algorithmic curation. The search results are based on information about the user, such as their location, past click-behavior, and search history. Consequently, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles, resulting in a limited and customized view of the world. The choices made by these algorithms are only sometimes transparent. Prime examples include Google Personalized Search results and Facebook's personalized news-stream. However, there are conflicting reports about the extent to which personalized filtering happens and whether such activity is beneficial or harmful, with various studies producing inconclusive results. The term ''filter bubble'' was coined by internet activist
Eli Pariser Eli Pariser (born December 17, 1980) is an author, activist, and entrepreneur. He has stated that his focus is "how to make technology and media serve democracy". He became executive director of MoveOn, MoveOn.org in 2004, where he helped pioneer ...
circa 2010. In Pariser's influential book under the same name, ''The Filter Bubble'' (2011), it was predicted that individualized personalization by algorithmic filtering would lead to intellectual isolation and social fragmentation. The bubble effect may have negative implications for civic
discourse Discourse is a generalization of the notion of a conversation to any form of communication. Discourse is a major topic in social theory, with work spanning fields such as sociology, anthropology, continental philosophy, and discourse analysis. F ...
, according to Pariser, but contrasting views regard the effect as minimal and addressable. According to Pariser, users get less exposure to conflicting viewpoints and are isolated intellectually in their informational bubble. He related an example in which one user searched Google for "BP" and got investment news about BP, while another searcher got information about the
Deepwater Horizon oil spill The ''Deepwater Horizon'' oil spill was an environmental disaster off the coast of the United States in the Gulf of Mexico, on the BP-operated Macondo Prospect. It is considered the largest marine oil spill in the history of the petroleum in ...
, noting that the two search results pages were "strikingly different" despite use of the same key words. The results of the U.S. presidential election in 2016 have been associated with the influence of social media platforms such as Twitter and Facebook, and as a result have called into question the effects of the "filter bubble" phenomenon on user exposure to
fake news Fake news or information disorder is false or misleading information (misinformation, disinformation, propaganda, and hoaxes) claiming the aesthetics and legitimacy of news. Fake news often has the aim of damaging the reputation of a person ...
and echo chambers, spurring new interest in the term, with many concerned that the phenomenon may harm democracy and
well-being Well-being is what is Intrinsic value (ethics), ultimately good for a person. Also called "welfare" and "quality of life", it is a measure of how well life is going for someone. It is a central goal of many individual and societal endeavors. ...
by making the effects of misinformation worse.


Concept

Pariser defined his concept of a filter bubble in more formal terms as "that personal
ecosystem An ecosystem (or ecological system) is a system formed by Organism, organisms in interaction with their Biophysical environment, environment. The Biotic material, biotic and abiotic components are linked together through nutrient cycles and en ...
of
information Information is an Abstraction, abstract concept that refers to something which has the power Communication, to inform. At the most fundamental level, it pertains to the Interpretation (philosophy), interpretation (perhaps Interpretation (log ...
that's been catered by these algorithms." An internet user's past browsing and search history is built up over time when they indicate interest in topics by "clicking links, viewing friends, putting movies in
heir Inheritance is the practice of receiving private property, titles, debts, entitlements, privileges, rights, and obligations upon the death of an individual. The rules of inheritance differ among societies and have changed over time. Offi ...
queue, reading news stories," and so forth. An internet firm then uses this information to target advertising to the user, or make certain types of information appear more prominently in search results pages. This process is not random, as it operates under a three-step process, per Pariser, who states, "First, you figure out who people are and what they like. Then, you provide them with content and services that best fit them. Finally, you tune in to get the fit just right. Your identity shapes your media." Pariser also reports: Accessing the data of link clicks displayed through site traffic measurements determines that filter bubbles can be collective or individual. As of 2011, one engineer had told Pariser that Google looked at 57 different pieces of data to personally tailor a user's search results, including non-cookie data such as the type of computer being used and the user's physical location. Pariser's idea of the filter bubble was popularized after the TED talk in May 2011, in which he gave examples of how filter bubbles work and where they can be seen. In a test seeking to demonstrate the filter bubble effect, Pariser asked several friends to search for the word "Egypt" on Google and send him the results. Comparing two of the friends' first pages of results, while there was overlap between them on topics like news and travel, one friend's results prominently included links to information on the then-ongoing
Egyptian revolution of 2011 The 2011 Egyptian revolution, also known as the 25 January Revolution (;), began on 25 January 2011 and spread across Egypt. The date was set by various youth groups to coincide with the annual Egyptian "Police holiday" as a statement against ...
, while the other friend's first page of results did not include such links. In ''The Filter Bubble'', Pariser warns that a potential downside to filtered searching is that it "closes us off to new ideas, subjects, and important information," and "creates the impression that our narrow self-interest is all that exists." In his view, filter bubbles are potentially harmful to both individuals and society. He criticized
Google Google LLC (, ) is an American multinational corporation and technology company focusing on online advertising, search engine technology, cloud computing, computer software, quantum computing, e-commerce, consumer electronics, and artificial ...
and
Facebook Facebook is a social media and social networking service owned by the American technology conglomerate Meta Platforms, Meta. Created in 2004 by Mark Zuckerberg with four other Harvard College students and roommates, Eduardo Saverin, Andre ...
for offering users "too much candy and not enough carrots." He warned that "invisible algorithmic editing of the web" may limit our exposure to new information and narrow our outlook. According to Pariser, the detrimental effects of filter bubbles include harm to the general society in the sense that they have the possibility of "undermining civic discourse" and making people more vulnerable to "propaganda and manipulation." He wrote: Many people are unaware that filter bubbles even exist. This can be seen in an article in ''The Guardian'', which mentioned the fact that "more than 60% of Facebook users are entirely unaware of any curation on Facebook at all, believing instead that every single story from their friends and followed pages appeared in their news feed." A brief explanation for how Facebook decides what goes on a user's news feed is through an algorithm that takes into account "how you have interacted with similar posts in the past."


Extensions of concept

A filter bubble has been described as exacerbating a phenomenon that called ''
splinternet The splinternet (also referred to as cyber-balkanization or internet balkanization) is a characterization of the Internet as splintering and dividing due to various factors, such as technology, commerce, politics, nationalism, religion, and diver ...
'' or ''cyberbalkanization'',The term ''cyber-balkanization'' (sometimes with a hyphen) is a hybrid of ''cyber'', relating to the internet, and ''
Balkanization Balkanization or Balkanisation is the process involving the fragmentation of an area, country, or region into multiple smaller and hostile units. It is usually caused by differences in ethnicity, culture, religion, and geopolitical interests. ...
'', referring to that region of Europe that was historically subdivided by languages, religions and cultures; the term was coined in a paper by
MIT The Massachusetts Institute of Technology (MIT) is a private research university in Cambridge, Massachusetts, United States. Established in 1861, MIT has played a significant role in the development of many areas of modern technology and sc ...
researchers Van Alstyne and Brynjolfsson.
which happens when the internet becomes divided into sub-groups of like-minded people who become insulated within their own online community and fail to get exposure to different views. This concern dates back to the early days of the publicly accessible internet, with the term "cyberbalkanization" being coined in 1996. Other terms have been used to describe this phenomenon, including " ideological frames" and "the figurative sphere surrounding you as you search the internet." The concept of a filter bubble has been extended into other areas, to describe societies that self-segregate according political views but also economic, social, and cultural situations. That bubbling results in a loss of the broader community and creates the sense that for example, children do not belong at social events unless those events were especially planned to be appealing for children and unappealing for adults without children. Barack Obama's farewell address identified a similar concept to filter bubbles as a "threat to mericans'democracy," i.e., the "retreat into our own bubbles, ...especially our social media feeds, surrounded by people who look like us and share the same political outlook and never challenge our assumptions... And increasingly, we become so secure in our bubbles that we start accepting only information, whether it's true or not, that fits our opinions, instead of basing our opinions on the evidence that is out there."


Comparison with echo chambers

Both "echo chambers" and "filter bubbles" describe situations where individuals are exposed to a narrow range of opinions and perspectives that reinforce their existing beliefs and biases, but there are some subtle differences between the two, especially in practices surrounding social media. Specific to
news media The news media or news industry are forms of mass media that focus on delivering news to the general public. These include News agency, news agencies, newspapers, news magazines, News broadcasting, news channels etc. History Some of the fir ...
, an echo chamber is a metaphorical description of a situation in which beliefs are amplified or reinforced by communication and repetition inside a closed system. Based on the sociological concept of selective exposure theory, the term is a metaphor based on the acoustic echo chamber, where sounds reverberate in a hollow enclosure. With regard to social media, this sort of situation feeds into explicit mechanisms of ''self-selected personalization'', which describes all processes in which users of a given platform can actively opt in and out of information consumption, such as a user's ability to follow other users or select into groups. In an echo chamber, people are able to seek out information that reinforces their existing views, potentially as an unconscious exercise of
confirmation bias Confirmation bias (also confirmatory bias, myside bias, or congeniality bias) is the tendency to search for, interpret, favor and recall information in a way that confirms or supports one's prior beliefs or Value (ethics and social sciences), val ...
. This sort of feedback regulation may increase political and
social polarization Social polarization is the segregation within a society that emerges when factors such as income inequality, real-estate fluctuations and economic displacement result in the differentiation of social groups from high-income to low-income. It is a ...
and extremism. This can lead to users aggregating into homophilic clusters within social networks, which contributes to group polarization. "Echo chambers" reinforce an individual's beliefs without factual support. Individuals are surrounded by those who acknowledge and follow the same viewpoints, but they also possess the agency to break outside of the echo chambers. On the other hand, filter bubbles are implicit mechanisms of ''pre-selected personalization'', where a user's media consumption is created by personalized algorithms; the content a user sees is filtered through an AI-driven algorithm that reinforces their existing beliefs and preferences, potentially excluding contrary or diverse perspectives. In this case, users have a more passive role and are perceived as victims of a technology that automatically limits their exposure to information that would challenge their world view. Some researchers argue, however, that because users still play an active role in selectively curating their own newsfeeds and information sources through their interactions with search engines and social media networks, that they directly assist in the filtering process by AI-driven algorithms, thus effectively engaging in self-segregating filter bubbles. Despite their differences, the usage of these terms go hand-in-hand in both academic and platform studies. It is often hard to distinguish between the two concepts in social network studies, due to limitations in accessibility of the filtering algorithms, that perhaps could enable researchers to compare and contrast the agencies of the two concepts. This type of research will continue to grow more difficult to conduct, as many social media networks have also begun to limit API access needed for academic research.


Reactions and studies


Media reactions

There are conflicting reports about the extent to which personalized filtering happens and whether such activity is beneficial or harmful. Analyst Jacob Weisberg, writing in June 2011 for ''
Slate Slate is a fine-grained, foliated, homogeneous, metamorphic rock derived from an original shale-type sedimentary rock composed of clay or volcanic ash through low-grade, regional metamorphism. It is the finest-grained foliated metamorphic ro ...
'', did a small non-scientific experiment to test Pariser's theory which involved five associates with different ideological backgrounds conducting a series of searches, "
John Boehner John Andrew Boehner ( ; born , 1949) is an American politician who served as the 53rd speaker of the United States House of Representatives from 2011 to 2015. A member of the Republican Party, he served 13 terms as the U.S. representative ...
," "
Barney Frank Barnett Frank (born March 31, 1940) is a retired American politician. He served as a member of the U.S. House of Representatives from Massachusetts from 1981 to 2013. A Democratic Party (United States), Democrat, Frank served as chairman of th ...
," " Ryan plan," and "
Obamacare The Affordable Care Act (ACA), formally known as the Patient Protection and Affordable Care Act (PPACA) and informally as Obamacare, is a landmark U.S. federal statute enacted by the 111th United States Congress and signed into law by Presi ...
," and sending Weisberg screenshots of their results. The results varied only in minor respects from person to person, and any differences did not appear to be ideology-related, leading Weisberg to conclude that a filter bubble was not in effect, and to write that the idea that most internet users were "feeding at the trough of a '' Daily Me''" was overblown. Weisberg asked Google to comment, and a spokesperson stated that algorithms were in place to deliberately "limit personalization and promote variety." Book reviewer Paul Boutin did a similar experiment to Weisberg's among people with differing search histories and again found that the different searchers received nearly identical search results. Interviewing programmers at Google, off the record, journalist found that user data used to play a bigger role in determining search results but that Google, through testing, found that the search query is by far the best determinant of what results to display. There are reports that Google and other sites maintain vast "dossiers" of information on their users, which might enable them to personalize individual internet experiences further if they choose to do so. For instance, the technology exists for Google to keep track of users' histories even if they don't have a personal Google account or are not logged into one. One report stated that Google had collected "10 years' worth" of information amassed from varying sources, such as
Gmail Gmail is the email service provided by Google. it had 1.5 billion active user (computing), users worldwide, making it the largest email service in the world. It also provides a webmail interface, accessible through a web browser, and is also ...
,
Google Maps Google Maps is a web mapping platform and consumer application offered by Google. It offers satellite imagery, aerial photography, street maps, 360° interactive panorama, interactive panoramic views of streets (Google Street View, Street View ...
, and other services besides its search engine, although a contrary report was that trying to personalize the internet for each user, was technically challenging for an internet firm to achieve despite the huge amounts of available data. Analyst Doug Gross of
CNN Cable News Network (CNN) is a multinational news organization operating, most notably, a website and a TV channel headquartered in Atlanta. Founded in 1980 by American media proprietor Ted Turner and Reese Schonfeld as a 24-hour cable ne ...
suggested that filtered searching seemed to be more helpful for
consumer A consumer is a person or a group who intends to order, or use purchased goods, products, or services primarily for personal, social, family, household and similar needs, who is not directly related to entrepreneurial or business activities. ...
s than for ''
citizens Citizenship is a membership and allegiance to a sovereign state. Though citizenship is often conflated with nationality in today's English-speaking world, international law does not usually use the term ''citizenship'' to refer to nationality; ...
'', and would help a consumer looking for "pizza" find local delivery options based on a personalized search and appropriately filter out distant pizza stores. Organizations such as ''
The Washington Post ''The Washington Post'', locally known as ''The'' ''Post'' and, informally, ''WaPo'' or ''WP'', is an American daily newspaper published in Washington, D.C., the national capital. It is the most widely circulated newspaper in the Washington m ...
'', ''
The New York Times ''The New York Times'' (''NYT'') is an American daily newspaper based in New York City. ''The New York Times'' covers domestic, national, and international news, and publishes opinion pieces, investigative reports, and reviews. As one of ...
'', and others have experimented with creating new personalized information services, with the aim of tailoring search results to those that users are likely to like or agree with.


Academia studies and reactions

A scientific study from Wharton that analyzed personalized recommendations also found that these filters can create commonality, not fragmentation, in online music taste. Consumers reportedly use the filters to expand their taste rather than to limit it. Harvard law professor
Jonathan Zittrain Jonathan L. Zittrain (born December 24, 1969) is an American professor of cyber law, Internet law and the George Bemis Professor of International Law at Harvard Law School. He is also a professor at the Harvard Kennedy School, a professor of co ...
disputed the extent to which personalization filters distort Google search results, saying that "the effects of search personalization have been light." Further, Google provides the ability for users to shut off personalization features if they choose by deleting Google's record of their search history and setting Google not to remember their search keywords and visited links in the future. A study from '' Internet Policy Review'' addressed the lack of a clear and testable definition for filter bubbles across disciplines; this often results in researchers defining and studying filter bubbles in different ways. Subsequently, the study explained a lack of empirical data for the existence of filter bubbles across disciplines and suggested that the effects attributed to them may stem more from preexisting ideological biases than from algorithms. Similar views can be found in other academic projects, which also address concerns with the definitions of filter bubbles and the relationships between ideological and technological factors associated with them. A critical review of filter bubbles suggested that "the filter bubble thesis often posits a special kind of political human who has opinions that are strong, but at the same time highly malleable" and that it is a "paradox that people have an active agency when they select content but are passive receivers once they are exposed to the algorithmically curated content recommended to them." A study by Oxford, Stanford, and Microsoft researchers examined the browsing histories of 1.2 million U.S. users of the Bing Toolbar add-on for Internet Explorer between March and May 2013. They selected 50,000 of those users who were active news consumers, then classified whether the news outlets they visited were left- or right-leaning, based on whether the majority of voters in the counties associated with user IP addresses voted for Obama or Romney in the 2012 presidential election. They then identified whether news stories were read after accessing the publisher's site directly, via the Google News aggregation service, web searches, or social media. The researchers found that while web searches and social media do contribute to ideological segregation, the vast majority of online news consumption consisted of users directly visiting left- or right-leaning mainstream news sites and consequently being exposed almost exclusively to views from a single side of the political spectrum. Limitations of the study included selection issues such as Internet Explorer users skewing higher in age than the general internet population; Bing Toolbar usage and the voluntary (or unknowing) sharing of browsing history selection for users who are less concerned about privacy; the assumption that all stories in left-leaning publications are left-leaning, and the same for right-leaning; and the possibility that users who are ''not'' active news consumers may get most of their news via social media, and thus experience stronger effects of social or algorithmic bias than those users who essentially self-select their bias through their choice of news publications (assuming they are aware of the publications' biases). A study by Princeton University and New York University researchers aimed to study the impact of filter bubble and algorithmic filtering on social media polarization. They used a mathematical model called the " stochastic block model" to test their hypothesis on the environments of Reddit and Twitter. The researchers gauged changes in polarization in regularized social media networks and non-regularized networks, specifically measuring the percent changes in polarization and disagreement on Reddit and Twitter. They found that polarization increased significantly at 400% in non-regularized networks, while polarization increased by 4% in regularized networks and disagreement by 5%.


Platform studies

While algorithms do limit political diversity, some of the filter bubbles are the result of user choice. A study by data scientists at Facebook found that users have one friend with contrasting views for every four Facebook friends that share an ideology. No matter what Facebook's algorithm for its News Feed is, people are more likely to befriend/follow people who share similar beliefs. The nature of the algorithm is that it ranks stories based on a user's history, resulting in a reduction of the "politically cross-cutting content by 5 percent for conservatives and 8 percent for liberals." However, even when people are given the option to click on a link offering contrasting views, they still default to their most viewed sources. " er choice decreases the likelihood of clicking on a cross-cutting link by 17 percent for conservatives and 6 percent for liberals." A cross-cutting link is one that introduces a different point of view than the user's presumed point of view or what the website has pegged as the user's beliefs. A recent study from Levi Boxell, Matthew Gentzkow, and Jesse M. Shapiro suggest that online media isn't the driving force for political polarization. The paper argues that polarization has been driven by the demographic groups that spend the least time online. The greatest ideological divide is experienced amongst Americans older than 75, while only 20% reported using social media as of 2012. In contrast, 80% of Americans aged 18–39 reported using social media as of 2012. The data suggests that the younger demographic isn't any more polarized in 2012 than it had been when online media barely existed in 1996. The study highlights differences between age groups and how news consumption remains polarized as people seek information that appeals to their preconceptions. Older Americans usually remain stagnant in their political views as traditional media outlets continue to be a primary source of news, while online media is the leading source for the younger demographic. Although algorithms and filter bubbles weaken content diversity, this study reveals that political polarization trends are primarily driven by pre-existing views and failure to recognize outside sources. A 2020 study from Germany utilized the Big Five Psychology model to test the effects of individual personality, demographics, and ideologies on user news consumption. Basing their study on the notion that the number of news sources that users consume impacts their likelihood to be caught in a filter bubble—with higher media diversity lessening the chances—their results suggest that certain demographics (higher age and male) along with certain personality traits (high openness) correlate positively with a number of news sources consumed by individuals. The study also found a negative ideological association between media diversity and the degree to which users align with right-wing authoritarianism. Beyond offering different individual user factors that may influence the role of user choice, this study also raises questions and associations between the likelihood of users being caught in filter bubbles and user voting behavior. The Facebook study found that it was "inconclusive" whether or not the algorithm played as big a role in filtering News Feeds as people assumed. The study also found that "individual choice," or confirmation bias, likewise affected what gets filtered out of News Feeds. Some social scientists criticized this conclusion because the point of protesting the filter bubble is that the algorithms and individual choice work together to filter out News Feeds. They also criticized Facebook's small sample size, which is about "9% of actual Facebook users," and the fact that the study results are "not reproducible" due to the fact that the study was conducted by "Facebook scientists" who had access to data that Facebook does not make available to outside researchers. Though the study found that only about 15–20% of the average user's Facebook friends subscribe to the opposite side of the political spectrum, Julia Kaman from Vox theorized that this could have potentially positive implications for viewpoint diversity. These "friends" are often acquaintances with whom we would not likely share our politics without the internet. Facebook may foster a unique environment where a user sees and possibly interacts with content posted or re-posted by these "second-tier" friends. The study found that "24 percent of the news items liberals saw were conservative-leaning and 38 percent of the news conservatives saw was liberal-leaning." "Liberals tend to be connected to fewer friends who share information from the other side, compared with their conservative counterparts." This interplay has the ability to provide diverse information and sources that could moderate users' views. Similarly, a study of
Twitter Twitter, officially known as X since 2023, is an American microblogging and social networking service. It is one of the world's largest social media platforms and one of the most-visited websites. Users can share short text messages, image ...
's filter bubbles by
New York University New York University (NYU) is a private university, private research university in New York City, New York, United States. Chartered in 1831 by the New York State Legislature, NYU was founded in 1832 by Albert Gallatin as a Nondenominational ...
concluded that "Individuals now have access to a wider span of viewpoints about news events, and most of this information is not coming through the traditional channels, but either directly from political actors or through their friends and relatives. Furthermore, the interactive nature of
social media Social media are interactive technologies that facilitate the Content creation, creation, information exchange, sharing and news aggregator, aggregation of Content (media), content (such as ideas, interests, and other forms of expression) amongs ...
creates opportunities for individuals to discuss political events with their peers, including those with whom they have weak social ties." According to these studies, social media may be diversifying information and opinions users come into contact with, though there is much speculation around filter bubbles and their ability to create deeper
political polarization Political polarization (spelled ''polarisation'' in British English, Australian English, and New Zealand English) is the divergence of political attitudes away from the center, towards ideological extremes. Scholars distinguish between ideologi ...
. One driver and possible solution to the problem is the role of emotions in online content. A 2018 study shows that different emotions of messages can lead to polarization or convergence: joy is prevalent in emotional polarization, while sadness and fear play significant roles in emotional convergence. Since it is relatively easy to detect the emotional content of messages, these findings can help to design more socially responsible algorithms by starting to focus on the emotional content of algorithmic recommendations.
Social bot A social bot, also described as a social AI or social algorithm, is a software agent that communicates autonomously on social media. The messages (e.g. tweets) it distributes can be simple and operate in groups and various configurations with ...
s have been utilized by different researchers to test polarization and related effects that are attributed to filter bubbles and echo chambers. A 2018 study used social bots on Twitter to test deliberate user exposure to partisan viewpoints. The study claimed it demonstrated partisan differences between exposure to differing views, although it warned that the findings should be limited to party-registered American Twitter users. One of the main findings was that after exposure to differing views (provided by the bots), self-registered republicans became more conservative, whereas self-registered liberals showed less ideological change if none at all. A different study from The People's Republic of China utilized social bots on ''Weibo''—the largest social media platform in China—to examine the structure of filter bubbles regarding to their effects on polarization. The study draws a distinction between two conceptions of polarization. One being where people with similar views form groups, share similar opinions, and block themselves from differing viewpoints (opinion polarization), and the other being where people do not access diverse content and sources of information (information polarization). By utilizing social bots instead of human volunteers and focusing more on information polarization rather than opinion-based, the researchers concluded that there are two essential elements of a filter bubble: a large concentration of users around a single topic and a uni-directional, star-like structure that impacts key information flows. In June 2018, the platform DuckDuckGo conducted a research study on the Google Web Browser Platform. For this study, 87 adults in various locations around the continental United States googled three keywords at the exact same time: immigration, gun control, and vaccinations. Even in private browsing mode, most people saw results unique to them. Google included certain links for some that it did not include for other participants, and the News and Videos infoboxes showed significant variation. Google publicly disputed these results saying that Search Engine Results Page (SERP) personalization is mostly a myth. Google Search Liaison, Danny Sullivan, stated that "Over the years, a myth has developed that Google Search personalizes so much that for the same query, different people might get significantly different results from each other. This isn't the case. Results can differ, but usually for non-personalized reasons." When filter bubbles are in place, they can create specific moments that scientists call 'Whoa' moments. A 'Whoa' moment is when an article, ad, post, etc., appears on your computer that is in relation to a current action or current use of an object. Scientists discovered this term after a young woman was performing her daily routine, which included drinking coffee when she opened her computer and noticed an advertisement for the same brand of coffee that she was drinking. "Sat down and opened up Facebook this morning while having my coffee, and there they were two ads for Nespresso. Kind of a 'whoa' moment when the product you're drinking pops up on the screen in front of you." "Whoa" moments occur when people are "found." Which means advertisement algorithms target specific users based on their "click behavior" to increase their sale revenue. Several designers have developed tools to counteract the effects of filter bubbles (see ). Swiss radio station SRF voted the word ''filterblase'' (the German translation of filter bubble) word of the year 2016.


Countermeasures


By individuals

In ''The Filter Bubble: What the Internet Is Hiding from You'', internet activist Eli Pariser highlights how the increasing occurrence of filter bubbles further emphasizes the value of one's bridging
social capital Social capital is a concept used in sociology and economics to define networks of relationships which are productive towards advancing the goals of individuals and groups. It involves the effective functioning of social groups through interper ...
as defined by Robert Putman. Pariser argues that filter bubbles reinforce a sense of social homogeneity, which weakens ties between people with potentially diverging interests and viewpoints. In that sense, high bridging capital may promote social inclusion by increasing our exposure to a space that goes beyond self-interests. Fostering one's bridging capital, such as by connecting with more people in an informal setting, may be an effective way to reduce the filter bubble phenomenon. Users can take many actions to burst through their filter bubbles, for example by making a conscious effort to evaluate what information they are exposing themselves to, and by thinking critically about whether they are engaging with a broad range of content. Users can consciously avoid news sources that are unverifiable or weak. Chris Glushko, the VP of Marketing at IAB, advocates using
fact-checking Fact-checking is the process of verifying the factual accuracy of questioned reporting and statements. Fact-checking can be conducted before or after the text or content is published or otherwise disseminated. Internal fact-checking is such che ...
sites to identify fake news. Technology can also play a valuable role in combating filter bubbles. Some browser plug-ins are aimed to help people step out of their filter bubbles and make them aware of their personal perspectives; thus, these media show content that contradicts with their beliefs and opinions. In addition to plug-ins, there are apps created with the mission of encouraging users to open their echo chambers. News apps such as ''Read Across the Aisle'' nudge users to read different perspectives if their reading pattern is biased towards one side/ideology. Although apps and plug-ins are tools humans can use, Eli Pariser stated "certainly, there is some individual responsibility here to really seek out new sources and people who aren't like you." Since web-based advertising can further the effect of the filter bubbles by exposing users to more of the same content, users can block much advertising by deleting their search history, turning off targeted ads, and downloading browser extensions. Some use anonymous or non-personalized search engines such as YaCy, DuckDuckGo, Qwant, Startpage.com, Disconnect, and Searx in order to prevent companies from gathering their web-search data. Swiss daily ''Neue Zürcher Zeitung'' is beta-testing a personalized news engine app which uses machine learning to guess what content a user is interested in, while "always including an element of surprise"; the idea is to mix in stories which a user is unlikely to have followed in the past. The European Union is taking measures to lessen the effect of the filter bubble. The
European Parliament The European Parliament (EP) is one of the two legislative bodies of the European Union and one of its seven institutions. Together with the Council of the European Union (known as the Council and informally as the Council of Ministers), it ...
is sponsoring inquiries into how filter bubbles affect people's ability to access diverse news. Additionally, it introduced a program aimed to educate citizens about social media. In the U.S., the CSCW panel suggests the use of news aggregator apps to broaden media consumers news intake. News aggregator apps scan all current news articles and direct you to different viewpoints regarding a certain topic. Users can also use a diversely-aware news balancer which visually shows the media consumer if they are leaning left or right when it comes to reading the news, indicating right-leaning with a bigger red bar or left-leaning with a bigger blue bar. A study evaluating this news balancer found "a small but noticeable change in reading behavior, toward more balanced exposure, among users seeing the feedback, as compared to a control group".


By media companies

In light of recent concerns about information filtering on social media, Facebook acknowledged the presence of filter bubbles and has taken strides toward removing them. In January 2017, Facebook removed personalization from its Trending Topics list in response to problems with some users not seeing highly talked-about events there. Facebook's strategy is to reverse the Related Articles feature that it had implemented in 2013, which would post related news stories after the user read a shared article. Now, the revamped strategy would flip this process and post articles from different perspectives on the same topic. Facebook is also attempting to go through a vetting process whereby only articles from reputable sources will be shown. Along with the founder of
Craigslist Craigslist (stylized as craigslist) is a privately held American company operating a classified advertisements website with sections devoted to jobs, housing, for sale, items wanted, services, community service, gigs, résumés, and discussi ...
and a few others, Facebook has invested $14 million into efforts "to increase trust in journalism around the world, and to better inform the public conversation". The idea is that even if people are only reading posts shared from their friends, at least these posts will be credible. Similarly, Google, as of January 30, 2018, has also acknowledged the existence of a filter bubble difficulties within its platform. Because current Google searches pull algorithmically ranked results based upon "authoritativeness" and "relevancy" which show and hide certain search results, Google is seeking to combat this. By training its search engine to recognize the
intent An intention is a mental state in which a person commits themselves to a course of action. Having the plan to visit the zoo tomorrow is an example of an intention. The action plan is the ''content'' of the intention while the commitment is the '' ...
of a search inquiry rather than the literal syntax of the question, Google is attempting to limit the size of filter bubbles. As of now, the initial phase of this training will be introduced in the second quarter of 2018. Questions that involve bias and/or controversial opinions will not be addressed until a later time, prompting a larger problem that exists still: whether the search engine acts either as an arbiter of truth or as a knowledgeable guide by which to make decisions by. In April 2017 news surfaced that Facebook,
Mozilla Mozilla is a free software community founded in 1998 by members of Netscape. The Mozilla community uses, develops, publishes and supports Mozilla products, thereby promoting free software and open standards. The community is supported institution ...
, and Craigslist contributed to the majority of a $14M donation to
CUNY The City University of New York (CUNY, pronounced , ) is the Public university, public university system of Education in New York City, New York City. It is the largest urban university system in the United States, comprising 25 campuses: eleven ...
's "News Integrity Initiative," poised at eliminating fake news and creating more honest news media. Later, in August, Mozilla, makers of the
Firefox Mozilla Firefox, or simply Firefox, is a free and open-source web browser developed by the Mozilla Foundation and its subsidiary, the Mozilla Corporation. It uses the Gecko rendering engine to display web pages, which implements curr ...
web browser, announced the formation of the Mozilla Information Trust Initiative (MITI). The +MITI would serve as a collective effort to develop products, research, and community-based solutions to combat the effects of filter bubbles and the proliferation of fake news. Mozilla's Open Innovation team leads the initiative, striving to combat misinformation, with a specific focus on the product with regards to literacy, research and creative interventions.


Ethical implications

As the popularity of
cloud services Cloud computing is "a paradigm for enabling network access to a scalable and elastic pool of shareable physical or virtual resources with self-service provisioning and administration on-demand," according to ISO. Essential characteristics ...
increases, personalized
algorithm In mathematics and computer science, an algorithm () is a finite sequence of Rigour#Mathematics, mathematically rigorous instructions, typically used to solve a class of specific Computational problem, problems or to perform a computation. Algo ...
s used to construct filter bubbles are expected to become more widespread. Scholars have begun considering the effect of filter bubbles on the users of
social media Social media are interactive technologies that facilitate the Content creation, creation, information exchange, sharing and news aggregator, aggregation of Content (media), content (such as ideas, interests, and other forms of expression) amongs ...
from an ethical standpoint, particularly concerning the areas of personal freedom,
security Security is protection from, or resilience against, potential harm (or other unwanted coercion). Beneficiaries (technically referents) of security may be persons and social groups, objects and institutions, ecosystems, or any other entity or ...
, and information bias. Filter bubbles in popular social media and personalized search sites can determine the particular content seen by users, often without their direct consent or cognizance, due to the algorithms used to curate that content. Self-created content manifested from behavior patterns can lead to partial information blindness. Critics of the use of filter bubbles speculate that individuals may lose autonomy over their own social media experience and have their identities socially constructed as a result of the pervasiveness of filter bubbles. Technologists, social media engineers, and computer specialists have also examined the prevalence of filter bubbles.
Mark Zuckerberg Mark Elliot Zuckerberg (; born May 14, 1984) is an American businessman who co-founded the social media service Facebook and its parent company Meta Platforms, of which he is the chairman, chief executive officer, and controlling sharehold ...
, founder of Facebook, and Eli Pariser, author of ''The Filter Bubble'', have expressed concerns regarding the risks of privacy and information polarization. The information of the users of personalized search engines and social media platforms is not private, though some people believe it should be. The concern over privacy has resulted in a debate as to whether or not it is moral for information technologists to take users' online activity and manipulate future exposure to related information. Some scholars have expressed concerns regarding the effects of filter bubbles on individual and social well-being, i.e. the dissemination of health information to the general public and the potential effects of internet search engines to alter health-related behavior. A 2019 multi-disciplinary book reported research and perspectives on the roles filter bubbles play in regards to health misinformation. Drawing from various fields such as journalism, law, medicine, and health psychology, the book addresses different controversial health beliefs (e.g. alternative medicine and pseudoscience) as well as potential remedies to the negative effects of filter bubbles and echo chambers on different topics in health discourse. A 2016 study on the potential effects of filter bubbles on search engine results related to suicide found that algorithms play an important role in whether or not helplines and similar search results are displayed to users and discussed the implications their research may have for health policies. Another 2016 study from the Croatian Medical journal proposed some strategies for mitigating the potentially harmful effects of filter bubbles on health information, such as: informing the public more about filter bubbles and their associated effects, users choosing to try alternative o Googlesearch engines, and more explanation of the processes search engines use to determine their displayed results. Since the content seen by individual social media users is influenced by algorithms that produce filter bubbles, users of social media platforms are more susceptible to
confirmation bias Confirmation bias (also confirmatory bias, myside bias, or congeniality bias) is the tendency to search for, interpret, favor and recall information in a way that confirms or supports one's prior beliefs or Value (ethics and social sciences), val ...
, and may be exposed to biased, misleading information. Social sorting and other unintentional discriminatory practices are also anticipated as a result of personalized filtering. In light of the 2016 U.S. presidential election scholars have likewise expressed concerns about the effect of filter bubbles on
democracy Democracy (from , ''dēmos'' 'people' and ''kratos'' 'rule') is a form of government in which political power is vested in the people or the population of a state. Under a minimalist definition of democracy, rulers are elected through competitiv ...
and democratic processes, as well as the rise of "ideological media". These scholars fear that users will be unable to " hinkbeyond
heir Inheritance is the practice of receiving private property, titles, debts, entitlements, privileges, rights, and obligations upon the death of an individual. The rules of inheritance differ among societies and have changed over time. Offi ...
narrow self-interest" as filter bubbles create personalized social feeds, isolating them from diverse points of view and their surrounding communities. For this reason, an increasingly discussed possibility is to design social media with more serendipity, that is, to proactively recommend content that lies outside one's filter bubble, including challenging political information and, eventually, to provide empowering filters and tools to users. A related concern is in fact how filter bubbles contribute to the proliferation of "
fake news Fake news or information disorder is false or misleading information (misinformation, disinformation, propaganda, and hoaxes) claiming the aesthetics and legitimacy of news. Fake news often has the aim of damaging the reputation of a person ...
" and how this may influence political leaning, including how users vote. Revelations in March 2018 of Cambridge Analytica's harvesting and use of user data for at least 87 million Facebook profiles during the 2016 presidential election highlight the ethical implications of filter bubbles. Co-founder and whistleblower of Cambridge Analytica Christopher Wylie, detailed how the firm had the ability to develop "psychographic" profiles of those users and use the information to shape their voting behavior. Access to user data by third parties such as Cambridge Analytica can exasperate and amplify existing filter bubbles users have created, artificially increasing existing biases and further divide societies.


Dangers

Filter bubbles have stemmed from a surge in media personalization, which can trap users. The use of AI to personalize offerings can lead to users viewing only content that reinforces their own viewpoints without challenging them. Social media websites like Facebook may also present content in a way that makes it difficult for users to determine the source of the content, leading them to decide for themselves whether the source is reliable or fake. That can lead to people becoming used to hearing what they want to hear, which can cause them to react more radically when they see an opposing viewpoint. The filter bubble may cause the person to see any opposing viewpoints as incorrect and so could allow the media to force views onto consumers. Researches explain that the filter bubble reinforces what one is already thinking. This is why it is extremely important to utilize resources that offer various points of view.


See also

* Algorithmic curation *
Algorithmic radicalization Algorithmic radicalization is the concept that recommender algorithms on popular social media sites such as YouTube and Facebook drive users toward progressively more extreme content over time, leading to them developing Radicalization, radicaliz ...
* Allegory of the Cave * Attention inequality * Communal reinforcement * Content farm * Dead Internet theory *
Deradicalization Deradicalization refers to a process of encouraging a person with extreme political, social or religious views to adopt more moderate positions on the issues. Measures and projects Google's think tank Jigsaw has been developing a new program ...
*
Echo chamber (media) image:Hallraum TU Dresden 2009-06-21.jpg, Echo chamber of the Dresden University of Technology image:Hamilton Mausoleum Interior.jpg, Hamilton Mausoleum has a long-lasting unplanned echo An echo chamber is a hollow enclosure used to produce rever ...
*
False consensus effect In psychology, the false consensus effect, also known as consensus bias, is a pervasive cognitive bias that causes people to overestimate the extent to which other people share their beliefs and views; it is the tendency to "see their own behavior ...
*
Group polarization In social psychology, group polarization refers to the tendency for a group to make decisions that are more extreme than the initial inclination of its members. These more extreme decisions are towards greater risk if individuals' initial tendenci ...
*
Groupthink Groupthink is a psychological phenomenon that occurs within a group of people in which the desire for harmony or conformity in the group results in an irrational or dysfunctional decision-making outcome. Cohesiveness, or the desire for cohesivenes ...
*
Infodemic An infodemic is a rapid and far-reaching spread of both accurate and inaccurate information about certain issues. The word is a portmanteau of ''information'' and ''epidemic'' and is used as a metaphor to describe how misinformation and disinf ...
*
Information silo An information silo, or a group of such silos, is an insular management system in which one information system or subsystem is incapable of reciprocal operation with others that are, or should be, related. Thus information is not adequately shared ...
* Media consumption * Narrowcasting * Search engine manipulation effect * Selective exposure theory * Serendipitous discovery, an antithesis of filter bubble * '' The Social Dilemma'' *
Stereotype In social psychology, a stereotype is a generalization, generalized belief about a particular category of people. It is an expectation that people might have about every person of a particular group. The type of expectation can vary; it can ...


Notes


References


Further reading

* Pariser, Eli. ''The Filter Bubble: What the Internet Is Hiding from You'',
Penguin Press Penguin Group is a British trade book publisher and part of Penguin Random House, which is owned by the German media conglomerate Bertelsmann. The new company was created by a merger that was finalised on 1 July 2013, with Bertelsmann initiall ...
(New York, 2011) * * * * * * * *


External links


Beware Online Filter Bubbles
TED Talks, March 2011 {{DEFAULTSORT:Filter bubble Influence of mass media Internet censorship Internet manipulation and propaganda Mass media issues Media bias Personalized search Public opinion Social influence Sociology of technology