Alt-right pipeline: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
m →‎Process: wikilinked heterodox
new sources
Line 4: Line 4:


== Process ==
== Process ==
Use of the internet allows individuals with [[heterodox]] beliefs to alter their environment, which in turn has transformative effects on the user. Influence from external sources such as the internet can be gradual so that the individual is not immediately aware of their changing understanding or surroundings. Members of the alt-right refer to this radicalization process as "taking the [[Red pill and blue pill|red pill]]" in reference to the method of immediately achieving greater awareness in ''[[The Matrix]]''. This is in contrast to the gradual nature of radicalization described by the alt-right pipeline.<ref name=":0">{{Cite journal |last=Munn |first=Luke |date=2019-06-01 |title=Alt-right pipeline: Individual journeys to extremism online |url=https://firstmonday.org/ojs/index.php/fm/article/view/10108 |journal=First Monday |language=en |doi=10.5210/fm.v24i6.10108 |s2cid=184483249 |issn=1396-0466}}</ref><ref name=":2">{{Cite web |last=Evans |first=Robert |author-link=Robert Evans (journalist) |date=2018-10-11 |title=From Memes to Infowars: How 75 Fascist Activists Were “Red-Pilled” |url=https://www.bellingcat.com/news/americas/2018/10/11/memes-infowars-75-fascist-activists-red-pilled/ |access-date=2022-10-27 |website=Bellingcat |language=en-GB}}</ref> Many far-right extremists recognize the potential of this radicalization method and actively share right-wing content with the intention of gradually radicalizing those around them. The use of racist imagery or humor may be used by these individuals under the guise of [[irony]] or insincerity to make alt-right ideas more acceptable.<ref name=":2" />
Use of the internet allows individuals with [[heterodox]] beliefs to alter their environment, which in turn has transformative effects on the user. Influence from external sources such as the internet can be gradual so that the individual is not immediately aware of their changing understanding or surroundings. Members of the alt-right refer to this radicalization process as "taking the [[Red pill and blue pill|red pill]]" in reference to the method of immediately achieving greater awareness in ''[[The Matrix]]''. This is in contrast to the gradual nature of radicalization described by the alt-right pipeline.<ref name=":0">{{Cite journal |last=Munn |first=Luke |date=2019-06-01 |title=Alt-right pipeline: Individual journeys to extremism online |url=https://firstmonday.org/ojs/index.php/fm/article/view/10108 |journal=First Monday |language=en |doi=10.5210/fm.v24i6.10108 |s2cid=184483249 |issn=1396-0466}}</ref><ref name=":2">{{Cite web |last=Evans |first=Robert |author-link=Robert Evans (journalist) |date=2018-10-11 |title=From Memes to Infowars: How 75 Fascist Activists Were “Red-Pilled” |url=https://www.bellingcat.com/news/americas/2018/10/11/memes-infowars-75-fascist-activists-red-pilled/ |access-date=2022-10-27 |website=Bellingcat |language=en-GB}}</ref> Many far-right extremists recognize the potential of this radicalization method and actively share right-wing content with the intention of gradually radicalizing those around them. The use of racist imagery or humor may be used by these individuals under the guise of [[irony]] or insincerity to make alt-right ideas more acceptable.<ref name=":2" /><ref>{{Cite web |last=Wilson |first=Jason |date=2017-05-23 |title=Hiding in plain sight: how the 'alt-right' is weaponizing irony to spread fascism |url=http://www.theguardian.com/technology/2017/may/23/alt-right-online-humor-as-a-weapon-facism |access-date=2022-10-28 |website=The Guardian |language=en}}</ref>


[[YouTube]] has been identified as a major element in the alt-right pipeline. This is facilitated through an "Alternative Influence Network", in which various right-wing scholars, pundits, and internet personalities interact with one another to boost performance of their content. These figures may vary in their ideologies between [[conservatism]], [[libertarianism]], or [[white nationalism]], but they share a common opposition to [[feminism]], [[progressivism]], and [[social justice]] that allows viewers of one figure to quickly acclimate to another.<ref>{{Cite report |url=https://datasociety.net/library/alternative-influence/ |title=Alternative Influence: Broadcasting the Reactionary Right on YouTube |last=Lewis |first=Rebecca |date=2018-09-18 |publisher=Data & Society}}</ref> They often prioritize right-wing social issues over right-wing economic issues, with little discussion of [[fiscal conservatism]]. Some individuals in this network may not interact with one another, but a collection of interviews, internet debates, and other interactions create pathways for users to be introduced to new content.<ref name=":1">{{Cite journal |last1=Ribeiro |first1=Manoel Horta |last2=Ottoni |first2=Raphael |last3=West |first3=Robert |last4=Almeida |first4=Virgílio A. F. |last5=Meira |first5=Wagner |date=2020-01-27 |title=Auditing Radicalization Pathways on YouTube |url= |journal=FAT* '20: Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency |language=en |pages=131–141 |doi=10.1145/3351095.3372879|isbn=9781450369367 |s2cid=201316434 }}</ref>
[[YouTube]] has been identified as a major element in the alt-right pipeline. This is facilitated through an "Alternative Influence Network", in which various right-wing scholars, pundits, and internet personalities interact with one another to boost performance of their content. These figures may vary in their ideologies between [[conservatism]], [[libertarianism]], or [[white nationalism]], but they share a common opposition to [[feminism]], [[progressivism]], and [[social justice]] that allows viewers of one figure to quickly acclimate to another.<ref>{{Cite report |url=https://datasociety.net/library/alternative-influence/ |title=Alternative Influence: Broadcasting the Reactionary Right on YouTube |last=Lewis |first=Rebecca |date=2018-09-18 |publisher=Data & Society}}</ref> They often prioritize right-wing social issues over right-wing economic issues, with little discussion of [[fiscal conservatism]]. Some individuals in this network may not interact with one another, but a collection of interviews, internet debates, and other interactions create pathways for users to be introduced to new content.<ref name=":1">{{Cite journal |last1=Ribeiro |first1=Manoel Horta |last2=Ottoni |first2=Raphael |last3=West |first3=Robert |last4=Almeida |first4=Virgílio A. F. |last5=Meira |first5=Wagner |date=2020-01-27 |title=Auditing Radicalization Pathways on YouTube |url= |journal=FAT* '20: Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency |language=en |pages=131–141 |doi=10.1145/3351095.3372879|isbn=9781450369367 |s2cid=201316434 }}</ref>


YouTube's algorithmic method of video suggestions also assists users in quickly finding new content similar to what they have previously watched, allowing users to more deeply explore an idea once they have expressed interest.<ref name=":0" /><ref name=":3">{{Cite news |last=Roose |first=Kevin |author-link=Kevin Roose |date=2019-06-08 |title=The Making of a YouTube Radical |language=en-US |work=The New York Times |url=https://www.nytimes.com/interactive/2019/06/08/technology/youtube-radical.html,%20https://www.nytimes.com/interactive/2019/06/08/technology/youtube-radical.html |access-date=2022-10-26 |issn=0362-4331}}</ref> When a user is exposed to certain content featuring certain political issues or [[culture war]] issues, this recommendation system may lead users to different ideas or issues, including [[Islamophobia]], [[opposition to immigration]], antifeminism, or [[Great Replacement|reproduction rates]].<ref name=":0" /> Recommended content is often somewhat related, which creates an effect of gradual radicalization between multiple issues, referred to as a pipeline. Radicalization also takes place in interactions with other radicalized users online, on varied platforms such as [[Gab (social network)|Gab]], [[Reddit]], [[4chan]], or [[Discord]].<ref name=":0" /> Major personalities in this chain often have a presence on [[Facebook]] and [[Twitter]], though YouTube is typically their primary platform for messaging and earning income.<ref name=":3" />
YouTube's algorithmic method of video suggestions assists users in quickly finding new content similar to what they have previously watched, allowing users to more deeply explore an idea once they have expressed interest.<ref name=":0" /><ref name=":3">{{Cite news |last=Roose |first=Kevin |author-link=Kevin Roose |date=2019-06-08 |title=The Making of a YouTube Radical |language=en-US |work=The New York Times |url=https://www.nytimes.com/interactive/2019/06/08/technology/youtube-radical.html,%20https://www.nytimes.com/interactive/2019/06/08/technology/youtube-radical.html |access-date=2022-10-26 |issn=0362-4331}}</ref> When a user is exposed to certain content featuring certain political issues or [[culture war]] issues, this recommendation system may lead users to different ideas or issues, including [[Islamophobia]], [[opposition to immigration]], antifeminism, or [[Great Replacement|reproduction rates]].<ref name=":0" /> Recommended content is often somewhat related, which creates an effect of gradual radicalization between multiple issues, referred to as a pipeline. Radicalization also takes place in interactions with other radicalized users online, on varied platforms such as [[Gab (social network)|Gab]], [[Reddit]], [[4chan]], or [[Discord]].<ref name=":0" /> Major personalities in this chain often have a presence on [[Facebook]] and [[Twitter]], though YouTube is typically their primary platform for messaging and earning income.<ref name=":3" /> The effect of this algorithmic bias has been replicated in some studies,<ref name=":1" /> though the extent of its influence is unclear.<ref name=":5">{{Cite journal |last=Ledwich |first=Mark |last2=Zaitsev |first2=Anna |date=2020-02-26 |title=Algorithmic extremism: Examining YouTube's rabbit hole of radicalization |url=https://firstmonday.org/ojs/index.php/fm/article/view/10419 |journal=First Monday |language=en |doi=10.5210/fm.v25i3.10419 |issn=1396-0466}}</ref><ref>{{Cite journal |last=Munger |first=Kevin |last2=Phillips |first2=Joseph |date=2020 |title=Right-Wing YouTube: A Supply and Demand Perspective |url=http://journals.sagepub.com/doi/10.1177/1940161220964767 |journal=The International Journal of Press/Politics |language=en |volume=27 |issue=1 |pages=186–219 |doi=10.1177/1940161220964767 |issn=1940-1612}}</ref>


== Content ==
== Content ==
Line 25: Line 25:


== Prevention ==
== Prevention ==
Many social media platforms have recognized the potential of radicalization and implement measures to limit its presence. High profile extremist commentators such as [[Alex Jones]] have been banned from several platforms, and platforms often have rules against [[hate speech]] and misinformation. In 2019, YouTube announced a change to its recommendation algorithm to reduce conspiracy theory related content.<ref name=":3" />
Many social media platforms have recognized the potential of radicalization and implement measures to limit its presence. High profile extremist commentators such as [[Alex Jones]] have been banned from several platforms, and platforms often have rules against [[hate speech]] and misinformation. In 2019, YouTube announced a change to its recommendation algorithm to reduce conspiracy theory related content.<ref name=":3" /> Some extreme content, such as explicit deceptions of violence, are typically removed on most social media platforms. On YouTube, content that expresses support of extremism may have monetization features removed, may be flagged for review, or may have public user comments disabled.<ref name=":5" />


== See also ==
== See also ==

Revision as of 02:49, 28 October 2022

The alt-right pipeline (also called the alt-right rabbit hole) is a conceptual model regarding internet radicalization toward the alt-right movement. It describes a phenomenon in which consuming antifeminist or anti-SJW content increases exposure to the alt-right or similar right-wing extremism through algorithmic bias and online communities.

Process

Use of the internet allows individuals with heterodox beliefs to alter their environment, which in turn has transformative effects on the user. Influence from external sources such as the internet can be gradual so that the individual is not immediately aware of their changing understanding or surroundings. Members of the alt-right refer to this radicalization process as "taking the red pill" in reference to the method of immediately achieving greater awareness in The Matrix. This is in contrast to the gradual nature of radicalization described by the alt-right pipeline.[1][2] Many far-right extremists recognize the potential of this radicalization method and actively share right-wing content with the intention of gradually radicalizing those around them. The use of racist imagery or humor may be used by these individuals under the guise of irony or insincerity to make alt-right ideas more acceptable.[2][3]

YouTube has been identified as a major element in the alt-right pipeline. This is facilitated through an "Alternative Influence Network", in which various right-wing scholars, pundits, and internet personalities interact with one another to boost performance of their content. These figures may vary in their ideologies between conservatism, libertarianism, or white nationalism, but they share a common opposition to feminism, progressivism, and social justice that allows viewers of one figure to quickly acclimate to another.[4] They often prioritize right-wing social issues over right-wing economic issues, with little discussion of fiscal conservatism. Some individuals in this network may not interact with one another, but a collection of interviews, internet debates, and other interactions create pathways for users to be introduced to new content.[5]

YouTube's algorithmic method of video suggestions assists users in quickly finding new content similar to what they have previously watched, allowing users to more deeply explore an idea once they have expressed interest.[1][6] When a user is exposed to certain content featuring certain political issues or culture war issues, this recommendation system may lead users to different ideas or issues, including Islamophobia, opposition to immigration, antifeminism, or reproduction rates.[1] Recommended content is often somewhat related, which creates an effect of gradual radicalization between multiple issues, referred to as a pipeline. Radicalization also takes place in interactions with other radicalized users online, on varied platforms such as Gab, Reddit, 4chan, or Discord.[1] Major personalities in this chain often have a presence on Facebook and Twitter, though YouTube is typically their primary platform for messaging and earning income.[6] The effect of this algorithmic bias has been replicated in some studies,[5] though the extent of its influence is unclear.[7][8]

Content

The alt-right pipeline has been found to begin with the intellectual dark web community, which is made up of internet personalities that are unified by an opposition to identity politics and political correctness, such as Joe Rogan, Ben Shapiro, Dave Rubin, and Jordan Peterson. The intellectual dark web community overlaps and interacts with the alt-lite community, such as Steven Crowder, Paul Joseph Watson, Mark Dice, and Sargon of Akkad. This community in turn overlaps and interacts with the alt-right community, such as James Allsup, Black Pigeon Speaks, Varg Vikernes, and Red Ice.[5] The most extreme endpoint often involves fascism or belief in an international Jewish conspiracy,[2] though the severity of extremism can vary between individuals.[6]

The antifeminist Manosphere has been identified as another early point in the alt-right pipeline.[9] The men's rights movement often discusses men's issues more visibly than other groups, attracting young men with interest in such issues when no alternative is made available. Many right-wing internet personalities have developed a method to expand their audiences by commenting on popular media; videos that criticize movies or video games for supporting left-wing ideas are more likely to attract fans of the respective franchises.[6]

The format presented by YouTube has allowed extremists of all ideologies to access new audiences through this means.[6] The same process has also been used to facilitate far-left radicalization. The internet community BreadTube developed through the use this pipeline process to introduce users to left-wing content and mitigate exposure to right-wing content,[6][10] though the pipeline process has been found to be less effective in spreading left-wing extremism due to the larger variety of opposing left-wing groups that limits interaction and overlap.[10] This dichotomy can also cause a "whiplash polarization" in which individuals are converted between far-right and far-left politics.[6]

Psychological factors

The psychological factors of radicalization through the alt-right pipeline are similar to other forms of radicalization, including normalization, acclimation, and dehumanization. Normalization involves the trivialization of racist and antisemitic rhetoric. Individuals early in the alt-right pipeline will not willingly embrace such rhetoric, but will adopt it under the guise of dark humor, causing it to be less shocking over time. This may sometimes be engineered intentionally by members of the alt-right to make their beliefs more palatable and provide plausible deniability for extreme beliefs. Acclimation is the process of being conditioned to seeing bigoted content. By acclimating to controversial content, individuals become more open to slightly more extreme content. Over time, conservative figures appear too moderate and users seek out more extreme voices. Dehumanization is the final step of the alt-right pipeline, where minorities are seen as lesser or undeserving of life and dehumanizing language is used to refer to people that disagree with far-right beliefs.[1]

The process is associated with young men that experience loneliness, meaninglessness, or a lack of belonging.[6] An openness to unpopular views is necessary for individuals to accept beliefs associated with the alt-right pipeline. It has been associated with contrarianism, in which an individual uses the working assumption that the worldviews of most people are entirely wrong. From this assumption, individuals are more inclined to adopt beliefs that are unpopular or fringe. This makes effective several entry points of the alt-right pipeline, such as libertarianism, in which ideologies attract individuals with traits that make them susceptible to radicalization when exposed to other fringe ideas.[11]

Motivation for pursuing these communities varies, with some people finding them by chance while others seek them out. Interest in video games is associated with the early stages of the alt-right pipeline.[6]

Prevention

Many social media platforms have recognized the potential of radicalization and implement measures to limit its presence. High profile extremist commentators such as Alex Jones have been banned from several platforms, and platforms often have rules against hate speech and misinformation. In 2019, YouTube announced a change to its recommendation algorithm to reduce conspiracy theory related content.[6] Some extreme content, such as explicit deceptions of violence, are typically removed on most social media platforms. On YouTube, content that expresses support of extremism may have monetization features removed, may be flagged for review, or may have public user comments disabled.[7]

See also

References

  1. ^ a b c d e Munn, Luke (2019-06-01). "Alt-right pipeline: Individual journeys to extremism online". First Monday. doi:10.5210/fm.v24i6.10108. ISSN 1396-0466. S2CID 184483249.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  2. ^ a b c Evans, Robert (2018-10-11). "From Memes to Infowars: How 75 Fascist Activists Were "Red-Pilled"". Bellingcat. Retrieved 2022-10-27.
  3. ^ Wilson, Jason (2017-05-23). "Hiding in plain sight: how the 'alt-right' is weaponizing irony to spread fascism". The Guardian. Retrieved 2022-10-28.
  4. ^ Lewis, Rebecca (2018-09-18). Alternative Influence: Broadcasting the Reactionary Right on YouTube (Report). Data & Society.
  5. ^ a b c Ribeiro, Manoel Horta; Ottoni, Raphael; West, Robert; Almeida, Virgílio A. F.; Meira, Wagner (2020-01-27). "Auditing Radicalization Pathways on YouTube". FAT* '20: Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency: 131–141. doi:10.1145/3351095.3372879. ISBN 9781450369367. S2CID 201316434.
  6. ^ a b c d e f g h i j Roose, Kevin (2019-06-08). "The Making of a YouTube Radical". The New York Times. ISSN 0362-4331. Retrieved 2022-10-26.
  7. ^ a b Ledwich, Mark; Zaitsev, Anna (2020-02-26). "Algorithmic extremism: Examining YouTube's rabbit hole of radicalization". First Monday. doi:10.5210/fm.v25i3.10419. ISSN 1396-0466.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  8. ^ Munger, Kevin; Phillips, Joseph (2020). "Right-Wing YouTube: A Supply and Demand Perspective". The International Journal of Press/Politics. 27 (1): 186–219. doi:10.1177/1940161220964767. ISSN 1940-1612.
  9. ^ Mamié, Robin; Horta Ribeiro, Manoel; West, Robert (2021-06-21). "Are Anti-Feminist Communities Gateways to the Far Right? Evidence from Reddit and YouTube". 13th ACM Web Science Conference 2021. WebSci '21. New York, NY, USA: Association for Computing Machinery: 139–147. doi:10.1145/3447535.3462504. ISBN 978-1-4503-8330-1. S2CID 232045966.
  10. ^ a b Cotter, Kelley (2022-03-18). "Practical knowledge of algorithms: The case of BreadTube". New Media & Society: 146144482210818. doi:10.1177/14614448221081802. ISSN 1461-4448. S2CID 247560346.
  11. ^ Hermansson, Patrik; Lawrence, David; Mulhall, Joe; Murdoch, Simon (2020-01-31). The International Alt-Right: Fascism for the 21st Century?. Routledge. pp. 57–58. ISBN 978-0-429-62709-5.