Jump to content

User:Mcoop23/sandbox

From Wikipedia, the free encyclopedia

Being Bold is important on Wikipedia.


Editing Wikipedia Articles

[edit]

“Yik Yak”

Wikipedia’s “Yik Yak” article outlines the history, features, and controversies surrounding this social media application. It appears that each fact in the article is referenced with an appropriate, reliable source. These sources include information directly from Yik Yak’s website, stories from a variety of newspapers and other news outlets like Huffington Post and Business Journal, and reports from national news organizations like NBC and NPR. It is easy to trace the facts in the article to their respective sources, thereby contributing to the overall reliability of the article.

Because the article topic is “Yik Yak,” everything in the article is related to either the Yik Yak organization itself or the social media app. Therefore, discussions about the history of the organization as well as information about how the social media app works and some controversies that have surrounded this social media platform each added to the article’s relevance and clarity.

When I initially read the article, I did not notice any sort of bias. However, looking back at the article’s composition, a majority of the article focuses on the controversies surrounding Yik Yak as an organization and as a social media platform. While the language in the article itself is not biased, the emphasis on cyberbullying and Yik Yak’s intentionally deleting posts that mention its competitors does elicit some bias against Yik Yak. Therefore, I would argue that this article is not completely neutral, as it seems slightly more biased against Yik Yak as a social media platform.

Most of the sources cited in the article are neutral sources, as they are from well-known national news outlets like NBC and the Chicago Tribune. However, some of the information is cited from blogs and editorials written about Yik Yak. Because these blogs and editorials reflect individuals’ opinions about the organization, these are biased sources that are not identified as such within the article.

As I mentioned above, a majority of the article is spent discussing the specific controversies surrounding the Yik Yak social media application. Thus, the controversial viewpoints are definitely overrepresented. On the contrary, the viewpoints pertaining to the benefits of Yik Yak, such as suicide prevention, are very much underrepresented, which could help expunge some of the bias evident in the article.

In terms of citations, all of the links that I checked were active links that worked. The authors of the article did well in constructing their verbiage by appropriately paraphrasing when necessary. I did not detect any plagiarism or close paraphrasing.

Though the article mentions Yik Yak laying off 60 percent of its employees in December 2016, there is little more information about this. I believe more information could be included regarding the current financial state and operations of the company and the future of the company.

After reading the “Talk” page, it is clear that other editors have cited bias within the article as well. This page explicitly says, “This page needs attention,” which indicates to me that the article is not as reliable or neutral as it should be. The page also indicates this article is a part of WikiProject Apps, an effort to develop coverage of applications within Wikipedia.

The “Yik Yak” article has been assigned a “Start-Class” rating on the Quality scale. This indicates that the article is not nearly complete and that it is still developing. Additionally, there is potential that some of the sources are not adequate or reliable. I agree with this rating, as the article is clearly not complete in terms of content and because of the fact that biased sources and biased content is found within the article. This article needs more work in order to provide Wikipedia users appropriate, relevant information.


“Filter Bubble”

Wikipedia’s “Filter Bubble” article details this emerging phenomenon, which speculates that search engines and other social media platforms, such as Google and Facebook, used personalized searches to limit the information to which an individual user is exposed based on what the user would like to see. Effectively, if social media platforms actually use filter bubbles, individuals could be isolated in their own cultural spheres, surrounded by other individuals who think and believe the same things as us. Each fact in the article is referenced with an appropriate, reliable source. These sources include USA Today, Chicago Tribune, Huffington Post, and Science magazine. The facts embedded in the article can be easily traced back to the sources found at the conclusion of the article.

To a certain extent, everything in the article was relevant to the article topic. However, the “Filter Bubble” topic is very broad and can cover a wide array of information. One particular item in the article distracted me from the overall topic. The authors of the article repeatedly referred to Eli Pariser and his book, “The Filter Bubble.” While Pariser seems to be an expert on the filter bubble topic, the article talks about Pariser’s theories and opinions more than it introduces the readers to other sources of information about the topic. Furthermore, because Pariser’s theories are referenced so frequently, it is evident which side Pariser is biased towards, which impacts the neutrality of the article. I felt that this was a significant part of the article that detracted from the article’s overall content.

As I mentioned above, the article appears to be lacking in neutrality. Filter bubbles are obviously a contentious subject because it has the potential to invade individuals’ personal privacy. The bias against filter bubbles, however, is particularly evident throughout the article. The article’s authors really only discuss one side of the argument – the side against filter bubbles. They use Eli Pariser’s theories to describe the negative aspects of filter bubbles and the harmful ways in which social media users can be affected. Though the article mentioned studies that discounted filter bubbles, these studies had basically no substance or scientific backing. Thus, this article is clearly not neutral, as it is heavily biased against filter bubbles due to the harmful effects on individuals’ social interactions and culture.

Information in this article comes from a variety of sources, including USA Today, Chicago Tribune, Huffington Post, Science magazine, NPR, CNN, The Atlantic, and The Wall Street Journal. Each of these are credible, neutral sources with reliable information. The viewpoints of Eli Pariser are also referenced throughout the article, coming from several of his publications. Though the article qualifies that these are his viewpoints, they still impact the overall neutrality of the article.

Because of the evident bias against filter bubbles, the negative aspects of filter bubbles are significantly overrepresented. On the other hand, the viewpoints in favor of filter bubbles are greatly underrepresented.

Each of the links to the citations works. The editors of the article did well in their paraphrasing from these sources. I did not detect any close paraphrasing or plagiarism.

After reading the “Talk” page associated with this article, it is clear that much discussion went on during the editing and review process. People disagreed with some of the items that were mentioned in the article and felt that the article was incomplete. At the bottom of the “Talk” page, there was a subsection called “Issues.” This section details the problems associated with the article, many of which I addressed above. One important thing I forgot to consider was the fact that direct quotes are not appropriate in Wikipedia articles, yet there are many direct quotes embedded in the article. This is another way in which the article is not reliable or credible. Ultimately, the general consensus is that the article is superficial, subjective, biased, and does not go into enough depth on the topic of filter bubbles.

This Wikipedia article has been given a C-class rating. C-Class articles indicate that the article is substantive, but either is lacking significant information or has too much irrelevant information. I agree with this rating for the article because of the clear biases and lack of in-depth information pertinent to the article topic. The fact that Eli Pariser’s subjective thoughts on filter bubbles dominate the article indicates that the article is not nearly as objective as it should be. Further, the article introduces particular aspects of filter bubbles, but does not provide any additional details to support these claims. Though this article has the potential to be highly credible and informative, significant improvements need to be made to develop the quality and content of the article.

--Mcoop23 (talk) 17:30, 6 February 2017 (UTC)


My Plan to Contribute

[edit]

As I mentioned in my evaluation of the "Filter Bubble" article, much emphasis is placed on Eli Pariser's theories regarding the influence and prevalence of the filter bubble. This is rightfully justified, as Eli Pariser coined the term "filter bubble," and is an expert in this field. However, I would argue that a little too much weight was given to Pariser in this article. Therefore, I hope to contribute to this article by including other reliable sources discussing the topic. Having a multitude of perspectives on the "filter bubble" topic will increase the reliability and credibility of this article.

--Mcoop23 (talk) 22:18, 12 February 2017 (UTC)


"Filter Bubble" Draft

[edit]

Ethical Implications of the Filter Bubble

The development and emergence of new technologies during the twenty-first century and the ways in which these new technologies are regulated have significant implications concerning security, ethics, and personal freedom.[1] The materialization of filter bubbles in popular social media and personalized search sites dictates the particular content seen by users, often without their direct consent or cognizance.[2] As the popularity of cloud services increases, personalized algorithms used to construct filter bubbles will unavoidably become more widespread.[2] Filter bubbles, then, could result in individuals losing their autonomy over their own social media platforms and individuals’ identities being socially constructed without their awareness.[2]

Social media platforms, such as Facebook, personalize users’ news feeds based on recent activity, such as recent comments and “likes”.[3] Consequently, individual users are not exposed to differing points of view, enhancing users’ sense of confirmation bias.[3] Additionally, social sorting and other unintentional discriminatory practices could arise as a result of personalized filtering.[4] Though it is difficult to quantify concern, it is important to consider the acceptability of these practices from an ethical standpoint.[4]

Filter bubbles should also be considered from the perspective of technologists, social media engineers, or computer specialists.[5] When evaluating this issue, users of personalized search engines and social media platforms must understand that their information is not private.[5] This raises the question as to whether or not it is moral for these information technologists to take users’ online activity and manipulate future exposure to information.[5]

Inevitably, a majority of social media users utilize these platforms as their primary source of news.[6] Accordingly, the ethical and moral considerations surrounding filter bubbles is important because of the potential for biased, misleading information exposure.[7]


Sources: (1) https://www.scientificamerican.com/article/the-many-ethical-implications-of-emerging-technologies/ (2) https://www.researchgate.net/profile/Martijn_Warnier/publication/238369914_Requirements_for_Reconfigurable_Technology_a_challenge_to_Design_for_Values/links/53f6e8250cf2888a7497561c.pdf#page=7 (3) https://www.wired.com/2016/11/filter-bubble-destroying-democracy/ (4) https://policyreview.info/articles/analysis/should-we-worry-about-filter-bubbles (5) https://www.ran.org/the_filter_bubble_raises_important_issues_you_just_need_to_filter_them_out_for_yourself (6) http://www.theverge.com/2016/11/16/13653026/filter-bubble-facebook-election-eli-pariser-interview (7) https://www.technologyreview.com/s/522111/how-to-burst-the-filter-bubble-that-protects-us-from-opposing-views/


I mistakingly put this on my Talk page rather than in my Sandbox. I apologize for the confusion!!

--Mcoop23 (talk) 15:10, 26 March 2017 (UTC)

  • Hi Mcoop23! I've looked over the article. I'm concerned that this reads a little like an academic paper rather than an encyclopedia article. It's very well written, but it does seem to contain some original research on your part. By this I mean that you've drawn your own conclusions from the source material and while they are almost certainly correct, some of the sources you've used to back up these points do not explicitly state these points. That's one of the main differences between academic papers/essays and encyclopedia articles. It's not an overwhelmingly huge issue here - most of what this can be fixed with a little tweaking and finding sources that back up the claims are aren't clearly stated in the current sourcing. Also, some of this is already in the reaction section to a certain degree so we could probably condense some of this a little as well. Let me know what you think. Shalor (Wiki Ed) (talk) 04:53, 5 April 2017 (UTC)
  1. ^ Al-Rodhan, Nayef. "The Many Ethical Implications of Emerging Technologies". Scientific American. Retrieved 2017-04-05.
  2. ^ a b c Dechesne, F.; Van den Hoven, M. J; Warnier, M. E. (2011). "Requirements for Reconfigurable Technology: a challenge to Design for Values" (PDF). 1st International Workshop on Values in Design—Building Bridges Between RE, HCI and Ethics.
  3. ^ a b El-Bermawy, Mostafa M. "Your Filter Bubble is Destroying Democracy". WIRED. Retrieved 2017-04-05.
  4. ^ a b Zuiderveen Borgesius, Frederik J.; Trilling, Damian; Möller, Judith; Bodó, Balázs; de Vreese, Claes H.; Helberger, Natali (2016). "Should we worry about filter bubbles?". Internet Policy Review. 5 (1). doi:10.14763/2016.1.401. S2CID 52211897.
  5. ^ a b c "The Filter Bubble Raises Important Issues – You Just Need To Filter Them Out For Yourself". Rainforest Action Network. Retrieved 2017-04-05.
  6. ^ Newton, Casey (2016-11-16). "The author of The Filter Bubble on how fake news is eroding trust in journalism". The Verge. Retrieved 2017-04-05.
  7. ^ "How to Burst the "Filter Bubble" that Protects Us from Opposing Views". MIT Technology Review. November 29, 2013. Retrieved 2017-04-05.