Sunday, May 5, 2024
HomeGlobalScience & TechnologyWikipedia launches first global code of conduct to combat site abuses

Wikipedia launches first global code of conduct to combat site abuses

-

New York, USA (CU)_ Wikipedia, the free online encyclopedia depends primarily on unpaid volunteers to manage user behavior issues. The online platform celebrated its 20th anniversary last month. On Tuesday, the foundation that runs Wikipedia launched its first global code of conduct, aiming to address allegations of harassment and lack of diversity that it has struggled to battle.

María Sefidari, the chair of board of trustees for the non-profit Wikimedia Foundation said, “We need to be much more inclusive. We are missing a lot of voices, we’re missing women, we’re missing marginalized groups.” She added, “A code of conduct without enforcement…is not going to be useful. We’re going to figure this out with the communities”.

Online networks have been subject to severe criticism for offensive behavior, aggressive language and other types of objectionable material, forcing them to renovate and more strictly enforce the rules of content. The online encyclopedia depends mostly on unpaid volunteers to deal with user behavior problems, unlike Facebook Inc and Twitter Inc that take more top-down approaches to content moderation. Wikimedia announced that in the process of creating new binding principles, over 1,500 Wikipedia volunteers from five continents and 30 languages were involved after the Board of Trustees voted in May last year.

Katherine Maher, the executive director of the Wikimedia Foundation said, “There’s been a process of change throughout the communities. It took time to build the support that was necessary to do the consultations for people to understand why this is a priority.” Maher added that the fears of some users that the new rules meant that the platform was becoming more centralized were unfounded.

The new code of conduct forbids abuse on and off the web, prohibiting actions such as hate speech, the use of insults, stereotypes or attacks based on personal traits and hazards of physical harm and hounding, or criticizing their work by trailing them through various posts. It also forbids the intentional insertion of misleading or biased data into content. Compared with major social media platforms which have failed to control disinformation, Wikipedia is reasonably a reliable and trustworthy platform.

There are 230,000 volunteer editors working on crowdsourced articles on Wikipedia and over 3,500 administrators who can take action on those sites, such as blocking accounts or banning edits. Complaints are addressed by committee members and users chosen by the communities. Wikimedia said the next stage of the project will be to focus on implementing the laws.

Maher said trainings will be provided for communities and interested task-forces of users. She added that Wikimedia has no immediate plans to improve its small faith and safety team, a group comprising a dozen workers who are currently addressing on urgent issues such as death threats or the sharing of private information from individuals.

spot_img

LEAVE A REPLY

Please enter your comment!
Please enter your name here

LATEST POSTS

Follow us

51,000FansLike
50FollowersFollow
428SubscribersSubscribe
spot_img