Search
Search titles only
By:
Search titles only
By:
Menu
Forums
New posts
Search forums
Home
What's new
New posts
Latest activity
Log in
Register
Search
Search titles only
By:
Search titles only
By:
Menu
Install the app
Install
Reply to thread
Home
Computers & Internet
Mobile Computing
Digital regulation must empower people to make the internet better
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Message
<blockquote data-quote="Annie Saunders" data-source="post: 1358"><p>Amanda Keton Contributor</p><p><a href="https://twitter.com/AmandaKeton" target="_blank"> </a></p><p>Amanda Keton is general counsel for the <a href="https://wikimediafoundation.org/" target="_blank">Wikimedia Foundation</a>.</p><p>Christian Humborg Contributor</p><p><a href="https://twitter.com/chumborg" target="_blank"> </a></p><p>Christian Humborg is the executive director of <a href="https://www.wikimedia.de/" target="_blank">Wikimedia Deutschland</a>.</p><p></p><p>As COVID-19 spread rapidly across the world in 2020, people everywhere were hungry for reliable information. A global network of volunteers rose to the challenge, consolidating information from scientists, journalists and medical professionals, and making it accessible for everyday people.</p><p></p><p>Two of them live almost 3,200 kilometers away from one another: Dr.<a href="https://www.thenationalnews.com/arts-culture/2021/08/16/arab-medical-professional-alaa-najjar-honoured-by-wikipedia-for-covid-19-coverage/" target="_blank"> Alaa Najjar</a> is a Wikipedia volunteer and medical doctor who spends breaks during his emergency room shift addressing COVID-19 misinformation on the Arabic version of the site. Sweden-based Dr.<a href="https://timesofindia.indiatimes.com/nri/indian-doctor-in-sweden-is-a-prominent-frontline-warrior-against-covid-fake-news/articleshow/85626467.cms" target="_blank"> Netha Hussain</a>, a clinical neuroscientist and doctor, spent her downtime editing COVID-19 articles in English and Malayalam (a language of southwestern India), later focusing her efforts on improving Wikipedia articles about COVID-19 vaccines.</p><p></p><p>Thanks to Najjar, Hussain and more than 280,000 volunteers, Wikipedia emerged as one of the most trusted sources for up-to-date,<a href="https://www.who.int/news/item/22-10-2020-the-world-health-organization-and-wikimedia-foundation-expand-access-to-trusted-information-about-covid-19-on-wikipedia" target="_blank"> comprehensive knowledge about COVID-19</a>, spanning nearly 7,000 articles in 188 languages. Wikipedia’s reach and ability to support knowledge-sharing on a global scale — from informing the public about a major disease to helping students study for tests — is only made possible by laws that enable its collaborative, volunteer-led model to thrive.</p><p></p><p></p><p>As the European Parliament considers new regulations aimed at holding Big Tech platforms accountable for illegal content amplified on their websites and apps through packages like the Digital Services Act (DSA), it must protect citizens’ ability to collaborate in service of the public interest.</p><p></p><p>Lawmakers are right to try to stem the spread of content that causes physical or psychological harm, including content that is illegal in many jurisdictions. As they consider a range of provisions for the comprehensive DSA, we welcome some of the proposed elements, including requirements for greater transparency about how platforms’ content moderation works.</p><p></p><p>But the current draft also includes prescriptive requirements for how terms of service should be enforced. At first glance, these measures may seem necessary to curb the rising power of social media, prevent the spread of illegal content and ensure the safety of online spaces. But what happens to projects like Wikipedia? Some of the proposed requirements could shift power further away from people to platform providers, stifling digital platforms that operate differently from the large commercial platforms.</p><p></p><p>Big Tech platforms work in fundamentally different ways than nonprofit, collaborative websites like Wikipedia. All of the articles created by Wikipedia volunteers are available for free, without ads and without tracking our readers’ browsing habits. The commercial platforms’ incentive structures maximize profits and time on site, using algorithms that leverage detailed user profiles to target people with content that is most likely to influence them. They deploy more algorithms to moderate content automatically, which results in errors of over- and under-enforcement. For example, computer programs often confuse <a href="https://www.npr.org/2021/10/20/1047623196/vienna-museums-artwork-onlyfans" target="_blank">artwork</a> and <a href="https://www.theatlantic.com/international/archive/2018/05/germany-facebook-afd/560435/" target="_blank">satire</a> with illegal content, while failing to understand human nuance and context necessary to enforce platforms’ actual rules.</p><p></p><p></p><p>The <a href="https://wikimediafoundation.org/" target="_blank">Wikimedia Foundation</a> and affiliates based in specific countries, like<a href="https://www.wikimedia.de/" target="_blank"> Wikimedia Deutschland</a>, support Wikipedia volunteers and their autonomy in making decisions about what information should exist on Wikipedia and what shouldn’t. The online encyclopedia’s open editing model is grounded in the belief that <em>people</em> should decide what information stays on Wikipedia, leveraging established volunteer-developed rules for neutrality and reliable sources.</p><p></p><p>This model ensures that for any given Wikipedia article on any subject, people who know and care about a topic enforce the rules about what content is allowed on its page. What’s more, our content moderation is transparent and accountable: All conversations between editors on the platform are publicly accessible. It is not a perfect system, but it has <a href="https://www.google.com/search?q=wired+wikipedia+the+last+best+place+on+the+inetnert&oq=wired+wikipedia+the+last+best+place+on+the+inetnert&aqs=chrome..69i57j69i64l3.6572j0j7&sourceid=chrome&ie=UTF-8" target="_blank">largely worked</a> to make Wikipedia a global source of neutral and verified information.</p><p></p><p>Forcing Wikipedia to operate more like a commercial platform with a top-down power structure, lacking accountability to our readers and editors, would arguably subvert the DSA’s actual public interest intentions by leaving our communities out of important decisions about content.</p><p></p><p>The internet is at an inflection point. Democracy and civic space are under attack in Europe and around the world. Now, more than ever, all of us need to think carefully about how new rules will foster, not hinder, an online environment that allows for new forms of culture, science, participation and knowledge.</p><p></p><p>Lawmakers can engage with public interest communities such as ours to develop standards and principles that are more inclusive, more enforceable and more effective. But they should not impose rules that are aimed solely at the most powerful commercial internet platforms.</p><p></p><p>We all deserve a better, safer internet. We call on lawmakers to work with collaborators across sectors, including Wikimedia, to design regulations that empower citizens to improve it, together.</p></blockquote><p></p>
[QUOTE="Annie Saunders, post: 1358"] Amanda Keton Contributor [URL='https://twitter.com/AmandaKeton'] [/URL] Amanda Keton is general counsel for the [URL='https://wikimediafoundation.org/']Wikimedia Foundation[/URL]. Christian Humborg Contributor [URL='https://twitter.com/chumborg'] [/URL] Christian Humborg is the executive director of [URL='https://www.wikimedia.de/']Wikimedia Deutschland[/URL]. As COVID-19 spread rapidly across the world in 2020, people everywhere were hungry for reliable information. A global network of volunteers rose to the challenge, consolidating information from scientists, journalists and medical professionals, and making it accessible for everyday people. Two of them live almost 3,200 kilometers away from one another: Dr.[URL='https://www.thenationalnews.com/arts-culture/2021/08/16/arab-medical-professional-alaa-najjar-honoured-by-wikipedia-for-covid-19-coverage/'] Alaa Najjar[/URL] is a Wikipedia volunteer and medical doctor who spends breaks during his emergency room shift addressing COVID-19 misinformation on the Arabic version of the site. Sweden-based Dr.[URL='https://timesofindia.indiatimes.com/nri/indian-doctor-in-sweden-is-a-prominent-frontline-warrior-against-covid-fake-news/articleshow/85626467.cms'] Netha Hussain[/URL], a clinical neuroscientist and doctor, spent her downtime editing COVID-19 articles in English and Malayalam (a language of southwestern India), later focusing her efforts on improving Wikipedia articles about COVID-19 vaccines. Thanks to Najjar, Hussain and more than 280,000 volunteers, Wikipedia emerged as one of the most trusted sources for up-to-date,[URL='https://www.who.int/news/item/22-10-2020-the-world-health-organization-and-wikimedia-foundation-expand-access-to-trusted-information-about-covid-19-on-wikipedia'] comprehensive knowledge about COVID-19[/URL], spanning nearly 7,000 articles in 188 languages. Wikipedia’s reach and ability to support knowledge-sharing on a global scale — from informing the public about a major disease to helping students study for tests — is only made possible by laws that enable its collaborative, volunteer-led model to thrive. As the European Parliament considers new regulations aimed at holding Big Tech platforms accountable for illegal content amplified on their websites and apps through packages like the Digital Services Act (DSA), it must protect citizens’ ability to collaborate in service of the public interest. Lawmakers are right to try to stem the spread of content that causes physical or psychological harm, including content that is illegal in many jurisdictions. As they consider a range of provisions for the comprehensive DSA, we welcome some of the proposed elements, including requirements for greater transparency about how platforms’ content moderation works. But the current draft also includes prescriptive requirements for how terms of service should be enforced. At first glance, these measures may seem necessary to curb the rising power of social media, prevent the spread of illegal content and ensure the safety of online spaces. But what happens to projects like Wikipedia? Some of the proposed requirements could shift power further away from people to platform providers, stifling digital platforms that operate differently from the large commercial platforms. Big Tech platforms work in fundamentally different ways than nonprofit, collaborative websites like Wikipedia. All of the articles created by Wikipedia volunteers are available for free, without ads and without tracking our readers’ browsing habits. The commercial platforms’ incentive structures maximize profits and time on site, using algorithms that leverage detailed user profiles to target people with content that is most likely to influence them. They deploy more algorithms to moderate content automatically, which results in errors of over- and under-enforcement. For example, computer programs often confuse [URL='https://www.npr.org/2021/10/20/1047623196/vienna-museums-artwork-onlyfans']artwork[/URL] and [URL='https://www.theatlantic.com/international/archive/2018/05/germany-facebook-afd/560435/']satire[/URL] with illegal content, while failing to understand human nuance and context necessary to enforce platforms’ actual rules. The [URL='https://wikimediafoundation.org/']Wikimedia Foundation[/URL] and affiliates based in specific countries, like[URL='https://www.wikimedia.de/'] Wikimedia Deutschland[/URL], support Wikipedia volunteers and their autonomy in making decisions about what information should exist on Wikipedia and what shouldn’t. The online encyclopedia’s open editing model is grounded in the belief that [I]people[/I] should decide what information stays on Wikipedia, leveraging established volunteer-developed rules for neutrality and reliable sources. This model ensures that for any given Wikipedia article on any subject, people who know and care about a topic enforce the rules about what content is allowed on its page. What’s more, our content moderation is transparent and accountable: All conversations between editors on the platform are publicly accessible. It is not a perfect system, but it has [URL='https://www.google.com/search?q=wired+wikipedia+the+last+best+place+on+the+inetnert&oq=wired+wikipedia+the+last+best+place+on+the+inetnert&aqs=chrome..69i57j69i64l3.6572j0j7&sourceid=chrome&ie=UTF-8']largely worked[/URL] to make Wikipedia a global source of neutral and verified information. Forcing Wikipedia to operate more like a commercial platform with a top-down power structure, lacking accountability to our readers and editors, would arguably subvert the DSA’s actual public interest intentions by leaving our communities out of important decisions about content. The internet is at an inflection point. Democracy and civic space are under attack in Europe and around the world. Now, more than ever, all of us need to think carefully about how new rules will foster, not hinder, an online environment that allows for new forms of culture, science, participation and knowledge. Lawmakers can engage with public interest communities such as ours to develop standards and principles that are more inclusive, more enforceable and more effective. But they should not impose rules that are aimed solely at the most powerful commercial internet platforms. We all deserve a better, safer internet. We call on lawmakers to work with collaborators across sectors, including Wikimedia, to design regulations that empower citizens to improve it, together. [/QUOTE]
Insert quotes…
Verification
Post reply
Home
Computers & Internet
Mobile Computing
Digital regulation must empower people to make the internet better
Top
Bottom
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.
Accept
Learn more…