Wikipedia:Wikipedia Signpost/2023-10-23/News and notes
Where have all the administrators gone?
Record low number of active administrators
We have had eight successful candidacies for adminship so far in 2023, which is just one more than the worst-ever year for RfA, which was 2021. The number of active administrators started the year 2023 at 500, then took a big dive in mid February for no reason that The Signpost has been able to determine, touched 452 a couple of times in April, and until October was steady around 460.
On October 18, we hit a new record low, going back over a decade, of 448 active admins. To find the last time English Wikipedia had fewer than 449 active admins, we have to go back to 2005.[a]
The reasons for this are cloudy and have been covered before by The Signpost, for instance at "Administrator cadre continues to contract – more" in January 2022.
In a recent Administrators' noticeboard discussion titled "Twelve fewer administrators", BeanieFan11 noted not a single month this year have we had a net gain of administrators, and so far all but one have had a net decrease, some large - per the admin's newsletter: January: +3, -11, net -8; February: +1, -5, net -4; March: +1, -2, net -1; April: +1, -1, net 0 (only month without negative net); May: +1, -4, net -3; June: +1, -3, net -2; July: +1, -8, net -7; August: +1, -4, net -3; September: +2, -4, net -2; October: +1, -12, net -11; Overall: +13, -53, net -40
At least one disappeared admin was fallout from the WP:ROADS controversy ending in a content fork and some departures, including Rschen7754 who resigned as an administrator and editor. – B
- ^ According to RickBot updates to WP:List of administrators that started in 2014, and charts at User:Widefox/editors before that. Anomalous data 11 September–28 September 2021 are excluded.
Knowledge Equity Fund
The Wikimedia Foundation has published comprehensive notes from the recent community call about the Knowledge Equity Fund on Meta. The notes include a Q&A. The WMF also highlights that the Fund has helped it to make new connections:
We contacted user groups and connected the grantees with them geographically or thematically, explaining the objects of the fund. We are also trying to create new synergies between Wikimedia user groups and external groups to increase our impact.
A few examples of connections we made are:
- Project Multatuli, which we connected with Wikimedia Indonesia
- Create Caribbean were connected with Noircir, Wiki Cari UG, Whose Knowledge, Projet:Université de Guyane and WikiMujeres
- Black Cultural Archives were connected with Noircir, Whose Knowledge and Wikimedia UK
- Criola were connected with Whose Knowledge, WikiMujeres and Mujeres (mulheres) LatinoAmericanas in Wikimedia and
- Data for Black Lives which we connected with AfroCrowd and Black Lunch Table
Through these connections, we have seen positive synergies within the movement at large
An ongoing English Wikipedia Village Pump Request for Comment on the controversial fund stands at 35:23 in favour of adopting the following non-binding resolution:
The English Wikipedia community is concerned that the Wikimedia Foundation has found itself engaged in mission creep, and that this has resulted in funds that donors provided in the belief that they would support Wikimedia Projects being allocated to unrelated external organizations, despite urgent need for those funds to address internal deficiencies.
We request that the Wikimedia Foundation reappropriates all money remaining in the Knowledge Equity Fund, and we request that prior to making non-trivial grants that a reasonable individual could consider unrelated to supporting Wikimedia Projects that the Foundation seeks approval from the community.
– AK
Community rejects proposal to create policy about large language models
There is an overwhelming consensus to not promote. 1 editor would promote to policy, 7 editors prefer guideline, and 30 editors were against promotion. 2 editors were fine with either policy or guideline. [...] The most common and strongest rationale against promotion (articulated by 12 editors, plus 3 others outside of their !votes) was that existing P&Gs [policies and guidelines], particularly the policies against vandalism and policies like WP:V and WP:RS, already cover the issues raised in the proposals. 5 editors would ban LLMs outright. 10-ish editors believed that it was either too soon to promote or that there needed to be some form of improvement. On the one hand, several editors believed that the current proposal was too lax; on the other, some editors felt that it was too harsh, with one editor suggesting that Wikipedia should begin to integrate AI or face replacement by encyclopedias that will. (2 editors made a bet that this wouldn't happen.)
Editors who supported promoting to guideline noted that Wikipedia needs to address the use of LLMs and that the perfect should not be the enemy of the good. However, there was no general agreement on what the "perfect" looked like, and other editors pointed out that promoting would make it much harder to revise or deprecate if consensus still failed to develop.
Similarly, on Wikimedia Commons a page collecting guidance about AI-generated media (particularly the use of generative AI models such as DALL-E, Stable Diffusion or Midjourney), likewise created in December 2022, is still marked as "a work in progress page", although it appears to have progressed a bit further already towards reflecting community consensus.
In any case, discussions about generative AI are continuing in the Wikimedia movement, also in off-wiki fora such as the "ML, AI and GenAI for Wikimedia Projects" Facebook group and the "Wikimedia AI" group on Telegram (non-public but with public invite link). At Wikimania 2023, it was the subject of various sessions including two panels titled "AI advancements and the Wikimedia projects" (video) and "ChatGPT vs. WikiGPT: Challenges and Opportunities in harnessing generative AI for Wikimedia Projects" (video). The September edition of the Wiki Education Foundation's "Speaker Series" likewise had the topic "Wikipedia in a Generative AI World", featuring three speakers including Aaron Halfaker (User:EpochFail, a former research scientist at the Wikimedia Foundation and developer of the AI-based ORES system that is still widely used for vandalism detection and other purposes). – H
Several European regulation efforts may adversely affect Wikimedia projects
In its EU Policy Monitoring Report for September, Wikimedia Europe highlights several legislative efforts that are ongoing on the continent. Some of them raise concerns regarding their possible impact on Wikipedia and other Wikimedia projects:
- The EMFA (European Media Freedom Act) is "intended to help a pluralistic media landscape", but also contains problematic provisions, e.g. a requirement for online platforms to warn "media providers, who can be media outlets but also individuals, such as journalists [...] ahead of moderating their content and to give them a fast-track channel to contest decisions. Some lawmakers even suggest that online platforms be prohibited from deleting content by media providers before the provider has had a chance to reply. All this is highly problematic, seeing that disinformation is sometimes produced by media providers." Efforts to exempt Wikimedia projects or at least non-profit "online encyclopaedias" succeeded initially but then were in jeopardy again. However, negotiations are expected to continue into 2024.
- The controversial Regulation to Prevent and Combat Child Sexual Abuse (CSAR) proposed by EU Commissioner Ylva Johansson is reported to have "stalled somewhat" recently. It would cover Wikimedia projects too, "and the Wikimedia Foundation has provided [already in 2022] constructive feedback, outlining some risks and challenges posed by the scanning technologies used. Wikimedia is also criticising the idea to scan direct, interpersonal communication in a general manner and without judicial oversight."
- In France, the proposed Loi SREN "would introduce some provisions on data retention and user identification, in order to not allow already banned users to re-register. That would require the collection of heaps of data and the compulsory identification of all users. Wikimedia projects are squarely in the scope of this proposal." Initial efforts to "take our projects out of the fireline" have failed.
– H
Brief notes
- Annual reports: Wikimedistas de Bolivia, Tyap Wikimedians User Group
- Wikimania videos are up: Two months after Wikimania (which took place in Singapore as an in-person event, for the first time since 2019), video recordings of the conference's sessions have become available on Commons (joining unedited recordings that were already published during the event on YouTube).
- Global bans: Gustin Kelly, since 18 October 2023
- Articles for Improvement: This week's Article for Improvement is Power (social and political). Please be bold in helping improve this article!
Discuss this story
Record low number of active administrators
The net -40 admins this year refers to all administrators, not active ones, which is partially a function of changes to inactivity desysop practice. Aside from the weird one-off drop from 8 February (498) to 9 Marchish (462), depending on how exactly you assess it the drop in active admins during 2023 is somewhere around 10-15 (eg. there are 449 as of last update). CMD (talk) 07:17, 23 October 2023 (UTC)[reply]
I'm not familiar with the admin or arbcom case in question, but the record of that arbitration paints a different picture than you are presenting (e.g. he or she "regularly performed deletions that do not comply with the deletion policy" among other things), and it substantiates my point, that it takes the Wikipedia equivalent of World War II to dislodge bad admins from their posts. Calling such positions the "mop" as some do, and saying it is "no big deal" is delusional, as if it was such a triviality, admins that bad would be desysopped right and left. Coretheapple (talk) 14:22, 26 October 2023 (UTC)[reply]And bang goes another admin: Wikipedia:Arbitration/Requests/Case#Lourdes. Oh well. Gråbergs Gråa Sång (talk) 09:50, 2 November 2023 (UTC)[reply]
Knowledge Equity Fund
fivesix(!) digits on that particular KEF grant, presumably. And what does it mean for Wikimedia? All I can find is more happy marketing corpo speak at [1] that "They will be receiving a one year grant of $100,000, which they will use to launch a Movement Scientists Fellowship. This Fellowship will match racial justice leaders with machine learning research engineers to develop data-based machine learning applications to drive change in the areas of climate, genetics, and economic justice. They will also launch a new series of educational programs, such as free and open oral histories that promote data literacy." I applaud what those NGOs are doing - but I don't see why we should be funding them?? (Bonus points for anyone who can point me to where that particular grant proposal actually exists, I couldn't even find it through Google or meta search). --Piotr Konieczny aka Prokonsul Piotrus| reply here 10:55, 23 October 2023 (UTC)[reply]