Wikipedia:Wikipedia Signpost/2023-12-24/Apocrypha

From Wikipedia, the free encyclopedia
Apocrypha

Local editor discovered 1,380 lost subheadings in ancient Signpost scrolls. And what he found was shocking.

Screenshot of lines of text on a black background.
What it feels like to chew Signpost gum.

Yeah, you got clickbaited. Anyway, here's the deal:

Recently, I wrote and deployed an argosy of scripts (covered in more detail here) to extract 1,380 lost subheadings from the revision history of the Signpost's main page. These are now in their respective articles' header templates (and from there, in the module indices that serve as an article database). While this allows for much broader flexibility in our display methods, that isn't very exciting (or at least not until these display methods are actually put into practice). What is exciting — or at least mildly amusing — is a whirlwind tour of the never-before-seen Signpost greatest-hits compilation.

Basically, the subheadings were introduced in July 2012, as part of the perpetual effort to keep the Signpost modern and bumpin' — they started out as simple excerpts from articles that were shown on the main page. In 2017, they started being incorporated by default into the RSS-description templates — these are invisible, they don't display anywhere on the article page, but they provide metadata in the HTML — and began to assume their current form (brief, couple-sentence-long hooks). Well, I went through and put all of them into RSS-description templates, so now there's 2,464 articles with machine-readable subheadings, out of 5,462 Signpost articles in total — i.e. there are precisely 1,998 articles from before July 2012 that just never had headlines in the first place. Well, whatevs.

Some were missing, some were messed-up, some were typos — honorable mentions to the 2018-02-20 humor column ("headline?") and the 2017-12-18 blog feature ("."). Among the rest, a few extremes stood out, which I had nothing better to do than put in tables for my own amusement — and maybe, dear reader, yours as well.

All-time shortest headlines

Brawny
U2 Too
Mooned
Alexa
Paris
Viral
Bots
F*&!
How
Law

All-time longest headlines

The story of Wikipedia; Wikipedia reanimated and republished; UK government social media rules; death of Italian Wikipedia administrator
Paid editing service announced; Commercial exploitation of free images; Wikipedia as a crystal ball; Librarians to counter systemic bias
Reciprocity and reputation motivate contributions to Wikipedia; indigenous knowledge and "cultural imperialism"; how PR people see Wikipedia
Student attitudes towards Wikipedia; Jesus, Napoleon and Obama top "Wikipedia social network"; featured article editing patterns in 12 languages
Productive collaboration around coordinated protest marches; Media and political personalities comment on Wikipedia at its 16th birthday celebration
Research at Wikimania 2019: More communication doesn't make editors more productive; Tor users doing good work; harmful content rare on English Wikipedia
A grizzly bear, Operation Mascot, Freedom Planet& Liberty Island, cosmic dust clouds, a cricket five-wicket list, more fine art, & a terrible, terrible opera...
Conflict dynamics, collaboration and emotions; digitization vs. copyright; WikiProject field notes; quality of medical articles; role of readers; best wiki paper award
Lawyer goes to court to discover Wikipedian's identity; Storming Wikipedia; Wikimedia UK Secretary in conflict-of-interest controversy; Does Wikipedia need a "right to reply" box?
After the apocalypse, when zombies and aliens take over the Earth in a thousand years and dig up Wikipedia's servers but can only find talk pages without their accompanying articles, what will they think??

The longest among these is 206 characters long. I wonder if that fits into the modern display template? Oh, if only a highly stable genius had made it easy to retrieve and format Signpost article metadata... if only you could type something short and memorable like {{Signpost/snippet/autofill|article|2018-10-28|Humour}} and have it automatically render the full snippet template... but alas: there's no sufficiently handsome and wise programmer among us, capable of such heroic deeds.

Haha sike.[1]

All-time shortest subheadings

Update on EranBot, our new copyright violation detection bot
Help wanted!
Sawtpedia: Giving a Voice to Wikipedia Using QR Codes
Sounds good!
April Fools' through the ages, part two
2011 and on.
Wikipedia does not need you
Get over it!
5, 10, and 15 Years ago: September 2022
Yes, again.
Enough time left to vote! IP ban
Just do it!
Maher stepping down
UCC launch.
Who tells your story on Wikipedia
You can!
Flyer22 Frozen
RIP.
A Festival Descends on the City: The Edinburgh Fringe, Pt. 2
Lo!

All-time longest subheadings

Khan Academy's Smarthistory and Wikipedia collaborate
To many Wikimedians, the Khan Academy would seem like a close cousin: the academy is a non-profit educational website and a development of the massive open online course concept that has delivered over 227 million lessons in 22 different languages. Its mission is to give "a free, world-class education to anyone, anywhere." This complements Wikipedia's stated goal to "imagine a world in which every single person on the planet is given free access to the sum of all human knowledge", then go and create that world. It should come as no surprise, then, that the highly successful GLAM-Wiki (galleries, libraries, archives, museums) initiative has partnered with the Khan Academy's Smarthistory project to further both its and Wikipedia's goals.
Directing Discussion: WikiProject Deletion Sorting
This week, we uncovered WikiProject Deletion Sorting, Wikipedia's most active project by number of edits to all the project's pages. This special project seeks to increase participation in Articles for Deletion nominations by categorizing the AfD discussions by various topic areas that may draw the attention of editors. The project was started in August 2005 with manual processes that are continued today by a bevy of bots, categories, and transclusions. The project took inspiration from WikiProject Stub Sorting and some historical discussions on deletion reform. As the sheer number of AfDs continues to grow, the project is seeking better tools to manage the deletion sorting process and attract editors to comment on these deletion discussions.
Improved video support imminent and Wikidata.org live
The TimedMediaHandler extension (TMH), which brings dramatic improvements to MediaWiki's video handling capabilities, will go live to the English Wikipedia this week following a long and turbulent development, WMF Director of Platform Engineering Rob Lanphier announced on Monday ... Wikidata.org, a new repository designed to host interwiki links, launched this week and will begin accepting links shortly. The site, which is one half of the forthcoming Wikidata trial (the other half being the Wikidata client, which will be deployed to the Hungarian Wikipedia shortly) will also act as a testing area for phase 2 of Wikidata (centralised data storage). The longer term plan is for Wikidata.org to become a "Wikimedia Commons for data" as phases 2 and 3 (dynamic lists) are developed, project managers say.
First chickens come home to roost for FDC funding applicants; WMF board discusses governance issues and scope of programs
The first round of the Wikimedia Foundation's new financial arrangements has proceeded as planned, with the publication of scores and feedback by Funds Dissemination Committee (FDC) staff on applications for funding by 11 entities—10 chapters, independent membership organisations supporting the WMF's mission in different countries, and the foundation itself. The results are preliminary assessments that will soon be put to the FDC's seven voting members and two non-voting board representatives. The FDC in turn will send its recommendations to the board of trustees on 15 November, which will announce its decision by 15 December. Funding applications have been on-wiki since 1 October, and the talk pages of applications were open for community comment and discussion from 2 to 22 October, though apart from queries by FDC staff, there was little activity.
Small Wikipedias' burden
In a certain way, writing Wikipedia is the same everywhere, in every language or culture. You have to stick to the facts, aiming for the most objective way of describing them, including everything relevant and leaving out all the everyday trivia that is not really necessary to understand the context. You have to use critical thinking, trying to be independent of your own preferences and biases. To some effect, that's all there is to it. Naturally, Wikipedians have their biases, some of which can never be cured. Most Wikipedians tend to like encyclopedias; but millions of people in the world don't share that bias, and we represent them rather poorly. I'm also quite sure that an overwhelming majority of Wikipedia co-authors are literate. Again, that's not true for everyone in this world. Yet we have other, less noticeable but barely less fundamental biases.
WMF and the German chapter face up to Toolserver uncertainty
The Toolserver is an external service hosting the hundreds of webpages and scripts (collectively known as "tools") that assist Wikimedia communities in dozens of mostly menial tasks. Few people think that it has been operating well recently; the problems, which include high database replication lag and periods of total downtime, have caused considerable disruption to the Toolserver's usual functions. Those functions are highly valued by many Wikimedia communities ... In 2011, the Foundation announced the creation of Wikimedia Labs, a much better funded project that among other things aimed to mimic the Toolserver's functionality by mid-2013. At the same time, Erik Möller, the WMF's director of engineering, announced that the Foundation would no longer be supporting the Toolserver financially, but would continue to provide the same in-kind support as it had done previously.
Hoaxes draw media attention; Sue Gardner's op-ed; Women of Wikipedia
Hoaxes draw media attention: On New Year's Day, the Daily Dot reported that a "massive Wikipedia hoax" had been exposed after more than five years. The article on the Bicholim conflict had been listed as a "Good Article" for the past half-decade, yet turned out to be an ingenious hoax. Created in July 2007 by User:A-b-a-a-a-a-a-a-b-a, the meticulously detailed piece was approved as a GA in October 2007. A subsequent submission for FA was unsuccessful, but failed to discover that the article's key sources were made up. While the User:A-b-a-a-a-a-a-a-b-a account then stopped editing, the hoax remained listed as a Good Article for five years, receiving in the region of 150 to 250 page views a month in 2012. It was finally nominated for deletion on 29 December 2012 by ShelfSkewed—who had discovered the hoax while doing work on Category:Articles with invalid ISBNs—and deleted the same day.
FDC's financial muscle kicks in
The WMF's Funds Dissemination Committee has published its recommendations for the inaugural round 1 of funding. Requests totalled US$10.4M, nearly all of the FDC's budget for both first and second rounds. The seven-member committee of community volunteers appointed in September advises the WMF board on the distribution of grant funds among applying Wikimedia organizations. The committee, which has a separate operating budget of $276k for salaries and expenses, considered 12 applications for funds, from 11 chapters and from the WMF itself for its non-core activities. The decision-making process included community and FDC staff input after October 1, the closing date for submissions. Taken together, the volunteers decided to endorse an average of 81% of the funding sought—a total of $8.43M, which went to 11 of the 12 applicants. This leaves $2.71M to be distributed in round 2, for which applications are due in little more than three months' time.
The ups and downs of September and October, plus extension code review analysis
The Wikimedia Foundation's engineering report for September 2012 was published this week on the Wikimedia Techblog and on the MediaWiki wiki, giving an overview of all Foundation-sponsored technical operations in that month (as well as brief coverage of progress on Wikimedia Deutschland's Wikidata project, phase 1 of which is edging its way towards its first deployment). Three of the seven headline items in the report have already been covered in the Signpost: problems with the corruption of several Gerrit (code) repositories, the introduction of widespread translation memory across Wikimedia wikis, and the launch of the "Page Curation" tool on the English Wikipedia, with development work on that project now winding down. The report also drew attention to the end of Google Summer of Code 2012, the deployment to the English Wikipedia of a new ePUB (electronic book) export feature, and improvements to the WLM app aimed at more serious photographers.
Wobbly start to ArbCom election, but turnout beats last year's
At the time of writing, this year's election has just closed after a two-week voting period. The eight seats were contested by 21 candidates. Of these, 15 have not been arbitrators (Beeblebrox, Count Iblis, Guerillero, Jc37, Keilana, Ks0stm, Kww, NuclearWarfare, Pgallert, RegentsPark, Richwales, Salvio giuliano, Timotheus Canens, Worm That Turned, and YOLO Swag); four candidates are sitting arbitrators (David Fuchs, Elen of the Roads, Jclemens, and Newyorkbrad); and two have previously served on the committee (Carcharoth and Coren). Four Wikimedia stewards from outside the English Wikipedia stepped forward as election scrutineers: Pundit, from the Polish Wikipedia; Teles, from the Portuguese Wikipedia; Quentinv57, from the French Wikipedia; and Mardetanha, from the Persian Wikipedia. The scrutineers' task is to ensure that the election is free of multiple votes from the same person, to tally the results, and to announce them. The full results are expected to be released within the next few days and will be reported in next week's edition of the Signpost.

The older subheadings tended to be longer (although I trimmed some of the most egregious ones when parsing them in). That last one is a whopping 1,070 characters. Let's see that monster in a snippet:

Boy oh boy!!!!!!! By the way, while we're on the subject, anyone wanna help fix all of this crap?

  1. ^ Please note that this autofill template is a grotesque hack which relies in turn on other grotesque hacks, and nobody should use it for anything serious or load-bearing. I use it here only to flex.