Wikipedia talk:Wikipedia Signpost/2013-09-04/News and notes

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Discuss this story

Article creation bot finishes first run[edit]

Not only is Swedish Wikipedia now overflowing with hundreds of thousands of stubs based on outdated taxonomic information, but all of this outdated information is now being copied en masse to other language wikis and Wikidata, from which it will eventually work its way to English Wikipedia, polluting our hand-built mostly-up-to-date taxonomic data with boatloads of crap.

  • First rule of article creation bots: Never build article creation bots.
  • Second rule of article creation bots: If you're going to build articles based on 3rd party databases, only use the most specific, specialized, up-to-date databases possible, not huge, generalized databases that don't bother to keep their data up-to-date.

Kaldari (talk) 22:26, 6 September 2013 (UTC)[reply]

As an active Wikidata user, I'm always highly concerned with the prospect of bad information coming over to Wikidata. Where is the bot operator's plan for Wikidata published? Sven Manguard Wha? 04:27, 7 September 2013 (UTC)[reply]

I agree completely. Even with all the resources of English Wikipedia, we have hundreds of thousands of poorly maintained and poorly watched articles that were created either by bots, or by users in systematic ways. A smaller wiki has no chance of maintaining hundreds of thousands of micro-stubs. I think the root of the problem is inherent or assumed notability for certain classes of things, but as long as we have the flawed notability standards, we need to at least use discretion in proliferating these articles, with a mind for the resources required to maintain them as thousands of microstubs vs fewer summary style articles. Gigs (talk) 15:10, 9 September 2013 (UTC)[reply]
I agree that taxonomic bots create a lot of crap, look at this list of suspected duplicates on wikidata.
But, in defense of the swedes, their swedish-lakes-project is a rather good example for bot-generation of articles: Take several reliable sources, prepare, get consensus and generate articles that people can add to, more text and photos.
And i really like this: "Our next initiative we are working with is to get all data of Swedish communes, cities and towns 100 % correct in Wikidata (and also a semiautomatic update link to Wikidata from the Swedish statistical authorities databases). We thought our articles on these subjects were fine, but find we need to put in 6-9 month time to get the data from fine to 100% correct, and all the relevant data elements in place in Wikidata even if it only a few thousand articles . When we are ready we will have all the base data for these entities taken from Wikidata (not giving much improvement) but more important we will be able to provide 100% quality data for other language versions to semiautomatic get data (or generate articles) of these subjects, where we feel a special responsibility to secure global quality for." If other projects take the same initiative for polish, french, german,... communes, cities and towns, that would be a huge step for wikidata! --Atlasowa (talk) 08:54, 11 September 2013 (UTC)[reply]

Illustrations and jokes[edit]

The Japanese have made this, so I'm not surprised by the 9:1 ratio. I'm happy that the foundation chose to experiment with that. --NaBUru38 (talk) 05:58, 7 September 2013 (UTC)[reply]