Wikipedia talk:Good articles/GAN Backlog Drives/January 2022

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Reviews that started before Jan 1st[edit]

My guess is these reviews would not be eligible even if significant reviewing work continued throughout January, but I am asking because I started these in December, two of the oldest articles I could find whose reviews are still not completed, pending improved changes. I'm unsure how to categorize such cases namely Talk:Repatriation tax avoidance/GA1 and Talk:China–Pakistan Free Trade Agreement/GA1. ~ 🦝 Shushugah (he/him • talk) 11:39, 29 December 2021 (UTC)[reply]

In previous drives we counted reviews opened before the beginning of the drive but only if the actual review didn't start before the drive began. Since these reviews appear to be already in progress I wouldn't count them in January, although it's great to be reviewing articles anytime! (t · c) buidhe 13:04, 29 December 2021 (UTC)[reply]
I concur with buidhe. As reviewing has actually taken place in those examples, I would not count them. Trainsandotherthings (talk) 16:26, 29 December 2021 (UTC)[reply]

Progress[edit]

BlueMoonset, are you able to keep track of the progress chart and graph, as you have before? No worries if not; I can do it if needed. --Usernameunique (talk) 22:06, 31 December 2021 (UTC)[reply]

Usernameunique, I am. Note that I typically won't post the new day until sometime after 01:00 UTC, because I use the Wikipedia:Good article nominations/Report page numbers, which are more accurate (because they count the actual nominations on the WP:GAN page) than the numbers at the top of that page, since the templates that calculate those numbers have some phantom reporting in them. Once I get the Report numbers, I back out all the changes to WP:GAN between 00:00 and 01:00 to get the real midnight numbers, which then go into the chart and graph. I hope to have the first entries for both in about 90 minutes from now. (It looks like one reviewer has jumped the gun by about an hour, probably confusion as to when midnight UTC is; looks like an AGF moment.) BlueMoonset (talk) 23:36, 31 December 2021 (UTC)[reply]
Thanks as always, BlueMoonset. --Usernameunique (talk) 23:47, 31 December 2021 (UTC)[reply]
BlueMoonset, is it just me, or is the "Report" number of outstanding nominations currently one higher than the WP:GAN number? As of 01:00 UTC, the Report page says 231 nominations; currently the WP:GAN page shows only 232, despite having two nominations added since the Report ran. If I'm not seeing things, this is a phantom number in the opposite direction from previously. --Usernameunique (talk) 03:00, 19 January 2022 (UTC)[reply]
Usernameunique, sorry I didn't see this earlier. I've been checking (and it's been a bit of a moving target, with one more new one at 03:01 and nominations reported as passing at 03:01, 03:41, and 04:01, in the middle of my doing a comprehensive check. Taking the 01:00 numbers from Report, it is indeed 388 total and 231 unreviewed. Add the 3 new ones, you get 391 total and 234 unreviewed; subtract the 3 passes, and you get 388 total and 234 unreviewed. When I check now, the Nominations page says 389 total and 234 unreviewed: the outstanding total is one higher on the Nominations page than the Report page, and the unreviewed nominations numbers match. So it looks like the Nominations page is still one higher than it ought to be; I did check all the counts for each subsection on the Nominations page and compared it to the Report page, and the Report is, as best I can tell, parsing and counting that page correctly in making the Report. BlueMoonset (talk) 04:25, 19 January 2022 (UTC)[reply]
No worries, BlueMoonset. It's finnicky to say the least. I added up the numbers again last night (starting with the 01:00 report, adding and subtracting based on subsequent activity to the nominations page, and then comparing the final number to the current tally on the nominations page), and didn't get a discrepancy for total nominations, but did get a discrepancy of one for unreviewed nominations (233 for report+math, 232 for nominations page). The tallying is below. As far as I can tell, the nominations page had the correct count, as doing a Ctrl+F for "(start review)" and "(discuss review)" adds up to the numbers at the top of the page.
Tally
388/231 1:00 report
389/232 2:01
390/233 2:41
390/234 3:01
389/234 3:41
388/234 4:01
389/235 4:21
388/235 5:21
387/235 5:41
387/234 7:01
386/234 7:21
385/234 7:41
385/233 8:01
385/232 8:13 nominations page
As you pointed out, however, the tallying gets more complicated as the hours go by. I'll try to remember to take another look at the numbers tonight. --Usernameunique (talk) 17:49, 19 January 2022 (UTC)[reply]
Stats from 00:05 UTC: Per the top of nominations page, there are 376 nominations, 226 of them unreviewed. Doing Ctrl+F for "(start review)" (224) and "(discuss review)" (152), however, turns up two discrepancies. These appear to be 2017 vote of no confidence in the government of Mariano Rajoy (still shown as under review, even though /GA1 was deleted), and 2022 College Football Playoff National Championship (on review since 14 January, but for some reason not showing as under review on the nominations page). Thus, I think the "true" count as of 00:05 is 376 nominations, 225 of them unreviewed. --Usernameunique (talk) 00:30, 20 January 2022 (UTC)[reply]
Update: That left one discrepancy—the top of the page said 226 unreviewed nominations, but the "true" number appeared to be 226. From looking at the talk pages of each article in Category:Good article nominees awaiting review, the discrepancy was caused by Krüper's nuthatch, which was listed in the category despite a review having been started at 23:22 on 19 January. --Usernameunique (talk) 00:48, 20 January 2022 (UTC)[reply]
I think the first point to make is that while the tally was 376 nominations, 226 unreviewed according to the top of the page, the numbers on the Report page for the same time (though not collected until 01:00; it was using the page as it was at 00:00 (and 23:41 and 00:05, for that matter) were 375 and 225 respectively. However, since the Report considered the 2017 vote of no confidence in the government of Mariano Rajoy nomination to be malformed, it may have been omitted from the count entirely or partially. I'm afraid today is going to be inconclusive. Krüper's nuthatch has been fixed, as has 2017 vote of no confidence in the government of Mariano Rajoy (both involved edits to the article's talk page); the Talk:2022 College Football Playoff National Championship/GA1 review page wasn't formatted properly in the top section (the bot is very specific about what it expects to see, and if reviewers mess with it, the GAN page can show the nom as unreviewed or as reviewed by "Example"); it took me two edits to fix the problem. BlueMoonset (talk) 02:23, 20 January 2022 (UTC)[reply]

Let's do this![edit]

Hi! Just wanted to share some encouragement with everyone. There's tons of great work that merits more recognition. There's plenty of editors waiting for feedback that will take their articles over the finish line. There's plenty of articles that need to be failed and given time to improve through constructive comments and advice. But most importantly, there's plenty of us to tackle this backlog. I can't think of a better way to start the year than reading some great articles. Let's do this! :D Santacruz Please ping me! 13:16, 1 January 2022 (UTC)[reply]

Issues with copyvio?[edit]

Hi! I've had a bunch of issues trying to check copyvio using the toolforge tool due to 502 errors or other issues. Is the drive affecting the site's performance somehow? Santacruz Please ping me! 14:18, 2 January 2022 (UTC)[reply]

Who is supposed to tabulate the review count?[edit]

Am I supposed to manually add a count to two lines to tabulate each participant's total number of articles reviewed and the total number of old nominations reviewed. or is that something that happens towards the end? For now I just manually listed the reviews in between the div tags ~ 🦝 Shushugah (he/him • talk) 19:05, 2 January 2022 (UTC)[reply]

Shushugah, the coordinators will do that at the end of the drive. You need to add only the reviews (and whether they're old) between the div tags. --Usernameunique (talk) 00:55, 3 January 2022 (UTC)[reply]

BlueMoonset, you just added this back to the list of old nominations, despite a review being started yesterday. Is there something I am missing about the review, or was this inadvertent? Thanks, --Usernameunique (talk) 01:54, 3 January 2022 (UTC)[reply]

Usernameunique, it was deliberate: the nomination wasn't showing up on the WP:GAN page or the GAN Reports page as under review, so I added it back as unreviewed. However, it looks like the reason that it wasn't showing up in either place is that although the review page was created, it is malformed: the required header that's boilerplate for all GA review pages isn't there, causing it not to show up on the GAN page. I've just done the repair to the review page, and have removed my addition to the backlog page. The bot run in five minutes should take care of the rest. BlueMoonset (talk) 05:17, 3 January 2022 (UTC)[reply]

Nominations that reach 90 days during review[edit]

I was wondering if nominations that reach 90 days old during the review count as old or not. For example, the nominations at the end of the table say "Eligible for an extra .5 point starting x January", but that's not clear if that means when the review is started or finished. eviolite (talk) 15:39, 3 January 2022 (UTC)[reply]

Eviolite, it means when the review is started. So if a review of Goo Hara, for example, is started on or after 6 January, it will be worth 1.5 points, but if started before 6 January, it will be worth 1 point. --Usernameunique (talk) 17:34, 3 January 2022 (UTC)[reply]

Review length[edit]

I'd like to remind everyone that there is a minimum length of review required to receive credit in the drive, officially at 1,000 characters, although reviewers have discretion to not count other short reviews. Some articles may not need a long review and that's fine, but in those instances you will not receive credit. Thanks. Trainsandotherthings (talk) 02:58, 4 January 2022 (UTC)[reply]

I guess how pedantic are we being with this? I've got Talk:Gerard Gosselin/GA1 at 939 bytes, Talk:CitySpire/GA1 at 930 bytes, etc. I don't use any templates, try not to quote excessively, and generally avoid filling things with fluff, but those are slightly shorter reviews of articles in quite good shape. I guess I could fill space with fluff or boilerplate, but do we really want to encourage that? Hog Farm Talk 03:05, 4 January 2022 (UTC)[reply]
I'm interested in hearing what the other coords have to say on the subject. I didn't implement the minimum length requirement, I'm just enforcing it. Courtesy pings @Lee Vilenski: @GhostRiver: @Usernameunique:. Trainsandotherthings (talk) 03:28, 4 January 2022 (UTC)[reply]
My inclination would be that the main criteria just be that the review is clearly adequate, as opposed to something very brief such as Talk:60 Wall Street/GA1. Hog Farm Talk 03:34, 4 January 2022 (UTC)[reply]
I tend to focus more on checking quickfails than passes, and since the page size tool doesn't always faithfully transcribe review sizes, what I'm more looking for is a detailed delineation of why an article falls under one of the quickfail critera. You can only improve an article if you know what its problems are, after all. — GhostRiver 05:22, 4 January 2022 (UTC)[reply]
One of my reviews, Talk:Music of Middle-earth/GA1, came out a bit short (862 bytes), the article is simply solid. I know you need some cut-off point, and that this one is really below the line, and that's totally OK. But I wonder if I should still list this review in any case (and let the coordinators decide in the end), or is it my responsibility to make sure my articles are above the threshold? I'm also asking because it was an "old" article; if the review is considered too short, would I still get the half point for it? Maybe it would even be an option to generally award half a point when the review is an edge-case length-wise? --Jens Lallensack (talk) 20:33, 4 January 2022 (UTC)[reply]
To be honest, I'm not a fan of short reviews such as the examples above. Even if an article unquestionably meets the good-article criteria, there is always something that could be improved. If a review of an 5,000-word article contains only half a dozen comments, my instinctive first reaction is normally that the reviewer likely did not take a close look at the article. Such an initial reaction might be incorrect, of course, and as it is, my reviews probably tend to verge on the longer side. So for something more objective, it's worth taking a look at what the instructions (particularly Step 3: Reviewing the article) say:
  • Based on the good article criteria, decide whether the article could be immediately passed or immediately failed.
  • Do not quick pass the nomination; an in-depth review must be performed to determine whether a nomination passes all of the good article criteria.
  • If the article is considered fully compliant with the good article criteria, provide a review on the review page justifying that decision and "pass" the nomination.
The instructions envision that an article can be passed without requesting any improvements from the nominator. Yet in all cases, they call for "an in-depth review", and state that the review must "justify" the "decision [to] 'pass' the nomination". As Talk:60 Wall Street/GA1, which Hog Farm pointed out, goes to show, a review may justify the decision to pass an article while not convincing anyone of the review's thoroughness. The bigger question, I think, is whether short reviews of the type discussed above can be considered "in-depth" reviews.
To answer the more instant question—how does all this apply to checking the thoroughness of a review during this drive—I do think that assume good faith applies, especially if a reviewer with some short reviews has other reviews which are longer and/or demonstrate an understanding of the relevant criteria. Likewise, taking a look at the article itself helps to see if the review really did have little ground to cover. Jens Lallensack, Music of Middle-earth might get by on this measure, but I'd be more concerned about Talk:Ruth Crosby Noble/GA1, which seems a very short review (3 comments; less than 300 bytes) for an article that seems to have breadth issues. --Usernameunique (talk) 08:31, 5 January 2022 (UTC)[reply]
Thanks, all of that makes much sense. I agree and removed the 300 bytes review from my list (but I'm not sure the article has breadth issues, the sources just seem to be very sparse). --Jens Lallensack (talk) 08:51, 5 January 2022 (UTC)[reply]

Old nominations[edit]

From 94 nominations that were 90+ days old to 4 in only 7 days—not bad. GhostRiver, Lee Vilenski, and Trainsandotherthings, at the risk of getting ahead of ourselves here (as happened last drive), I propose changing the rule from articles becoming eligible for the extra .5 point 10 days at a time (90+ days old, then 80+ days, then 70+ days, etc.) to 15 at a time (90, 75, 60, etc.). The primary benefit is that it will give a greater selection of old articles for people to review; currently, for example, 11 nominations are between 80 and 89 days old, but 24 are between 75 and 89 days old. In the last drive, when it got down to one or two articles left which nobody wanted to review, it effectively removed the points-based incentive to review old articles for the latter half of the drive. --Usernameunique (talk) 16:17, 7 January 2022 (UTC)[reply]

This seems like a reasonable idea to me. Trainsandotherthings (talk) 17:52, 7 January 2022 (UTC)[reply]
I am also a bit worried about us keeping up with checking people's GA reviews, we are falling behind and will continue doing so if people keep reviewing at this rate (it's a good problem to have, but I don't want us to have to check 200 reviews on the last day). Trainsandotherthings (talk) 18:05, 7 January 2022 (UTC)[reply]
Could non coordinating volunteers help out with checking reviews? Maybe not to say "approved" but to at least flag for issues? ~ 🦝 Shushugah (he/him • talk) 18:52, 7 January 2022 (UTC)[reply]
I like that idea! This will even give coordinators time to focus on their own GA reviews. – Kavyansh.Singh (talk) 19:09, 7 January 2022 (UTC)[reply]
Buidhe, a note on this edit: There are still four nominations that are 90+ days old, so (unless I'm missing something) we're waiting for those to be taken before expanding it. Per the above, if all 90+ day-old nominations are taken, then all 75+ day-old nominations (not 80+) will become eligible for the extra half point. Of course, if I am missing something (always somewhere between possible and likely), please feel free to undo my undo. --Usernameunique (talk) 09:58, 8 January 2022 (UTC)[reply]
I'm afraid I'm not understanding. As you state, In the last drive, when it got down to one or two articles left which nobody wanted to review, it effectively removed the points-based incentive to review old articles for the latter half of the drive. I don't see how 10 vs. 15 days would fix that problem, because if last time's patter repeats the list will never be expanded before 90 days. What would help the issue would be expanding the list by opening it up when it gets below five articles or so. (t · c) buidhe 10:05, 8 January 2022 (UTC)[reply]
Not implementing @Buidhe's change now creates an incentive for waiting to review 75-90 days old articles until the 90+ ones have been taken. But perhaps we should not change anything now, but just declare the older half of open nominations "old" in the next drive so there is more supply. —Kusma (talk) 10:32, 8 January 2022 (UTC)[reply]
For future drives the best way to do it might be the twenty articles that are the oldest unreviewed. (t · c) buidhe 10:36, 8 January 2022 (UTC)[reply]
At any given time? —Kusma (talk) 13:53, 8 January 2022 (UTC)[reply]
Yes. (t · c) buidhe 14:37, 8 January 2022 (UTC)[reply]
I would be in favor of dropping down to 75 days now, as I know I've already had a few thoughts of, "Well, that article isn't at 90 days yet, so I should wait to pick it up until I can get the extra point." I'm not sure if there's a major benefit towards instead just waiting until those four remaining older articles are picked up. — GhostRiver 15:38, 8 January 2022 (UTC)[reply]
Agreed, let's drop down to 75 days now. There's only 2 older articles currently eligible right now. Trainsandotherthings (talk) 16:18, 8 January 2022 (UTC)[reply]
I'd been on the fence, but the above discussion has pushed me over. Let's do as suggested and drop it to 75 now. --Usernameunique (talk) 17:17, 8 January 2022 (UTC)[reply]
@Usernameunique I'd prefer a that any rule modifications be communicated somewhat in advance to also make it clearer for people who are reviewing, whether their 75 day old nomination etc.. (assuming there still is 90+ days) don't automatically become worth 1.5 points, when someone else was strategically focusing on the older 90+ day nominations that fewer people are reviewing (for whatever reasons). Something like January 15th to expand all 75+ day and older seems like a reasonable balance.
Regarding reviewing, Kavyansh.Singh indicated a willingness to review the reviews and we combined have 40+ reviews, so we'd be able to maybe do a QPQ review of a review or something if that's desirable. At this point, I haven't gotten any feedback, so something would be better than waiting till the very end, if there are issues with my reviews. ~ 🦝 Shushugah (he/him • talk) 23:31, 10 January 2022 (UTC)[reply]

There are currently five nominations that are 75+ days old (in order: 123 days, 89, 85, 80, and 75). Another two are 74 days old. Does anyone have thoughts on preemptively dropping the threshold down to 60 days, which would make about 30 articles eligible for the extra half point? --Usernameunique (talk) 15:02, 13 January 2022 (UTC)[reply]

I'm tentatively supportive of dropping the threshold again. I noticed we had gotten most of the 75+ day old articles yesterday. It might be worth waiting another day or so, but I wouldn't object to dropping the threshold now. Trainsandotherthings (talk) 17:52, 13 January 2022 (UTC)[reply]
I'd like to see the now-125-days-old one taken first, if possible. There are currently six five nominations that qualify as "old", and another two will come online in 19 hours. Maybe drop to 60 Saturday afternoon or evening, if the existing old ones haven't run out by then. BlueMoonset (talk) 04:57, 14 January 2022 (UTC)[reply]
Looks like we've found ourselves a hero for the 125-day-old nomination. That gives us two old nominations (90 & 86 days old), with another two that will reach 75 days shortly. I'm happy to drop the threshold now, or to give it another day. --Usernameunique (talk) 22:01, 14 January 2022 (UTC)[reply]
Now that the oldest nomination has been taken, I would be in favor of dropping it down in the hopes of stimulating some more reviews, now that we appear to have stalled a bit. — GhostRiver 03:45, 15 January 2022 (UTC)[reply]
Absent any objections, I'll drop it to 60 in a few hours. --Usernameunique (talk) 21:07, 15 January 2022 (UTC)[reply]

There are currently only four nominations that are 60+ days old remaining; they range from 61 to 63 days old. I'd suggest dropping it to 45 at the end of tomorrow. This would create about 35 "old" nominations, and would probably sustain us through the end of the drive. Anyone else have thoughts? --Usernameunique (talk) 21:11, 24 January 2022 (UTC)[reply]

Now down to three. I'd be in favor of dropping it again to help power through the last week of the drive. I'm insanely impressed at how effective this one has been and definitely think we should take more aggressive dropping procedures in future drives. — GhostRiver 18:39, 25 January 2022 (UTC)[reply]
Agreed on both points. Trainsandotherthings (talk) 18:44, 25 January 2022 (UTC)[reply]
Just dropped it to 45+, which gives 28 old nominations. As GhostRiver says, it's seriously impressive. According to the the report's numbers, on 31 December there were 290 nominations that were 30+ days old, and 91 that were 90+ days old. According to the 25 January report, there are only 56 nominations that are 30+ days old, and the oldest is only 64 days old. --Usernameunique (talk) 19:43, 25 January 2022 (UTC)[reply]

This review has hit a bit of a snag, and I've only just started doing GA reviews during this backlog drive so I'm here to ask for advice.

The nominator translated the article from French, but doesn't have access to the sources and has not verified them. (Q1. Can someone tell me if this normally enough to fail a GA outright? My thinking here is that in this case every reference has an implicit "verification needed" maint tag on it, and if those were explicitly in the article, it would quickfail.) I could, hypothetically, attempt to verify every reference myself. But there are a lot of them. The "what the GA criteria are not" article says Not checking at least a substantial proportion of sources to make sure that they actually support the statements they're purported to support. is a mistake to avoid. I agree wholeheartedly. But when I see GA reviews that have been done quite quickly, apparently to everyone's satisfaction, I can't help but wonder if that statement from "what the GA criteria are not" is actually the norm. (Also, checking a thoroughly referenced article takes longer than translating the text itself in the first place.) So, Q2.: is it? Could someone confirm/deny this for me?

Another editor is on hand to help, but they're the author of most of the French article. This is great from the perspective of having another editor familiar with the topic, who can answer questions about why things in the article are the way they are. They could also check to make sure the article has been translated accurately. But if the purpose of a GA reviewer checking references in the first place is to have a second, uninvolved editor verify them, obviously they can't help with that.

As the article stands, I would also not recommend a pass based on criteria 1a and 3b: I have already observed that there is at least one section that should probably be moved into its own, separate article, and the prose tends to wander. This is primarily because the article is heavily, at times exclusively, based on a book-length biography of Dmitrieff, which of course takes contextual detours and thus gets a bit tricky to summarize into encyclopedia style. (Before anyone asks: I do not think it depends so heavily on this book that it is a reliability problem - I just see this as the more-than-likely reason for its style and format.) It is also partly because supporting articles that could be wikilinked to did not exist when the original French article was written. I don't think this is impossible to fix within the recommended time constraints of a GA review, although I do think the concerns are significant. However, the nominator will struggle to deal with these concerns usefully without access to the original sources. On the other hand, I do have access to the main sources, but if I make those edits, I think that makes me too involved in the article to review it.

Q3. So, in light of all that, what do I do now? Fail it, edit it, renominate it with the other two editors? Is it possible to turn the review over to someone else? If so, should I make the changes I suggested before or after doing that? Some other option...? I'm happy to stand as a late-addition co-nominator/assistant/whatever if someone else can lead the review.

And, finally, Q4. Am I setting my interpretation of the GA criteria inappropriately high? This is a GA on French Wikipedia, and I understand that GA criteria are not meant to be a high bar, so maybe I am. If so, I would nonetheless prefer to cancel my review rather than pass the article myself. -- asilvering (talk) 22:18, 8 January 2022 (UTC)[reply]

asilvering Thanks for your diligence in performing GA reviews! What I might do in this case is pick some references at random and ask for quotes that back them up (Google Translate French -> English is reasonably reliable in my experience even if you don't speak French). If you have access to the sources but the nominator does not, it is possible to email electronic copies of the material that they can work with? I do think if you are making substantial changes to the article, then you become involved, but then you could ask for a second reviewer to pass the article. (t · c) buidhe 02:06, 9 January 2022 (UTC)[reply]
asilvering I've been watching the backlog drive and thinking of reviewing some, I can come in as the second reviewer if you want to get involved in editing the article directly. I'm not very experienced with GA reviewing but I can read French and she seems interesting. ~ L 🌸 (talk) 07:08, 10 January 2022 (UTC)[reply]
@LEvalyn That rarest of heroes: a Reviewer Two you actually want to hear from! I've added you to the GAR page. -- asilvering (talk) 07:39, 10 January 2022 (UTC)[reply]

Advice wanted[edit]

Just looking for some general advice - how do people cope with unpleasant nominators? I've just had someone continue to be grumpy and snarky even after their article passed the criteria, and it's leaving me dispirited about this drive. HenryCrun15 (talk) 06:43, 17 January 2022 (UTC)[reply]

I would first make sure that they know they are behaving inappropriately, something like "Do you still wish that I continue to review this article? You are giving me a hard time; lets try to stay constructive". If it gets unbearable, I would not wait too long with withdrawing from the review, leaving a statement that you were not able to reach a constructive solution with the nominator. A new reviewer who is taking over may adjust to the situation much more easily, and – should they get similar problems – will be in a much stronger position. That may sound hard, but I think that risking your personal motivation is obviously bad for the project of a whole. I would be very interested to hear how others would handle this, though. --Jens Lallensack (talk) 08:49, 17 January 2022 (UTC)[reply]
What Jens said. Kingsif (talk) 21:37, 17 January 2022 (UTC)[reply]
HenryCrun15, you're talking, I take it, about your review of Erdős–Straus conjecture? If so, you're not alone; I've engaged with that nominator two or three times, every one of which has been a distinctly unpleasant experience. That level of abrasiveness and rudeness would not be tolerated in most corners of the real world, and there is no reason that it should be here. But the solution, as Jens said, is relatively simple—you should feel free to disengage and turn your attention elsewhere, even if that means stopping in the middle of a review. Editing Wikipedia is supposed to be fun, after all, and thankfully, most editors help keep it that way. --Usernameunique (talk) 22:56, 17 January 2022 (UTC)[reply]
Most of the time everyone at GA is lovely, sometimes I find nominators can't see the wood for the trees, ie they've written the whole article and somehow can't accept any well-meant criticism. In those rare times I would either do what Jens said (and fail the review) or if we are close to the end, just finish up and move on. That's happened to me maybe twice so far in around 30 reviews. As a footnote, I've also found the nominator of the mentioned review can act aggressively. Mujinga (talk) 12:04, 19 January 2022 (UTC)[reply]

16th Jan increase?[edit]

The progress table suggests that the nom count went up by 9 on 16th Jan, but the overall "change since beginning" count apparently shows a decrease of 9 on the same day. Should this maybe have been 9 the in the other direction (232 to 241), as it then skews the following days, or have I read this wrong? Bungle (talkcontribs) 18:29, 18 January 2022 (UTC)[reply]

Pinging @BlueMoonset: who should be able to answer this. Trainsandotherthings (talk) 02:07, 19 January 2022 (UTC)[reply]
Bungle, the change columns reflect changes in the unreviewed nominations, not the total nominations. So although the number of total nominations increased by 6 from 389 to 395 between the 15th and 16th, the number of unreviewed nominations increased by 9, from 230 to 239. Since the drive started with 462 unreviewed nominations, this means we went from a decrease of 232 through the 15th (462 − 230) to a decrease of 223 (462 − 239) through the 16th. If the total decrease had been to 241, the number of unreviewed nominations would also have had to decrease by 9, rather than increase by 9. BlueMoonset (talk) 02:50, 19 January 2022 (UTC)[reply]
@BlueMoonset: Absolutely, thanks (I did note at the end I may have read it wrongly). Thanks for clarifying :) Bungle (talkcontribs) 06:51, 19 January 2022 (UTC)[reply]

Lowest number of unreviewed nominations since 7 March 2014[edit]

We've just attained the lowest number of unreviewed nominations for nearly eight years, since 7 March 2014. As of midnight, we had 191 unreviewed nominations; on 7 March 2014 we had 181, down from 194 the day before. Our most recent low point was 196 unreviewed nominations on 29 April 2020, so we're in even more impressive territory. (After 2014, the next stop is early 2012.)

For total nominations, the 332 we hit at midnight is still short of the 326 we achieved a few months ago, on both 7 and 17 October 2021, but after that we jump all the way back to 3 July 2016 with 325 and 316 the previous day. Then it's over two years further to 315 on 29 March 2014, all of which are in reach. It will be interesting to see how far we get in the next week. BlueMoonset (talk) 01:46, 25 January 2022 (UTC)[reply]

Kudos to the coords and all participants for an exceedingly successful backlog drive! Part of the reason it got so low was that we started the drive at a lower point than other backlog drives, and part of it is your hard work! (t · c) buidhe 19:05, 28 January 2022 (UTC)[reply]
Massive congratulations to the coords on this one. I'm sad that my exam season overlaps with half the drive and so I haven't been able to do more reviewing, but this drive has been a great success. 👏 A. C. SantacruzPlease ping me! 19:18, 28 January 2022 (UTC)[reply]
We're down even further after today: 316 total noms (lowest since 2 July 2016) and 183 unreviewed noms (lowest since 7 March 2014). If we drop any further, the totals start moving through March 2014, and the unreviewed into January 2012. BlueMoonset (talk) 01:19, 30 January 2022 (UTC)[reply]
And, with one day to go, 310 total noms (lowest since 25 March 2014), and 176 unreviewed noms (lowest since 15 January 2012; the numbers from 23 January to 7 February 2012 are clearly problematic). BlueMoonset (talk) 02:08, 31 January 2022 (UTC)[reply]
Excellent work everyone. Is there a theoretical cutoff for GAN not being backlogged and if yes, what is it? One week? Two weeks? Four weeks? —Kusma (talk) 13:49, 31 January 2022 (UTC)[reply]
When we give out the shiny things for participants, we should absolutely say something about reducing the backlog to its lowest number in an entire decade! — GhostRiver 15:08, 31 January 2022 (UTC)[reply]

Final results[edit]

The final results are in as of midnight on 1 February 2022: 299 total nominations, of which 165 were unreviewed:

  • 299 total nominations is the lowest total since 7 March 2014, when the total stood at 298
  • 165 unreviewed nominations is the lowest total since 12 January 2012, when the number stood at 163

This leaves 134 active reviews still to be concluded at that time. If either number dips lower as people finish off their reviews (and maybe also grab new ones because people do that), I'll post about it here. Note that the next two milestones for total nominations are 297 on 28 February 2014, and 292 on 15 February 2012. So if the total decreases by two more, we go back another two years... BlueMoonset (talk) 01:13, 1 February 2022 (UTC)[reply]

Can't believe I forgot the two most important numbers:

  • Total nominations were reduced by 225: 524 to 299
  • Unreviewed nominations were reduced by 297: 462 to 165

BlueMoonset (talk) 02:13, 1 February 2022 (UTC)[reply]

Now at 296 total nominations... --Usernameunique (talk) 16:01, 1 February 2022 (UTC)[reply]
I think we need to continue the midnight timing. I did see 295 during the day, but midnight was 297, which ties 28 February 2014. BlueMoonset (talk) 02:09, 2 February 2022 (UTC)[reply]
Down tonight to 295 total nominations despite an influx of new noms. As noted above, this is the lowest number of nominations since 15 February 2012. Congratulations to all concerned. BlueMoonset (talk) 01:03, 3 February 2022 (UTC)[reply]

Two more days[edit]

As a reminder, the drive ends at the end of January—two days and a few hours from now. Please ensure all articles are listed on the drive page by then. Also, it is a good time to grab any more articles you intend on reviewing; any review posted before the drive ends will be counted in the points tally. Pinging all participants: A. C. Santacruz, Alanna the Brave, Amitchell125, ArnabSaha, Artem.G, AryKun, Asilvering, Bilorv, BubbaDaAmogus, Bungle, Casliber, Catlemur, ComplexRational, David Fuchs, Ealdgyth, Eddie891, Elli, Etriusus, Eviolite, ExcellentWheatFarmer, Ezlev, FormalDude, Ganesha811, Gerald Waldo Luis, GhostRiver, Gug01, HenryCrun15, HickoryOughtShirt?4, Hog Farm, Hurricane Noah, I'ma editor2022, JackFromWisconsin, Jens Lallensack, JohnFromPinckney, JPxG, Kyle Peake, Kavyansh.Singh, Kingsif, Kusma, LEvalyn, Mark83, mhawk10, MrLinkinPark333, MSG17, Muboshgu, Mujinga, Olivaw-Daneel, Paparazzzi, PCN02WPS, Pickersgill-Cunliffe, Realmaxxver, REDMAN 2019, Reidgreg, Sammi Brie, Sennalen, Shushugah, simongraham, Skarmory, starsandwhales, Steelkamp, Sturmvogel_66, Tayi Arajakate, The C of E, The Most Comfortable Chair, Trainsandotherthings, Vacant0, Vanamonde93, Vaticidalprophet, Vice regent, Vogon101, WhinyTheYounger, Z1720, Zmbro. --Usernameunique (talk) 19:02, 29 January 2022 (UTC)[reply]

@Usernameunique: Your ping did not work. You can't send more than 50 notifications in one edit, it will simply not do anything. —Kusma (talk) 20:38, 29 January 2022 (UTC)[reply]
I got mine. How is it not working ?BubbaDaAmogus (talk) 20:56, 29 January 2022 (UTC)[reply]
You probably received the second (now invisible) ping. —Kusma (talk) 21:02, 29 January 2022 (UTC)[reply]
Thanks for the heads up, Kusma. Hopefully the second attempt (1; 2; 3; 4) worked. BubbaDaAmogus, you might have got the second ping, which I then deleted (see links in previous sentence). --Usernameunique (talk) 21:04, 29 January 2022 (UTC)[reply]
Ok. I don't understand how the whole "pinging" system works yet, so it's gonna be a while before this looks easier to me. BubbaDaAmogus (talk) 21:07, 29 January 2022 (UTC)[reply]

Final steps[edit]

The drive ended yesterday, though obviously many of the reviews are still being finished up. GhostRiver, Lee Vilenski, Trainsandotherthings, how do you feel about giving it a week or so to let the dust settle, then going through and tallying points/handing out barnstars? (I should be able to help with that.) As GhostRiver said, we should also mention how effective this drive was; I can draft an update to the previous message which gives some of the great stats from the drive. Anything else we should be doing? --Usernameunique (talk) 15:41, 1 February 2022 (UTC)[reply]

I agree that a week sounds good to give reviewers time to finish up and nominators time to make some changes. We should also start checking some of the completed reviews now, however, just to get a jump on that. — GhostRiver 17:06, 1 February 2022 (UTC)[reply]

Takeaways[edit]

While the drive is still fresh in our minds, anyone have any thoughts about the drive—what worked, what didn't, things to try in the next one, and so on? Obviously, the drive was effective; this is seen in the total numbers, and also in the way that the backlog of old nominations in particular was crushed. A couple things that I think helped were:

  1. The drive was somewhat close in time to the last one (so the beginning numbers were high, but not huge);
  2. The drive was also still somewhat removed in time from the last one (so editors weren't burned out);
  3. The list of old nominations focused attention on those nominations, and was periodically dropped (e.g., from 90 to 75 days) when only a few remained.

For the next drive, I think we might:

  1. Start in May or June (i.e., with a three- or four-month break) to try to accommodate the above two factors, while keeping the backlog's overall progression trending down.
  2. Focus attention not just on unreviewed nominations, but on stagnated reviews. The list of abandoned reviews does this to a degree, but it's a bit of an afterthought. Also, there are plenty of old reviews and old holds, which perhaps the coordinators could put some effort into nudging along.

--Usernameunique (talk) 15:59, 1 February 2022 (UTC)[reply]

Another takeaway is, of course, that this drive coincided with the beginning of the WikiCup, which means that Cup participants were likely looking for quick points that could be found in reviewing, and it also means that many people were nominating, so by doing more reviews, we hopefully can keep the backlog down a while longer. This to say that May is probably a marginally more attractive option than June, as May would coincide with a new Cup round (and also has one more day in the month). — GhostRiver 17:08, 1 February 2022 (UTC)[reply]
We did have several problems with good-faith but very inexperienced reviewers jumping in and unintentionally causing problems. We should consider ways to encourage new reviewers to seek assistance before jumping in head-first. Trainsandotherthings (talk) 17:13, 1 February 2022 (UTC)[reply]
Offer 0.5 points for mentoring a new user? —Kusma (talk) 17:22, 1 February 2022 (UTC)[reply]
That really seems a good thing to do, but I don't think we should be giving points for that. Does mentoring help our drive's aim of "[reducing] the backlog of Good Article nominations"? – Kavyansh.Singh (talk) 18:08, 1 February 2022 (UTC)[reply]
@Trainsandotherthings I think this would be mostly cleared up if the reviews were checked more quickly, so having more co-ordinators or a pre-approved list of experienced editors who can check off reviews (as suggested below) seems to me the best way to do this? Speaking as a noob who started doing GA reviews for this backlog drive, I pitched in because the idea that having someone come by to check was very reassuring. I felt like I could join in without breaking anything too badly, because someone would come by to tell me I'd made a mess, without me having to draw a lot of attention to myself in order to get someone to check my work. If there was more "encouraging new reviewers to seek assistance before jumping in", I think I wouldn't have joined, because I would have read it as a polite way to tell noobs to gtfo. I like to think I was a net positive in this backlog drive (the people whose articles I reviewed seemed happy with my reviews, at least...), and it certainly made me feel more like "a wikipedia editor" or "a member of the community" rather than "some rando with a wikipedia account", so it was personally valuable too. I wouldn't want a future someone in the same position to feel scared off. -- asilvering (talk) 22:22, 7 February 2022 (UTC)[reply]
This was a highly productive backlog drive, thanks to all you participated and made it possible. The only thing I'll note is that a majority of the reviews in this drive were (at-least 'are') not checked. It was all good for first few days (including the innovative idea of mentioning a fact in the edit summary when checking reviews). While not a major issue, but long list of un-checked reviews may cause some problems later. If there is something wrong with anyone's review, they should be told that within a day or two, then at the end of the drive when remaining reviews are checked. So, as proposed by Shushugah above, I suggest that in any future drives, either (#1) allow any non-coordinator in good standing to check other reviews, like the QPQ system, or (#2) increase the number of coordinators to (X), giving then time to check reviews as well as manage their own GA reviews/nominations. I personally favor #1. Thoughts? – Kavyansh.Singh (talk) 17:29, 1 February 2022 (UTC)[reply]
Yeah, I tried to keep up at first, but the volume simply got too high (and I've been busier than normal IRL, planning and executing a move two hours away from my current home) so I fell way behind. This is a good problem to have, since it means we had lots of reviews done. I'll be hacking away at the backlog along with the other coords. In the future, we could certainly add more coords. I'm a bit hesitant about allowing non-coords to check other people's reviews, but I'm not entirely opposed to it either. Trainsandotherthings (talk) 17:46, 1 February 2022 (UTC)[reply]
I don't love the idea of allowing non-coords to check either (especially as a QPQ system), because it opens up the way for some of the overzealous new reviewers who have been doing drive-bys to do their own drive-by review checks. Adding more coords would be fine, or alternatively (here comes my education degree) assigning specific tasks to specific coordinators a few days in advance rather than an "anyone can do whatever" approach that might lead to lopsided participation. — GhostRiver 17:57, 1 February 2022 (UTC)[reply]
Yeah, I know that just like drive-by reviews, there may be drive-by checks. I don't propose it as a QPQ system, where it would be mandatory to check. It would be a voluntary process. And I suggested: "any non-coordinator in good standing" (emphasis mine this time). And @GhostRiver, what are the tasks of drive coordinators except checking reviews? There must be something else, but I cannot recall. Because dividing the tasks also seems a good idea. – Kavyansh.Singh (talk) 18:06, 1 February 2022 (UTC)[reply]
Coords are also more or less the go-to people when editors have questions related to the drive. We generally help make sure everything goes smoothly. It's also a convenient excuse for me to put off some of my projects I've been meaning to do onwiki ;). Trainsandotherthings (talk) 18:15, 1 February 2022 (UTC)[reply]
Just like how Asiatic cheetah was instantly nominated by a user who hadn't edited an article, he promised to fix the article [1], then he decided to change its name to get away with it, and then the article was also instantly passed as GA by the reviewer. It's a shame, someone should delist it. 90.149.247.199 (talk) 13:33, 2 February 2022 (UTC)[reply]
I don't see any reason to believe that both nominator and reviewer didn't act in good faith. Probably they simply were inexperienced, and that is ok; we can't expect people to be perfect from the start. However, with all those "citation needed" tags in place, I also think that this article needs reassessment. --Jens Lallensack (talk) 14:10, 2 February 2022 (UTC)[reply]

Assistance needed checking reviews[edit]

@Lee Vilenski, GhostRiver, and Usernameunique: we still have a lot of reviews that need to be checked so we can finish up and award barnstars. I've been checking some but I've been busy IRL, so I'd appreciate if you three could help out as well. Thanks. Trainsandotherthings (talk) 20:05, 26 February 2022 (UTC)[reply]

I can spend some time checking the next few days. Obviously I can't check any of my own, though, or reviews of my articles. — GhostRiver 22:13, 26 February 2022 (UTC)[reply]
@Lee Vilenski, GhostRiver, Usernameunique, and Trainsandotherthings: Just a reminder as there's still a bunch of unchecked reviews and there haven't been any reviews checks since 8th March. AryKun (talk) 11:43, 24 March 2022 (UTC)[reply]
My irl schedule is pretty hectic but I'll try to get some checks done this weekend. Trainsandotherthings (talk) 11:45, 24 March 2022 (UTC)[reply]

Maybe we need a GAN Backlog Drive Backlog Drive ;) More seriously, for the backlog drive after this one, we either should have more coordinators, or have non-coordinators checking reviews, or something else more creative. —Kusma (talk) 08:49, 29 March 2022 (UTC)[reply]

Maybe, for next time, let participants count their points themselves, with coordinators only doing spot checks? Jens Lallensack (talk) 09:25, 29 March 2022 (UTC)[reply]
And/or allow people to flag up reviews for checking by coords (we could have a section under the count where people could just say "looks ok ~~~~ or "looks fine except for maybe Talk:AwesomeArticle/GA1, coords decide plz" and if the section gets three thumbs-ups by experienced reviewers, the coords can just accept that subject to spot checks if paranoid). It seems clear we need a process that is easier on the coords than the perfectionist "sign off every single review". —Kusma (talk) 09:37, 29 March 2022 (UTC)[reply]
I don't know what the solution is. I am in the middle of a horrible depressive episode caused by my bipolar disorder and everything feels like I'm trying to swim through molasses. I'm sorry. — GhostRiver 14:43, 29 March 2022 (UTC)[reply]
I'm sorry to hear that. Please do whatever you need to do for yourself. Unwatch the page if it helps. @Lee Vilenski, @Usernameunique, @Trainsandotherthings: I think there's about 250 reviews to check. If we get two or five more volunteers (I volunteer if you'll have me) and everyone checks 30-50 reviews, this will be done in under a week and people can finally get their barnstars and medals. —Kusma (talk) 15:06, 29 March 2022 (UTC)[reply]
Yeah, noting that mine have already been checked (I added an old tag back, hope you don't mind), I can help check some others. I am again relegated somewhat to mobile so it would actually be easier for me. Kingsif (talk) 16:12, 29 March 2022 (UTC)[reply]
I will work on this when I get a chance. I'll try to review at least a few today. Real life has been pretty busy and I've had limited energy for Wikipedia, but I will do as much as I can. By my count, I've checked a bit over 100 reviews so far. Doing all 250 remaining would take me a long time. I see Usernameunique has checked one review and Lee Vilenski has checked none - could both of you please pitch in? Trainsandotherthings (talk) 17:14, 29 March 2022 (UTC)[reply]
@Trainsandotherthings, looks like you're the only remaining coordinator. Perhaps you should co-opt a few people so you don't have to do all the work by yourself (so far you have not accepted any of the offers of help). You could ask the wider GA community to help at WT:GAN so a few more people can pitch in. I'm not too worried about getting a barnstar or not, but I am kind of worried about future backlog drives if this one never gets closed. —Kusma (talk) 14:25, 2 April 2022 (UTC)[reply]
@Kusma:Yeah, at this point I'm willing to accept any offers of help, doing this all alone would take me far too long. If anyone still watching this page wants to pitch in please feel free, just post here first so we know. Trainsandotherthings (talk) 14:29, 2 April 2022 (UTC)[reply]
I can help checking the reviews! – Kavyansh.Singh (talk) 14:46, 2 April 2022 (UTC)[reply]
OK, I made a start. Five down, lots to go. Maybe @Kingsif will join in as well? —Kusma (talk) 14:46, 2 April 2022 (UTC)[reply]
@Trainsandotherthings, Kusma, and Kavyansh.Singh: I just looked at... every unchecked review, and marked a bunch as no good (because somebody has to). Some of the rest I think need a closer look and might be no good, but most seem good to me, in case you were wondering. Kingsif (talk) 03:47, 4 April 2022 (UTC)[reply]
Thanks for checking. If you mark relatively extensive reviews like this [2] as "no good", I must have misunderstood the drive criteria I fear. I tend to keep my reviews short and simple but that doesn't mean I didn't address all criteria. Jens Lallensack (talk) 06:48, 4 April 2022 (UTC)[reply]
That's a 2.5k+ review that should count. —Kusma (talk) 07:23, 4 April 2022 (UTC)[reply]
Why are you going by review page size only? If that was all that was needed, it would have been easy to tally counts months ago. No, 2.5k of template or fluff doesn't count as a review. In this specific case, there are a dozen or so fragment remarks on punctuation, wikilinks, word choice, then it is failed. It doesn't even pretend to address more criteria than "is well-written". Just remarking that the rest were at least checked, especially before a fail, seems bare minimum for anyone looking at the review to accept it was done properly. Kingsif (talk) 12:43, 4 April 2022 (UTC)[reply]
I am not going by page size only, but that is one of the few guidances we have. The review in question does address the content as well, and was accepted as helpful by the nominator. I see that the guidelines mention "check all criteria", but that is something that wouldn't happen for a failed review outside the backlog drive; I don't think it is a reasonable thing to ask for. —Kusma (talk) 12:56, 4 April 2022 (UTC)[reply]
I think our expectations probably do differ, but besides that, all I can say is that if there is no evidence at the review page that the reviewer has attempted a sufficient review (and quickfailing based on a dozen or so poorly-placed commas and the like is a dubious fail anyway), then it surely can't be accepted as one. Kingsif (talk) 13:06, 4 April 2022 (UTC)[reply]

Speaking with my coord hat on, I'm not sure I'd say the Guadeloupe woodpecker review is extensive enough to get credit. I don't think I'd have quickfailed that article personally, but every reviewer is different. Noting that quickfails are eligible to be counted so long as the review is still thorough enough. It's bordeline but I'd say this review isn't quite there. Trainsandotherthings (talk) 14:50, 4 April 2022 (UTC)[reply]

I am not complaining, I just want to understand what I did wrong here, to not repeat the same mistake the next drive. I do still not understand at all why this article didn't pass. Not extensive enough in which way? Is it too short? It isn't a quick fail btw.; although I was clear that the article needs substantial work, it was the author who decided to withdraw. Jens Lallensack (talk) 15:09, 4 April 2022 (UTC)[reply]
Oh, I didn't realize it was a withdrawal by the nominator. This is my first time acting as a coord. I was hoping the other coords would help out here, but GhostRiver has IRL circumstances taking up her time, and the other two have ignored all pings and attempts at communication. In the drive rules we have a minimum length requirement listed, and that may need to be revisited for the next drive. To be honest I may not do this again if the other coords leave me to do all the checking of reviews myself, it's highly unfair to me. With the delegating I've done to try and get this done since it's already hilariously late, I don't really have time to check everyone's work. I invite further comment from the editor who checked the review in question as well. Trainsandotherthings (talk) 15:39, 4 April 2022 (UTC)[reply]
Thank you for all the effort you put into this. I'm happy to help out checking remaining reviews as well. It would be best, of course, if we all follow the same standards while checking, at least approximately. Regarding the required minimum length: we have many highly experienced writers in Wikipedia, and for those articles, I sometimes spend two hours or more and still struggle to reach 1k of review (without fluff and templates, which I never add). I thought this minimum length was a good choice. Jens Lallensack (talk) 15:55, 4 April 2022 (UTC)[reply]
I think I explained but, well, the review is long enough for me. However, even if Jens did read the whole article and mentally checked off all GA criteria, none of that is noted in the review. To read the review, they point out a bunch of minor style points, then say that the article needs serious work to be well written. With this note, an encouragement to quickfail and no offer to continue reviewing, the nominator says ok they'll withdraw it. Meeting a set byte length doesn't trump noting you have reviewed all GA criteria. So if Jens wants to know what to do better: even just say "I checked the images, they're fine", and on for all criteria. It's like a paper trail, but without it, the review page is barely (if at all) different to a drive-by. Kingsif (talk) 22:51, 5 April 2022 (UTC)[reply]
@Kingsif Thanks for your explanation. However, my points are not "minor style points"; most of them address serious content flaws that derive from incorrect translation of a French Featured Article. You mentioned I merely did "fragment remarks on punctuation, wikilinks, word choice", but I didn't put any remarks on punctuation; I only listed wikilinks if they link to the wrong article and thus critically mislead the reader; and I only comment on word choice if the issue was as serious to completely change the meaning of the sentence. Minor issues I actually fixed myself during review, at least some of them that were immediately clear to me (there were to many), see [3]. You also say I didn't offer to continue reviewing, which is simply not true. I wrote Could you try to split longer sentences and simplify, and also check if every information makes sense to you. I would then do another attempt. – How much more encouragement do you want to continue working on an article? What are you looking for instead? You criticise that I stated the article needs substantial work, but this is my assessment and I need to be honest with the reviewer. And again, it is not a quickfail (it was in fact the nominator who failed it themselves in the end). Your last point is that I didn't indicate that I checked all GA criteria. I have a few thoughts on this. First, it is not a listed requirement to do this, and I'm not the only drive participant who does not do this out of principle. I personally consider this to be "fluff", one of the thing you were criticising about my review, although I don't see any fluff in my review; I always try to avoid it. So requiring something that was never a requirement previously may be not completely fair to participants, in my opinion. Second, I do not see how a copy-paste check list of GA criteria, or a sentence like "all other criteria are met", or similar approaches, would provide evidence that those criteria were checked in sufficient detail. It would have taken me a second to add such copy-paste sentences or templates, but it is nothing I want to do. Third, I think it is very reasonable in such cases to follow WP:AGF and assume that reviewers know what the GA criteria are even if they do not explicitly mention each criterion. I am interested to hear your thoughts on this. Thanks, Jens Lallensack (talk) 08:10, 6 April 2022 (UTC)[reply]
I didn't say your review was fluff, I was saying in general that fluff is a way most reviews get over 2.5k and so byte length is a poor metric for review quality. With your review, the issue is your attempt at being concise. You say that substantial work needs to be done without indicating what that is - that's not a review. If a nominator was to fix every point you listed and then nominate it again, presumably there would still be all that other substantial work? So it's not useful. And while a short sentence saying you've checked X criteria doesn't explain how you've done that, it does give the nominator (and anyone in the future reading the review) confidence that you have done it, as you've bothered/remembered to mention it. I wouldn't AGF; in some of my own noms, I have indeed asked reviewers what they thought of X or Y, or specific parts I've been concerned with myself, if it's not mentioned. Article improvement is the goal, here, all the time, and even if it seems perfunctory to you, a nod that everything an article really needs to be most useful to a reader is there, I think is helpful. Of course, what I think on that is besides the point: your or my personal philosophy on assuming a reviewer did do a review doesn't mean anything. Drive-by passes and fails would always stand with that outlook. No, if the GANR guidelines (and those of the backlog drive, if you think it's being singled out - it isn't) want all criteria checked, the review should show that all criteria have been checked as a minimum. Though, in checking these backlog drive reviews I have given leeway to not noting one or two, but skipping most looks lazy even if it isn't. If the organizer(s) had to ask the reviewer of every dubious review if they had done some reviewing they hadn't noted, they'd be here until Christmas. If they took good faith that every review a reviewer said they'd done was good (which I see you did propose above and was not accepted), there would be no checking. Maybe your review was helpful in seeing that article get some improvement though I maintain that telling a nominator it needs substantial work is all but telling them to withdraw it, and that's what should be the take home here. Kingsif (talk) 13:22, 6 April 2022 (UTC)[reply]
I was very precise about what the nominator needs to do (and at least they understood it), and again, I was not telling them to withdraw. And an relatively extensive review on which I spent a full hour is not a drive-by review per definition. But fine for me, let's move on with important things now. Jens Lallensack (talk) 13:41, 6 April 2022 (UTC)[reply]

"No good" reviews[edit]

@Lee Vilenski, GhostRiver, Usernameunique, Trainsandotherthings, Kusma, and Kavyansh.Singh:, Kingsif has marked quite a few reviews as being "no good" and non-qualifying, including a lot that seem to be comprehensive enough to me. Speaking for my own, the ones of mine they've marked are on articles where the nominators are really experienced at writing about the subject and there's honestly nothing to criticize except for grammar, formatting, and perhaps some ref errors. I really don't see what criteria they're disqualifying these on. AryKun (talk) 05:51, 13 April 2022 (UTC)[reply]

You can read the above, if you like, but it is quite simple: if the review page does not indicate that most criteria have been checked, it is not a complete review. It cannot be taken on good faith that the reviewer has checked things they haven't mentioned (even a perfunctory "checked, fine"), and it cannot be expected for someone checking reviews to be also checking the article before and after promotion, or asking the reviewer about their process, to get a full idea that isn't noted. Reviews are written out, indeed, for a reason; if it is not written, how is anyone to know it was done. I was actually pretty lenient on many short/template reviews that I didn't disqualify. Kingsif (talk) 09:17, 13 April 2022 (UTC)[reply]
Literally every other coord checking reviews seems to have been fine with the reviews you're disqualifying. AryKun (talk) 09:21, 13 April 2022 (UTC)[reply]
As a nominator I would not be fine with a review like, say, this one you want to get credit for. Maybe you put a lot of effort into it, but nothing indicates that. You asked on what criteria was it disqualified, I told you; if you want somebody else to look through your four bullet points just telling people to link X, fine. But, besides skipping all but half of one criterion, you don't even give a reason why you are suggesting edits, so I don't see how anyone would find them valid reviews. Kingsif (talk) 09:25, 13 April 2022 (UTC)[reply]
On the surface, there is nothing wrong with short reviews if the article is actually well-written, but it's important that all of the criteria are checked, which is why we have things like GA Progress boxes. Even when everything is fine in a section, I tend to write out "Good", and I note things like "No stability concerns" to make sure that I'm checking for all of the above. In this capacity, a quickfail is more likely to pass inspection than a quick pass, because a fail means that a specific criterion has been checked against and found wanting. (Apologies if this is only semi-coherent, I'm going through a medication adjustment at the moment and my thoughts are a little scattered). — GhostRiver 14:08, 13 April 2022 (UTC)[reply]
I unfortunately don't have much time for Wikipedia lately, but I will at some point double check Kingsif's checks and vacate any I disagree with in my capacity as a coord. I can't promise a timeframe on that though, I have been and remain very busy irl. Unfortunately Lee Vilenski and Usernameunique have totally abandoned their obligations as coordinators... Trainsandotherthings (talk) 12:56, 13 April 2022 (UTC)[reply]
My apologies, I thought we could help you instead of creating more work for you. Kingsif has now unfortunately managed to kill off the momentum of people helping check reviews that had developed in early April. —Kusma (talk) 16:34, 13 April 2022 (UTC)[reply]
I think that overall the backlog drive should not require too much of reviewers. (It should not be harder to get credit for reviews than over at the WikiCup). Judging reviews harshly now also does not make any sense, unlike during the backlog drive when people could have learned to write reviews that satisfy the judges. —Kusma (talk) 16:31, 13 April 2022 (UTC)[reply]

Review checking progress[edit]

Over the past few days I've been able to knock out a significant number of reviews. By my best estimate we have less than 100 remaining. I'm hoping to get this closed out by the end of the week. Trainsandotherthings (talk) 18:13, 9 May 2022 (UTC)[reply]

@Trainsandotherthings: I checked a few more reviews. I am not sure about Talk:Columbia University tunnels/GA1 particularly. The review is long enough (for drive, at least), and does appear to have checked various crierias, but the article was delisted recently due to WP:V issues. – Kavyansh.Singh (talk) 16:31, 10 May 2022 (UTC)[reply]
Wow. I just read through that GAR. We should not count that review if it missed problems serious enough to get the article delisted, that were present at the time of the review. This appears to be the case. Trainsandotherthings (talk) 16:35, 10 May 2022 (UTC)[reply]

If I am not wrong, it appears only the reviews of me and Jens are left. Rest all appears to have been checked and totalled. – Kavyansh.Singh (talk) 18:29, 10 May 2022 (UTC)[reply]

Yours are now done as well. So just Jens left. There's just so many that this user did it may take a while, and a lot of them are right on the borderline for length. Cautiously optimistic we can get this finished today or tomorrow. Trainsandotherthings (talk) 18:36, 10 May 2022 (UTC)[reply]

Statistics[edit]

Total number of participants: 73

Statistics for GAN January drive
Participant # reviews # "old" reviews "Extra" points for old reviews TOTAL points
A. C. Santacruz 4 1 0.5 4.5
Alanna the Brave 3 0 0 3
Amitchell125 16 2 1 17
ArnabSaha 2 0 0 2
Artem.G 8 6 3 11
AryKun 20 8 4 24
asilvering 7 2 1 8
Bilorv 1 0 0 1
BubbaDaAmogus 0 0 0 0
Bungle 2 1 0.5 2.5
Casliber 2 0 0 2
Catlemur 1 0 0 1
ComplexRational 1 0 0 1
David Fuchs 5 1 0.5 5.5
Ealdgyth 11 0 0 11
Eddie891 2 0 0 2
Elli 0 0 0 0
Etriusus 3 0 0 3
Eviolite 19 5 2.5 21.5
ExcellentWheatFarmer 4 1 0.5 4.5
ezlev 5 3 1.5 6.5
FormalDude 1 0 0 1
Ganesha811 5 3 1.5 6.5
Gerald Waldo Luis 1 0 0 1
GhostRiver 48 5 2.5 50.5
Gug01 0 0 0 0
HenryCrun15 3 0 0 3
HickoryOughtShirt?4 2 0 0 2
Hog Farm 11 4 2 13
Hurricane Noah 0 0 0 0
I'ma editor2022 0 0 0 0
JackFromWisconsin 1 0 0 1
Jens 46 13 6.5 52.5
JohnFromPinckney 1 1 0.5 1.5
JPxG 0 0 0 0
K. Peake 15 0 0 15
Kavyansh.Singh 25 7 3.5 28.5
Kingsif 5 4 2 7
Kusma 10 2 1 11
LEvalyn 3 2 1 4
Mark83 7 3 1.5 8.5
Mhawk10 0 0 0 0
MrLinkinPark333 3 2 1 4
MSG17 1 1 0.5 1.5
Muboshgu 2 2 1 3
Mujinga 23 5 2.5 25.5
Olivaw-Daneel 6 2 1 7
Paparazzzi 2 1 0.5 2.5
PCN02WPS 8 5 2.5 10.5
Pickersgill-Cunliffe 6 0 0 6
Realmaxxver 3 1 0.5 3.5
REDMAN 2019 0 0 0 0
Reidgreg 1 0 0 1
Sammi Brie 7 6 3 10
Sennalen 1 1 0.5 1.5
simongraham 15 0 0 15
Skarmory 1 0 0 1
starsandwhales 0 0 0 0
Steelkamp 3 0 0 3
Sturmvogel_66 5 0 0 5
Tayi Arajakate 2 2 1 3
The C of E 1 1 0.5 1.5
The Most Comfortable Chair 0 0 0 0
Trainsandotherthings 3 2 1 4
Usernameunique 11 7 3.5 14.5
Vacant0 14 7 3.5 17.5
Vanamonde 0 0 0 0
Vaticidalprophet 8 0 0 8
Vice regent 3 0 0 3
Vogon101 3 1 0.5 3.5
WhinyTheYounger 2 2 1 3
Z1720 11 10 5 16
zmbro 1 0 0 1

@Lee Vilenski, GhostRiver, Usernameunique, Trainsandotherthings, Kusma, and Kingsif: — Just few reviews left to check, then we'll be able to complete this table and, finally, wrap up everything! – Kavyansh.Singh (talk) 06:53, 11 May 2022 (UTC)[reply]

WE ARE DONE[edit]

312 checks by yours truly later, we are FINALLY FINISHED with checking all the reviews. I will award out barnstars over the next 24-48 hours, and announce the winner. Trainsandotherthings (talk) 16:53, 13 May 2022 (UTC)[reply]

For procedural reasons (It is improper for me to award myself a barnstar), @GhostRiver: could you award me my barnstar (the invisible barnstar) for this drive? And any others you're up to awarding as well. If not I will eventually get them all besides my own. Trainsandotherthings (talk) 16:58, 13 May 2022 (UTC)[reply]
Stats above have been updated. Thanks a lot to everyone who helped make this drive possible. – Kavyansh.Singh (talk) 17:14, 13 May 2022 (UTC)[reply]
Well done to all involved. I must have missed the bit where it says barnstars started from 3 points rather than the older minimum of 2, and seems I only just miss out :( Glad to have contributed none the less though. Bungle (talkcontribs) 17:08, 13 May 2022 (UTC)[reply]

Barnstars have all been awarded. Including one to myself, since I didn't get a reply to my ping. Congratulations to Jens for winning, and thank you to everyone who reviewed! A special thanks to those who helped check reviews. Trainsandotherthings (talk) 04:07, 15 May 2022 (UTC)[reply]

@Trainsandotherthings: And special thanks to you for, if I may say, single-handedly managing most of the drive. And of-course, thanks to all those who helped check the reviews! Un-watching. – Kavyansh.Singh (talk)