Jump to content

Wikipedia talk:WikiProject New page/Archive

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Name of the project

I think the name of the project should be Wikipedia:WikiProject New pages. Otolemur crassicaudatus (talk) 15:36, 30 December 2007 (UTC)

I have changed the project name to Wikipedia:WikiProject New page studies. Otolemur crassicaudatus (talk) 16:26, 30 December 2007 (UTC)

I have again changed the project name into Wikipedia:WikiProject New page. Otolemur crassicaudatus (talk) 16:31, 30 December 2007 (UTC)

Good luck with this, it seems a worthwhile thing to do. Nick mallory (talk) 06:43, 31 December 2007 (UTC)

Facts on newly created pages

The section "Facts on newly created pages" need to be expanded.Otolemur crassicaudatus (talk) 08:05, 31 December 2007 (UTC)

Userbox

I should create a userbox for this project shortly. :) Rt. 22:12, 31 December 2007 (UTC)

It's not much, but you can view it by adding {{User WikiProject New page}} or just clicking the link. :) Rt. 20:40, 1 January 2008 (UTC)

I have changed the display text from "new page wikiproject" to "WikiProject New page". This gives a more clear understanding. Otolemur crassicaudatus (talk) 14:43, 2 January 2008 (UTC)

End-of-shift report

On a voluntary basis, a newpage patroller could fill out a form that looks like this:

Newpage patrol end-of-shift report
Begin End Total pages patrolled Bot-created CSD-tagged SOBMTI CSD-deleted CSD-denied CSD-removed CSD-pending PROD / AFD Redirects Cleanup / tagged
Date 12:30 13:30 30 30 30 30 30 30 30 30 30 30 30

Total pages patrolled - Actually, the total number of pages that were created during your shift, regardless of whether or not you actually saw them. Approximated by the number of entries in Special:Newpages from the beginning to the end of your shift.

Bot-created - Number of stubs created by bots, mass-creators, etc., during your shift. Easily determined from the bot's list of contributions.


The following can be determined from your list of personal contributions. It is a good idea to wait a while before filling out that section.

CSD-tagged - Number of pages you tagged with CSD during your shift, as determined by the number of user warnings you produced.

SOBMTI - "SomeOne Beat Me To It" - Number of pages you wanted to tag for speedy, but someone else did before you could (no practical way to follow-up).

CSD-deleted - Number of pages you tagged that no longer existed at the end of your shift, as determined by the number of user warnings not associated with an existing page.

CSD-denied - Number of CSD tags you inserted that were removed by admins and/or editors deemed reliable (including yourself), or removed following edits that made the article satisfactory.

CSD-removed - Number of CSD tags you inserted that were removed by article authors, requiring reinsertion.

CSD-pending - Number of CSD tags you inserted that were still in place at the end of your shift.

PROD/AFD - Number of pages you nominated for PROD or AFD during your shift.

Redirects - Number of pages you redirected during your shift, including page moves you made because of inappropriate titles.

Cleanup/tagged - Number of pages you tagged but not for deletion, or cleaned up yourself.

--Blanchardb-MeMyEarsMyMouth-timed 05:04, 1 January 2008 (UTC)

This chart is an excellent idea, Blanchardb. I support this. However I think the "Begin" and "End" column will be not much necessary. Otolemur crassicaudatus (talk) 08:59, 1 January 2008 (UTC)

Another chart can be created on the issue that which criteria is used for CSD. Otolemur crassicaudatus (talk) 09:01, 1 January 2008 (UTC)

My first real one

Newpage patrol end-of-shift report
Begin End Total pages patrolled Bot-created CSD-tagged SOBMTI CSD-deleted CSD-denied CSD-removed CSD-pending PROD / AFD Redirects Cleanup / tagged
January 1st 14:13 UTC 14:56 UTC 106 78+ 9 0 8 1 1 0 1 4 1
January 2nd 2:55 UTC 3:45 UTC 73 11++ 8 9 6 2 1 0 1 3 5
January 2nd 12:45 UTC 14:00 UTC 98 30 8 2 7 1 0 0 0 1 6
January 3rd 23:43 UTC 00:49 UTC 150 114 21 5 20 1 2 0 3 2 5
January 6th 12:45 UTC 14:22 UTC 99 15 13 3 11 1 1 1 1 1 5
January 20th 13:23 UTC 14:22 UTC 85 33 7 2 7 1 0 0 0 1 10

+55 by Olivier (talk · contribs), 7 by Dpaajones (talk · contribs), 7 by Blofeld of SPECTRE (talk · contribs), 9 by Rtphokie (talk · contribs)

++All by Rtphokie (talk · contribs)

--Blanchardb-MeMyEarsMyMouth-timed 15:15, 1 January 2008 (UTC)

My End-of-shift report

Newpage patrol end-of-shift report
CSD-tagged SOBMTI CSD-deleted CSD-denied CSD-removed PROD AFD Cleanup / tagged
January 2nd 5 2 4 1 0 0 0 3

Otolemur crassicaudatus (talk) 14:34, 2 January 2008 (UTC)

New db- template

I have created a new speedy deletion template to be used on attempted copy-and-paste page moves (speedy G6 housekeeping). The template is {{db-copypaste}}. --Blanchardb-MeMyEarsMyMouth-timed 15:29, 2 February 2008 (UTC)

ClueBot V

Yesterday, a bot that would automatically analyze new pages and look for speedy deletion candidates has been approved for a three-day trial period but was deactivated about three hours later after complaints that an inordinate amount of false positives violates WP:BITE. An admin even threatened to block such a bot. The bot's creator, himself an admin, is looking for ways to improve his bot for a second trial, and I would like your input. Please comment at Wikipedia:Bots/Requests for approval/ClueBot V‎. --Blanchardb-MeMyEarsMyMouth-timed 21:46, 20 February 2008 (UTC)


Stopping Leaking of New Page

One of the things I have noticed when doing quick research on notability and sources for new pages I check is that within minutes of being added they are already indexed by Google and friends. Does anyone know if MediaWiki supports some kind of waiting period to block robot crawlers (outside robot crawlers like Google's from seeing pages before editors and admins have had a chance to triage the obvious problems? --Marcinjeske (talk) 03:05, 15 April 2008 (UTC)

This is beyond our jurisdiction, actually. You could state your concerns at MediaWiki, which makes the software that Wikipedia uses. --Blanchardb-MeMyEarsMyMouth-timed 11:09, 15 April 2008 (UTC)


Well, I was mainly wondering if MediaWiki already had this functionality - checking some other resources - it does not. The best it does is use a static robots.txt to ward search bots off from User space, talks pages, and other special purpose pages. But because 1) it depends on cooperation from the search bots and 2) it happens at the URL level before any info is known about the page contents, age, or status, there is no way to leverage that functionality to do the above.

But mainly I posted here because this is the logical place where I could get a sense if this was something editors though should be done. No point in convincing developers to do something if it is never going to get used. So, what would people thing of:

The ability to block search bots (generally, but in particular those from the big engines like Google) from visiting:

  • articles newly created - to provide time for wiki editors to vet and improve a page before it is exposed to the world
    • a major idea being that articles filled with self-promotion or spam which get speedily deleted would never show up in Google results, reducing the incentive for that sort of vandalism
    • a big question is how long an article should be quarantined? A few hours at least? At most a few days? We wouldn't want it to end up like US copyright, where a definite timeline would get extended to an indefinite timeline.
    • what about instead of time, it was edit count, or more specifically number of unique editors. We would pick some threshold, say 3, as the minimal consensus needed for something to be accessible to search engines. That would guarantee that for the most blatant CSD stuff... (user 1 creates, user 2 tags, admin 1 deletes) and no casual users get mislead by Wikipedia's high PageRank into reading an article touting services or random thoughts.
  • articles tagged for copyvio or speedy deletion - if there is such high doubt that the article should exist, why let the search engines see it
    • the danger here is that malicious editors might use this feature to keep legitimate pages out... but them they would be quickly corrected for inappropriately tagging pages - by definition, pages like this are either going to disappear very soon, or the tags will be removed and the search engine would be let in.
  • articles recently edited - again, while vandalism and other inappropriate content is usually quickly removed (thanks to vigilant editors and our own awesome WikiBots), we run the risk that in the meantime, that content gets cached in the external world.
    • again, the question is how long... I would say on the order of a few minutes to an hour after the last edit. The best determination would be made by looking at logs of articles and seeing how long vandalism takes to revert on average.
    • this would remove at least a bit of the incentive to vandalize/spam Wikipedia
  • detecting requests coming from bots in an active way would also allow MediaWiki to enforce the currently voluntary constraints of robots.txt

It is possible for negative effects:

  • detecting what is a robot is not an exact science... there could be false positives where legitimate readers may be blocked from reading articles...
    • but this can be addressed by providing a user-easy but machine-difficult way to view the article in these cases.
    • if the technical ability exists to block one or more types of users based on the content or state of the article, that ability could be misused in the future
    • there is the potential for page creators to be confused why the new page is not showing up in search engines, but the there really should not be a user expectation that listing is instant (even though in practice Google has the site within minutes of creation)
  • the MediaWiki software would have increased work in identifying search bot requests and examining the page for tests of freshness and "minimal consensus"
    • as long as the tests are kept simple, they can be introduced naturally in the course of serving the page, and only under the condition that the requesting agent is a search bot.
      • given the minimal editor test, you do not need to exhaustively search the history... any article with more than a few revision is likely to have multiple editors, once three unique editors are counted, the test succeeds
      • since copyvio and speedy deletion tags should appear at the beginning of a article, only the first few lines of the article need to be examined
    • The tradeoff in processing saved from not displaying a portion new/revised pages to search bots may offset the computational cost of doing the processing

So, how good/bad does that sound as a technological measure to augment and reflect consensus? --Marcinjeske (talk) 15:58, 19 April 2008 (UTC)

This has bothered me in the past as well when doing NPP. MediaWiki and Wikipedia now has {{NOINDEX}}, so all transclusions of mw:Help:New pages and mw:Help:Recent changes could (should?) be tagged to prevent Google slurping them up. This might be an issue for the Village Pump, as it would mean altering quite a few user pages. John Vandenberg (chat) 10:20, 20 December 2008 (UTC)

Nonconstructive comments re new pages

Having created a stub article for Daniel J. O'Hern, in advance of further expansion of the article, I received this comment on my talk page stating that "It is quite annoying to see a new page completely depending on only one reference. Try use some more references for this article. Review WP:RS and WP:V before creating any other article." from User:Otolemur crassicaudatus within seconds of the article's creation. While I couldn't care less as to what this one editor finds annoying, I sincerely hope that attacks on editors of new articles is not the goal and objective of this WikiProject. When I see new articles that I have an issue with, I try my best to edit and expand the articles; While I appreciate the fact that people enjoying criticizing rather than doing, the simple statement that "this article would benefit from additional expansion and sources" -- without the personal annoyance commentary or references to policies already fulfilled by the article -- would be far more productive in the future. Alansohn (talk) 17:28, 4 June 2008 (UTC)

Proposal to limit the creation of new articles

I've written down some thoughts about a proposal to limit the creation of new articles, while allowing anonymous users to create articles (which is not the case now). Your thoughts and comments will be highly valued, see User:Plrk/On the creation of articles. Plrk (talk) 21:28, 14 September 2008 (UTC)

template for directing people to their user page

I see lots of pages that are titled the same as their username (or very similar). Is there a template already that I can put on their user talk page? --Clubmarx (talk) 22:13, 18 October 2008 (UTC)

There used to be {{userfy}} to be placed on the articles themselves, but because it was rarely used it is now a redirect to {{notability}}. However, you can move the page to user space yourself and notify the creator with {{userfied}} on his talk page. Don't forget to put a {{db-rediruser}} tag at the old location. -- Blanchardb -MeMyEarsMyMouth- timed 22:41, 18 October 2008 (UTC)

New CSD criterion

Patrollers should be aware of a new CSD criterion that will certainly cut down the number of AfD's we get.

Please familiarize yourselves with the wording of this criterion, called A9 non-notable album, because someone unfamiliar with it might misapply it.

  1. An article about a musical recording which does not indicate why its subject is important or significant and where the artist's article has never existed or has been deleted. This is distinct from questions of verifiability and reliability of sources, and is a lower standard than notability; to avoid speedy deletion an article does not have to prove that its subject is important/significant, just give a reasonable indication of why it might be important/significant. A9 does not apply to other forms of creative media, products, or any other types of articles.

This means if an album or a song was made by an artist who does have an article, you must either wait for the article on the artist to be deleted, or run an AfD on the album independently of the artist. -- Blanchardb -MeMyEarsMyMouth- timed 16:45, 27 October 2008 (UTC)

snapshots

In order to conduct useful research, it is necessary to have a well defined inputs, and resample using the same inputs. The interface of Special:newpages is very dynamic, which is helpful for people whacking vandals and helping newcomers, but isnt terribly useful for reviewing it. We need snapshots, and we need more than one set of eyes to view the same snapshot and make deductions, in order that they can compare notes.

To this end, I have documented the transclusion interface of mw:Help:New_pages which does allow for snapshots to be set up for review. As an example, I have set up a transclusion example of "last seven days of unpatrolled pages" at User:Jayvdb/NPP7; that examples is a bit too dynamic, and the parameters may not be sufficient to list every unpatrolled page, but it provides an example of how transclusion can be used. We can ask for a "days" parameter if that is going to be useful.

Also, the mw:Recentchanges table removes entries older than mw:Manual:$wgRCMaxAge, we can reconstruct a list of pages created on a given day using bots, which can also try to determine what happened to the new page (was it prodded, speedied, sent to afd, improved and/or approved?). John Vandenberg (chat) 10:15, 20 December 2008 (UTC)

New template

Hi, there's a discussion you might want to be aware of, at Wikipedia talk:Article wizard 2.0/maintenance#New template. cheers, Rd232 talk 11:09, 21 September 2009 (UTC)

I am (opinion) a fairly longtime wikipedian

but have not done too much "behind the curtain" stuff and recently ran into a couple of articles that were labeled ( I paraphrase) as needing to be reviewed. Because they fell within my strike zone (American baseball reference) I decided to do that but am not sure how or where to mark it as Reviewed and did not want to remove the Needs to be reviewed tag before following the proper process. Anyone wan to help me out? Einar aka Carptrash (talk) 21:41, 21 April 2010 (UTC)

inactive

To people participating in this project,
I am going to hold a re-verification of the members to get the active ones. To declare that you are active, please write "# {{user|<you>}}" at the active section. ~~EBE123~~ talkContribs 18:58, 4 June 2011 (UTC)

Active

  1. Ebe123 (talk · contribs)
  2. SilentBobxy2 (talk · contribs)
  3. Ankit Maity (talk · contribs)

Inactive

This project has been basically inactive for very, very long time. Most discussions, especially those concerning issues with New Page Patrolling and its development take place at WP:NPP and its associated pages where your ideas and suggestions may find more resonance. --Kudpung กุดผึ้ง (talk) 16:51, 5 June 2011 (UTC)

Full list of pages:

--Kudpung กุดผึ้ง (talk) 17:28, 5 June 2011 (UTC)

This morning I patrolled around 100 articles with the backlog starting from 11 May. Please help bring it current. There are so many articles being added in wikipedia every day! Is this project going to reach 5 million articles by the end of the year? Divide et Impera (talk) 20:15, 10 June 2011 (UTC)

Restructuring

I would want to restructure all the project. Here is the primary things.

  1. Renaming to Wikipedia:WikiProject New Page Patrol;
  2. Get all to date;
  3. Members, remove inactive ones.


I would want consensus before doing the changes. ~~EBE123~~ talkContribs 22:25, 10 June 2011 (UTC)

With all due respect, I feel that instead of attempting to revive a completely dead project that may shortly be proposed for deletion, you may prefer to enlist more support for the main project at WP:NPP which has superceeded this one. New page patrolling is in a sorry state and patrollers urgently need to be encouraged to do the job of patrolling more accurately. Any development of NPP as a system is already being carried out and is the subject of intensive research since October last year. We have recently recrafted all the pages to be more informative, and have bots running that gather statistics. As a result of this research, major new board-wide policy regarding the creation of new pages was established two weeks ago through discussion by almost 600 editors, and is soon to be implemented. The effect will be to reduced the number of pages to be patrolled by up to 80%, possibly making New Page Patrol a redundant system. --Kudpung กุดผึ้ง (talk) 00:58, 11 June 2011 (UTC)