User:Rich Farmbrough/Image Filter/The hard questions

From Wikipedia, the free encyclopedia

Who decides what categories are offered to users?[edit]

  • Some categories that might be requested, might be considered offensive to offer. For example, and in particular, images of women. (This is the liberal dilemma - the liberal respects the right of others to believe what they want, unless it becomes illiberal, thus betraying the fundamental illiberality of the liberal.)
  • The more categories the more work is needed - even tagging as a short-cut can be huge - and if the number of categories runs to thousands then there begin to be technical issues with the proposed solutions.
  • Any system without a severity rating is fairly limited .
  • The system would also need to cope with moving images, sound files and film clips.
  • The system should distinguish between line drawings, monochrome pictures, photos of artwork, real and faked images
  • People will want multiple sets of filters, NSFW, home, with-the-kids.

Who decides which images are in which categories?[edit]

Categorisors on Commons are already in backlog. License patrol on en: is having to be done by bot. We simply do not have the resource to implement this, nor is there a massive pool of Wikimedians waiting to join and do these types of jobs. We are at the top end of the ogee.

How is it kept this culturally neutral?[edit]

While the proposal supported cultural neutrality in the naming of classes of images (i.e. "nudes" not "smut") the received wisdom is that different communities have different norms - in fact this is the basis for the US "community standards" definition of obscenity. Reflecting this on the Wikimedia projects is like our early attempts to cover small languages - at some point you have, pragmatically, to draw the line. Unlike language numbers, however we do not have figures for the number of peoele who don't wish to see skin lesions, compared with the number who don't want to see spiders. There is I think some kind of mental map that says "Nudes, sex, violence, images of Muhammed" and we're done. Nothing could be further form the truth. Virtually every religion has a spectrum of prohibited images, depending on sect, era, locality and use - these may require sophisticated knowledge to assess. Many people find it offensive to see the ubiquitous "FCUK" logo everywhere. Image s of drug use, sick people, medical situations. Phobics come in more shades than I can count (at least without taking my socks off). Leaving any of these groups out in the cold is not culturally neutral, and that's without looking at the "hard cases".

How do we avoid helping censors?[edit]

The creation of any categorization system that is filter specific is de facto in the public domain (free as in beer, if not as in speech). Since categorising 10 million files into several hundred categories probably amounts to hundreds of thousands of hours of work, this represents a substantial gift to:

  1. repressive regimes who wil be able to selectively filter
  2. repressive cypber cafes/ISPs/schools/libraries/places of work who wil be able to insist on filter settings.

How do we avoid filter creep?[edit]

  1. The social "if you are looking with a filter you must be a pervert/blasphemer/terrorist" - we have already seen this on the Meta discussion page.
  2. The proposals to have filtering opt-out rather than opt-in - on the grounds that users will be offended before they know about or set the filter. Again this has already been proposed and supported
  3. Drift of filter boundaries to include more images as an application of the precautionary principle
  4. The proposal to include text - in fact the existing system can include text, the following image was included in the discussion on Meta, but I don't think anyone realised - which shows the point is valid but totally fails to convey it.

Does this solve a problem or move it?[edit]

The underlying question "What images is it appropriate to show?" can be answered in two ways:

  • The existing answer "Those that provide significant additional appropriate encyclopaedic information to a page"
  • The new answer which we must all as global liberals embrace "It depends on the reader."

The second answer suffers however from simply iterating the question to:

  • "What images is it appropriate to have in X filter category?"

And in fact makes the matter worse, because rather than making one judgement about an image - when we use it - we now need to make several (probably many) when the image is accessioned.

The small questions[edit]

  • What are the PR implications? "Wikipedia retains blaspmeous/indecent images" "Wikiepdia makes seeing porn the default" etc...
  • What about context? An image might be appropriate on one page and not on another?
  • Is this an enabler for inappropriate images on pages, onthe grouns that they will be filtered?
  • Will this chill donations of images?
  • Will this discourage editors?
  • Can page layouts deigned for images work as well when the images are hidden?
  • Do we create liability for oursleves (collectively or individually) if something is wrongly classified?
  • Will there be "lamest ever" filter wars? (You betcha!)