Draft:New Jim Code
Submission declined on 6 December 2023 by WikiOriginal-9 (talk).
Where to get help
How to improve a draft
You can also browse Wikipedia:Featured articles and Wikipedia:Good articles to find examples of Wikipedia's best writing on topics similar to your proposed article. Improving your odds of a speedy review To improve your odds of a faster review, tag your draft with relevant WikiProject tags using the button below. This will let reviewers know a new draft has been submitted in their area of interest. For instance, if you wrote about a female astronomer, you would want to add the Biography, Astronomy, and Women scientists tags. Editor resources
|
New Jim Code is a term coined by sociologist and Princeton University professor Ruha Benjamin in her book Race After Technology. It concerns itself with algorithmic bias, specifically in the unfair outcomes that result from prejudiced systems and/or developers. It examines the discrepancy between the “neutral” nature of algorithms and the biased output of the automated tools.
The term is defined as, “the employment of new technologies that reflect and reproduce existing inequities but that are promoted and perceived as more objective or progressive than the discriminatory systems of a previous era”.[1]: 5 .
There are four dimensions that Benjamin lists as vital parts that constitute New Jim Code:
- Impartiality
- Personalization
- Merit
- The framework of a forward-looking enterprise that promises social progress”
It draws on the term “New Jim Crow”, which was created by Michelle Alexander[1]: 8 to highlight how the existing racist systems extend to technology. Understanding race in the context of technology is an underlying factor that contributes to the understanding of the terminology. It challenges the belief that technological innovation is race-agnostic, instead positing that the idea of race not being a factor in algorithms is incorrect.
Examples[edit]
Data Sharing
One example that Benjamin states that stems from algorithmic inequity is the issues from data sharing, which results in an unequal targeting of marginalized groups by those obtaining shared data. While seeming to be a positive phenomenon, it has posed data privacy issues and allows acts of discrimination to occur based on data generalizations. Stigmas play a part in data sharing issues, such as with African Americans being denied services due to supposed health risks like sickle cell[2]
Predictive Guilt
The use of algorithms and AI tools by law enforcement is seen in products such as Snapshot DNA Phenotyping Service by Parabon Nanolabs. It uses a technique called DNA phenotyping by collecting and using samples from crime scenes to construct facial renditions of persons involved. Criticisms have been raised about its validity, such as Dr. Yaniv Erlich, who compared the phenotyping to “science fiction”, or other scientists noting the lack of peer review [3]. It raises concerns of the effectiveness of the software and the implications it could have on those suspects due to the product’s output.
Reactions to New Jim Code[edit]
In response to the described New Jim Code, Ruha Benjamin posits that there are ways of thinking about technology development that can change the underlying culture that enables the New Jim Code [1]: 183 . Benjamin suggests that by viewing technology as a tool and valuing the politics and purpose of said tool, thinkers will question the idea of technology design as a “solution” to societal and cultural problems that contribute to inequity [1]: 178–183 . In doing so, she believes over time the creation of technology that, rather than value innovation and profit, promotes and centers equity can begin [1]: 183 .
References[edit]
- ^ a b c d e Benjamin, Ruha. Race After Technology. Polity Press, 2019, https://www.ruhabenjamin.com/race-after-technology.
- ^ Heeney, C.; Hawkins, N.; de Vries, J.; Boddington, P.; Kaye, J. (2010-03-29). "Assessing the Privacy Risks of Data Sharing in Genomics". Public Health Genomics. 14 (1): 17–25. doi:10.1159/000294150. ISSN 1662-4246. PMC 2872768. PMID 20339285.
- ^ Southall, Ashley (2017-10-19). "Using DNA to Sketch What Victims Look Like; Some Call It Science Fiction". The New York Times. ISSN 0362-4331. Retrieved 2023-12-04.
- in-depth (not just passing mentions about the subject)
- reliable
- secondary
- independent of the subject
Make sure you add references that meet these criteria before resubmitting. Learn about mistakes to avoid when addressing this issue. If no additional references exist, the subject is not suitable for Wikipedia.