Draft:New Jim Code

From Wikipedia, the free encyclopedia

New Jim Code is a term coined by sociologist and Princeton University professor Ruha Benjamin in her book Race After Technology. It concerns itself with algorithmic bias, specifically in the unfair outcomes that result from prejudiced systems and/or developers. It examines the discrepancy between the “neutral” nature of algorithms and the biased output of the automated tools.

The term is defined as, “the employment of new technologies that reflect and reproduce existing inequities but that are promoted and perceived as more objective or progressive than the discriminatory systems of a previous era”.[1]: 5 .

There are four dimensions that Benjamin lists as vital parts that constitute New Jim Code:

  1. Impartiality
  2. Personalization
  3. Merit
  4. The framework of a forward-looking enterprise that promises social progress”

It draws on the term “New Jim Crow”, which was created by Michelle Alexander[1]: 8  to highlight how the existing racist systems extend to technology. Understanding race in the context of technology is an underlying factor that contributes to the understanding of the terminology. It challenges the belief that technological innovation is race-agnostic, instead positing that the idea of race not being a factor in algorithms is incorrect.

Examples[edit]

Data Sharing

One example that Benjamin states that stems from algorithmic inequity is the issues from data sharing, which results in an unequal targeting of marginalized groups by those obtaining shared data. While seeming to be a positive phenomenon, it has posed data privacy issues and allows acts of discrimination to occur based on data generalizations. Stigmas play a part in data sharing issues, such as with African Americans being denied services due to supposed health risks like sickle cell[2]

Predictive Guilt

The use of algorithms and AI tools by law enforcement is seen in products such as Snapshot DNA Phenotyping Service by Parabon Nanolabs. It uses a technique called DNA phenotyping by collecting and using samples from crime scenes to construct facial renditions of persons involved. Criticisms have been raised about its validity, such as Dr. Yaniv Erlich, who compared the phenotyping to “science fiction”, or other scientists noting the lack of peer review [3]. It raises concerns of the effectiveness of the software and the implications it could have on those suspects due to the product’s output.

Reactions to New Jim Code[edit]

In response to the described New Jim Code, Ruha Benjamin posits that there are ways of thinking about technology development that can change the underlying culture that enables the New Jim Code [1]: 183 . Benjamin suggests that by viewing technology as a tool and valuing the politics and purpose of said tool, thinkers will question the idea of technology design as a “solution” to societal and cultural problems that contribute to inequity [1]: 178–183 . In doing so, she believes over time the creation of technology that, rather than value innovation and profit, promotes and centers equity can begin [1]: 183 .

References[edit]

  1. ^ a b c d e Benjamin, Ruha. Race After Technology. Polity Press, 2019, https://www.ruhabenjamin.com/race-after-technology.
  2. ^ Heeney, C.; Hawkins, N.; de Vries, J.; Boddington, P.; Kaye, J. (2010-03-29). "Assessing the Privacy Risks of Data Sharing in Genomics". Public Health Genomics. 14 (1): 17–25. doi:10.1159/000294150. ISSN 1662-4246. PMC 2872768. PMID 20339285.
  3. ^ Southall, Ashley (2017-10-19). "Using DNA to Sketch What Victims Look Like; Some Call It Science Fiction". The New York Times. ISSN 0362-4331. Retrieved 2023-12-04.