Computing Must Pay Attention to Outcomes to Achieve Equity

There is a push right now across ACM that is gathering momentum. This push is for increased attention to cultural competency [12] in the training of computing professionals. In fact, the CS202X: ACM/IEEE-CS/AAAI Computer Science Curricula Taskforce [4] has a knowledge area subcommittee devoted to SEP or Society, Ethics, and Professionalism [5]. This subcommittee is currently proposing a new set of student learning outcomes under the heading of Justice, Equity, Diversity, and Inclusion. These new learning foci include the identification of underlying assumptions and values built into system designs and the recognition and description of the impact of diversity and under-representation of identities in the computing profession. The emphasis on underlying values and on describing impacts is not accidental. As many have come to recognize, computing has a legacy of compartmentalizing social concerns [2] (and social costs), and this segregation of intellectual concerns is often facilitated by the distance between those who design [8] and those who experience the impacts of those designs. This new curricular push within ACM and IEEE is about equipping future computer science graduates with the cultural competencies that they need to connect the values informing local design practices to the public impacts of those designs.

Examining impact as a means for identifying the cultural values of a system is not a new idea. For many years now, those engaged in fairness studies have used terminology such as “disparate impact” to point to systems with racist, sexist, or other discriminatory outcomes. The terminology itself [disparate impact] comes from US Labor Law and it “refers to practices in employment, housing, and other areas that adversely affect one group of people of a protected characteristic more than another, even though rules applied by employers or landlords are formally neutral.” [13]  This notion of ‘formal neutrality’ has particular implications for computing as rules or algorithms are often imposed within a decision making system [10] that do not include any explicit reference to gender or race or any other protected class of people. As a result of this lack of explicit prejudice, algorithmic designers may feel they are wrongly maligned when a system produces or amplifies inequities, and they find themselves vilified in the public eye or the recipient of unpleasant legal proceedings.  In retrospect, it is often possible to see how latent classism [7], racism [6], ableism [3], and/or sexism [11] resulted in a particular system design. Computing professionals are increasingly being confronted with the reality that discriminatory intent can be irrelevant to determining accountability when an adopted system is found to have a disparate impact.

The increasing expectations for social accountability that computing professionals face means that computing programs must better equip students with greater awareness of how racism (for example) and other inequities might get baked into or inadvertently hidden behind, what is otherwise a ‘neutral’ rule.  In CICS 305 this week, students read prominent sociological theorist Ruha Benjamin’s article in Science [1] where she walks the reader through the racially disparate impact of an algorithmically driven health care optimization service. In this example, an algorithm used to predict risk, and thus optimize the assignment of healthcare services, was producing similar risk scores regardless of race, and these scores underreported the higher health risk Black patients experienced. In analyzing the disparate outcomes of this tool, researchers determined that the mistake was the use of care costs as a proxy for the level of need for that care. Their assumption being that more money was being spent by/for sicker people. This assumption ignored the well documented healthcare disparities experienced by Black people in the United States and as a result, it treated healthcare spending as a neutral variable. As Benjamin states in her article, “coded inequity is perpetuated precisely because those who design and adopt such tools are not thinking carefully about systemic racism.”

Taking the opportunity to learn about the history of systemic racism (or sexism or ableism) in one’s country, community, or profession is just one example of a way that those hoping to join the computing industry can equip themselves to be better developers and more equitable designers. Recognizing the need for a less siloed, less intellectually segregated education is precisely why the new student learning outcomes are being proposed for the SEP knowledge area mentioned at the beginning of this entry. We cannot all of us know everything there is to know. Fortunately, developing cultural competency does not mean knowing everything.  It is a reasonable expectation that we do more to learn the history of our field and about the identities of the people experiencing the products of our designs. It is reasonable to expect researchers to become minimally educated in theories of race, intersectionality, and the ethical collection of human data [9]. And we can develop an openness to examining the impacts of the systems and tools we design so that when problems are identified, we can demonstrate accountability by making swift corrections that realize the values of justice, equity, diversity and inclusion that we claim on our website.

[A longer version of this entry will appear in ACM SIGCAS: Computers and Society Magazine]

References:

[1] Ruha Benjamin. 2019. Assessing risk, automating racism. Science 366, 6464 (25 Oct. 2019), 421-422. DOI: https://doi.org/10.1126/science.aaz3873 

[2] CHM Editorial. 2021. Decoding Racism in Technology. CHM Blog. (5 May 2021) Retrieved from  https://computerhistory.org/blog/decoding-racism-in-technology/ 


[3] Kevin Collier. 2001. Twitter’s racist algorithm is also ageist, ableist and Islamophobic, researchers find. NBC News (9 Aug. 2021), Retrieved from https://www.nbcnews.com/tech/tech-news/twitters-racist-algorithm-also-ageist-ableist-islamaphobic-researchers-rcna1632

[4] CS202X. 2022. ACM/IEEE-CS/AAAI Computer Science Curricula. Retrieved from https://csed.acm.org/ 


[5] CS202X. 2022. Society, Ethics, Professionalism. Retrieved from https://csed.acm.org/create-your-website-with-blocks/knowledge-areas/society-ethics-and-professionalism/

[6] Avriel Epps-Darling. 2020. How the racism baked into technology hurts teens. The Atlantic. (24 Oct. 2020) Retrieved from https://www.theatlantic.com/family/archive/2020/10/algorithmic-bias-especially-dangerous-teens/616793/ 


[7] Michele Gilman. 2020. AI algorithms intended to root out welfare fraud often end up punishing the poor instead. The Conversation, (14 Feb. 2020), Retrieved from https://theconversation.com/ai-algorithms-intended-to-root-out-welfare-fraud-often-end-up-punishing-the-poor-instead-131625

[8] Brent Hecht, Lauren Wilcox, Jeffrey P. Bigham, Johannes Schöning, Ehsan Hoque, Jason Ernst, Yonatan Bisk et al. 2021. It’s time to do something: mitigating the negative impacts of computing through a change to the peer review process. arXiv:2112.09544. Retrieved from https://arxiv.org/ftp/arxiv/papers/2112/2112.09544.pdf

[9] Khari Johnson. 2021. A move for ‘algorithmic reparation’ calls for racial justice in AI. Wired. (23 Dec. 2021), Retrieved from https://www.wired.com/story/move-algorithmic-reparation-calls-racial-justice-ai/ 

[10] S.C. Olhede and P.J. Wolfe. 2018. The growing ubiquity of algorithms in society: implications, impacts and innovations. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 376, 2128 (6 Aug. 2018), 20170364. DOI: https://doi.org/10.1098/rsta.2017.0364 


[11] Genevieve Smith, and Ishita Rustagi. 2021. When good algorithms go sexist: why and how to advance AI gender equity. Stanford Social Innovation Review (31 March 2021), DOI: https://doi.org/10.48558/A179-B138


[12] Alicia Nicki Washington. 2020. When twice as good isn’t enough: the case for cultural competence in computing. Proceedings of the 51st ACM Technical Symposium on Computer Science Education. Association for Computing Machinery, New York, NY, USA, 213–219. DOI: https://doi.org/10.1145/3328778.3366792 

[13] Wikipedia. 2022. Disparate impact. Wikipedia: the Free Encyclopedia. Retrieved from  https://en.wikipedia.org/wiki/Disparate_impact