Date of Award

Spring 5-22-2023

Document Type

Undergraduate Honors Thesis

Degree Name

Bachelor of Science in Computer Science

Department

Computer Science

Advisor

Dr. Jennifer Tillman

Abstract

Individuals from marginalized backgrounds face different healthcare outcomes due to algorithmic bias in the technological healthcare industry. Algorithmic biases, which are the biases that arise from the set of steps used to solve or analyze a problem, are evident when people from marginalized communities use healthcare technology. For example, many pulse oximeters, which are the medical devices used to measure oxygen saturation in the blood, are not able to accurately read people who have darker skin tones. Thus, people with darker skin tones are not able to receive proper health care due to their pulse oximetry data being inaccurate. This research aims to highlight the ethical implications of marginalized communities facing different healthcare outcomes and provide suggestions on how to prevent algorithmic bias from appearing in healthcare. In order to do this, this paper will first give examples of algorithmic bias, then discuss the ethical implications of those biases, and lastly provide solutions that may help prevent algorithmic bias. It is unethical that marginalized communities are being misread, misdiagnosed, and mistreated due to algorithmic biases. Additionally, the technological healthcare industry must be diversified in order to prevent algorithmic biases from arising in their medical technologies.

Share

COinS