Fill This Form To Receive Instant Help

Help in Homework
trustpilot ratings
google ratings


Homework answers / question archive / As more human life is controlled or guided by computer algorithms, there is growing concern about various biases that these algorithms encode, and the real-world implications of such biases

As more human life is controlled or guided by computer algorithms, there is growing concern about various biases that these algorithms encode, and the real-world implications of such biases

Psychology

As more human life is controlled or guided by computer algorithms, there is growing concern about various biases that these algorithms encode, and the real-world implications of such biases. For example, in 2015 a black developer realized Google's photo recognition software tagged pictures of him and his friends as gorillas.1 Similarly, it was found that facial recognition software struggled to read black faces.2 Other problems have arisen, including Facebook automatically suspending the accounts of Native Americans for having seemingly "fake"names,3 Google translate replacing gender-neutral pronouns with gendered pronouns in English according to sexist stereotypes,4 and airport body scanners flagging transgender bodies as threats.5 Some have labeled this phenomenon "data violence," noting that coding choices can "implicitly and explicitly lead to harmful or even fatal outcomes." 6 (A more recent article in the New York Times documented one such case in which a faulty facial recognition system led to a Black man's arrest and detainment for a crime that he did not commit.7 I strongly encourage you to read this particular article!)

Some software developers and commentators have claimed that complaints about data violence are overhyped. For instance, some have claimed that these problematic results are simply unfortunate side effects of data analyses and statistical models that are, in other respects, highly accurate and useful. Software developers are not necessarily doing anything wrong when they create algorithms that, for the most part, work very well--even if that software has unintended biases. It may be concluded that in some cases, an algorithm might end up reflecting some broader social injustice, leading to biased results--such as when racial disparities in arrest rates affect the results of software used to predict criminal behaviors.8 But even then, developers sometimes argue, the problem is not with the software itself, but with the broader injustices for which the developers themselves are not responsible. Relatedly, some argue that there is a division of labor in software development that makes it the responsibility of the architect of the larger project to pay attention to the broader social implications of the software, and not necessarily the individual engineers who work on them. Or, perhaps, they need an "in-house philosopher" to consider the messy ethical concerns for them.9

However, others find this response to be little more than an attempt to avoid responsibility for the way in which their own actions help to reinforce and reproduce biases and injustices. Many instances of racist and sexist errors are due to developers' biases, stereotypes, and interests. Software engineers carry with them assumptions about what should be considered "normal" or what range of cases they must account for; these assumptions can affect how software is programmed and the types of testing it undergoes. Furthermore, engineers may often overlook important "edge cases," or the problematic implications of their doing so, because of the fact that the tech industry is overwhelmingly white and male.10 Given that these software problems disproportionately harm members of historically marginalized groups, there seems to be a further concern that leaving developer diversity unaddressed or viewing these failures as merely instances or poor engineering will not fix the underlying problem.

1. Who is responsible for the kinds of "data violence" described in the case?

2. Can we consider software sexist or racist, even if it doesn't have intentions or attitudes?

3. What, if anything, should software companies do to address data violence?

Option 1

Low Cost Option
Download this past answer in few clicks

3.87 USD

PURCHASE SOLUTION

Already member?


Option 2

Custom new solution created by our subject matter experts

GET A QUOTE

Related Questions