Technology reflects society’s bigotry

In the current world, it is almost impossible not to interact with some form of technology on a daily basis. Problems arise when the people who create this technology don’t reflect the demographics of the general population that will be using it. 

 

Everyone has biases. It’s impossible not to when we are raised in a racist, sexist, homophobic, ableist, etcetera society. For the creators of technology, their biases inevitably seep into their designs — whether it’s a sensor that only detects light skin tones or an AI that is only given data that reinforces a white supremacist, patriarchal society. 

 

These technological biases vary from being absurd and inconvenient to causing real harm. Some infrared sensors on soap and paper towel dispensers don’t recognize darker skin tones, preventing them from dispensing the needed item. There have been digital cameras that ask if someone blinked in a picture of smiling Asian people. Stores with a point of sale system that use the name on your card can out transgender people who haven’t legally changed their name. Risk assessment and predictive policing algorithms use already racially skewed arrest rates to determine rearrest. Police were using facial recognition software to identify and arrest Black Lives Matter protesters expressing their first amendment rights.

 

Some of these issues can be fixed by having more members of marginalized groups in the field of technology. Another step to try to eliminate these biases from technology is having sensitivity testers, like authors having sensitivity readers to make sure things work as intended for everyone who may use it. Really, it should be part of the quality assurance process. Other aspects, particularly the policing algorithms, are extensions of already existing oppressive structures that will need more work to be dismantled. 

 

Businesses that use these technologies play a role as well. If installing soap dispensers, they should make sure they will work for everyone. As we shift toward contactless payment methods, names should still be asked or there should be an option to input the name the customer wants to be called. To prevent offensive or prank names from being put on an order, there could be a simple text filter to eliminate profanity and common joke names. 

 

Developers and engineers should take more care to make sure their creations aren’t perpetuating inequalities. There needs to be more members of marginalized groups in the field anyway, since tech is usually a cishet white boy’s club. It is also the responsibility of the general public to raise awareness for these issues as they come up and help combat them by holding people accountable.

(2) comments

bekzclz na

Inequality on the rise: The growing divide between haves and have-nots

The majority of respondents to this study are in agreement that digital life is likely to improve the lives of people at the top of the socioeconomic ladder over the next few decades. A large share of those who predicted that internet use will produce change for the worse for most individuals over the next 50 years expressed concerns that an extension of current trends will lead to a widening economic divide that leaves the majority in the dust of the privileged class.

Cleetus Yeetus

Dafuq?

Welcome to the discussion.

Keep it Clean. Please avoid obscene, vulgar, lewd, racist or sexually-oriented language.
PLEASE TURN OFF YOUR CAPS LOCK.
Don't Threaten. Threats of harming another person will not be tolerated.
Be Truthful. Don't knowingly lie about anyone or anything.
Be Nice. No racism, sexism or any sort of -ism that is degrading to another person.
Be Proactive. Use the 'Report' link on each comment to let us know of abusive posts.
Share with Us. We'd love to hear eyewitness accounts, the history behind an article.