In the current world, it is almost impossible not to interact with some form of technology on a daily basis. Problems arise when the people who create this technology don’t reflect the demographics of the general population that will be using it.
Everyone has biases. It’s impossible not to when we are raised in a racist, sexist, homophobic, ableist, etcetera society. For the creators of technology, their biases inevitably seep into their designs — whether it’s a sensor that only detects light skin tones or an AI that is only given data that reinforces a white supremacist, patriarchal society.
These technological biases vary from being absurd and inconvenient to causing real harm. Some infrared sensors on soap and paper towel dispensers don’t recognize darker skin tones, preventing them from dispensing the needed item. There have been digital cameras that ask if someone blinked in a picture of smiling Asian people. Stores with a point of sale system that use the name on your card can out transgender people who haven’t legally changed their name. Risk assessment and predictive policing algorithms use already racially skewed arrest rates to determine rearrest. Police were using facial recognition software to identify and arrest Black Lives Matter protesters expressing their first amendment rights.
Some of these issues can be fixed by having more members of marginalized groups in the field of technology. Another step to try to eliminate these biases from technology is having sensitivity testers, like authors having sensitivity readers to make sure things work as intended for everyone who may use it. Really, it should be part of the quality assurance process. Other aspects, particularly the policing algorithms, are extensions of already existing oppressive structures that will need more work to be dismantled.
Businesses that use these technologies play a role as well. If installing soap dispensers, they should make sure they will work for everyone. As we shift toward contactless payment methods, names should still be asked or there should be an option to input the name the customer wants to be called. To prevent offensive or prank names from being put on an order, there could be a simple text filter to eliminate profanity and common joke names.
Developers and engineers should take more care to make sure their creations aren’t perpetuating inequalities. There needs to be more members of marginalized groups in the field anyway, since tech is usually a cishet white boy’s club. It is also the responsibility of the general public to raise awareness for these issues as they come up and help combat them by holding people accountable.