image created by Margaux Peltat for the Chilled Cow YouTube channel
Time for #DEVDiscuss — right here on DEV 😎
Inspired by our ongoing #WeCoded celebration, tonight’s topic is...coding against bias!
It will come as a surprise to very few that algorithms, user interface design, and other aspects of the technology we used are just as biased as the people who make them—often unintentionally so.
For today's #DEVDiscuss, we want to uplift people, projects, and case studies that are coding against bias.
This is a rich topic, so we encourage you to think on it, do some research, and bring some thoughtful considerations to the conversation. Take all the time you need!
Questions:
- Who are some people or projects that are coding against bias?
- Do you know of any case studies to share?
- Have you ever used your technical skills to combat bias or other inequity?
- What are some things to keep in mind when coding against bias?
- Any triumphs, fails, or other stories you'd like to share on this topic?
Top comments (1)
Last year I developed the prototype for a piece of text analysis software aimed at highlighting bias in communications. It was a naive implementation which used pattern matching on the back-end, but it worked as a tech demo. Other people are working on it now, and they hope to have it released open source by the end of the year, with a much more advanced ML back-end.
I suppose this doesn't really fit with the "coding against bias" of the post, but it's been an interesting experience seeing how much of the language we use could be made more inclusive.