In today's digital world, embedded algorithms are created by companies and websites everywhere to feed off of data to custom-tailor user experiences. The goal of algorithms is to show the user specific content that will increase user retention. As a result, users are shown very specific content that relates to them and are restricted from a wide variety of topics. The content users are suggested can also be affected by inherent factors such as their location, race, or gender. This is known as algorithmic bias and it creates disparities in what users see on social media. Black users in particular are nearly twice as likely to see race-related posts than white users (PEW Research Center). This puts black users, especially youth, in an echo chamber on social media where they are only shown posts that either involve race relations or push black stereotypes, which can hinder their development and warp not only their view of the world, but their view of themselves as well.
Algorithms, learning programs installed in computer systems, are made to analyze data and correlate certain characteristics or occurrences in a data set to specific factors. For example, if an algorithm reviews the spending patterns of multiple families of various economic statuses in a residential area, it will likely correlate higher spending with higher levels of wealth. The characteristics algorithms correlate to certain outcomes are made to help data scientists obtain higher understandings of contributing factors to certain problems, but sometimes algorithms can be harmful by correlating outcomes to discriminatory factors such as race or gender.
Algorithmic bias doesn’t just negatively affect minorities in the digital world, it also has repercussions in the real world. For example, many large hospitals in the US use algorithms in their computer systems to assess what procedures should be performed on patients based on not just severity of their condition, but also their predicted ability to afford the treatment and race. This has led many Black patients to be denied medical treatment they need just based on cost statistics alone. Algorithms have also been used by the judicial system to predict the likelihood of defendants being future offenders. Due to existing biases, Black defendants are much more likely to face longer sentences or labels as future offenders just based on race (The Atlantic).
Due to College Tribe’s devotion to young black boys and their futures, we are obligated to take action on this pressing matter. As a result, we are launching Project algoRHYTHM which aims to stop the spread of algorithmic bias. This project will consist of three major components. Firstly, we will have a social media campaign that will spread awareness of the issue and inform people of the threats it poses. Secondly, we will build a coalition of people and organizations that will work toward finding possible solutions to curb algorithmic bias. Lastly, we will be building a forum and community page on the College Tribe website that will allow the coalition to communicate with each other and organize events. This is only the beginning and the project will grow to include more aspects as the project progresses. Keep an eye out for news about Project algoRHYTHM.