Neural Sports

ABOUTARTICLES

Racism in AI

May 17, 20207 min read

Racism in AI and the technology industry is real. It’s real and it’s f*cked. I will never understand, I will never feel the feelings that black people constantly go through on a daily basis as I’m a privileged white man. I will never have to worry about going to certain countries on holiday, I will never have to worry about hearing police sirens, I will never have to worry about being able to get a job in technology. Why? Because I’m white. It’s not because I’m well behaved, or because I’m well educated, or have nothing to worry about. It’s because I’m white and not black. Do you know how messed up that is? What I promise that I will do is I will stand by the black community against all forms of racism, I will continue to learn and listen on how I can help make a difference and I will make sure I speak out more on the racial injustice that occurs.

Do you want to know how many people code? 0.5%. How many people actively use the internet and therefore use technology? 59%. That means 0.5% of the world is responsible for the technology people use on a daily basis. Do you want to know how many white people are responsible for being executives of these tech companies that lower than 0.5% of the world are a part of? 83%. This is a number that is too high. This is a number that creates a bias in decisions. This is a number that is a problem.

Let me show you a small sample of examples demonstrating the problem.

FaceApp

FaceApp developed a deep learning/computer vision app that would alter a person’s face when they submitted a photo of themselves. How it altered a person’s face was based on a neural network model that was trained. Some features included showing them the person as ‘hot’, old, as a man, or as a woman. Do you want to what data they used to train their models? They used the most liked photos on Instagram. And what was the majority race for all these photos? White. This meant when a black person submitted a photo of themselves and applied the ‘hot’ filter it would bleach their skin and make their noses more ‘European like’. Because the team that created this model did not have enough diversity in their dataset and team so didn’t even consider the consequences. This is what bias looks like in AI. This doesn’t come from technology. It comes from the people behind the technology not checking themselves and their way of workings.

faceapp-racism

Runaway Feedback Loops

This is a big big problem within AI. A lot of police departments in the US are using predictive policing algorithms. This basically says where can we go so we can stop someone who is about to commit a crime. Show us where a crime is about to be committed so we can get a jump start and get to the location before it happens. Do you see where the problem is going to arise? An algorithm feedbacks to you the data that is given to it. So if certain police officers have engaged in racial profiling and have actively gone out to find, stop and arrest black people the algorithm will more than likely suggest police officers travel to the black neighbourhoods to check for people committing crimes. So they will, they’ll arrest people, data is fed back to the algorithm and the loop continues. Again and again, targetting black communities.

Once more, this starts with the people behind the technology. Their pure ignorance of the problem. Not fact-checking the data or calling out the racist people behind the data - the police officers. If you want to read more about predictive policing please check out the paper.

Google Computer Vision Application

Most people are not aware of machine learning and that’s the truth. To most people when they see an automatic tag of a dog in their image they think someone has gone through the effort of manually code that tag. Based on this pure fact we have a responsibility. The image below is disgusting but an important example of how a team that lacks diversity creates a bias in the data and therefore bias in the software. Google really tried to blame this error on ‘the algorithms lack of common sense’. Come on now. Pathetic.

google-racism

The Justice System

In the USA a lot of judges have access to sentencing guideline software that says to the judge for this individual we would recommend this kind of sentence. So this software was used to predict if a criminal would re-offend. Let me demonstrate two cases.

Case One. Brisha Borden was running late to pick up her god-sister from school and she noticed an unlocked bike and scooter. So Borden and her mate picked them up and tried to ride them down the street. When they were realising they were too big for the bike and scooter the Mum of the kid’s bike and scooter came running over and exclaimed they were hers, they dropped them instantly. But it was too late a neighbour already called the police and they were fined $83 and charged with burglary and petty theft. Broaden has other charges that were just misdemeanors when she was a juvenile.

Case Two. Vernon Prater was arrested for shoplifting $84 dollars worth of tools from Home Depot. Prater had already been convicted or armed robbery and attempted armed robbery which he had served five years in jail for. Along with another sentence he served for more armed robbery.

Yet when they were both booked the software spat out the likelihood of both individuals reoffending. Borden who is black - was rated a high risk. Prater who is white - was rated a low risk.

Two years later? Borden had not been charged with any other crimes but Prater who was rated low risk was serving an 8-year jail sentence for breaking into a warehouse and stealing $8000 worth of electrical equipment.

This is common across the US and influences the judges decisions when sentencing an individual. Now DO NOT tell me there is no white privilege occurring and that this is not racist. There is racism in the justice system and it is not okay. To read more about the study of racism in the justice system through machine learning Pro Publica released this great study that I have summarised.

Final Words

Do not think this isn’t happening now. This is real and currently in technology out there. Do not wait until the problem comes up to start thinking about how you would react to it. Start thinking now. Start thinking about how you will handle it when it comes up because it definitely will and you have a responsibility to bring these issues to the forefront and make others aware of them. Start educating yourself now. Just because it does not affect you it does not mean it doesn’t affect others. Act now, speak up, whilst continually learning and listening. We have to make sure this isn’t just a trend. This is here now, tomorrow and forever and we will help black people in their fight against injustice.

Important Resources

Please check out this amazing website that shows ways you can help now, tomorrow and forever.

Black Lives Matter

Also, check out these amazing black owned tech companies that are doing amazing things within the tech industry. I will be adding more and more as I come across them.

Black Young Professionals

People of Colour in Tech

Hustle Crew

Black Girls Who Code

Gym Streak

Disha Pages

Eden BodyWorks

Our Universe

EatOkra


Developed by Sean O'Connor, a sports and AI enthusiast.