The big problem with recruiting and hiring is that it’s done by humans, and humans make mistakes and are subject to our own biases. So, the best solution is to create programs that evaluate candidates and choose the best person. Problem solved!
But, the latest research in artificial intelligence shows us that even robots have bias problems.
Researchers used a neural network called CLIP to control a robot’s arm. The arm then divided blocks with people’s pictures into different categories. Easy enough, right? But when researchers asked the robot to pick the doctors, janitors, murderers, and homemakers out of the person bedecked blocks, the robot demonstrated “toxic stereotypes.”
To keep reading, click here: Robots are Racist and Sexist: AI Isn’t Ready for Recruiting