Skip to content Skip to navigation

Social media is the perfect petri dish for bias. The solution is for tech companies to slow us down.





Like the drug-resistant superbug pictured here, our biases against other people can “migrate to other spaces,” psychologist Jennifer Eberhardt says.






Sep 3 2019

Posted In:

Faculty, In the News

Stanford psychology professor Jennifer Eberhardt, the author of Biased: Uncovering the Hidden Prejudice That Shapes What We See, Think, and Do, says Nextdoor reduced racial profiling by 75 percent by introducing a tiny bit of friction for users.

By Eric Johnson@HeyHeyESJ  

Sep 3, 2019, 6:20am EDT

No matter how well-intentioned their creators were, tech products can encourage and amplify existing racial biases. And Stanford University psychology professor Jennifer Eberhardt says companies can take meaningful steps to reduce that effect — although it may come at the cost of the twitchy virality that has helped them grow so quickly.

“Bias can kind of migrate to different spaces,” Eberhardt said on the latest episode of Recode Decode with Kara Swisher. “All the problems that we have out in the world and in society make their way online. ... You’re kind of encouraged to respond to that without thinking and to respond quickly and all of that. That’s another condition under which bias is most likely to be triggered, is when you’re forced to make decisions fast.”

In her most recent book Biased: Uncovering the Hidden Prejudice That Shapes What We See, Think, and Do, Eberhardt recounts how the local social network Nextdoor successfully reduced racial profiling among its users by 75 percent: It introduced some friction.

“They realized, after talking to me and other people and consulting the literature, that if they wanted to fight racial profiling on the platform that they were going to have to slow people down,” she said. “It used to be the case that if you saw someone suspicious you could just hit the crime and safety tab and then you could shout out to all your neighbors, ‘Suspicious person!’ Oftentimes, the person who was suspicious was a black man, and in the cases where it was racial profiling, this person was doing nothing.”

Requiring those users to complete a three-item checklist — which included an educational definition of racial profiling — shifted the “cultural norm,” Eberhardt explained, away from “see something, say something” and toward “if you see something suspicious, say something specific.”

“Companies have a huge role to play here,” she said. “I think they have a responsibility in all of this because of the power they wield. I mean, they cannot just affect one life but many, many lives, millions of lives. That checklist changed the mindset of all of these people now who are on Nextdoor. I think recognizing that power and marshaling that power is a good thing.”