Skip to main content
March 2, 2020

Lucy Bernholz, Stanford University | Stanford Women in Data Science (WiDS) Conference 2020

About This Video

Lucy Bernholz, Senior Research Scholar, Stanford University | @p2173 sits down with Sonia Tagare for WiDS 2020 at Stanford University.

#WiDS2020 #WomenInTech #theCUBE…

Diverse teams help build less biased algorithms, says Stanford researcher

As powerful as the benefits of artificial intelligence are, using biased data and defective AI models can cause a lot of damage.

To address that growing issue, human values must be integrated into the entire data science process, according to Lucy Bernholz (pictured), senior research scholar and director of the Digital Civil Society Lab at Stanford University.

“[Values] shouldn’t be a separate topic of discussion,” she said. “We need this conversation about what we’re trying to build for, who we’re trying to protect, how we’re trying to recognize individual human agency, and that has to be built in throughout data science.”

Bernholz spoke with Sonia Tagare, host of theCUBE, SiliconANGLE Media’s mobile livestreaming studio, during the Women in Data Science conference in Stanford, California. They discussed the importance of values in data science, why it is necessary to have a diverse team to build and analyze algorithms, and the work being done by the Digital Civil Society Laboratory.

Breaking the bias cycle
All data is biased because it is people who collect it, according to Bernholz. “And we’re building the biases into the data science and then exporting those tools into bias systems,” she highlighted. “And guess what? Problems are getting worse. So, let’s stop doing that.”

When creating algorithms and analyzing them, data scientists need to make sure that they are considering all the different types of people in the data set and understanding those people in context, Bernholz explained.

“We know perfectly well that women of color face a different environment than white men; they don’t walk through the world in the same way,” she explained. “And it’s ridiculous to assume that your shopping algorithm isn’t going to affect that difference that they experience in the real world.”

It is also necessary to have different profiles of people involved in the creation of the algorithms, as well as in the management of the companies, who can make decisions about whether and how to use them, she added.

“We need a different set of teaching mechanisms where people are actually trained to consider from the beginning what’s the intended positive, what’s the intended negative, and what is some likely negatives, and then decide how far they go down that path,” Bernholz concluded.

Here’s the complete video interview, part of SiliconANGLE’s and theCUBE’s coverage of the Women in Data Science conference:

In This Video
Senior Research Scholar, Director, Digital Civil Society Lab, Stanford University