Vermont Public is independent, community-supported media, serving Vermont with trusted, relevant and essential information. We share stories that bring people together, from every corner of our region. New to Vermont Public? Start here.

© 2024 Vermont Public | 365 Troy Ave. Colchester, VT 05446

Public Files:
WVTI · WOXM · WVBA · WVNK · WVTQ
WVPR · WRVT · WOXR · WNCH · WVPA
WVPS · WVXR · WETK · WVTB · WVER
WVER-FM · WVLR-FM · WBTN-FM

For assistance accessing our public files, please contact hello@vermontpublic.org or call 802-655-9451.
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Screening Free For Vermonters: Documentary Explores Bias Coded In Algorithms, Technology

An image from the documentary Coded Bias shows a computer program mapping a white mask being held in front of a Black woman's face.
Coded Bias, courtesy
An image from the documentary Coded Bias, on the shortcomings and biases being programmed into commonplace technologies like algorithms. The documentary is screening free for Vermonters through March 8.

Racial bias — implicit, subconscious or out in the open — is a serious human problem. So serious that it's been detected in an unexpected place: the world of artificial intelligence, computers and facial recognition technology. A documentary that's screening free for Vermonters through March 8 delves into the problem.VPR’s Mitch Wertlieb discussed the documentary Coded Bias with Traci Griffith, an associate professor of media studies, journalism and digital arts at St. Michael's College. She moderated a https://youtu.be/awpfq7R6p20?t=286">December 2020 live-streamed panel discussion with Coded Bias producer/director Shalini Kantayya, cast member and NYU journalism professor Meredith Broussard and UVM philosophy professor Randall Harp.  A transcript of their conversation has been edited for clarity.

Click here to stream Coded Bias for free through March 8.

Mitch Wertlieb: Coded Bias follows an MIT Media Lab researcher [Joy Buolamwini] who has discovered that many facial recognition technologies fail more often on darker-skinned faces. What prompted her to look into this problem in the first place?

Traci Griffith: [Buolamwini] was working in the MIT Media Lab as a researcher, and part of her job entailed using facial recognition technology. And she is a Black woman. She figured out that it was failing to recognize her face, and she realized this was a problem with the technology. It was inherent in the technology.

How did she find out that it was having trouble reading her face?

Well, it's interesting. When you view the documentary, she appears in front of the screen and it fails to “read” her face. She then places a white mask — it's kind of like a mime mask, that's the best way I could describe it — but she places this mask in front of her face, and then the technology reads it. It's amazing when you see it actually happening on screen.

What surprises me so much about this is — I guess it was an assumption that I had — that when human biases are taken out of the equation, things like racial bias should not be an issue. But I'm guessing that these programs are created by humans with their own biases, and I'm wondering if that's what this researcher discovered. Kind of that old “garbage in, garbage out” kind of situation, when it comes to computers. Is it that simple, or is it more complex than that?

I think your assumption is very correct, in that you would feel that maybe the use of machines might even out the playing field. The problem is that artificial intelligence is man-made, right? It's man-made! And the vast number of programmers, and those who are creating this technology, are white males. And so the technology is created through their lens.

More From VPR News: Protests In White And Black, And The Different Response Of Law Enforcement

Our inherent biases, even those we don't recognize, are then built into the process of the machine. And the machine-learned algorithms that are created by humans, are just as biased as we are.

What did this researcher do after she discovered this problem? Did she bring it to the attention of people at MIT?

Not just MIT. She actually brought it to the attention of Congress, because a lot of these systems are being used by our government.

Facial recognition technology is rampant across the United States. We are often being surveilled without our knowledge, or without our understanding of what exactly that could mean for us.

Credit Courtesy Coded Bias

And so as we walk around town, as we walk around cities, there are biometric systems in place used for general surveillance, and we don't even recognize that it's happening. And so it doesn't require our knowledge. It doesn't require our consent. It doesn't require our participation. [Through] our simply being, walking around town, we are being surveilled. And in these biometric systems, facial recognition in this situation, is being used to identify people, and your whereabouts, and when you're there and how often you go there, et cetera.

These surveillance systems know what we're doing. And they're being used largely by our government, but mainly for things like [by] police, in monitoring where people might go, or where they might be.

We see a lot of this after the Jan. 6 uprising at the Capitol. Facial recognition technology is being used to identify people who were there.

Now, some could say that's good. Some could say that's bad. But we need to consider the bias that is inherent in this technology if we're going to be using it for such kinds of of situations.

One of the issues that Black Lives Matter protesters have continually brought up is, we need to be seen. We need to be seen as citizens of this country who have the same rights as white people. We’re not being seen.

And it seems to me, this problem here, with this facial recognition technology not even being able to acknowledge this Black woman's face, shows that this problem goes well beyond Black Lives Matter.

It absolutely does. And it's not just about being seen, but it's being seen for who you are. Because we've also found that there's a lot of fallibility in this facial recognition technology. So even if you are seen, a number of problems come about with particular groups of people — and in this situation, Black people in particular — that particular groups are not seen … [or if they are, they are seen as] “the inaccuracy.” That's part of the issue as well. So you're seen, but you're not seen for who you actually are. You're misidentified. So it compounds the problem.

More From VPR News: Facing Push For BLM Banner In City Park, Barre Decides To Fly 22 Different Flags

Yes, the Black Lives Matter movement is pushing the idea of being seen as being recognized, but it's also about being recognized for who you are and not misidentified. [So you don't] get the knock on the door from the police officer, because your face has been recognized as being someplace that you weren't, because you've been misidentified. And so it is about accuracy, it's about being seen, but it's also about being seen for who you are, in a way that recognizes you as the individual that you are.

This film is absolutely amazing. I will tell you, there's so many aspects of it that you just don't even think about, or recognize. And it opens your eyes to some of the problems with this AI technology. It's very, very interesting.

Have questions, comments or tips? Send us a message or tweet Morning Edition host Mitch Wertlieb @mwertlieb.

We've closed our comments. Read about ways to get in touch here.

A graduate of NYU with a Master's Degree in journalism, Mitch has more than 20 years experience in radio news. He got his start as news director at NYU's college station, and moved on to a news director (and part-time DJ position) for commercial radio station WMVY on Martha's Vineyard. But public radio was where Mitch wanted to be and he eventually moved on to Boston where he worked for six years in a number of different capacities at member station WBUR...as a Senior Producer, Editor, and fill-in co-host of the nationally distributed Here and Now. Mitch has been a guest host of the national NPR sports program "Only A Game". He's also worked as an editor and producer for international news coverage with Monitor Radio in Boston.
Matt Smith worked for Vermont Public from 2017 to 2023 as managing editor and senior producer of Vermont Edition.
Latest Stories