Vermont Public is independent, community-supported media, serving Vermont with trusted, relevant and essential information. We share stories that bring people together, from every corner of our region. New to Vermont Public? Start here.

© 2024 Vermont Public | 365 Troy Ave. Colchester, VT 05446

Public Files:
WVTI · WOXM · WVBA · WVNK · WVTQ
WVPR · WRVT · WOXR · WNCH · WVPA
WVPS · WVXR · WETK · WVTB · WVER
WVER-FM · WVLR-FM · WBTN-FM

For assistance accessing our public files, please contact hello@vermontpublic.org or call 802-655-9451.
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Software Developed At Dartmouth Looks To Stop The Spread Of Extremist Videos

BrianAJackson
/
iStock.com
Software developed at Dartmouth's Counter Extremism Project was modeled after Microsoft software aimed at preventing the dissemination of online child pornography, and extracts digital signatures from video and audio recordings.

A computer scientist at Dartmouth College has developed new software aimed at quickly identifying and stopping the spread of extremist videos online that are used to incite violent attacks. Dr. Hany Farid developed the software as part of Dartmouth's Counter Extremism Project.

The system was designed in a way that is similar to one used to prevent the dissemination of child pornography online. That software was developed with Microsoft, and it allowed social media companies to find, remove and report instances of child pornography.

"The way that technology works is that every image, every video, every audio recording has a distinct signature, very much like human DNA. There's a distinct signature that we are able to extract from the underlying medium. Even as that medium undergoes changes as it makes its way through the internet, we can identify it," Farid explained. "Working in partnership with the National Center for missing and exploited children, we extracted digital signatures from known child pornography content that we know continually, year after year, gets distributed. We then compare the signatures to things being uploaded to Facebook, to Twitter, to Instagram, etc., and we simply ask, is this a known bad content? And if it is, we will remove and report it."

Right now, most sites follow a manual process for removing content. They wait for a report of a violation or offensive content, which is then removed manually and taken down.

"Once it's removed with a manual intervention and with the understanding that it violates terms of service, we can now remove it forever from a site." -Dr. Hany Farid, Counter Extremism Project

"They suffer from a whack-a-mole problem. It comes up, it stays up for a little while, it comes down, and then the poster posts it again and they have to wait a few weeks," Farid said. "So what we are saying is that if you have agreed that the content violates your terms of service and it has to come down, we will extract a signature from that image, audio or video recording and then any time the content that matches that signature comes up, we will not allow it up. Therefore once it's removed with a manual intervention and with the understanding that it violates terms of service, we can now remove it forever from a site."

When they were looking for child pornography, at the time videos were not as ubiquitous, the software mostly tracked photographs. Now video and audio dominate. The software extracts digital signatures from video and audio recordings. The CEP is careful about how much technical information they release, but they do look at the actual contents. They'll extract distinct features, and those that are stable over the lifetime of the video.

"One of the very nice things about these signatures is that if I hand you a signature, which is just a bunch of numbers, you can't reconstruct the content that it comes from. That means we can safely ship those signatures around without worrying about the content leaking," Farid said.

Social media companies such as Facebook and Twitter have yet to commit to using the software. When the child pornography software was released, they were hesitant.

"This not a First Amendment issue," Farid said. "Facebook, Twitter, YouTube, Google, they get to decide what goes on their network or not. What makes them nervous is the precedence this sets. That if we start taking down child pornography and now we start taking down counter extremism, what's next down the line? It's a legitimate concern to have. We feel like this is a narrowly-defined area. We are saying, you are already agreeing to take down certain content. We are trying to make that fully automatic, extremely reliable and efficient."

The technology is going under final engineering in order to be deployed, and will be ready in the next month or two.

A graduate of NYU with a Master's Degree in journalism, Mitch has more than 20 years experience in radio news. He got his start as news director at NYU's college station, and moved on to a news director (and part-time DJ position) for commercial radio station WMVY on Martha's Vineyard. But public radio was where Mitch wanted to be and he eventually moved on to Boston where he worked for six years in a number of different capacities at member station WBUR...as a Senior Producer, Editor, and fill-in co-host of the nationally distributed Here and Now. Mitch has been a guest host of the national NPR sports program "Only A Game". He's also worked as an editor and producer for international news coverage with Monitor Radio in Boston.
Latest Stories