Who’s Your Hero?

The most seasoned news reporters can still be biased. So a group of Carleton computer science majors are developing a browser extension to detect biases in online news articles as their comps project.

Who's your hero?Who's your hero? Photo: Greg Mably How it works: “The idea is to detect implicit bias by assigning ‘roles’ to figures in news articles,” says Quinn Mayville ’19 (Newton, Mass.). “Even articles that are presenting facts use narrative framing. We’re trying to recognize whether there’s a ‘hero,’ ‘villain,’ or ‘victim’ in the article. For example, a Fox News article about Donald Trump might frame him as a hero whereas a similar story in the New York Times might portray him as a villain. We’re using natural language processing techniques to compare which words are used around people and entities in the article to classify them into those three categories.”

Why it matters: “We have a duty as media consumers to try to ingest as many reasonable perspectives as possible in order to adequately form our own opinions and beliefs,” says Tianna Avery ’19 (Hampton, Va.). “Our project could make people more conscious of how the articles they’re reading are biased, and it implicitly invites them to seek out different views of the stories or entities being discussed.”

Biggest challenge: “We’re implementing a paper published by computer scientists at Northwestern,” says Mayville. “Following someone else’s paper has been a bit bumpy. We’re reworking their algorithm a bit to create better outputs.”

Do unbiased news stories exist? “I’ve never seen an exceptional example of one,” says Avery. “It’s possible, but unusual.”

Add a comment

The following fields are not to be filled out. Skip to Submit Button.
(This is here to trap robots. Don't put any text here.)
(This is here to trap robots. Don't put any text here.)
(This is here to trap robots. Don't put any text here.)