Angie Drobnic Holan – Nieman Lab https://www.niemanlab.org Thu, 27 Apr 2023 14:32:31 +0000 en-US hourly 1 https://wordpress.org/?v=6.2 Searching for gold: Making sense of academic research about journalism https://www.niemanlab.org/2023/04/searching-for-gold-making-sense-of-academic-research-about-journalism/ https://www.niemanlab.org/2023/04/searching-for-gold-making-sense-of-academic-research-about-journalism/#respond Thu, 27 Apr 2023 14:32:00 +0000 https://www.niemanlab.org/?p=214609 Do academics know secrets about journalism that working reporters and editors don’t know?

For curious journalists like me, spending time reading academic research about journalism and democracy reveals a mixed picture.

There’s plenty of research to show that journalism is still a critical part of an engaged society. Decades of evidencebased studies show a correlation between news consumption and political engagement. People who read more news tend to vote more regularly and engage more in their own community.

Newer academic studies tend to look at very specific practices around types of journalism and find insights particular to certain beats or coverage areas — and there’s quite a lot of it. Just a few examples include how journalists use empathy in covering homelessness, whether fact-checking changes false beliefs, and how audiences react to watching coverage of terrorism.

But keeping track of all that academic research across subject areas is no easy task. Here’s where professors Mark Coddington and Seth Lewis have stepped up an email newsletter (hosted on Substack) that aims to showcase the most compelling research published each month. The newsletter is called RQ1, and Nieman Lab republishes it each month.

Want more? Subscribe to our newsletter here and have Nieman Lab’s daily look at the changing world of digital journalism sent straight to your inbox.

Coddington and Lewis are both former journalists who became academics. (For several years, Coddington wrote the “Week in Review” column for Nieman Lab.) They now study their former colleagues amid a changing digital news environment, tackling issues of data journalism, social media, news engagement and news aggregation. (Coddington is at Washington and Lee University, while Lewis is at University of Oregon.)

“We’ve had trouble ourselves keeping up with the constant flow of new research on news and journalism, and we want to help you keep up with it as we try to wade through it as well,” they write in the newsletter.

As editor-in-chief of PolitiFact, I have a high interest in keeping up with academic research on fact-checking, and as a Nieman Fellow I’ve been studying research about the connection between journalism and democracy, so I reached out to Coddington and Lewis with a few questions. This interview has been edited for length and clarity.

Angie Drobnic Holan: When you’re putting together the newsletter each month, is there just a gusher of research to go through? And have you noticed changes in the research over the years?

Mark Coddington: I feel at times overwhelmed by the gusher of research that is out there. Almost every major journal that regularly publishes sends out email alerts when they publish a study, so I subscribe to all of those. And then there are others that I check regularly as well. Any new research goes into a spreadsheet, and that spreadsheet runs to about 75 to 80 articles a month. And that’s a lot of research — a lot. For the newsletter, we select the ones that we think would be of most interest to journalists or researchers.

Seth Lewis: The study of communication has been around for about 100 years, but the focused study of journalism in this field that we now call journalism studies is really only about two decades old. And in fact, that began with the founding of the journals Journalism and Journalism Studies, which both appeared in 2000. The journal Journalism Practice came out in 2007, and then Digital Journalism was launched in 2013…So there has been a real flourishing of research about news in the last two decades, which, of course, kind of ironically tracks the period in which newspapers have contracted. The news industry has seen its fortunes crumble in the last couple of decades, while space and attention given to research about journalism has grown dramatically.

Holan: What are the areas currently in journalism research that are really robust and productive?

Coddington: One of those areas is sociology of journalism, especially the practice of journalism during this time of immense change. Since the late 2000s or so, a lot of strong research looks at how journalists do their jobs, and how it has changed in so many different areas. Researchers have studied the values journalists bring to their work, and how the values changed. A lot of these are practice-oriented sociological questions.

Holan: Do you think it’s helpful for working journalists to read this research?

Lewis: When I worked at the Miami Herald, I remember that sometimes I would wander over to different parts of the newsroom, and near the executive editor’s office there was a coffee table with various reading materials, probably for people who were waiting to meet with the editor. And on that coffee table was a copy of Newspaper Research Journal, which is another journal that covers research about news. And I remember, as a journalist, picking this up and flipping through it and thinking, “What is the purpose of this research? None of this seems very relevant to what we do.” It was a flippant response, and now it’s sort of ironic that I do research about news. But there is research about journalism that, depending on how it’s framed and conducted, can feel pretty detached from the actual working realities of journalism. As journalism research has become more established academically, it’s tended toward specialization and some degree of jargon and terminology that’s opaque.

But strong research does exist, and it has a lot of relevance for journalists. And nowadays, given all of the kinds of networks and social media and email alerts that exist, the opportunities for journalists to come into contact with that good research and find value from it are much greater than ever before.

Coddington: I think it’s partly a question of the level of engagement. As far as deep engagement with journalism research, I’m not sure that’s the best investment of time for an incredibly busy journalist. Because it’s hard for me, on top of my job that actually includes this, to deeply engage with and read and fully understand multiple news studies a month — and to actually understand what they’re saying and how they’re engaging with other areas of research. That’s beyond what a journalist should reasonably be expected to do, and I’m not sure it’s the best investment of their time, because it takes a long time to really thoroughly read and understand an academic study.

But I think some familiarity with research in the field is helpful for journalists to just understand and think a little bit more deeply about what they’re doing.

If you can get an introduction to at least some of the ideas of how people have thought about how journalists do their jobs, it can really help you think from a different angle of what is actually going on in your job, and potentially how to do it better.

Holan: When you write the RQ1 newsletter, what audience do you have in mind? Is it just journalists, or nonjournalists as well?

Coddington: When we started, my intended audience was journalists, but it was also busy academics who want to keep up with research but simply don’t have time. I also thought of it as written for first-year graduate students. That is still, in my head, sort of my happy medium, because somebody in their first month of a master’s program is still learning about this stuff.

Lewis: I also imagine that we might be able to reach people who are interested in news and journalism, even if they’re not actually working journalists. There are people who find news fascinating and interesting, or people who just like to be informed about what’s happening in the world of journalism, because they find it an intriguing space. We want to make sure that the really good stuff rises to the top and gets the notice that it deserves.

Social media has changed the game, and academics have used Twitter as a key medium to talk about their work — to get it noticed, not only by fellow researchers, but also by journalists. But we’ve also seen ways in which these social networks are kind of uneven and problematic. Many academics have pulled back on their use of Twitter. And so there’s a sense that email is the ultimate common denominator. An email newsletter is something that everybody can easily tap into.

Holan: I see a lot of research about journalism coming from a lot of different academic fields, from computer scientists or librarians or philosophers. It can be research that crosses a lot of academic borders. Do you see that?

Lewis: I would say that journalism has become more interesting precisely because its fortunes have become more uncertain. It’s the inherent instability in the space that makes it so fascinating to many researchers. Whether they’re coming from sociology, political science, economics, or computer science, each of them can find in this a highly dynamic space where there’s a lot of uncertainty as to what it’s going to look like in five years or 10 years, and what will happen to legacy players compared to emerging upstarts, and what will be the knock on-effects of losing newspapers in communities, and what the loss of news media means for declines in civic participation, and so on. I think there’s a growing interest in fields to look at the changing dynamics of journalism as a way to examine larger patterns in society.

Coddington: Fundamentally, it can be easy for academics in journalism studies to forget that journalism is actually an object of study rather than a field academically in itself. There is a field of journalism studies, but fundamentally, that’s not an academic discipline, like sociology or anthropology, or philosophy, or something like that. Journalism is an object of study. And I think the more disciplinary lenses through which we can look at it the better. And yes, most often it’s been looked at through a social scientific lens that is housed within communication as a field. But it’s equally legitimate to study it through an economic lens, or a political science lens, or an historical lens.

Holan: Some journalists are starting to do more research on themselves. I work in fact-checking journalism, and many fact-checking newsrooms have put out their own studies on how they see their field developing and what effects fact-checking produces. It might not be considered scholarly, but it is serious research.

Coddington: You asked earlier whether journalists should know about academic research, and I would say that if somebody is going into fact-checking, do they need to read all the research on fact-checking? No, that would take too long to read. You should just focus on being a better fact-checker. But, should you read Lucas Graves’ book, Deciding What’s True? Yes, you absolutely should read that book, if you are going to go into fact-checking in any form. It will help you think so much better about what you’re doing.

Holan: I keep running into sociologist Michael Schudson’s work every time I work on any project about journalism and democracy. His book The Sociology of News influenced me a lot. What books have shaped you?

Coddington: I think every journalism scholar has a book that they either read as a journalist, if they were a journalist, or early on in graduate school — there was a book that kicked open the door to a new way of thinking, and that they would probably recommend to every  journalist. For me, it’s Gaye Tuchman’s Making News: A Study in the Construction of Reality, from 1978. Almost every paper I write her work has influenced in some form that I have to cite. She’s a sociologist, and the way that she thought about how journalists know what they know, and how they put that all together within the thought-professional environment that they live in, on a day-to-day basis…It just felt like a new way of thinking about it, that honestly colors and informs so much of the way we talk about the way journalists do their jobs, whether people have read the book or not.

Lewis: For me, it wasn’t so much a book as it was blogging. In particular, it was Jay Rosen’s PressThink blog. I was working as a journalist, but when I had various breaks and downtime, I found that I was gravitating more and more to PressThink, around 2004 to 2005. He was in a sense kind of doing public scholarship through that blog. He was writing about news, although not in a research-driven way, but he was bringing a critical evaluative lens to it that I found really fascinating. It was prompting me to ask questions about the work I was doing, and about how those questions could be explored more fully. When Jay Rosen talked about people formerly known as the audience, as he famously did in 2006, that concept really resonated with me, in a way that ended up informing some of my early research into participatory journalism.

But I also remember when I decided to go back and do a Ph.D., I asked someone what I should read in preparation, and they recommended Herbert Gans’s book, Deciding What’s News, from 1979. That and Tuchman’s book stand as these two pillars of journalism research from the 20th century that still have such a shaping influence on the way we study the sociology of news today.

I do think there is real value in finding those important books that bring together the research on a given topic, either as one of the first key things written about the topic, or because it summarizes a lot of existing research. As an example, my friend and collaborator, Sue Robinson, has a book coming this year called How Journalists Engage: A Theory of Trust Building, Identities, and Care. It will be a book that tells the story of engagement and journalism, which has been one of the really robust areas of research over the past five to 10 years. And so she’ll both synthesize what has been done, but also bring her own new original research to it. That’s the kind of book that a journalist would benefit from reading at least a couple of chapters. They would get a lot out of that, as opposed to trying to summarize and skim 40 or 50 articles.

Holan: Final question: Why do you call the newsletter RQ1?

Coddington: When writing research papers, RQ1 is the shorthand for the first research question. So when you have multiple research questions you will shorten it to say, RQ1, RQ2, RQ3, and then hypotheses are H1, H2, H3. So it is a bit of academic shorthand that almost any academic in our field would get. And for anybody else, at least it wouldn’t turn them off.

Lewis: I think it’s appropriate we call it RQ1 and not H1, because in the field of journalism research, we tend to ask research questions rather than pose hypotheses. Hypotheses work well for studies of things that are well-established, where things feel stable and you’re looking for incremental forms of change. But the study of journalism tends to involve more exploratory, inductive forms of qualitative analysis. That generally begins with research questions as opposed to hypotheses. And that really speaks to the nature of this work right now, that the future of journalism is very much in flux. It’s very much this open-ended question. Our purpose is to point to the research questions that are being asked and answered, and to gesture to more questions yet to be explored.

Angie Drobnic Holan is editor-in-chief of PolitiFact and a 2023 Nieman Fellow.

]]>
https://www.niemanlab.org/2023/04/searching-for-gold-making-sense-of-academic-research-about-journalism/feed/ 0
Inside the Star Chamber: How PolitiFact tries to find truth in a world of make-believe https://www.niemanlab.org/2012/08/inside-the-star-chamber-how-politifact-forges-truth-in-the-world-of-make-believe/ https://www.niemanlab.org/2012/08/inside-the-star-chamber-how-politifact-forges-truth-in-the-world-of-make-believe/#comments Tue, 21 Aug 2012 14:30:31 +0000 http://www.niemanlab.org/?p=65501 PolitiFact editor Bill Adair in the Star Chamber

WASHINGTON — PolitiFact’s “Star Chamber” is like Air Force One: It’s not an actual room, just the name of wherever Bill Adair happens to be sitting when it’s time to break out the Truth-O-Meter and pass judgment on the words of politicians. Today it’s his office.

Three judges preside, usually the same three: Adair, Washington bureau chief of the Tampa Bay (née St. Petersburg) Times; Angie Drobnic Holan, his deputy; and Amy Hollyfield, his boss.

For this ruling — one of four I witnessed over two days last month — Holan and Hollyfield are on the phone. Staff writer Louis Jacobson is sitting in. Jacobson was assigned to check this claim from Rep. Jeff Duncan (R-S.C.):

After conducting five interviews, poring over survey data, and filing 1,100 words, Jacobson is recommending False. Hollyfield wants to at least consider something stronger:

Hollyfield: Is there any movement for a Pants on Fire?

Adair: I thought about it, but I didn’t feel like it was far enough off to be a Pants on Fire. What did you think, Lou?

Jacobson: I would agree. Basically it was a case I think of his staff blindly taking basically what was in Drudge and Daily Caller. Should they have been more diligent about checking the fine print of the poll? Yes, they should have. Were they being really reckless in what they did? No. It was pretty garden-variety sloppiness, I would say. I don’t think it rises to the level of flagrancy that I would think of a Pants on Fire.

Adair: It’s just not quite ridiculous. It’s definitely false, but I don’t think it’s ridiculous.

This scene has played out 6,000 times before, but not in public view. Like the original Court of Star Chamber, PolitiFact’s Truth-O-Meter rulings have always been secret. The Star Chamber was a symbol of Tudor power, a 15th-century invention of Henry VII to try people he didn’t much care for. While the history is fuzzy, Wikipedia’s synopsis fits the chamber’s present-day reputation: “Court sessions were held in secret, with no indictments, no right of appeal, no juries, and no witnesses.”

PolitiFact turns five on Wednesday. Adair founded the site to cover the 2008 election, but the inspiration came one cycle earlier, when a turncoat Democrat named Zell Miller told tens of thousands of Republicans that Sen. John Kerry had voted to weaken the U.S. military. “Miller was really distorting his record,” Adair says, “and yet I didn’t do anything about it.”

The team won a Pulitzer Prize for the election coverage. The site’s basic idea — rate the veracity of political statements on a six-point scale — has modernized and mainstreamed the old art of fact-checking. The PolitiFact national team just hired its fourth full-time fact checker, and 36 journalists work for PolitiFact’s 11 licensed state sites. This week PolitiFact launches its second, free mobile app for iPhone and Android, “Settle It!,” which provides a clever keyword-based interface to help resolve arguments at the dinner table. (PolitiFact’s original mobile app, at $1.99, has sold more than 24,000 copies.) The site attracts about 100,000 pageviews per day, Adair told me, and that number will certainly rise as the election draws closer and politicians get weirder.

PolitiFact's "I Brake for Pants on Fire" bumper sticker

If your job is to call people liars, and you’re on a roll doing it, you can expect a steady barrage of criticism. PolitiFact has been under fire practically as long as it has existed, but things intensified earlier this year, when Rachel Maddow criticized PolitiFact for, in her view, botching a series of rulings.

In public, Adair responded cooly: “We don’t expect our readers to agree with every ruling we make,” is his refrain. In private, it struck a nerve.

“I think the criticism in January and February, added to some of the criticism we’ve gotten from conservatives over the months, persuaded us that we needed to make some improvements in our process,” Adair told me. “We directed our reporters to slow down and not try to rush fact-checks. We directed all of our reporters and editors to make sure that [they’re] clear in the ruling statement.”

Adair made a series of small changes to tighten up the journalism. And for the first time he invited a reporter — me — to watch the truth sausage get made.

The paradox of fact-checking

To understand fact-checking is to accept a paradox: “Words matter,” as PolitiFact’s core principles go, and “context matters.”

Consider this incident recently all over the news: Harry Reid says some guy told him Mitt Romney didn’t pay taxes for 10 years. It’s probably true. Some guy probably did say that to Harry Reid. But we can’t know for sure. To evaluate that statement is almost impossible without cooperative witnesses to the conversation.

Now, is Reid’s implication true? We can’t know that, either, not until someone produces evidence. So how does a fact checker handle this claim?

The Truth-O-Meter gave Reid its harshest ruling, “Pants on Fire,” a PolitiFact trademark reserved for claims it considers not only false but absurd. In the Star Chamber, judges ruled that Reid had no evidence to back up his claim.

“It is now possible to get called a liar by PolitiFact for saying something true,” complained James Poniewozik and others. But True certainly would not have sufficed, here not even Half True.

Maybe the Truth-O-Meter needs an “Unsubstantiated” rating. They considered it, but decided against it, Adair told me, “because of fears that we’d end up rating many, many things ‘unsubstantiated.'”

Whereas truth is complicated, elastic, subjective… the Truth-O-Meter is simple, fixed, unambiguous. In a way, this overly simplistic device embodies the problem PolitiFact is trying to solve.

“The fundamental irony is that the same technological changes and changes in the media system that make organizations like PolitiFact and FactCheck.org possible also make their work less effective, in that we do have this highly fragmented media environment,” said Lucas Graves, who recently defended his dissertation on fact-checking at Columbia University.

So the Truth-O-Meter is the ultimate webby invention: bite-sized, viral-ready. Whether that Pants on Fire for Reid was warranted or not, 4,300 shares on Facebook is pretty good. PolitiFact is not the only fact checker in town, but the Truth-O-Meter is everywhere; the same simplicity in its rating system that opens it to so much criticism also helps it spread, tweet by tweet.

“PolitiFact exists to be cited. It exists to be quoted,” Graves said. “Every Truth-O-Meter piece packages really easily and neatly into a five-minute broadcast segment for CNN or for MSNBC.” (In fact, Adair told me, he has appeared on CNN alone at least 300 times.)

PolitiFact political cartoon

Stories get “chambered,” in PolitiFact parlance, 10-15 times a week. Adair begins by reading the ruling statement — that is, the precise phrase or claim being evaluated — aloud. Then — and this is new, post-criticism — Adair asks four questions, highlighted in bold. (“Sounds like something from Passover, but the four questions really helps get us focused,” he says.) Let’s return to the ruling we started with, from the beginning:

Adair: We are ready to rule on the Jeff Duncan item. So the ruling statement is: “83 percent of doctors have considered leaving the profession because of ObamaCare.” Lou is recommending a False. Let’s go through the questions.

Is the claim literally true?

Adair: No.

Jacobson: No, using Obamacare.

Is the claim open to interpretation? Is there another way to read the claim?

Jacobson: I don’t think so.

Adair: I don’t think so.

Does the speaker prove the claim to be true?

Adair: No. Did you get in touch with Duncan?

Jacobson: Yes, and his office declined to speak. Politely declined.

Did we check to see how we handled similar claims in the past?

Adair: Yes, we looked at the — and this didn’t actually get included in the item…

Jacobson:The Glenn Beck item.

Adair: Was it Glenn Beck?

Jacobson: Two years ago.

Adair: I thought it was the editorial in the Financial Times or whatever. What was that?

Jacobson: Well, Beck was quoted citing a poll by Investors Business Daily.

Adair: Investors Business Daily, right.

Jacobson: We gave that a False too, I think. But similar issues, basically.

Adair: Okay. So we have checked how we handled similar things in the past. Lou is recommending a false. How do we feel about false?

Angie: I feel good.

Hollyfield: Yup.

Adair: Good. All right, not a lot of discussion on this one!

Then, after briefly considering Pants on Fire, they agree on False.

Another change in the last year has created a lot of grief for PoitiFact: Fact checkers now lean more heavily on context when politicians appear to take credit or give blame. Which brings us to Rachel Maddow’s complaint. In his 2012 State of the Union address, President Obama said:

In the last 22 months, businesses have created more than 3 million jobs. Last year, they created the most jobs since 2005.

PolitiFact rated that Half True, saying an executive can only take so much credit for job creation. But did he take credit? Would the claim have been 100 percent true if not for the speaker? Under criticism, PolitiFact revised the ruling up to Mostly True. Maddow was not satisfied:

You are a mess! You are fired! You are undermining the definition of the word “fact” in the English language by pretending to it in your name. The English language wants its word back. You are an embarrassment. You sully the reputation of anyone who cites you as an authority on “factishness,” let alone fact. You are fired.

Maddow (in addition to many, many liberals) was already mad about PolitiFact’s pick for 2011 Lie of the Year, that Republicans had voted, through the Ryan budget, to end Medicare. Of course, her criticism then was that PolitiFact was too literal.

“Forget about right or wrong,” Graves said. “There’s no right answer if you define ‘right’ as coming up with a ruling that everybody will agree with, especially when it comes to the question of interpreting things literally or taking an account out of context.” Damned if they do, damned if they don’t.

Graves, who identifies himself as falling “pretty left” on the spectrum, has observed PolitiFact twice: for a week last year and again for a three-day training session with one of PolitiFact’s state sites.

“One of the things that comes through clearest when you spend time with fact checkers…is that they have a very healthy sense that these are imperfect judgments that they’re making, but at the same time they’re going to strive to do them as fairly as possible. It’s a human endeavor. And like all human endeavors, it’s not infallible.”

A real live Truth-O-Meter

The truth is that fact-checking, and fact checkers, are kinda boring. What I witnessed was fair and fastidious; methodical, not mercurial. (That includes the other three (uneventful) rulings I watched.) I could uncover no evidence of PolitiFact’s evil scheme to slander either Republicans or Democrats. Adair says he’s a registered independent. He won’t tell me which candidate he voted for last election, and he protects his staff members’ privacy in the voting booth. In Virginia, where he lives, Adair abstains from open primary elections. Revealing his own politics would “suggest a bias that I don’t think is there,” Adair says.

“In a hyper-partisan world, that information would get distorted, and it would obscure the reality, which is that I think political journalists do a good job of leaving their personal beliefs at home and doing impartial journalism,” he says.

Does all of this effort make a dent in the net truth of the universe? Is moving from he-said-she-said to some form of judgment, simplified as it may be, “working?” Last month, David Brooks wrote:

A few years ago, newspapers and nonprofits set up fact-checking squads, rating campaign statements with Pinocchios and such. The hope was that if nonpartisan outfits exposed campaign deception, the campaigns would be too ashamed to lie so much.

This hope was naive. As John Dickerson of Slate has said, the campaigns want the Pinocchios. They want to show how tough they are.

“I don’t think we were naive. I’ve always said anyone who imagines we can change the behavior of candidates is bound to be disappointed,” said Brooks Jackson, director of FactCheck.org. He was a pioneer of modern political fact-checking for CNN in the 1990s. “I suspect it is a fact that the junior woodchucks on the campaign staffs have now perversely come to value our criticism as some sort of merit badge, as though lying is a virtue, and a recognized lie is a bigger virtue.”

Rarely is there is a high political cost to lying. All the explainers in the world couldn’t completely blunt the impact of the Swift Boat Veterans for Truth’s campaign to denigrate John Kerry’s military service. More recently, in July, the Democratic Congressional Campaign Committee claimed Chinese prostitution money helped finance the campaign of a Republican Congressman in Ohio. PolitiFact rated it Pants on Fire.

That didn’t stop the DCCC from rolling out identical claims in Wisconsin and Tennessee. The DCCC eventually apologized. But which made more of an impression on voters, the original lie or the eventual apology from an amorphous nationwide organization?

Brendan Nyhan, a political science professor at Dartmouth College, has done a lot of research on the effects of fact-checking on the public. As he wrote for CJR:

It is true that corrective information may not change readers’ minds. My research with Georgia State’s Jason Reifler finds that corrections frequently fail to reduce misperceptions among the most vulnerable ideological group and can even make them worse (PDF). Other research has reached similarly discouraging conclusions — at this point, we know much more about what journalists should not do than how they can respond effectively to false statements (PDF).

If the objective of fact-checking is to get politicians to stop lying, then no, fact-checking is not working. “My goal is not to get politicians to stop lying,” is another of Adair’s refrains. “Our goal is…to give people the information they need to make decisions.”

Unlike The Washington Post’s Glenn Kessler, who awards Pinocchios for lies, or PolitiFact, which rates claims on a Truth-O-Meter, Jackson’s FactCheck.org doesn’t reduce its findings to a simple measurement. “I think you are telling people we can tell the difference between something that is 45 percent true and 57 percent true — and some negative number,” he said, referring to Pants on Fire. “There isn’t any scientific objective way to measure the degree of mendacity to any particular statement.”

“I think it’s fascinating that they chose to call it a Truth-O-Meter instead of a Truth Meter,” Graves said. Truth-O-Meter sounds like a kitchen gadget, or a toy. “That ‘O’ is sort of acknowledging that this is a human endeavor. There’s no such thing as a machine for perfectly and accurately making judgments of truth.”

Update: This story was edited after publication to clarify details of the “Star Chamber” process. Political cartoon by Chip Bok used with permission.

]]>
https://www.niemanlab.org/2012/08/inside-the-star-chamber-how-politifact-forges-truth-in-the-world-of-make-believe/feed/ 3