1. Welcome to CowboysZone!  Join us!  Come on!  You know you want to!

Media Bias Is Real, Finds UCLA Political Scientist

Discussion in 'Political Zone' started by Doomsday101, Nov 2, 2007.

  1. Doomsday101

    Doomsday101 Well-Known Member

    78,530 Messages
    3,832 Likes Received
    While the editorial page of The Wall Street Journal is conservative, the newspaper's news pages are liberal, even more liberal than The New York Times. The Drudge Report may have a right-wing reputation, but it leans left. Coverage by public television and radio is conservative compared to the rest of the mainstream media. Meanwhile, almost all major media outlets tilt to the left.

    These are just a few of the surprising findings from a UCLA-led study, which is believed to be the first successful attempt at objectively quantifying bias in a range of media outlets and ranking them accordingly.

    "I suspected that many media outlets would tilt to the left because surveys have shown that reporters tend to vote more Democrat than Republican," said Tim Groseclose, a UCLA political scientist and the study's lead author. "But I was surprised at just how pronounced the distinctions are."

    "Overall, the major media outlets are quite moderate compared to members of Congress, but even so, there is a quantifiable and significant bias in that nearly all of them lean to the left," said co‑author Jeffrey Milyo, University of Missouri economist and public policy scholar.

    The results appear in the latest issue of the Quarterly Journal of Economics, which will become available in mid-December.

    Groseclose and Milyo based their research on a standard gauge of a lawmaker's support for liberal causes. Americans for Democratic Action (ADA) tracks the percentage of times that each lawmaker votes on the liberal side of an issue. Based on these votes, the ADA assigns a numerical score to each lawmaker, where "100" is the most liberal and "0" is the most conservative. After adjustments to compensate for disproportionate representation that the Senate gives to low‑population states and the lack of representation for the District of Columbia, the average ADA score in Congress (50.1) was assumed to represent the political position of the average U.S. voter.

    Groseclose and Milyo then directed 21 research assistants — most of them college students — to scour U.S. media coverage of the past 10 years. They tallied the number of times each media outlet referred to think tanks and policy groups, such as the left-leaning NAACP or the right-leaning Heritage Foundation.

    Next, they did the same exercise with speeches of U.S. lawmakers. If a media outlet displayed a citation pattern similar to that of a lawmaker, then Groseclose and Milyo's method assigned both a similar ADA score.

    "A media person would have never done this study," said Groseclose, a UCLA political science professor, whose research and teaching focuses on the U.S. Congress. "It takes a Congress scholar even to think of using ADA scores as a measure. And I don't think many media scholars would have considered comparing news stories to congressional speeches."

    Of the 20 major media outlets studied, 18 scored left of center, with CBS' "Evening News," The New York Times and the Los Angeles Times ranking second, third and fourth most liberal behind the news pages of The Wall Street Journal.

    Only Fox News' "Special Report With Brit Hume" and The Washington Times scored right of the average U.S. voter.

    The most centrist outlet proved to be the "NewsHour With Jim Lehrer." CNN's "NewsNight With Aaron Brown" and ABC's "Good Morning America" were a close second and third.

    "Our estimates for these outlets, we feel, give particular credibility to our efforts, as three of the four moderators for the 2004 presidential and vice-presidential debates came from these three news outlets — Jim Lehrer, Charlie Gibson and Gwen Ifill," Groseclose said. "If these newscasters weren't centrist, staffers for one of the campaign teams would have objected and insisted on other moderators."

    The fourth most centrist outlet was "Special Report With Brit Hume" on Fox News, which often is cited by liberals as an egregious example of a right-wing outlet. While this news program proved to be right of center, the study found ABC's "World News Tonight" and NBC's "Nightly News" to be left of center. All three outlets were approximately equidistant from the center, the report found.

    "If viewers spent an equal amount of time watching Fox's 'Special Report' as ABC's 'World News' and NBC's 'Nightly News,' then they would receive a nearly perfectly balanced version of the news," said Milyo, an associate professor of economics and public affairs at the University of Missouri at Columbia.

    Five news outlets — "NewsHour With Jim Lehrer," ABC's "Good Morning America," CNN's "NewsNight With Aaron Brown," Fox News' "Special Report With Brit Hume" and the Drudge Report — were in a statistical dead heat in the race for the most centrist news outlet. Of the print media, USA Today was the most centrist.

    An additional feature of the study shows how each outlet compares in political orientation with actual lawmakers. The news pages of The Wall Street Journal scored a little to the left of the average American Democrat, as determined by the average ADA score of all Democrats in Congress (85 versus 84). With scores in the mid-70s, CBS' "Evening News" and The New York Times looked similar to Sen. Joe Lieberman, D-Conn., who has an ADA score of 74.

    Most of the outlets were less liberal than Lieberman but more liberal than former Sen. John Breaux, D-La. Those media outlets included the Drudge Report, ABC's "World News Tonight," NBC's "Nightly News," USA Today, NBC's "Today Show," Time magazine, U.S. News & World Report, Newsweek, NPR's "Morning Edition," CBS' "Early Show" and The Washington Post.

    Since Groseclose and Milyo were more concerned with bias in news reporting than opinion pieces, which are designed to stake a political position, they omitted editorials and Op‑Eds from their tallies. This is one reason their study finds The Wall Street Journal more liberal than conventional wisdom asserts.

    Another finding that contradicted conventional wisdom was that the Drudge Report was slightly left of center.

    "One thing people should keep in mind is that our data for the Drudge Report was based almost entirely on the articles that the Drudge Report lists on other Web sites," said Groseclose. "Very little was based on the stories that Matt Drudge himself wrote. The fact that the Drudge Report appears left of center is merely a reflection of the overall bias of the media."

    Yet another finding that contradicted conventional wisdom relates to National Public Radio, often cited by conservatives as an egregious example of a liberal news outlet. But according to the UCLA-University of Missouri study, it ranked eighth most liberal of the 20 that the study examined.

    "By our estimate, NPR hardly differs from the average mainstream news outlet," Groseclose said. "Its score is approximately equal to those of Time, Newsweek and U.S. News & World Report and its score is slightly more conservative than The Washington Post's. If anything, government‑funded outlets in our sample have a slightly lower average ADA score (61), than the private outlets in our sample (62.8)."

    The researchers took numerous steps to safeguard against bias — or the appearance of same — in the work, which took close to three years to complete. They went to great lengths to ensure that as many research assistants supported Democratic candidate Al Gore in the 2000 election as supported President George Bush. They also sought no outside funding, a rarity in scholarly research.

    "No matter the results, we feared our findings would've been suspect if we'd received support from any group that could be perceived as right- or left-leaning, so we consciously decided to fund this project only with our own salaries and research funds that our own universities provided," Groseclose said.

    The results break new ground.

    "Past researchers have been able to say whether an outlet is conservative or liberal, but no one has ever compared media outlets to lawmakers," Groseclose said. "Our work gives a precise characterization of the bias and relates it to known commodity — politicians."

    -UCLA-

    MS580
  2. 03EBZ06

    03EBZ06 Need2Speed

    7,979 Messages
    411 Likes Received
    It's interesting that UCLA is reporting this bias since UCLA is a very liberal college and have known for their professors spewing liberal agendas, providing one sided views and ridiculing students who doesn't agree with those views.

    Since my daughter is attending UCLA and her political views have shifted far to the left, I'd like to see UCLA conduct a survey on political views of their staff.

    But yeah, I agree with the article, it's very transparent to me.
  3. jterrell

    jterrell Penguinite

    19,903 Messages
    1,501 Likes Received
    umm, 1. I didn't see a link.

    2. This is VERY VERY DATED!!!
  4. jterrell

    jterrell Penguinite

    19,903 Messages
    1,501 Likes Received
    http://www.brendan-nyhan.com/blog/2005/12/the_problems_wi.html


    December 22, 2005
    The problems with the Groseclose/Milyo study of media bias

    UCLA political scientist Tim Groseclose and Missouri economist Jeff Milyo have published a study (PDF) alleging liberal media bias that is receiving a lot of attention, including a link on Drudge. But you should be wary of trusting its conclusions for reasons that I tried to explain to Groseclose after he presented the paper at Duke in fall 2003.

    First, here's a summary of the study's methodology from the UCLA press release:

    Groseclose and Milyo based their research on a standard gauge of a lawmaker's support for liberal causes. Americans for Democratic Action (ADA) tracks the percentage of times that each lawmaker votes on the liberal side of an issue. Based on these votes, the ADA assigns a numerical score to each lawmaker, where "100" is the most liberal and "0" is the most conservative. After adjustments to compensate for disproportionate representation that the Senate gives to low-population states and the lack of representation for the District of Columbia, the average ADA score in Congress (50.1) was assumed to represent the political position of the average U.S. voter.

    Groseclose and Milyo then directed 21 research assistants -- most of them college students -- to scour U.S. media coverage of the past 10 years. They tallied the number of times each media outlet referred to think tanks and policy groups, such as the left-leaning NAACP or the right-leaning Heritage Foundation.

    Next, they did the same exercise with speeches of U.S. lawmakers. If a media outlet displayed a citation pattern similar to that of a lawmaker, then Groseclose and Milyo's method assigned both a similar ADA score.

    In short, the underlying assumption is that, if the press is unbiased, then media outlets will cite think tanks in news reporting in a fashion that is "balanced" with respect to the scores assigned to the groups based on Congressional citations. Any deviation from the mean ADA score of Congress is defined as "bias." But is that a fair assumption?

    In particular, the paper's methodology doesn't allow for two important potential differences between the processes generating news citations and floor speech citations:

    (1) Technocratic centrist to liberal organizations like Brookings and the Center on Budget and Policy Priorities tend to have more credentialed experts with peer-reviewed publications than their conservative counterparts. This may result in a greater number of citations by the press, which seeks out expert perspectives on the news, but not more citations by members of Congress, who generally seek out views that reinforce their own.

    To illustrate, assume that there are two kinds of political stories. In the first, the press interviews think tank experts about policy debates in a "he said, she said" framework. We'll stipulate that these stories have an identical distribution of citations to that of Congress. But let's also assume the press is expected to consult technical experts or scholarship about trends or recent developments they are reporting on according to the norms of journalism, and that under such norms these experts, when they are cited, are not always "balanced" by an opposing expert. As a result, "expert" think tank citations are less frequently set off with a balancing quote or argument from the other side.

    It follows from these premises that if there are more generally recognized technical experts on left-to-center side of the spectrum, then a study using the Groseclose/Milyo methodology would place the media on the Democratic side of the Congressional mean even if members of the press randomly chose technical experts to cite.

    (2) The Groseclose/Milyo methodology doesn't allow for differential rates of productivity in producing work of interest to the media or Congress between organizations. To the extent that a think tank is better at marketing itself to the press than Congress (or vice versa), it could skew the results. For instance, the Heritage Foundation is extremely close to conservative members of Congress and has an elaborate operation designed to put material into their hands. But the fact that these members end up citing Heritage more than the press does is not ipso facto proof that the media is liberal.

    In fact, there are a number of stories you can tell about why a media/Congress discrepancy in think tank citation would not necessarily imply ideological bias on the part of members of the elite media (including those listed above) and if any of them are true, the argument as stated does not hold.

    Here is Groseclose/Milyo's response to the above criticisms:

    More problematic is a concern that congressional citations and media citations do not follow the same data generating process. For instance, suppose that a factor besides ideology affects the probability that a legislator or reporter will cite a think tank, and suppose that this factor affects reporters and legislators differently. Indeed, John Lott and Kevin Hassett have invoked a form of this claim to argue that our results are biased toward making the media appear more conservative than they really are. They note:

    For example, Lott [2003, Chapter 2] shows that the New York Times' stories on gun regulations consistently interview academics who favor gun control, but uses gun dealers or the National Rifle Association to provide the other side ... In this case, this bias makes [Groseclose and Milyo's measure of] the New York Times look more conservative than is likely accurate. [2004, p. 8]"

    However, it is possible, and perhaps likely, that members of Congress practice the same tendency that Lott and Hassett have identified with reporters—that is, to cite academics when they make an anti-gun argument and to cite, say, the NRA when they make a pro-gun argument. If so, then our method will have no bias. On the other hand, if members of Congress do not practice the same tendency as journalists, then this can cause a bias to our method. But even here, it is not clear in which direction it will occur. For instance, it is possible that members of Congress have a greater (lesser) tendency than journalists to cite such academics. If so, then this will cause our method to make media outlets appear more liberal (conservative) than they really are.

    In fact, the criticism we have heard most frequently is a form of this concern, but it is usually stated in a way that suggests the bias is in the opposite direction. Here is a typical variant: “It is possible that (i) journalists care about the ‘quality’ of a think tank more than legislators do (e.g. suppose journalists prefer to cite a think tank with a reputation for serious scholarship instead of another group that is known more for its activism); and (ii) the liberal think tanks in the sample tend to be of higher quality than the conservative think tanks.” If statements (i) and (ii) are true, then our method will indeed make media outlets appear more liberal than they really are. That is, the media will cite liberal think tanks more, not because they prefer to cite liberal think tanks, but because they prefer to cite high-quality think tanks. On the other hand, if one statement is true and the other is false, then our method will make media outlets appear more conservative than they really are. (E.g. suppose journalists care about quality more than legislators, but suppose that the conservative groups in our sample tend to be of higher quality than the liberal groups. Then the media will tend to cite the conservative groups disproportionately, but not because the media are conservative, rather because they have a taste for quality.) Finally, if neither statement is true, then our method will make media outlets appear more liberal than they really are. Note that there are four possibilities by which statements (i) and (ii) can be true or false. Two lead to a liberal bias and two lead to a conservative bias.

    This criticism, in fact, is similar to an omitted-variable bias that can plague any regression. Like the regression case, however, if the omitted variable (e.g., the quality of the think tank) is not correlated with the independent variable of interest (e.g., the ideology of the think tank), then this will not cause a bias. In the Appendix we examine this criticism further by introducing three variables that measure the extent to which a think tank’s main goals are scholarly ones, as opposed to activist ones. That is, these variables are possible measures of the "quality" of a think tank. When we include these measures as controls in our likelihood function, our estimated ADA ratings do not change significantly. E.g., when we include the measures, the average score of the 20 news outlets that we examine shifts less than three points. Further, we cannot reject the hypothesis that the new estimates are identical to the estimates that we obtain when we do not include the controls.

    Here is the portion of the appendix they refer to:

    Columns 5-9 of Table A2 address the concern that our main analysis does not control for the “quality” of a think tank or policy group. To account for this possibility, we constructed three variables that indicate whether a think tank or policy group is more likely to produce quality scholarship. The first variable, closed membership, is coded as a 0 if the web site of the group asks visitors to join the group. For instance, more activist groups--such as the NAACP, NRA, and ACLU--have links on their web site that give instructions for a visitor to join the group; while the more scholarly groups—such as the Brookings Institution, the RAND Corporation, the Urban Institute, and the Hoover Institution—do not. Another variable, staff called fellows, is coded as 1 if any staff members on the group’s website are given one of the following titles: fellow (including research fellow or senior fellow), researcher, economist, or analyst.

    Both variables seem to capture the conventional wisdom about which think tanks are known for quality scholarship. For instance, of the top-25 most-cited groups in Table I, the following had both closed membership and staff called fellows: Brookings, Center for Strategic and International Studies, Council on Foreign Relations, AEI, RAND, Carnegie Endowment for Intl. Peace, Cato, Institute for International Economics, Urban Institute, Family Research Council, and Center on Budget and Policy Priorities. Meanwhile, the following groups, which most would agree are more commonly known for activism than high-quality scholarship, had neither closed membership nor staff called fellows: ACLU, NAACP, Sierra Club, NRA, AARP, Common Cause, Christian Coalition, NOW, and Federation of American Scientists.

    The third variable that we constructed is off K street. It is coded as a 1 if and only if the headquarters of the think tank or policy group is not located on Washington D.C.’s K Street, the famous street for lobbying firms.

    The problem, however, is that conservative think tanks have consciously aped the tropes of the center-left establishment (such as fellows and closed memberships) while discarding their commitment to technocratic scholarship. Thus, the fact that the American Enterprise Institute and the Family Research Council are included in these categories means that the variables do not adequately address the criticism. Similarly, the Heritage Foundation, the prototypical faux-technocratic think tank (see here and here), has fellows as well. Counts of scholarly citations or staff with Ph.D.'s would have been far better metrics. (As for the K Street variable, it is simply bizarre -- many lobbying shops are a block or two away, as Media Matters points out.)

    Others have also objected to the study's methodology. Here are the strongest portions of two critiques that have appeared in the last several days:

    Dow Jones & Co. in a letter to Romenesko responding to the study's classification of the Wall Street Journal news pages as liberal:

    "[T]he reader of this report has to travel all the way Table III on page 57 to discover that the researchers' "study" of the content of The Wall Street Journal covers exactly FOUR MONTHS in 2002, while the period examined for CBS News covers more than 12 years, and National Public Radio’s content is examined for more than 11 years. This huge analytical flaw results in an assessment based on comparative citings during vastly differing time periods, when the relative newsworthiness of various institutions could vary widely. Thus, Time magazine is “studied” for about two years, while U.S. News and World Report is examined for eight years. Indeed, the periods of time covered for the Journal, the Washington Post and the Washington Times are so brief that as to suggest that they were simply thrown into the mix as an afterthought. Yet the researchers provide those findings the same weight as all the others, without bothering to explain that in any meaningful way to the study’s readers."

    Media Matters:

    We leave to the reader the judgment on whether anyone could take seriously a coding scheme in which RAND is considered substantially more "liberal" than the ACLU. But this is not the only problem with Groseclose and Milyo's study; they lump together advocacy groups and think tanks that perform dramatically different functions. For instance, according to their data, the National Association for the Advancement of Colored People (NAACP) is the third most-quoted group on the list. But stories about race relations that include a quote from an NAACP representative are unlikely to be "balanced" with quotes from another group on their list. Their quotes will often be balanced by quotes from an individual, depending on the nature of the story; however, because there are no pro-racism groups of any legitimacy (or on Groseclose and Milyo's list), such stories will be coded as having a "liberal bias." On the other hand, a quote from an NRA spokesperson can and often will be balanced with one from another organization on Groseclose and Milyo's list, Handgun Control, Inc...

    It is not hard to imagine perfectly balanced news stories that Groseclose and Milyo would score as biased in one direction or the other, given the study's methodology. For instance, an article that quoted a member of Congress taking one side of an issue, and then quoted a think tank scholar taking the other side, would be coded as "biased" in the direction of whichever side was represented by the think tank scholar. Since Groseclose and Milyo's measure of "bias" is restricted to citations of think tank and advocacy groups, this kind of miscategorization is inevitable.

    Groseclose and Milyo's discussion of the idea of bias assumes that if a reporter quotes a source, then the opinion expressed by that source is an accurate measure of the reporter's beliefs -- an assumption that most, if not all, reporters across the ideological spectrum would find utterly ridiculous. A Pentagon reporter must often quote Defense Secretary Donald H. Rumsfeld; however, the reporter's inclusion of a Rumsfeld quotation does not indicate that Rumsfeld's opinion mirrors the personal opinion of the reporter.

    ...The authors also display a remarkable ignorance of previous work on the subject of media bias. In their section titled "Some Previous Studies of Media Bias," they name only three studies that address the issue at more than a theoretical level. All three studies are, to put it kindly, questionable...

    Although the authors seem completely unaware of it, in reality there have been dozens of rigorous quantitative studies on media bias and hundreds of studies that address the issue in some way. One place the authors might have looked had they chosen to conduct an actual literature review would have been a 2000 meta-analysis published in the Journal of Communication (the flagship journal of the International Communication Association, the premier association of media scholars). The abstract of the study, titled "Media bias in presidential elections: a meta-analysis," reads as follows:

    A meta-analysis considered 59 quantitative studies containing data concerned with partisan media bias in presidential election campaigns since 1948. Types of bias considered were gatekeeping bias, which is the preference for selecting stories from one party or the other; coverage bias, which considers the relative amounts of coverage each party receives; and statement bias, which focuses on the favorability of coverage toward one party or the other. On the whole, no significant biases were found for the newspaper industry. Biases in newsmagazines were virtually zero as well. However, meta-analysis of studies of television network news showed small, measurable, but probably insubstantial coverage and statement biases.

    Standard scholarly practice dictates the assembly of a literature review as part of any published study, and meta-analyses, as they gather together the findings of multiple studies, are particularly critical to literature reviews. That Groseclose and Milyo overlooked not only the Journal of Communication meta-analysis, but also the 59 studies it surveyed, raises questions about the seriousness with which they conducted this study.

    Indeed, they seem to be unaware that an academic discipline of media studies even exists. Their bibliography includes works by right-wing media critics such as Media Research Center founder and president L. Brent Bozell III and Accuracy in Media founder Reed Irvine (now deceased), as well as an article from the right-wing website WorldNetDaily. But Groseclose and Milyo failed to cite a single entry from any of the dozens of respected scholarly journals of communication and media studies in which media bias is a relatively frequent topic of inquiry -- nothing from Journal of Communication, Communication Research, Journalism and Mass Communication Quarterly, Journal of Broadcasting & Electronic Media, Political Communication, or any other media studies journal.
  5. silverbear

    silverbear Semi-Official Loose Cannon

    24,188 Messages
    0 Likes Received
    As soon as I read the assertion that the Drudge Report was "centrist", I know this so-called study was a fargin' joke...

Share This Page