Con out of Econometrics (JEP)

最新のJournal of Economic Perspectivesがすごそう。
タイトルだけ見て関心がありそうなのがこんなに。

Joshua D. Angrist and Jörn-Steffen Pischke
The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics

    • -


Edward E. Leamer
Tantalus on the Road to Asymptopia

    • -


Michael P. Keane
A Structural Perspective on the Experimentalist School

    • -


Christopher A. Sims
But Economics Is Not an Experimental Science

    • -


Aviv Nevo and Michael D. Whinston
Taking the Dogma out of Econometrics: Structural Modeling and Credible Inference

James H. Stock
The Other Transformation in Econometric Practice: Robust Tools for Inference

Liran Einav and Jonathan Levin
Empirical Industrial Organization: A Progress Report

    • -


Nathan Nunn and Nancy Qian
The Columbian Exchange: A History of Disease, Food, and Ideas

それぞれのAbstract:

The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics
Joshua D. Angrist and Jörn-Steffen Pischke
Since Edward Leamer's memorable 1983 paper, "Let's Take the Con out of Econometrics," empirical microeconomics has experienced a credibility revolution. While Leamer's suggested remedy, sensitivity analysis, has played a role in this, we argue that the primary engine driving improvement has been a focus on the quality of empirical research designs. The advantages of a good research design are perhaps most easily apparent in research using random assignment. We begin with an overview of Leamer's 1983 critique and his proposed remedies. We then turn to the key factors we see contributing to improved empirical work, including the availability of more and better data, along with advances in theoretical econometric understanding, but especially the fact that research design has moved front and center in much of empirical micro. We offer a brief digression into macroeconomics and industrial organization, where progress -- by our lights -- is less dramatic, although there is work in both fields that we find encouraging. Finally, we discuss the view that the design pendulum has swung too far. Critics of design-driven studies argue that in pursuit of clean and credible research designs, researchers seek good answers instead of good questions. We briefly respond to this concern, which worries us little.

    • -


Tantalus on the Road to Asymptopia
Edward E. Leamer
My first reaction to "The Credibility Revolution in Empirical Economics," authored by Joshua D. Angrist and Jörn-Steffen Pischke, was: Wow! This paper makes a stunningly good case for relying on purposefully randomized or accidentally randomized experiments to relieve the doubts that afflict inferences from nonexperimental data. On further reflection, I realized that I may have been overcome with irrational exuberance. Moreover, with this great honor bestowed on my "con" article, I couldn't easily throw this child of mine overboard. As Angrist and Pischke persuasively argue, either purposefully randomized experiments or accidentally randomized "natural" experiments can be extremely helpful, but Angrist and Pischke seem to me to overstate the potential benefits of the approach. I begin with some thoughts about the inevitable limits of randomization, and the need for sensitivity analysis in this area, as in all areas of applied empirical work. I argue that the recent financial catastrophe is a powerful illustration of the fact that extrapolating from natural experiments will inevitably be hazardous. I discuss how the difficulties of applied econometric work cannot be evaded with econometric innovations, offering as examples some under-recognized difficulties with instrumental variables and robust standard errors. I conclude with comments about the shortcomings of an experimentalist paradigm as applied to macroeconomics, and some warnings about the willingness of applied economists to apply push-button methodologies without sufficient hard thought regarding their applicability and shortcomings.

    • -


A Structural Perspective on the Experimentalist School
Michael P. Keane
What has always bothered me about the "experimentalist" school is the false sense of certainty it conveys. My view, like Leamer's, is that there is no way to escape the role of assumptions in statistical work, so our conclusions will always be contingent. Hence, we should be circumspect about our degree of knowledge. I present some lessons for economics from the field of marketing, a field where broad consensus has been reached on many key issues over the past twenty years. In marketing, 1) the structural paradigm is dominant, 2) the data are a lot better than in some fields of economics, and 3) there is great emphasis on external validation. Of course, good data always helps. I emphasize that the ability to do controlled experiments does not obviate the need for theory, and finally I address different approaches to model validation.


But Economics Is Not an Experimental Science
Christopher A. Sims
The fact is, economics is not an experimental science and cannot be. "Natural" experiments and "quasi" experiments are not in fact experiments. They are rhetorical devices that are often invoked to avoid having to confront real econometric difficulties. Natural, quasi-, and computational experiments, as well as regression discontinuity design, can all, when well applied, be useful, but none are panaceas. The essay by Angrist and Pischke, in its enthusiasm for some real accomplishments in certain subfields of economics, makes overbroad claims for its favored methodologies. What the essay says about macroeconomics is mainly nonsense. Consequently, I devote the central part of my comment to describing the main developments that have helped take some of the con out of macroeconomics. Recent enthusiasm for single-equation, linear, instrumental variables approaches in applied microeconomics has led many in these fields to avoid undertaking research that would require them to think formally and carefully about the central issues of nonexperimental inference -- what I see and many see as the core of econometrics. Providing empirically grounded policy advice necessarily involves confronting these difficult central issues.

    • -


Taking the Dogma out of Econometrics: Structural Modeling and Credible Inference
Aviv Nevo and Michael D. Whinston
Without a doubt, there has been a "credibility revolution" in applied econometrics. One contributing development has been in the improvement and increased use in data analysis of "structural methods"; that is, the use of models based in economic theory. Structural modeling attempts to us data to identify the parameters of an underlying economic model, based on models of individual choice or aggregate relations derived from them. Structural estimation has a long tradition in economics, but better and larger data sets, more powerful computers, improved modeling methods, faster computational techniques, and new econometric methods such as those mentioned above have allowed researchers to make significant improvements. While Angrist and Pischke extol the successes of empirical work that estimates "treatment effects" based on actual or quasi-experiments, they are much less sanguine about structural analysis and hold industrial organization up as an example where "progress is less dramatic." Indeed, reading their article one comes away with the impression that there is only a single way to conduct credible empirical analysis. This seems to us a very narrow and dogmatic approach to empirical work; credible analysis can come in many guises, both structural and nonstructural, and for some questions structural analysis offers important advantages. In this comment, we address the criticism of structural analysis and its use in industrial organization, and consider why empirical analysis in industrial organization differs in such striking ways from that in field such as labor, which have recently emphasized the methods favored by Angrist and Pischke.

    • -


The Other Transformation in Econometric Practice: Robust Tools for Inference
James H. Stock
Angrist and Pischke highlight one aspect of the research that has positively transformed econometric practice and teaching. They emphasize the rise of experiments and quasi-experiments as credible sources of identification in microeconometric studies, which they usefully term "design-based research." But in so doing, they miss an important part of the story: a second research strand aimed at developing tools for inference that are robust to subsidiary modeling assumptions. My first aim in these remarks therefore is to highlight some key developments in this area. I then turn to Angrist and Pischke's call for adopting experiments and quasi-experiments in macroeconometrics; while sympathetic, I suspect the scope for such studies is limited. I conclude with some observations on the current debate about whether experimental methods have gone too far in abandoning economic theory.

    • -


Empirical Industrial Organization: A Progress Report
Liran Einav and Jonathan Levin
The field of industrial organization has made dramatic advances over the last few decades in developing empirical methods for analyzing imperfect competition and the organization of markets. These new methods have diffused widely: into merger reviews and antitrust litigation, regulatory decision making, price setting by retailers, the design of auctions and marketplaces, and into neighboring fields in economics, marketing, and engineering. Increasing access to firm-level data and in some cases the ability to cooperate with firms or governments in experimental research designs is offering new settings and opportunities to apply these ideas in empirical work. This essay begins with a sketch of how the field has evolved to its current state, in particular how the field's emphasis has shifted over time from attempts to relate aggregate measures across industries toward more focused studies of individual industries. The second and primary part of the essay describes several active areas of inquiry. We also discuss some of the impacts of this research and specify topics where research efforts have been more or less successful. The last section steps back to offer a broader perspective. We address some current debates about research emphasis in the field, and more broadly about empirical methods, and offer some thoughts on where future research might go.

    • -


The Columbian Exchange: A History of Disease, Food, and Ideas
Nathan Nunn and Nancy Qian
This paper provides an overview of the long-term impacts of the Columbian Exchange -- that is, the exchange of diseases, ideas, food crops, technologies, populations, and cultures between the New World and the Old World after Christopher Columbus' voyage to the Americas in 1492. We focus on the aspects of the exchange that have been most neglected by economic studies; namely the transfer of diseases, food crops, and knowledge between the two Worlds. We pay particular attention to the effects of the exchange on the Old World.

これは早速読んでおこう。