The NBER Reporter Fall 2002: News
New Directors and Officers Elected by NBER Board
Economic Fluctuations and Growth
Twenty-Third NBER Summer Institute Held in 2002
At its annual meeting in September, the NBER's Board of Directors elected a new Chair and Vice Chair, Michael H. Moskow and Elizabeth E. Bailey. Moskow is President of the Federal Reserve Bank of Chicago and first joined the NBER's Board in 1979. Bailey is a professor at the Wharton School of the University of Pennsylvania; she has been on NBER's Board since 1995.
Two new at-large Board members were elected as well: Laurence H. Meyer and Jacob A. Frenkel. Meyer is a former governor of the Federal Reserve System and the co-founder of Laurence H. Meyer and Associates, an economic forecasting firm. He holds a B.A. from Yale University and a Ph.D. in economics from MIT, and taught economics at Washington University in St. Louis from 1969-96.
Frenkel is currently Chairman of Merrill Lynch International and Chairman and CEO of the Group of Thirty. Born in Tel Aviv, Israel, he holds a B.A. from Hebrew University and a Ph.D. from the University of Chicago. Frenkel taught at the University of Chicago and Tel Aviv University for many years before serving as Economic Counsellor and Director of Research of the International Monetary Fund in 1987-91. He was then appointed Governor of the Bank of Israel and served from 1991-2000. He joined Merrill Lynch in 2000.
Roughly one hundred academic macroeconomists from all over the world gathered in Cambridge on July 20 to attend the summer research meeting of NBER's Program on Economic Fluctuations and Growth. The meeting was organized by Lawrence Christiano, NBER and Northwestern University, and James Stock, NBER and Harvard University. These papers were discussed:
Galí, Gertler, and López-Salido present a simple, theory-based measure of the variations in aggregate economic efficiency associated with business fluctuations. They decompose this indicator, which they refer to as "the gap," into two constituent parts -- a price markup and a wage markup -- and show that the latter accounts for the bulk of the fluctuations in their gap measure. They also demonstrate the connection between their gap measure and the gap between output and its natural level, a more traditional indicator of aggregate inefficiency. Finally, they derive a measure of the welfare costs of business cycles that is related directly to their gap variable. Their welfare measure corresponds to the inefficient component of economic fluctuations, and thus should be interpreted as a lower bound to the costs of the latter. When applied to postwar U.S. data, for some plausible parametrizations, their measure indicates non-negligible welfare losses of gap fluctuations. The results, however, hinge critically on some key parameters, including the intertemporal elasticity of labor supply.
Chari, Kehoe, and McGrattan propose a simple method for guiding researchers in developing quantitative models of economic fluctuations. They show that a large class of models, including models with various frictions, are equivalent to a prototype growth model with time-varying wedges that, at least on face value, look like time-varying productivity, labor taxes, and capital income taxes. They label the time-varying wedges as efficiency wedges, labor wedges, and investment wedges. They then use data to measure these wedges and feed them back into the prototype growth model. After assessing the fraction of fluctuations accounted for by these wedges during the great depressions of the 1930s in the United States, Germany, and Canada, they find that the efficiency and labor wedges together account for essentially all of the declines and subsequent recoveries. Investment wedges play, at best, a minor role.
Traditional Q theory relates a firm's investment to its value of Q at all frequencies; weekly or even daily fluctuations in Q should be just as informative for investment decisions as quarterly or annual data. Abel and Eberly develop a model in which investment is more responsive to Q at long horizons than at short horizons; at short horizons, investment is most responsive to cash flow. These effects arise because a firm's value depends on both its existing capital and its available technologies, even if they are not yet installed. In contrast, the firm's current investment depends only on the currently installed technology. Thus, the value of the firm, and hence Tobin's Q, are "too forward-looking" relative to the investment decision. Cash flow, on the other hand, reflects only current technology and demand. The excessively forward-looking information in Tobin's Q, while extraneous to high frequency investment decisions, does predict future adoptions of the frontier technology. In this way, it is a better predictor of long-run investment than of short-run investment. Short-run investment is better predicted by the firm's cash flow.
Caballero and Hammour propose a framework for understanding recurrent historical episodes of vigorous economic expansion accompanied by extreme asset valuations, as occurred in Japan in the 1980s and the United States in the 1990s. They interpret this phenomenon as a high-valuation equilibrium with a low effective cost of capital based on optimism about the future availability of funds for investment. Key to the sustainability of such an equilibrium is a feedback from increased growth to an increase in the supply of funding. They show that such a feedback naturally arises when the expansion is concentrated in a "new economy" sector and when it is supported by sustained fiscal surpluses -- which together would constitute an integral part, as cause and consequence, of a "speculative growth" equilibrium. The high-valuation equilibrium they analyze may take the form of a stock market bubble. In contrast to classical bubbles on non-productive assets, bubbles in their model encourage real investment, boost long-run savings, and may appear in dynamically efficient economies.
Francis and Ramey re-examine the recent evidence that technology shocks do not produce business cycle patterns in the data. They first extend Galí's (1999) work, which uses long-run restrictions to identify technology shocks, by examining whether the identified shocks plausibly can be interpreted as technology shocks. They do this in three ways. First, they derive additional long-run restrictions and use them as tests of overidentification. Second, they compare the qualitative implications from the model with the impulse responses of variables such as wages and consumption. Third, they test whether some standard "exogenous" variables predict the shock variables. They find that oil shocks, military build-ups, and "Romer dates" do not predict the shock labeled "technology." They then show ways in which a standard DGE model can be modified to fit Galí's finding that a positive technology shock leads to lower labor input. Finally, they re-examine the properties of the other key shock to the system.
Burstein, Eichenbaum, and Rebelo study the behavior of inflation after nine large post-1990 contractionary devaluations. A salient feature of the data is that inflation is low relative to the rate of devaluation. The authors argue that distribution costs and substitution away from imports to lower quality local goods can account quantitatively for the post-devaluation behavior of prices.
In the summer of 2002, the NBER held its twenty-third annual Summer Institute. More than 1200 economists from universities and organizations throughout the world attended. The papers presented at dozens of different sessions during the four-week Summer Institute covered a wide variety of topics. A complete agenda and many of the papers presented at the various sessions are available on the NBER's web site by clicking Summer Institute 2002 on our conference page, www.nber.org/confer.