Weather, storage and an old climate impact debate

This somewhat technical post is a belated followup to a comment I wrote with Tony Fisher, Michael Hanemann and Wolfram Schlenker, which was finally published last year in the American Economic Review.  I probably should have done this a long time ago, but I needed to do a little programming.  And I've basically been slammed nonstop.

First the back story:  The comment re-examines a paper by Deschanes and Greenstone (DG) that supposedly estimates a lower bound on the effects of climate change by relating county-level farm profits to weather.  They argue that year-to-year variation in weather is random---a fair proposition---and control for unobserved differences across counties using fixed effects.  This is all pretty standard technique.

The overarching argument was that with climate change, farmers could adapt (adjust their farming practices) in ways they cannot with weather, so the climate effect on farm profits would be more favorable than their estimated weather effect.

Now, bad physical outcomes in agriculture can actually be good for farmers' profits, since demand for most agricultural commodities is pretty steep: prices go up as quantities go down.  So, to control for the price effects they include year fixed effects.  And since farmers grow different crops in different parts of the country and there can be local price anomalies, they go further and use state-by-year fixed effects so as to squarely focus on quantity effects in all locations.

Our comment pointed out a few problems:  (1) there were some data errors like missing temperature data apparently coded with zeros and much of the Midwest and most of Iowa dropped from the sample without explanation; (2) in making climate predictions they applied state-level estimates to county-level baseline coefficients, in effect making climate predictions that regress to the state mean (e.g., Death Valley and Mt. Witney have different baselines but the same future); (3) all those fixed effects wash out over 99 percent of weather variation, leaving only data errors for estimation; (4) the standard errors didn't appropriately account for the panel nature of the spatially correlated errors.

These data and econometric issues got the most attention.  Correct these things and the results change a lot.  See the comment for details.

But, to our minds, there is a deeper problem with the whole approach.  Their measure of profits was really no such thing, at least not in an economic sense: it was reported sales minus a crude estimate of current expenditures.  The critical thing here is that farmers often do not sell what they produce.  About half the country's grain inventories are held on farm.  Farms also hold inventory in the form of capital and livestock, which can be held, divested or slaughtered.  Thus, effects of weather in one year may not show up in profits measured in that year.  And since inventories tend to be accumulated in plentiful times and divested in bad times, these inventory adjustments are going to be correlated with the weather and cause bias.

Although DG did not consider this point originally, they admitted it was a good one, but argued they had a simple solution: just include the lags of weather in the regression. When they attempted this, they found lagged weather was not significant, and thus concluded that this issue was unimportant.  This argument is presented in their reply to our comment.

We were skeptical about their proposed solution to the storage issue.  And so, one day long ago, I proposed to Michael Greenstone, that we test his proposed solution. We could solve a competitive storage model, assume farmers store as a competitive market would, and then simulate prices and quantities that vary randomly with the weather.  Then we could regress sales (consumption X price) against our constructed weather and lags of weather plus price controls. If the lags worked in this instance, where we knew the underlying physical structure, then it might work in reality.

Greenstone didn't like this idea, and we had limited space in the comment, so the storage stuff took a minimalist back seat. Hence this belated post.

So I recently coded a toy storage model in R, which is nice because anyone can download and run this thing  (R is free).  Also, this was part of a problem set I gave to my PhD students, so I had to do it anyway.

Here's the basic set up:

y    is production which varies randomly (like the weather).
q    is consumption, or what's clearly sold in a year.
p    is the market price, which varies inversely with q (the demand curve)
z    is the amount of the commodity on hand (y plus carryover from last year).

The point of the model is to figure out how much production to put in or take out of storage.  This requires numerical analysis (thus, the R code).  Dynamic equilibrium occurs when there is no arbitrage: where it's impossible to make money by storing more or storing less.

Once we've solved the model, which basically gives q, p as a function of z, we can simulate y with random draws and develop a path of q and p.  I chose a demand curve, interest rate and storage cost that can give rise to a fair amount of price variability and autocorrelation, which happens to fit the facts.  The code is here.

Now, given our simulated y, q and p, we might estimate:

(1)   q_t = a + b0  y_t + b1 y_{t-1} + b2 y_{t-2} + b3 y_{t-3} +  ... + error

(the ... means additional lags, as many as you like.  I use five.)

This expression makes sense to me, and might have been what DG had in mind: quantity in any one year is a function of this year's weather and a reasonable number past years, all of which affect today's output via storage.  For the regression to fully capture the true effect of weather, the sum of b# coefficients should be one.

Alternatively we might estimate:

(2)   p_t q_t = a + b0  y_t + b1 y_{t-1} + b2 y_{t-2} + b3 y_{t-3} +  ... + error

This is almost like DG's profit regression, as costs of production in this toy model are zero, so "profit" is just total sales.   But DG wanted to control for price effects in order to account for the pure weather effect on quantity, since the above relationship, the sum of the b# coefficients is likely negative.  So, to do something akin to DG within the context of this toy model we need to control for price.  This might be something like:

(3)  p_t q_t = a + b0  y_t + b1 y_{t-1} + b2 y_{t-2} + b3 y_{t-3} +  ... + c p_t + error

Or, if you want to be a little more careful, recognizing there is a nonlinear relationship, we might have a more flexible control for p_t, and use a polynomial. Note that we cannot used fixed effects like DG because this isn't a panel.  I'll come back to this later.  In any case, with better controls we get:
 
(4)   p_t q_t = a + b0  y_t + b1 y_{t-1} + b2 y_{t-2} + b3 y_{t-3} +  ... + c1 p_t  + c2 p_t^2 + c3 p_t^3 +  error

At this point you should be worrying about having p_t on both the right and left side.  More on this in a moment.  First, let's take a look at the results:

Equation 1:
            Estimate Std. Error t value Pr(>|t|)
(Intercept)     1.68       1.32    1.28     0.20
y               0.39       0.03   15.62     0.00
l.y             0.23       0.03    9.17     0.00
l2.y            0.10       0.03    3.83     0.00
l3.y            0.07       0.03    2.66     0.01
l4.y            0.07       0.03    2.69     0.01
l5.y            0.06       0.03    2.34     0.02


The sum of the y coefficients is 0.86.  I'm sure if you put in enough lags they would sum to 1. You shouldn't take the Std. Error or t-stats seriously for this or any of the other regressions, but that doesn't really matter for the points I want to make. Also, if you run the code, the exact results will differ because it will take a different random draw of y's, but the flavor will be the same.

Equation 2:
            Estimate Std. Error t value Pr(>|t|)
(Intercept)  4985.23     166.91   29.87        0
y             -72.15       3.19  -22.63        0
l.y           -43.67       3.20  -13.64        0
l2.y          -22.52       3.21   -7.03        0
l3.y          -15.61       3.21   -4.87        0
l4.y          -13.58       3.19   -4.26        0
l5.y          -12.26       3.19   -3.85        0


All the coefficients are negative.  As we expected, good physical outcomes for y mean bad news for profits, since prices fall through the floor.  If you know a little about the history of agriculture, this seems about right.  So, let's "control" for price.

Equation 3:
            Estimate Std. Error t value Pr(>|t|)
(Intercept)  2373.15     167.51   14.17        0
y             -28.12       2.91   -9.66        0
l.y           -17.72       2.10   -8.43        0
l2.y          -11.67       1.63   -7.17        0
l3.y           -8.07       1.57   -5.16        0
l4.y           -5.99       1.56   -3.84        0
l5.y           -5.68       1.54   -3.68        0
p               7.84       0.44   17.65        0


Oh, good, the coefficients are less negative.  But we still seem to have a problem.  So, let's improve our control for price by making it a 3rd order polynomial:

Equation 4:
            Estimate Std. Error       t value Pr(>|t|)
(Intercept)  1405.32          0  1.204123e+15     0.00
y               0.00          0  2.000000e-02     0.98
l.y             0.00          0  3.000000e-02     0.98
l2.y            0.00          0  6.200000e-01     0.53
l3.y            0.00          0 -3.200000e-01     0.75
l4.y            0.00          0 -9.500000e-01     0.34
l5.y            0.00          0 -2.410000e+00     0.02
poly(p, 3)1  2914.65          0  3.588634e+15     0.00
poly(p, 3)2  -716.53          0 -1.795882e+15     0.00
poly(p, 3)3     0.00          0  1.640000e+00     0.10


The y coefficients are now almost precisely zero. 

By DG's interpretation, we say that weather has no effect on profit outcomes and thus climate change is likely to have little influence on US agriculture.  Except in this simulation we know that in the underlying physical reality is that one unit of y ultimately has a one unit effect on the output.  DG's interpretation is clearly wrong.

What's going on here? 

The problem comes from an attempt to "control" for price.  Price, after all, is a key (the key?) consequence of the weather. Because storage theory predicts that prices incorporate all past production shocks, whether they are caused by weather or something else, in controlling for price, we remove all weather effects on quantities.  So, DG are ultimately mixing up cause and effect, in their case by using a zillion fixed effects. One should take care in adding "controls" that might actually be an effect, especially when you supposedly have a random source of variation.  David Freedman, the late statistician who famously critiqued regression analysis in the social sciences and provided inspiration to the modern empirical revolution in economics, often emphasized this point.

Now, some might argue that the above analysis is just a single crop, that it doesn't apply to DG's panel data. I'd argue that if you can't make it work in a simpler case, it's unlikely to work in a case that's more complicated.  More pointedly, this angle poses a catch 22 for the identification strategy: If  inclusion of state-by-year fixed effects does not absorb all historic weather shocks, then it implies that the weather shocks must have been crop- or substate-specific, in which case there is bias due to endogenous price movements even after the inclusion of these fixed effects. On the other hand, if enough fixed effects are included to account for all endogenous price movements, then lagged weather by definition does not add any additional information and should not be significant in the regression.  Prices are a sufficient statistic for all past and current shocks.

All of this is to show that the whole DG approach has problems.  However, I think the idea of using lagged weather is a good one if combined with a somewhat different approach.  We might, for example, relate all manner of endogenous outcomes (prices, quantities, and whatever else) to current and past weather. This is the correct  "reduced form."  From these relationships, combined with some minimalist economic structure, we might learn all kinds of interesting and useful things, and not just about climate change.   This observation, in my view, is the over-arching contribution of my new article with Wolfram Schlenker in the AER

I think there is a deeper lesson in this whole episode that gets at a broader conversation in the discipline about data-driven applied microeconomics over the last 20 years.  Following Angrist, Ashenfelter, Card and Krueger, among others, everyone's doing experiments and natural experiments.  A lot of this stuff has led to some interesting and useful discoveries.  And it's helped to weed out some applied econometric silliness.

Unfortunately, somewhere along the way, some folks lost sight of basic theory.   In many contexts we do need to attach our reduced forms to some theoretical structure in order to interpret them.  For example, bad weather causing profits to go up in agriculture actually makes sense, and indicates something bad for consumers and for society as a whole.

And in some contexts a little theory might help us remember what is and isn't exogenous.

Comments

  1. I don't know how much it matters for your argument, but year to year variation in weather is not random (Google "regime shift" and you will find a decent Wikipedia entry). If economists are going to play in other people's territory, they should learn the territory first.

    ReplyDelete
  2. Anonymous:
    1. In context, we mean to assume that any association between year-to-year weather and crop outcomes can be safely attributed as causation going from weather to crop outcomes and not the other way around.
    2. This work is squarely economics, however I would guess the parties involved have a decent understanding of the associated climate science.
    3. I often become frustrated by scientists who blithely make big economic and policy claims without knowing the first thing about economics and policy. So, I try to understand the associated science as well as possible.
    4. If you're going to make flyby attacks, please be professional about it and leave your name.

    ReplyDelete
  3. Enjoy reading your AER paper, Michael. Here is a potential different story to explain why you got almost zero coef. in equation (4). By definition, q and p are perfectly linearly related, and regress one on another will get the exact point estimate even without standard errors. So putting p in any regression of q will dominate over other variables. That is essentially why you got zero coef. in (4) because you don't need other variable to explain q. Here (4) can be considered as regressing q on y/p and p (plus 1's and p^2; the coef. for p^2 is almost zero based on the poly 3 term in (4), since there is no nonlinear relationship there). The coef. in (2) is negative because p*q is always negative as a result of any change. The coef. in (3) is negative and smaller because it is still (2) while keeping p constant, i.e. variation in pq is downscaled, but the sign is still the same. All in all, I am wondering if you relax the direct linear relationship between p and q and add some noise to it, what will happen to your conclusion? Other variables like y will always turn on almost zero coef.? Probably not, no matter whether p is caused by y or not. Of course, if p is (partially) caused by y, we cannot include p when estimating the effect of y on q, especially the case when y is really exogenous.

    ReplyDelete
  4. Changgui: I agree! (well, mainly). I didn't describe it the way you do, but I think we're saying the same thing. p is endgoneous, indeed the result of current and past shocks y, as is q. Controlling for p therefore confounds the reduced form relationship between p*q and y. The fit, with a cubic in p, is very nearly perfect.

    The thing is, this is in effect exactly what DG have done. Now, in their case it may not be exact, because there are other kinds of shifts going on, and folks besides farmers store stuff. But the same endogeneity and bias toward zero occurs.

    ReplyDelete
    Replies
    1. I agree with what you've said and what DG shouldn't have done. I am just puzzled that the coef. of y will (tend to) be zero in any regression when you also control for a perfectly linearly correlated variable (p) with the dependent variable (q), no matter whether p is caused by y or not. I am just wondering in reality, how do we separate real causal link from y to p and the case that p and y just coincidently move together? Thanks for the reply, Michael.

      Delete
    2. Hi Changgui,

      This is the essence of the endogenity problem.
      Both p and q are determined by past and current weather.

      It is okay to regress p on y, q on y, and even p*q on y, but it's bad to put any of the endogenous things on the right hand size.

      The more subtle thing is that fixed effects can, in effect, be endogenous. They can wash out not just confounding effects, but also endogenous responses of your exogenous variable, making your exogenous variable look insignificant when it actually is. By the same token, such "controls" might bias a coefficient upwards in other contexts.

      Just to be clear: I think fixed effects can be very useful. But I think we over use them in many contexts, especially when there is feedback between observation units, either over space or over time. That kind of intertemporal and spatial feedback is typical in economics--actually, it's central to the whole idea of the 'invisible hand'. Yet now many economists commonly treat every panel data set as if we have repeated observations in some kind of grand experiment. I may even be guilty of this in some contexts. Anyway, I think we need to be careful with how we use these tools.

      I should also emphasize again that the idea of using lagged weather shocks, not just current shocks, is a good one, and something that might be exploited for many other studies. I wouldn't have had that epiphany if it weren't for DG's reply.

      Delete

Post a Comment

Popular posts from this blog

Nonlinear Temperature Effects Indicate Severe Damages to U.S. Crop Yields Under Climate Change

Commodity Prices and the Fed