Thursday, December 3, 2015

Renewable energy not as costly as some think

The other day Marshall and Sol took on Bjorn Lomborg for ignoring the benefits of curbing greenhouse gas emissions.  Indeed.  But Bjorn, among others, is also notorious for exaggerating costs.  That fact is that most serious estimates of reducing emissions are fairly low, and there is good reason to believe cost estimates are too high for the simple fact that analysts cannot measure or imagine all ways we might curb emissions.  Anything analysts cannot model translates into cost exaggeration.

Hawai`i is a good case in point.  Since moving to Hawai`i I've started digging into energy, in large part because the situation in Hawai`i is so interesting.  Here we make electricity mainly from oil, which is super expensive.  We are also rich in sun and wind.  Add these facts to Federal and state subsidies and it spells a remarkable energy revolution.  Actually, renewables are now cost effective even without subsidies.

In the video below Matthias Fripp, who I'm lucky to be working with now, explains how we can achieve 100% renewable energy by 2030 using current technology at a cost that is roughly comparable to our conventional coal and oil system. In all likelihood, with solar and battery costs continuing to fall, this goal could be achieved for a cost that's far less.  And all of this assumes zero subsidies.

One key ingredient:  We need to shift electric loads toward the supply of renewables, and we could probably do this with a combination of smart variable pricing and smart machines that could help us shift loads.  More electric cars could help, too.  I'm sure some could argue with some of the assumptions, but it's hard to see how this could be wildly unreasonable.





Wednesday, October 21, 2015

Paul Krugman on Food Economics



Paul Krugman doesn't typically write about food, so I was a little surprised to see this.  Still, I think he got most things right, at least by my way of thinking.  Among the interesting things he discussed.



1. The importance of behavioral economics in healthy food choices
2. That it's hard to know how many actual farmers are out there, but it's a very small number.
3. That we could clean up farming a lot by pricing externalities [also see], or out-right banning of the most heinous practices, but that doesn't mean we're going to go back to the small farms of the pre-industrial era, or anything close to it.
4. Food labels probably don't do all that we might like them to do (see point 1.)
5. How food issues seem to align with Red/Blue politics just a little too much

There's enough to offend and ingratiate most everyones preconceived ideas in some small way, but mostly on the mark, I think.

Saturday, October 17, 2015

Angus Deaton and Commodity Prices

Angus Deaton just won the Nobel Prize in economics.  He's a brilliant, famous economist who is known for many contributions.  In graduate school I discovered a bunch of his papers and studied them carefully. He is a clear and meticulous writer which made it easy for me to learn a lot of technical machinery, like stochastic dynamic programming.  His care and creativity in statistical matters, and linking data to theory, was especially inspiring. His papers with Christina Paxson inspired me to think long and hard about all the different ways economists might exploit weather as an instrument for identifying important economic phenomena.

One important set of contributions about which I've seen little mention concerns a body of work on commodity prices that he did in collaboration with Guy Laroque.  This is really important research, and I think that many of those who do agricultural economics and climate change might have missed some of its implications, if they are aware of it at all.

Deaton lays out his work on commodity prices like he does in a lot of his papers: he sets out to test a core theory, insists on using only the most reliable data, and then pushes the data and theory hard to see if they can be reconciled with each other.  He ultimately concludes that, while theory can broadly characterize price behavior, there is a critical paradox in the data that the theory cannot reconcile: too much autocorrelation in prices.  (ASIDE 1)

These papers are quite technical and the concluding autocorrelation puzzle is likely to put most economists, and surely all non-economists, into a deep slumber.  Who cares?

Undoubtedly, a lot of the fascination with these papers was about technique.  They were written in the generation following discovery of GMM (generalized method of moments) as a way to estimate models centered on rational expectations, models in which iid errors can have an at least somewhat tangible interpretation as unpredictable "expectation errors."

One thing I always found interesting and useful from these papers was something that Deaton and Larqoue take entirely for granted.  They show that the behavior of commodity prices themselves, without any other data, indicate that their supply and demand are extremely inelastic.  (ASIDE 2)  For, if they weren't, prices would not be as volatile as they are, as autocorrelated as they are, and stored as prevalently as they are.  Deaton writes as much in a number of places, but states this as if it's entirely obvious and not of critical concern. (ASIDE 3)

But here's the thing: the elasticities of supply and demand are really what's critical for thinking about implications of policies, especially those that can affect supply or demand on a large scale, like ethanol mandates and climate change.  Anyone who read and digested Deaton and Laroque and knew the stylized facts about corn prices knew that the ethanol subsidies and mandates were going to cause food prices to spike, maybe a whole lot. But no one doing policy analysis in those areas paid any attention.

Estimated elasticities regularly published in the AJAE for food commodities are typically orders of magnitude larger than are possible given what is plainly clear in price behavior, and the authors typically appear oblivious to the paradox.   Also, if you like to think carefully about identification, it's easy to be skeptical of the larger estimated elasticities.  Sorry aggies--I'm knocking you pretty hard here and I think it's deserved.  I gather there are similar problems in other corners of the literature, say mineral and energy economics.

And it turns out that the autocorrelation puzzle may not be as large a puzzle as Deaton and Laroque let on.  For one, a little refinement of their technique can give rise to greater price autocorrelation, a refinement that also implies even more inelastic supply and demand.  Another simple way to reconcile theory and data is to allow for so-called ``convenience yields," which basically amounts to negative storage costs when inventories are low.  Negative storage costs don't make sense on their face, but might actually reflect the fact that in any one location---where stores are actually held---prices or the marginal value of commodities can be a lot more volatile than posted prices in a market.  Similar puzzles of positive storage when spot prices exceed futures prices can be similarly explained---there might be a lot of uncertain variability in time and space that sit between stored inventories and futures deliveries.

The graph below, from this recent paper by Gouel and Legrand, uses updated techniques to show how inelastic demand needs to be to obtain high price autocorrelation, as we observe in the data (typically well above 0.6).



Adding bells and whistles to the basic theory easily reconciles the puzzle, but only strengthens the conclusion that demand and supply of commodities are extremely steep.  And that basic conclusion should make people a little more thoughtful when it comes to thinking about implications of policies and about the potential impacts of climate change, which could greatly disrupt supply of food commodities.



ASIDE 1:  Papers like Deaton's differ from most of the work that fills up the journals these days.  Today we see a lot more empirical work than in the past, but most of this work is nearly atheoretical, at least relative to Deaton's.  It most typically follows what David Card describes as ``the design-based approach."  I don't think that's bad change.  But I think there's a lot of value in the kind of work Deaton did, too.

ASIDE 2:  The canonical model that Deaton and Laroque use, and much of the subsequent literature, has a perfectly inelastic supply curve that shifts randomly with the weather and a fixed demand curve.  This is because, using only prices, one cannot identify both demand and supply.  Thus, the estimated demand elasticities embody both demand and supply.  And since that one elasticity is clearly very inelastic it also implies the sum of the two elasticities is very inelastic.

ASIDE 3:  Lots of people draw conclusions lightly as if they are obvious without really thinking carefully about lies beneath.  Not Angus Deaton.

Tuesday, August 4, 2015

Answering Matthew Kahn's questions about climate adaptation

Matt has taken the bait and asked me a five good questions about my snarky, contrarian post on climate adaptation.  Here are his questions and my answers.

Question 1.  This paper will be published soon by the JPE. Costinot, Arnaud, Dave Donaldson, and Cory B. Smith. Evolving comparative advantage and the impact of climate change in agricultural markets: Evidence from 1.7 million fields around the world. No. w20079. National Bureau of Economic Research, 2014.  http://www10.iadb.org/intal/intalcdi/PE/2014/14183.pdf

It strongly suggests that adaptation will play a key role protecting us. Which parts of their argument do you reject and why?

Answer:  This looks like a solid paper, much more serious than the average paper I get to review, and I have not yet studied it.  I’m slow, so it would take me awhile to unpack all the details and study the data and model.  Although, from a quick look, I think there are a couple points I can make right now.  

First, and most importantly, I think we need to be clear about the differences between (i) adaptation (ii) price response and trade; (iii) innovation that would happen anyway; (iv) climate-change-induced innovation; and (v) price-induced innovation.  I’m pretty sure this paper is mainly about (ii), not about adaptation as conventionally defined within literature, although there appears to be some adaptation too.  I need to study this much more to get a sense of the different magnitudes of elasticities they estimate, and whether I think they are plausible given the data.

To be clear: I think adaptation, as conventionally defined, pertains to changing production behavior when changing climate while holding all other factors (like prices, trade, technology, etc.) constant.  My annoyance is chiefly that people are mixing up concepts.  My second annoyance is that too many are perpetually optimistic--some economists wear it like a badge, and I don’t think evidence or history necessarily backs up that optimism.


Question 2. If farmers know that they face uncertain risks due to climate change, what portfolio choices can they engage in to reduce the variability of their earnings? What futures markets exist to allow for hedging? If a risk averse economic agent knows "that he does not know" what ambiguous risks she faces, she will invest in options to protect herself. Does your empirical work capture this medium term investment rational plan? Or do you embrace the Berkeley behavioral view of economic agents as myopic?

Some farmers have subsidized crop insurance (nearly all in the U.S. do). But I don't think insurance much affects production choices at all. Futures markets seem to “work” pretty well and could be influenced by anticipated climate change.  We actually use a full-blown rational expectations model to estimate how much they might be affected by anticipated climate change right now: about 2% higher than they otherwise would be.  

Do I think people are myopic? Very often, yes.  Do I think markets are myopic?  By and large, no, but maybe sometimes.  I believe less in bubbles than Robert Shiller, even though I'm a great admirer of his work.  Especially for commodity markets (if not the marcoeconomy) I think rational expectations models are a good baseline for thinking about commodity prices, very much including food commodity prices.  And I think rational expectations models can have other useful purposes, too.  I actually do think the Lucas enterprise has created some useful tools, even if I find the RBC center of macro more than a bit delusional.

I think climate and anticipated climate change will affect output (for good and bad), which will affect prices, and that prices will affect what farmers plant, where they plant it, and trade.  But none of this, I would argue, is what economists conventionally refer to as adaptation.  A little more on response to prices below...

Again, my beef with the field right now is that we are too blase about miracle of adaptation.  It’s easy to tell horror stories that the data cannot refute.  Much of economist tribe won’t look there—it feels taboo.  JPE won’t publish such an article. We have blinders on when uncertainty is our greatest enemy.


Question 3. If specific farmers at specific locations suffer, why won't farming move to a new area with a new comparative advantage? How has your work made progress on the "extensive margin" of where we grow our food in the future?

The vast majority of arable land is already cropped.  That which isn’t is in extremely remote and/or politically difficult parts of Africa.  Yes, there will be substitution and shifting of land.  But these shifts will come about because of climate-induced changes in productivity.  In other words, first-order intensive margin effects will drive second-order extensive margin effects.  The second order effects—some land will move into production, some out--will roughly amount to zero.  That’s what the envelope theorem says.  To a first approximation, adaptation in response to climate change will have zero aggregate effect, not just with respect to crop choice, but with respect to other management decisions as well.  I think Nordhaus himself made this point a long time ago. 

However, there will also be intensive and extensive margin responses to prices.  Those will be larger than zero.  But I think the stylized facts about commodity prices ( from the rational expectations commodity price model, plus other evidence ) tell us that supply and demand are extremely inelastic. 


Question 4. The key agriculture issue in adapting to climate change appears to be reducing international trade barriers and improving storage and reducing geographic trade costs. Are you pessimistic on each of these margins? Container ships and refrigeration keep getting better, don't they?

I think storage will improve, because almost anyone can do it, and there’s a healthy profit motive.  It’s a great diversification strategy for deep-pocketed investors. I think many are already into this game and more will get into it soon.  Greater storage should quell a good share of the greater volatility, but it actually causes average prices to rise, because there will be more spoilage.  But I’m very “optimistic” if you will, about the storage response.  I worry some that the storage response will be too great.

But I’m pretty agnostic to pessimistic about everything else.  Look what happened in earlier food price spikes.  Many countries started banning exports.  It created chaos and a huge “bubble” (not sure if it was truly rational or not) in rice prices.  Wheat prices, particularly in the Middle East, shot up much more than world prices because government could no longer retain the subsidized floors. As times get tougher, I worry that politics and conflict could turn crazy.  It’s the crazy that scares me.  We’ve had a taste of this, no?  The Middle East looks much less stable post food price spikes than before. I don’t know how much food prices are too blame, but I think they are a plausible causal factor.


Question 5. With regards to my Climatopolis work, recall that my focus is the urbanized world. The majority of the world already live in cities and cities have a comparative advantage in adapting to climate conditions due to air conditioning, higher income and public goods investments to increase safety.

To be fair: I’m probably picking on the wrong straw man.  What’s bothering me these days has much less to do with your book and more to do with the papers that come across my desk every day.  I think people are being sloppy and a bit closed minded, and yes, perhaps even tribal.  I would agree that adaptation in rich countries is easier.  Max Auffhammer has a nice new working paper looking at air conditioning in California, and how people will use air conditioning more, and people in some areas will install air conditioners that don’t currently have them--that's adaptation.  This kind of adaptation will surely happen, is surely good for people but bad for energy conservation.  It’s a really neat study backed by billions of billing records.   But the adaptation number—an upper bound estimate—is small. 

I thought of you and your book because people at AAEA were making the some of the same arguments you made, and because you’re much bigger fish than most of the folks in my little pond.  Also, I think your book embodies many economists’ perhaps correct, but perhaps gravely naïve, what-me-worry attitude.

Why Science for Climate Adaptation is Difficult

Matthew Kahn, author of the cheeky book Climatopolis: How Our Cities will Thrive in the Hotter Future, likes to compliment our research (Schlenker and Roberts, 2009) on potential climate impacts to agriculture by saying it will cause valuable innovation that will prevent its dismal predictions from ever occurring. 

Matt has a point, one that has been made many times in other contexts by economists with Chicago School roots.  Although in Matt’s case (and most all of the others), it feels more like a third stage of denial than a serious academic argument.

It’s not just Matt.  Today, the serious climate economist (or Serious?) is supposed to write about adaptation.  It feels taboo to suggest that adaptation is difficult.  Yet, the conventional wisdom here is almost surely wrong.  Everyone seems to ignore or miscomprehend basic microeconomic theory: adaptation is a second or higher-order effect, probably as ignorable as it is unpredictable. 

While the theory is clear, the evidence needs to be judged on a case-by-case basis. Although it seems to me that much of the research so far is either flawed or doesn’t measure adaptation at all.  Instead it confounds adaptation—changes in farming and other activities due to changes in climate—with something else, like technological change that would have happened anyway, response to prices, population growth or other factors.

For example, some farmers may be planting earlier or later due to climate change.   They may also be planting different crops in a few places. But farmers are also changing what, when and where they plant due to innovation of new varieties that would have come about even if Spring weren’t coming a little earlier.  The effects of climate change on farm practices are actually mixed, and in the big picture, look very small to me, at least so far.

The other week the AAEA meetings in San Francisco, our recent guest blogger Jesse Tack was reminding me of Matt’s optimistic views, and in the course of our ensuing conversations about some of his current research, it occurred to me just why crop science surrounding climate-related factors is so difficult. The reason goes back to struggles of early modern crop science, and the birth of modern statistics and hypothesis testing, all of which probably ushered in the Green Revolution.

How’s all that?  Well, modern statistical inference and experimental design have some earlier roots, but most of it can be traced to two books, The Statistical Manual for Research Workers, and The Arrangement of Field Experiments, both written by Ronald Fisher in the 1920s. Fisher developed his ideas while working at Rothamsted, one of the oldest crop experiment stations in the world.  In 1919 he was hired to study the vast amount of data collected since the 1840s, and concluded that all the data was essentially useless because all manner of events affecting crop yields (mostly weather) had hopelessly confounded the many experiments, which we unrandomized and uncontrolled. It was impossible to separate signal from noise. To draw scientific inferences and quantify uncertainties, would require randomized controlled trials, and some new mathematics, which Fisher then developed.  Fisher’s statistical techniques, combined with his novel experimental designs, literally invented modern science. It’s no surprise then that productivity growth in agriculture accelerated a decade or two later. 

So what does this have to do with adaptation?  Well, the crux of adaptation involves higher-order effects: the interaction of crop varieties, practices and weather.  It’s not about whether strain X has a typically higher yield than strain Y.  It’s about the relative performance of strain X and strain Y across a wide range weather conditions. 

Much like the early days of modern science, this can be very hard to measure because there’s so much variability in the weather and other factors. Scientists cannot easily intervene to control the temperature and CO2 like they can varieties and crop practices.  And when they do, other experimental conditions (like soil moisture) are usually carefully controlled such that no water or pest stresses occur.  Since these other factors are also likely influenced by warming temperatures (like VPD-induced drought, also here), so it’s not really clear whether these experiments tell us what we need to know about the effects of climate change.

(An experiment with controlled temperatures and CO2 concentrations)

Then, of course, is the curse of dimensionality.  To measure interactions of practices, temperature and CO2, requires experimentation on a truly grand scale.   If we constrain ourselves to actual weather events, in most parts of the world we have only one crop per year, so the data accumulate slowly, will be noisy, and discerning cause and effect basically impossible. In the end, it’s not much different from Ronald Fisher trying to discern truth from his pre-1919 experiment station data that lacked randomly assigned treatments and controls.

I would venture to guess that these challenges in the agricultural realm likely apply to other areas as well.

So, given the challenges, the high cost, and basic microeconomic prediction that adaptation is a small deal anyway, how much should we actually spend on adaptation versus prevention?

Wednesday, April 1, 2015

Discounting Climate Change Under Secular Stagnation


Ben Bernanke, recent former Chair of the Federal Reserve, has a new blog.  And he's writing about low interest rates and so-called secular stagnation, a pre-WWII phrase recently resurrected by Larry Summers.

The topic is dismal--hey, they're economists! But for those in the field it's a real hoot to see these titans of economic thought relieved of their official government duties and able to write openly about what they really think.

These two share many views, but Ben has a less dismal outlook than Larry.  Larry thinks we're stuck in a low-growth equilibrium, and low or even negative interest rates are here to stay without large, persistent fiscal stimulus.  Ben thinks this situation is temporary, if long lived.  He writes:
I generally agree with the recent critique of secular stagnation by Jim Hamilton, Ethan Harris, Jan Hatzius, and Kenneth West. In particular, they take issue with Larry’s claim that we have never seen full employment during the past several decades without the presence of a financial bubble. They note that the bubble in tech stocks came very late in the boom of the 1990s, and they provide estimates to show that the positive effects of the housing bubble of the 2000’s on consumer demand were largely offset by other special factors, including the negative effects of the sharp increase in world oil prices and the drain on demand created by a trade deficit equal to 6 percent of US output. They argue that recent slow growth is likely due less to secular stagnation than to temporary “headwinds” that are already in the process of dissipating. During my time as Fed chairman I frequently cited the economic headwinds arising from the aftermath of the financial crisis on credit conditions; the slow recovery of housing; and restrictive fiscal policies at both the federal and the state and local levels (for example, see my August and November 2012 speeches.)
These are good points. But then Larry has a compelling response, too.  I particularly agree with Larry about the basic economic plausibility of  persistent equilibrium real interest rates that are well below zero.  He writes:
Do Real Rates below Zero Make Economic Sense? Ben suggests not– citing my uncle Paul Samuelson’s famous observation that at a permanently zero or subzero real interest rate it would make sense to invest any amount to level a hill for the resulting saving in transportation costs.  Ben grudgingly acknowledges that there are many theoretical mechanisms that could give rise to zero rates. To name a few: credit markets do not work perfectly, property rights are not secure over infinite horizons, property taxes that are explicit or implicit, liquidity service yields on debt, and investors with finite horizons.
Institutional uncertainty seems like a big deal that can't be ignored when thinking about long-run growth and real interest rates (these are closely connected).  People are pessimistic about growth these days, for seemingly pretty good reasons.  Institutional collapse may be unlikely, but far from impossible.  Look at history.  If we think negative growth is possible, savings are concentrated at the top of the wealth distribution, and people are loss averse, it's not hard to get negative interest rates.

Still, I kind of think we'd snap out of this if we had a bit more fiscal stimulus throughout the developed world, combined with a slightly higher inflation target--say 3 or 4 percent.  But keep in mind I'm just an armchair macro guy.

The point I want to make is that these low interest rates, and the possibility of secular stagnation, greatly affects the calculus surrounding optimal investments to curb climate change.  The titans of environmental economics--Weitzman, Nordhaus and Pindyck--have been arguing about the discount rate we should use to weigh distant future benefits against near-future costs of abating greenhouse gas emissions.  They're arguing about this because the right price for emissions is all about the discount rate.  Everything else is chump change by comparison.

Nordhaus and Pindyck argue that we should use a higher discount rate and have a low price on greenhouse gas emissions.  Basically, they claim that curbing greenhouse gas emissions involves a huge transfer of wealth from current, relatively poor to future supremely rich.  And a lot of that conclusion comes from assuming 2%+ baseline growth forever. Weitzman counters that there's a small chance that climate change will be truly devastating, causing losses so great that the future may not be as well off as we expect.  Paul Krugman has a great summary of this debate.

Anyway, it always bothered me that Nordhaus and Pindyck had so much optimism built into baseline projections.  Today's low interest rates and the secular stagnation hypothesis paint a different picture.  Quite aside from climate change, growth and real rates look lower than the 2% baseline many assume, and a lot more uncertain.  And that means Weitzman-like discount rates (near zero) make sense even without fat-tailed uncertainty about climate change impacts.

Monday, March 16, 2015

The Limits of Econometrics, and Roots of Modern Applied Micro

I just stumbled upon this article by David Freedman, published post-mortem.  A lot of this looks familiar to me: you can find pieces of it in David Freedman's book, Statistical Models, Theory and Practice, which I highly recommend.

Perhaps I've blogged about this before, but I rather suspect that David Freedman's critical writings on use and misuse of regression analysis formed the basis of so-called "applied micro," which grew out of Princeton University, and the work of Ashenfelter, Card, Krueger, Angrist and others.  An occasional citation will clue careful readers to this connection, particularly the teaching of natural experiments, and David Freedman's canonical example:  Snow on Cholera.  Some modern references to Snow refer to Freedman; many do not.  But I'm pretty sure it was in fact Freedman who dug this seminal work out of the dustbin of history and used it to inspire invigorated new empiricism that searches for natural experiments and rigorously tests maintained assumptions in regression models of observational data.

The one thing about Freedman that can be a little frustrating is his terseness.  There's a lot he doesn't say but only implies.  Still, he says enough to lay bear the nakedness of the heroic assumptions made in much applied econometrics.  The sins remain plentiful, even among his descendants who practice applied micro today.

Nevertheless, I think empiricism today is much better than it was twenty years ago.  I think David Freedman deserves much of the credit for that.  And I think he's still worth reading and re-reading, if we want to improve honesty in applied econometrics.


Tuesday, March 10, 2015

Buying Conservation--For the Right Price


Erica Goode has an inspiring article about the benefits of conservation tillage, which has been gaining favor among farmers.  No-till farming can improve yields, lower costs, and improve the environment.  Just the kind of thing we all want to hear--everybody wins!

One important thing Goode doesn't mention: USDA has been subsidizing conservation tillage, and these subsidies have probably played an important role in spreading adoption.

Subsidizing conservation practices like no-till can be a little tricky.  After all, while this kind of thing has positive externalities, farmers presumably reap rewards too.  There are costs involved with undertaking something new. But once the practice is adopted and proven, there would seem to be little need for further subsidies.  The problem is that it can be difficult to take subsidies away once they've been established.

In practice, the costs and benefits of no till and other conservation practices vary.  Some of this has to do with the type of land.  No-till can be excellent for land in the Midwest with thick topsoil.  In the South, where topsoil is thin, maybe not so much.  So, for some farmers conservation practices are worthwhile; for others, the hassle may not be worth the illusive future benefits.  Ideally, policy would provide subsidies to the later, not the former.  But how do policy makers differentiate?  In practice, they don't; everybody gets the subsidies.

Can we do better? Together with some old colleagues at USDA, I've been thinking about this question for a long time, and we recently released a report (PDF) summarizing some of the most essential ideas (here's the Amber Waves short take).

In short, yes, we can do better.  The basic idea involves a form of price discrimination, implemented using augmented signup rules.  Sign ups for conservation programs operate like an auction: farmers submit offers for enrollment, offers are ranked nationally, and the best offers are selected.  The problem is that, when farmers compete on a national scale, farmers happy to do no-till conservation without any subsidies at all are pitted against farmers for whom the private benefits conservation tillage are dubious.  A lot of the subsides probably end up going to farmers who would do it anyway.

Alternatively, signups could impose a degree of local competition, such that the worst offers for any set of observable characteristics--say farms in a crop district with land of similar quality--would be rejected regardless of their national-level standing.  This kind of local competition would garner more competitive offers from no-till farmers who would use the practice even without subsidies.

It's difficult to tell how much more conservation we buy for the tax payer buck using these techniques.  We can't really know without testing the mechanisms on real signups. This is where real policy experiments could have a lot of added value.  Will USDA give it a try? Only time will tell...



Saturday, January 31, 2015

Asymmetric Delusions and Pragmatism


Sometimes I joke that to conservatives, the solution to every problem is to cut taxes; to liberals, the solution to every problem is to eat local.  Of course, purported panaceas on both the left and right are snake oil, even if peddled by true believers.

So, the other day I picked on my own tribe: foodies.  To be frank, I have a love-hate relationship with the movement.  It has thin and oftentimes paranoid underpinnings.  But while a lot of things advocated by the movement are illogical or scientifically baseless (like the dangers of GMOs), the movement also strikes me as mostly harmless, and sometimes even beneficial.  There are some important and disastrous exceptions, like the movement to kill golden rice, which could save the lives of millions and save millions more from blindness.

But for the most part, the movement to "eat local," and all it's offspring, doesn't strike me as particularly harmful.   Here in Hawai'i there are delusional ideas that we ought to stop importing food and go back to the traditional ways of living off taro.  Obviously this isn't going to happen.  Even though they banned GMOs on some of our islands, exceptions were made for key crops actually grown.  Still, the movement does seem to be strong enough to help protect cultural heritage, an important public good.  It also cultivates a food culture that breeds great restaurants and fresh local produce.  After all, wasn't the whole movement inspired by my former Berkeley neighbor, restaurant extraordinaire, Alice Waters? (I never met her, but lived two doors down, in an in-law studio during grad school)  The movement isn't going to save the planet or feed the world, but it sure makes my privileged little world a lot nicer and a little healthier.

What about delusions on the right?  Front and center would be climate change denialism.  Not far behind would be anti-Keynesianism, or the austerity movement.  As Paul Krugman reminds us every day, quite persuasively in my view, these delusions have hardly been harmless.  And in contrast to foodies, the radical right has a whole lot of power, controlling both houses of Congress and rich backing by Wall Street, the Koch brothers and friends.

So, while reasoning and herd behavior on both extremes seems equally delusional at times, the delusional right strikes me as more destructive and much more powerful.

But then, speaking honestly, I identify more with the left than the right.  So am I being too soft on the lefties?  I don't think so, but feel free to weigh in. I think the lefty culture tends to be a bit more introspective by nature than the right.  The right doesn't tend to tease their own quite like I did the other day, not without swift excommunication.  And I think lefties, by their nature, are a little bit more susceptible to evidence and persuasion.  They also, by nature, possess visceral independence and eschew the party line.  Maybe that's what I was doing the other day.

Given the real political asymmetry here, I do feel a little guilty for picking on foodies.  But not too much.  I don't really want to go out and do a lot of research on the topic of food waste to prove how silly this stuff is, or to prove that while GMOs can have some undesirable side effects they aren't Frankenfoods. For the most part this is a fight that isn't worth fighting.  I figure it's better to focus on the things that really do matter.

But I figured it was worth a blog post, because I think we all do better when we eschew our innate tendency toward tribalism, try to figure out what's really going on, and find the most pragmatic solutions to real problems.

Wednesday, January 28, 2015

Food Waste Delusions



A couple months ago the New York Times convened a conference "Food for Tomorrow: Farm Better. Eat Better. Feed the World."  Keynotes predictably included Mark Bittman and Michael Pollan.  It featured many food movement activists, famous chefs, and a whole lot of journalists. Folks talked about how we need to farm more sustainably, waste less food, eat more healthfully and get policies in place that stop subsidizing unhealthy food and instead subsidize healthy food like broccoli.

Sounds good, yes? If you're reading this, I gather you're familiar with the usual refrain of the food movement.  They rail against GMOs, large farms, processed foods, horrid conditions in confined livestock operations, and so on.  They rally in favor of small local farms who grow food organically, free-range antibiotic free livestock, diversified farms, etc.  These are yuppies who, like me, like to shop at Whole Foods and frequent farmers' markets.  

This has been a remarkably successful movement.  I love how easy it has become to find healthy good eats, bread with whole grains and less sugar, and the incredible variety and quality of fresh herbs, fruits, vegetables and meat.  Whole Paycheck Foods Market has proliferated and profited wildly.  Even Walmart is getting into the organic business, putting some competitive pressure on Whole Foods. (Shhhh! --organic isn't necessarily what people might think it is.)

This is all great stuff for rich people like us. And, of course, profits.  It's good for Bittman's and Pollan's book sales and speaking engagements.  But is any of this really helping to change the way food is produced and consumed by the world's 99%?  Is it making the world greener or more sustainable?  Will any of it help to feed the world in the face of climate change?

Um, no.  

Sadly, there were few experts in attendance that could shed scientific or pragmatic light on the issues.  And not a single economist or true policy wonk in sight. Come on guys, couldn't you have at least invited Ezra Klein or Brad Plummer?  These foodie journalists at least have some sense of incentives and policy. Better, of course, would be to have some real agricultural economists who actually know something about large-scale food production and policies around the world. Yeah, I know: BORING!

About agricultural polices: there are a lot of really bad ones, and replacing them with good policies might help.  But a lot less than you might think from listening to foodies.  And, um, we do subsidize broccoli and other vegetables, fruits, and nuts.  Just look at the water projects in the West. 

Let me briefly take on one issue du jour: food waste.  We throw away a heck of a lot of food in this country, even more than in other developed countries.  Why?  I'd argue that it's because food is incredibly cheap in this country relative to our incomes.  We are the world's bread basket.  No place can match California productivity in fruit, vegetables and nuts.  And no place can match the Midwest's productivity in grains and legumes.  All of this comes from remarkable coincidence of climate, geography and soils, combined with sophisticated technology and gigantic (subsidized) canal and irrigation systems in the West.  

Oh, we're fairly rich too.  

Put these two things together and, despite our waste, we consume more while spending less on food than any other country.  Isn't that a good thing?  Europeans presumably waste (a little) less because food is more scarce there, so people are more careful and less picky about what they eat. Maybe it isn't a coincidence that they're skinnier, too.

What to do? 

First, it's important to realize that there are benefits to food waste.  It basically means we get to eat very high quality food and can almost always find what we want where and when we want it.  That quality and convenience comes at a cost of waste.  That's what people are willing to pay for.  

If anything, the foodism probably accentuates preference for high quality, which in turn probably increases waste.  The food I see Mark Bittman prepare is absolutely lovely, and that's what I want.  Don't you?

Second, let's suppose we implemented a policy that would somehow eliminate a large portion of the waste.  What would happen?  Well, this would increase the supply of food even more.  And sinse we have so much already, and demand for food is very inelastic, prices would fall even lower than they are already.  And the temptation to substitute toward higher quality--and thus waste more food--would be greater still.  

Could the right policies help?  Well, maybe.  A little. The important thing here is to have a goal besides simply eliminating waste.  Waste itself isn't problem. It's not an externality like pollution.  That goal might be providing food for homeless or low income families.  Modest incentive payments plus tax breaks might entice more restaurants, grocery stores and others to give food that might be thrown out to people would benefit from it.  This kind of thing happens already and it probably could be done on a larger scale. Even so, we're still going to have a lot of waste, and that's not all bad. 

What about correcting the bad policies already in place?  Well, water projects in the West are mainly sunk costs.  That happened a long time ago, and water rights, as twisted as they may be, are more or less cemented in the complex legal history.   Today, traditional commodity program support mostly takes the form of subsidized crop insurance, which is likely causing some problems.  The biggest distortions could likely be corrected with simple, thoughtful policy tweaks, like charging higher insurance premiums to farmers who plant corn after corn instead of corn after soybeans.  But mostly it just hands cash (unjustly, perhaps) to farmers and landowners.  The odds that politicians will stop handing cash to farmers is about as likely as Senator James Inhofe embracing a huge carbon tax.  Not gonna happen.

But don't worry too much.  If food really does get scarce and prices spike, waste will diminish, because poorer hungry people will be less picky about what they eat.

Sorry for being so hard on the foodies.  While hearts and forks are in the right places, obviously I think most everything they say and write is naive.  Still, I think the movement might actually do some good.  I like to see people interested in food and paying more attention to agriculture.  Of course I like all the good eats.  And I think there are some almost reasonable things being said about what's healthy and not (sugar and too much red meat are bad), even if what's healthy has little to do with any coherent strategy for improving environmental quality or feeding the world.  

But perhaps the way to change things is to first get everyones' attention, and I think foodies are doing that better than I ever could.

Saturday, January 17, 2015

The Hottest Year on Record, But Not in the Corn Belt

Here's Justin Gillis in his usual fine reporting of climate issues, and the map below from NOAA, via the New York Times.


Note the "warming hole" over the Eastern U.S., especially the upper Midwest, the all important corn belt region.  We had a bumper crop this year, and that's because while most of the world was remarkably warm, the corn belt was remarkably cool, especially in summer.

Should we expect the good fortune to continue?  I honestly don't know...

Saturday, January 10, 2015

Searching for critical thresholds in temperature effects

Update 2: Okay, I think it's fixed.
Update: I just realized the code posted badly.  I don't know why.  It looks good in the cross-post at G-FEED.  I'll try to fix.



If google scholar is any guide, my 2009 paper with Wolfram Schlenker on the nonlinear effects of temperature on crop outcomes has had more impact than anything else I've been involved with.

A funny thing about that paper: Many reference it, and often claim that they are using techniques that follow that paper.  But in the end, as far as I can tell, very few seem to actually have read through the finer details of that paper or try to implement the techniques in other settings.  Granted, people have done similar things that seem inspired by that paper, but not quite the same.  Either our explication was too ambiguous or people don't have the patience to fully carry out the technique, so they take shortcuts.  Here I'm going to try to make it easier for folks to do the real thing.

So, how does one go about estimating the relationship plotted in the graph above?

Here's the essential idea:  averaging temperatures over time or space can dilute or obscure the effect of extremes.  Still, we need to aggregate, because outcomes are not measured continuously over time and space.  In agriculture, we have annual yields at the county or larger geographic level.  So, there are two essential pieces: (1) estimating the full distribution of temperatures of exposure (crops, people, or whatever) and (2) fitting a curve through the whole distribution.

The first step involves constructing the distribution of weather. This was most of the hard work in that paper, but it has since become easier, in part because finely gridded daily weather is available (see PRISM) and in part because Wolfram has made some STATA code available.  Here I'm going to supplement Wolfram's code with a little bit of R code.  Maybe the other G-FEEDers can chime in and explain how to do this stuff more easily.

First step:  find some daily, gridded weather data.  The finer scale the better.  But keep in mind that data errors can cause serious attenuation bias.  For the lower 48 since 1981, the PRISM data above is very good.  Otherwise, you might have to do your own interpolation between weather stations.  If you do this, you'll want to take some care in dealing with moving weather stations, elevation and microclimatic variations.  Even better, cross-validate interpolation techniques by leaving one weather station out at a time and seeing how well the method works. Knowing the size of the measurement error can also help correcting bias.  Almost no one does this, probably because it's very time consuming... Measurement error in weather data creates very serious problems (see here and here)

Second step:  estimate the distribution of temperatures over time and space from the gridded daily weather.  There are a few ways of doing this.  We've typically fit a sine curve between the minimum and maximum temperatures to approximate the time at each degree in each day in each grid, and then aggregate over grids in a county and over all days in the growing season.  Here are a couple R functions to help you do this:

# This function estimates time (in days) when temperature is
# between t0 and t1 using sine curve interpolation.  tMin and
# tMax are vectors of day minimum and maximum temperatures over
# range of interest.  The sum of time in the interval is returned.
# noGrids is number of grids in area aggregated, each of which 
# should have exactly the same number of days in tMin and tMax
 
days.in.range <- span=""> function( t0, t1 , tMin, tMax, noGrids )  {
  n   =  length(tMin)
  t0  =  rep(t0, n)
  t1  =  rep(t1, n)
  t0[t0 < tMin]   =  tMin[t0 < tMin]
  t1[t1 > tMax]  =  tMax[t1 > tMax]
  u  =  function(z, ind) (z[ind] - tMin[ind])/(tMax[ind] - tMin[ind])  
  outside  =  t0 > tMax | t1 < tMin
  inside  =  !outside
  time.at.range  =  ( 2/pi )*( asin(u(t1,inside)) - asin(u(t0,inside)) ) 
  return( sum(time.at.range)/noGrids ) 
}
 
# This function calculates all 1-degree temperature intervals for 
# a given row (fips-year combination).  Note that nested objects
# must be defined in the outer environment.
aFipsYear  =  function(z){
  afips   = Trows$fips[z]
  ayear    = Trows$year[z]
  tempDat  = w[ w$fips == afips & w$year==ayear, ]
  Tvect = c()
  for ( k in 1:nT ) Tvect[k] = days.in.range(
              t0   = T[k]-0.5, 
              t1   = T[k]+0.5, 
              tMin = tempDat$tMin, 
              tMax = tempDat$tMax,
              noGrids = length( unique(tempDat$gridNumber) )
              )
  Tvect
}

The first function estimates time in a temperature interval using the sine curve method.  The second function calls the first function, looping through a bunch of 1-degree temperature intervals, defined outside the function.  A nice thing about R is that you can be sloppy and write functions like this that use objects defined outside of the environment. A nice thing about writing the function this way is that it's amenable to easy parallel processing (look up 'foreach' and 'doParallel' packages).

Here are the objects defined outside the second function:

w       # weather data that includes a "fips" county ID, "gridNumber", "tMin" and "tMax".
        #   rows of w span all days, fips, years and grids being aggregated
 
tempDat #  pulls the particular fips/year of w being aggregated.
Trows   # = expand.grid( fips.index, year.index ), rows span the aggregated data set
T       # a vector of integer temperatures.  I'm approximating the distribution with 
        #   the time in each degree in the index T

To build a dataset call the second function above for each fips-year in Trows and rbind the results.

Third step:  To estimate a smooth function through the whole distribution of temperatures, you simply need to choose your functional form, linearize it, and then cross-multiply the design matrix with the temperature distribution.  For example, suppose you want to fit a cubic polynomial and your temperature bins that run from from 0 to 45 C.  The design matrix would be:

D = [    0          0          0   
            1          1           1
            2          4           8
             ...
           45     2025    91125]

These days, you might want to do something fancier than a basic polynomial, say a spline. It's up to you.  I really like restricted cubic splines, although they can over smooth around sharp kinks, which we may have in this case. We have found piecewise linear works best for predicting out of sample (hence all of our references to degree days).  If you want something really flexible, just make D and identity matrix, which effectively becomes a dummy variable for each temperature bin (the step function in the figure).  Whatever you choose, you will have a (T x K) design matrix, with K being the number of parameters in your functional form and T=46 (in this case) temperature bins. 

To get your covariates for your regression, simply cross multiply D by your frequency distribution.  Here's a simple example with restricted cubic splines:


library(Hmisc)
DMat = rcspline.eval(0:45)
XMat = as.matrix(TemperatureData[,3:48])%*%DMat
fit  = lm(yield~XMat, data=regData)
summary(fit)

Note that regData has the crop outcomes.  Also note that we generally include other covariates, like total precipitation during the season,  county fixed effects, time trends, etc.  All of that is pretty standard.  I'm leaving that out to focus on the nonlinear temperature bit. 

Anyway, I think this is a cool and fairly simple technique, even if some of the data management can be cumbersome.  I hope more people use it instead of just fitting to shares of days with each maximum or mean temperature, which is what most people following our work tend to do.  

In the end, all of this detail probably doesn't make a huge difference for predictions.  But it can make estimates more precise, and confidence intervals stronger.  And I think that precision also helps in pinning down mechanisms.  For example, I think this precision helped us to figure out that VPD and associated drought was a key factor underlying observed effects of extreme heat.

Renewable energy not as costly as some think

The other day Marshall and Sol took on Bjorn Lomborg for ignoring the benefits of curbing greenhouse gas emissions.  Indeed.  But Bjorn, am...