Thursday, August 30, 2012

Media Release: Where is it Cheapest to Cut Carbon?

We have a media release out about our Australian Journal of Agricultural and Resource Economics article "Where in the world is it cheapest to cut carbon emissions?"

I actually really like an earlier version of this where we extended the "low hanging fruit" analogy, which really encapsulates our idea:

"Dr Pezzey said that cutting emissions could be compared to picking apples: “We’d expect there to be more ‘low-hanging fruit’ - low cost or easy options for cutting emissions - in countries that have had less aggressive energy efficiency policies and are therefore more emissions-intensive like the US or Australia.” he said.

“But the more fruit there is below a given height, the bigger the total crop is. A common global carbon price is like an agreement on how high up the tree each country should go to harvest emission cuts. So the total cost of such a policy should be higher in an emissions-intensive country, as it requires a larger total cut in its emissions.”

But it's hard to get too many ideas into a very short piece.

Saturday, August 25, 2012

What Happened to the US Sulfur Emissions Market?

An interesting paper from Schmalensee and Stavins on the market for sulfur emissions in the US. This market is often cited as an example of the success of environmental economics and specifically cap and trade and I have used it as that in my teaching in the past. But what has happened more recently? The authors describe it as "ironic". After a period of relative stability often cited as a great success, the price of permits first sky-rocketed and then collapsed:



Essentially, the George W. Bush administration tried to tighten the cap on emissions significantly pushing the price up. In reaction EPA said they'd reconsider starting the collapse in price and then the courts overturned the new regulations. Then a couple of years ago the Obama administration then proposed state based emissions caps and limited interstate trading couple of years ago the market stopped trading entirely.

Schmalensee and Stavins still think that the program was a success if "ironic". I think this shows just how volatile emissions trading programs are and the likelihood of changes being made in response to price spikes that threaten or destroy the program. I think that the volatility of the European Union carbon trading market tells a similar story and increasingly shows that carbon taxes are the way to go. I used to be in favor of emissions trading due to the supposed certainty it would give on emissions reductions but real world experience shows that this could be a case of the perfect being the enemy of the possible.

Thursday, August 23, 2012

Ingelfinger Rule

Until I read this blogpost I didn't know this had a name: "The Ingelfinger Rule". Apparently, Franz Ingelfinger was the editor of the New England Journal of Medicine. In 1969 he ruled that authors couldn't be published in his journal if they engaged in pre-publication or even media publicity around their findings prior to their article seeing print. This was for purely monopolistic reasons. But it is a rule that has become very entrenched across the biomedical and broader biological academic journal world. It hasn't had any influence at all in the world of economics, where prepublication is the norm. Occasionally, economists engaged in interdisciplinary work find that they can't get published because they weren't aware of these very different cultural rules.

Journal Selectivity, Impact Factor, and Citations

Following the debate on Elsevier and open access publication, I have noticed increasing rhetoric about journal impact factors. A journal impact factor is the mean citations received in a given year by articles published in a journal in a number of preceding years. It's long been known that this measure is a rather imperfect indicator of the citation potential of an article published in that journal as the distribution of citations received by articles in any given journal is very dispersed and skewed. A few star articles often get the majority of citations and many more articles are cited little or not at all.

There is a correlation between a journal's impact factor and the number of citations individual articles in the journal get, though it is low. Obviously, it is better to evaluate papers by the number of citations they get themselves. There is an implicit assumption that citations accrue very slowly in the social sciences and especially in economics, so that citations are not useful for evaluating recent research and hence either impact factors or costly secondary peer review are used instead.

We are having an internal debate at the Crawford School about whether to use metrics and which metrics to use in allocation of research funding within the school. So, I have been thinking that if impact factors are only weak predictors of citations are they more strongly correlated with something else? That something else could be the selectivity of journals aka rejection rate. It seems very likely that there is a strong correlation between rejection rate and impact factor.

The problem in testing this is that there seems to be only limited data on rejection rates. I've done a very quick search on this and came up with some information. de Marchi and Rocchi found a significant partial correlation (p=0.04) between rejection rate and impact factor for a sample of 72 journals from all disciplines that responded to their survey. For the ecology journals that Aarssen et al. looked at the correlation between rejection rate and impact factor was 0.687, though there are some low impact journals with quite high rejection rates:



The authors argue that this may be because these low IF journals are flooded with low quality papers that they have to reject. A high rejection rate among a pool of low quality research isn't the same as a high rejection rate among a pool of higher quality research. Still, despite this, IF does seem to be a better predictor of journal selectivity than of citations and on the whole we might think that papers that pass more selective processes tend to be higher quality.


Friday, August 17, 2012

When to Put Out a Working Paper?

In the past I usually put out a working paper version of my research at the same time as submitting the paper to a journal. I assumed that if I thought the paper was ready to be refereed then it was ready to be read by a broader audience too. When I was looking for a job I thought that it was important to get stuff onto my CV as fast as possible. I still usually sent papers, and especially single authored papers, to colleagues for comment before getting to this stage. But some discussions with colleagues, reading blogs, and noting the that some working papers are in fact more or less identical to the published version has made rethink this practice.

Some colleagues reported that they either waited for the first round of comments from a journal before putting up a working paper or that they waited for final acceptance at a journal.

So recently, I have been following a mix of these two strategies, which is one reason why I haven't put up many working papers recently. But I'm still not sure what the best strategy should be.

Thursday, August 16, 2012

ResearchGate 3.0

I've blogged a couple of times about ResearchGate and set up a profile there early in the year. At the time, I wasn't impressed with the coverage of social sciences. Recently, I've noticed a flurry of activity on ResearchGate with a number of colleagues putting up profiles and publications. You can see diffusion in action through the social network. They also have improved coverage in the social sciences including assigning points based on impact factor to journals listed in the Social Science Citation Index. Of course, it would make more sense to use citations rather than impact factors, but the latter is much easier...

So far, I am still finding it less useful than academia.edu or even LinkedIn. I don't really find any of them terribly useful, though occasionally I have been led to an interesting paper...

Wednesday, August 15, 2012

CAMA Moves to Crawford School

As of today, CAMA has moved to the Crawford School from the ANU College of Business and Economics. Warwick McKibbin has already moved into an office just down the corridor from me. I think this is a great development for the Crawford School and will increase our strength in macroeconomics and finance as well as concentrating all ANU environmental and energy economists in Crawford. Bringing CAMA and its network to Crawford should also strengthen our research profile and ranking. Several of us are already research associates in CAMA and I am looking forward to increased collaboration opportunities.

Sunday, August 5, 2012

RePEc Citation Profiles

RePEc's coverage of citations is improving rapidly. It seems Elsevier has agreed to allow RePEc to use citations in Elsevier papers. This has increased the number of my citations in RePEc dramatically as a lot of my citations are in Energy Economics and Ecological Economics. I just found that you can now get a citation profile complete with graphs for anyone registered on RePEc. This joins the profiles available from Google, Scopus, Web of Science, and Microsoft. Of course, it is not as comprehensive as the former three services but probably a lot more accurate than the latter.