David Stern's Blog on Energy, the Environment, Economics, and the Science of Science
Friday, December 31, 2010
iPad Pricing
Apple charges AUD 130 more for the 32GB iPad than for the 16GB iPad. Yet a 16GB flash memory card can be bought for as little as AUD45. The premium in the US is USD 100. Interestingly, the premium for a 64GB iPad relative to a 32GB iPad is also AUD 130/USD 100. The lowest price I found for a 32GB flash drive was AUD 89. It looks like Apple makes much more profit off the higher memory versions than the base model and that this is a form of price discrimination which is enabled by it not being possible to add flash memory oneself to the iPad. I think this can be analyzed as a case of "tying". Apple requires that you buy the extra memory from it, much as the movie theatre bans food apart from that they sell themselves. This would be a good case study or exercise for a microeconomics course.
More on iPad pricing from the Economist.
Thursday, December 30, 2010
Flash Drives vs. Memory Cards
Since writing my previous post about memory devices I tested a class 2 (probably) Sandisk SD card from a digital camera vs. my Lexar flashdrive using the free XBench software. I was using my MacBook Pro for the tests and just plugging the Sandisk card into the SD slot on the laptop. Here are the results. First the flash drive:
And here is the memory card:
Data can be read faster off this flash drive than off the memory card but writing to the memory card is faster than to the flash drive. This greater uniformity of read-write speeds seems to be a feature of memory cards. These speeds seem to be reflected in real world applications. It takes a lot longer to copy files to the flash drive than vice versa. The MacBook's hard drive is mostly faster than either portable device:
And here is the memory card:
Data can be read faster off this flash drive than off the memory card but writing to the memory card is faster than to the flash drive. This greater uniformity of read-write speeds seems to be a feature of memory cards. These speeds seem to be reflected in real world applications. It takes a lot longer to copy files to the flash drive than vice versa. The MacBook's hard drive is mostly faster than either portable device:
Tuesday, December 28, 2010
SD Memory Cards
In the last couple of years I have used a USB flash drive like this:
as my primary computer data storage and I've used the hard drive on my laptop and office computer as data back-up and the location for the operating system and applications. This means that I can easily transport all my data from office to home and back without having to copy heaps of files back and forth and remember which ones I've updated and without having a hard drive dangling off my laptop. The applications etc. are the same in both locations and all the data goes with me. The Lexar 16GB hard-drive pictured is fast and almost indestructible. But I'm rapidly running out of space and they don't make a 32GB model. I can't find another small, fast, and robust drive with a 32GB capacity. So I'm thinking of using something like this in future instead:
It's an SD card more commonly used in cameras etc. MacBooks and iMacs both have SD card slots and according to Apple there should be no problems even in installing the operating system on such a card. For computers without a slot card readers can be bought very cheaply.
So are there any drawbacks to this idea?
as my primary computer data storage and I've used the hard drive on my laptop and office computer as data back-up and the location for the operating system and applications. This means that I can easily transport all my data from office to home and back without having to copy heaps of files back and forth and remember which ones I've updated and without having a hard drive dangling off my laptop. The applications etc. are the same in both locations and all the data goes with me. The Lexar 16GB hard-drive pictured is fast and almost indestructible. But I'm rapidly running out of space and they don't make a 32GB model. I can't find another small, fast, and robust drive with a 32GB capacity. So I'm thinking of using something like this in future instead:
It's an SD card more commonly used in cameras etc. MacBooks and iMacs both have SD card slots and according to Apple there should be no problems even in installing the operating system on such a card. For computers without a slot card readers can be bought very cheaply.
So are there any drawbacks to this idea?
Monday, December 27, 2010
Marginal Cost Curve for Crude Oil
Nice figure of the marginal cost curve for crude oil:
It's included in a post on the Oil Drum by David Murphy. Of course, reality is a bit more complicated than that and Murphy's article doesn't say that this is a marginal cost curve, but it does give a rough idea. Krugman is also on board for peak oil.
It's included in a post on the Oil Drum by David Murphy. Of course, reality is a bit more complicated than that and Murphy's article doesn't say that this is a marginal cost curve, but it does give a rough idea. Krugman is also on board for peak oil.
Getting Around the Great Firewall
As you can see, because Blogger is blocked in China, Stochastic Trend gets no visits from there:
I am sure some people visit using a VPN, but I've come up with an alternative solution. Once a month I could post the source code of my blog to my website. I don't know if the images which are still all hosted by Google will be visible or not and all the links on the side that go back to Blogger will be blocked, but at least the text will be accessible.
If you are in China and reading this, it would be great to hear from you!
I am sure some people visit using a VPN, but I've come up with an alternative solution. Once a month I could post the source code of my blog to my website. I don't know if the images which are still all hosted by Google will be visible or not and all the links on the side that go back to Blogger will be blocked, but at least the text will be accessible.
If you are in China and reading this, it would be great to hear from you!
Thursday, December 23, 2010
Friday, December 17, 2010
Marginal CO2 Abatement Cost Curves from EMF22
These are my first estimates of the marginal abatement cost curves for the four main regions based on the results of the EMF22 exercise. Here I have flipped the graph back 90 degrees again. This is private marginal cost for abating fossil and industrial emissions of CO2 using market exchange rates. The EU is the most expensive region for small cuts in emissions and India the cheapest. But for extreme cuts India is most expensive. These results will look different if we measure cost differently. Note also that this meta-analysis finds very high carbon prices for large cuts in emissions. The typical numbers thrown around of $20 per tonne only apply to very small cuts in emissions. But the cost of a given emissions reduction declines over time as technology progresses.
Thursday, December 16, 2010
Marginal CO2 Abatement Cost Curve for the US
So, I turned the graph 90 degrees and replaced tonnes of abatement by percent and got this:
The percentage is emissions relative to business as usual. i.e. 100% means there is no abatement. 0% means there is 100% abatement. Yes, some models end up with negative emissions. And these are just fossil-fuel/industrial emissions of CO2. They assume that we will be burning biomass and sequestrating the carbon. This looks like a logistic curve. Especially if we dump the negative emissions data.
The percentage is emissions relative to business as usual. i.e. 100% means there is no abatement. 0% means there is 100% abatement. Yes, some models end up with negative emissions. And these are just fossil-fuel/industrial emissions of CO2. They assume that we will be burning biomass and sequestrating the carbon. This looks like a logistic curve. Especially if we dump the negative emissions data.
Wednesday, December 15, 2010
IEA Data No Longer Available at NLA
I'm reliably told that access to online IEA data at the National Library of Australia is no longer available. Maybe ANU ought to think about subscribing given how much climate research is going on at the university?
Tuesday, December 14, 2010
Marginal Abatement Cost Curve for China
The chart plots the carbon price in US Dollars (market exchange rate) against Gt of Co2 abated for China using the results from seven of the EMF-22 models. This is not really a cost curve as it includes data for different time periods and model scenarios. I've been struggling to model this data over the last 10 days or so on and off. But this is actually the first time I've plotted the data like this. It's always helpful I think to look at your data.
Fitting a straight line to this chart would imply that carbon prices are an exponential function of abatement. This is not a bad fit but something a bit more non-linear would probably work better. Another way of representing the data is to plot the carbon price against the percentage reduction in emissions:
Again, this appears to be roughly linear but a nonlinear curve would fit better.
Monday, December 13, 2010
Polar View
Some nice maps from GISS showing temperature anomalies for November in the Arctic and Antarctic:
It's clear from these, that though temperatures were low in NW Europe they were much higher than normal across Arctic Canada and Russia. For more information, visit Climate Progress
It's clear from these, that though temperatures were low in NW Europe they were much higher than normal across Arctic Canada and Russia. For more information, visit Climate Progress
Sunday, December 12, 2010
The Green Paradox
The green paradox is the idea that a policy to reduce global warming could instead accelerate the use of fossil fuels because owners extract more fossil fuels while they are still valuable. Of course, an actual cap on emissions should avoid this green paradox, but some other policies might lead to a green paradox in theory. Quentin Grafton, Tom Kompas, and Ngo Van Long have written a paper titled "Do Biofuel Subsidies Reduce Greenhouse Gas Emissions?" that argues that subsidies to increase biofuels could plausibly accelerate the use of fossil fuels and, therefore, climate change.
Importantly, the paper assumes that the production of biofuels does not use fossil fuels itself. In fact biofuels production requires large inputs of fossil fuels, such that there is a controversy as to whether some of them yield any net energy at all. So Grafton et al.'s paper shows an additional reason why encouraging biofuel production might be a bad idea besides the low energy return on investment of biofuels and their displacement of food production.
Grafton et al. assume that extraction costs of fossil fuels are constant and that biofuels and fossil fuels are perfectly substitutable. The supply of biofuels is increasing in the net price received by producers (market price plus subsidy). Given the assumptions the price of fossil fuels rises at the rate of interest. This results in a tractable problem where at some point in time the fossil fuel price will rise sufficiently high that all fuel demand will be supplied by biofuels. Fossil fuel producers must, therefore, extract all their resources by that time. Under certain conditions the greater the subsidy to biofuel production the sooner that time will come and the faster fossil fuels will be extracted. In other words, the supply response from fossil fuel producers overwhelms the substitution effect towards biofuels on the demand side.
Making the reasonable assumption that overall demand for energy does not fall to zero for a finite price they show that the green paradox is plausible for reasonable parameter values under both competitive and monopoly extraction of fossil fuels.
Linking Book Chapters to Your Website
With Google Books it's now possible to provide links to your book chapters online without posting your own pdf of the chapter. For example:
Stern D. I. (2004) The environmental Kuznets curve, in: P. Safonov and J. Proops (eds.) Modelling in Ecological Economics, Edward Elgar, Cheltenham.
I already provide links from my publication list to RePEc for all my journal articles in RePEc (one way to increase RePEc hits) and to the doi for articles not in RePEc. There are only a couple of pages missing from the version of my chapter on Google Books.
Stern D. I. (2004) The environmental Kuznets curve, in: P. Safonov and J. Proops (eds.) Modelling in Ecological Economics, Edward Elgar, Cheltenham.
I already provide links from my publication list to RePEc for all my journal articles in RePEc (one way to increase RePEc hits) and to the doi for articles not in RePEc. There are only a couple of pages missing from the version of my chapter on Google Books.
Paul Burke's Graduation
I went along to Paul Burke's graduation as Doctor of Philosophy at ANU on Friday. As you can see I was in "civilian clothing" and sat in the audience. Here the hooding ceremony is performed by the Chancellor of the University (Gareth Evans) rather than by the graduate's PhD adviser,* as is the practice in the US. Actually, I hadn't been to a graduation in Australia before and there were some other interesting traditions involving a mace and "beadle's staves". As you can see from the picture there are some differences in academic dress between the US and the Commonwealth countries too.
Anyway, congratulations to Paul!
* I was on Paul's committee but not his primary adviser.
Thursday, December 9, 2010
OCEAN-OIL
Boston University, Louisiana State University, and the National Council for Science and the Environment (NCSE) have created a resource that will allow you to explore questions regarding the causes, magnitude and consequences of the Deepwater Horizon disaster, as well as to contribute your own expertise.
The Online Clearinghouse for Education & Networking: Oil Interdisciplinary Learning (OCEAN-OIL) is an open-access, peer-reviewed electronic education resource about the Deepwater Horizon disaster. OCEAN-OIL is funded by the National Science Foundation.
The OCEAN-OIL website is seamlessly integrated into the Encyclopedia of Earth, which is a free, peer-reviewed, searchable collection of content about the Earth, its natural environments, and their interaction with society, written by expert scholars and educators. EoE Editor-in-Chief Cutler Cleveland
In order to contribute your expertise to this initiative, follow the procedures outlined here.
For more information contact Mallory Nomack.
NCSE is holding a special one day symposium on the Gulf of Mexico, what is necessary for ecological and economic recovery, as well as the broader issues of off shore oil drilling. Former EPA Administrator William Reilly and Senator Bob Graham, Co-chairs of National Commission on the BP Deepwater Horizon Oil Spill and Offshore Drilling will keynote the symposium, which will kick off NCSE’s 11th National Conference on Science, Policy and the Environment: Our Changing Oceans, January 19-21, 2011 in Washington, DC.
"Elasticities of Substitution and Complementarity" to be published in Journal of Productivity Analysis
My paper Elasticities of Substitution and Complementarity has been accepted by the Journal of Productivity Analysis. The paper surveys the various definitions of the "elasticity of substitution" and puts them into a framework that explains their relationships and purposes. It includes the new definition of Hicks Elasticity of Substitution based on the input distance function as well as definitions of elasticities based on revenue functions and a discussion of the work of Pigou on the topic which was not cited for decades.
Wednesday, December 8, 2010
Box Plots in Excel
Strangely, Excel does not have a chart type for "box plots". In general, it's graphing capabilities are not that great. You can make an approximation to a boxplot chart using some obscure options which are quite hidden away. I used these instructions to come up with this:
Not quite as pretty as the example in Wikipedia, but probably good enough.
Not quite as pretty as the example in Wikipedia, but probably good enough.
Google E-Books
I got an e-mail from Edward Elgar Publishing this morning announcing that they are joining Google's E-Books initiative. It was launched yesterday and is only available in the US so far. Seems it is yet another new format but will be available for many devices (unlike Amazon's Kindle).
Monday, December 6, 2010
Proposal Tips
Actually, the article is titled "How to Fail in Grant-Writing". It's kind of funny. Well, some of it is. I'm re-working my (unsubmitted) ARC proposal from last year right now...
Sunday, December 5, 2010
IPCC Position Available
IPCC is looking for Programme Manager, Communications and Media Relations based in Geneva.
CCEP Debuts on RePEc at 17th in Australia
The Centre for Climate Economics and Policy, which was just recently launched, enters the RePEc ranking for Australian economics institutions at 17th (roughly top 14%). CCEP is a network of researchers working on climate issues directed by Frank Jotzo of the Crawford School at ANU>. We have a working paper series also on RePEc (which I am administering) and a conference/workshop is planned for early next year.
CCEP is equally ranked with the National Centre for Social and Economic Modelling at University of Canberra and one place below long-established economics departments including the Arndt-Corden Department of Economics at ANU * and research groups such as that of John Quiggin at U. Queensland.
We also rank 51st in the world for energy economics and 58th for environmental economics.
* Arndt-Corden is now part of the Crawford School at ANU. The other departments ranked 16th are also subdivisions of higher ranked academic units.
CCEP is equally ranked with the National Centre for Social and Economic Modelling at University of Canberra and one place below long-established economics departments including the Arndt-Corden Department of Economics at ANU * and research groups such as that of John Quiggin at U. Queensland.
We also rank 51st in the world for energy economics and 58th for environmental economics.
* Arndt-Corden is now part of the Crawford School at ANU. The other departments ranked 16th are also subdivisions of higher ranked academic units.
Saturday, December 4, 2010
What is Business as Usual for China and India?
My paper with Frank Jotzo in Energy Policy argued that while India's goal of cutting emissions intensity by 25% between 2005 and 2020 was likely to be similar to the business as usual reduction in emissions, China's goal was much more ambitious. China aims to reduce emissions intensity by 40-45% over this time frame, while we estimated it would decline by 24% under business as usual.
By contrast, many commentators argued that China's goal was just business as usual. This was because China's strong policies to reduce energy and carbon intensity were already included in standard scenarios.
I am now revising my paper with Ross Lambie on where it is cheapest to cut carbon emissions. We will use the results of the 22nd Energy Modelling Forum (EMF22) in this revised version. So I was curious what the business as usual scenarios developed by the participating models said about China and India in the 2000-2020 period:
On average they predict a 25% reduction in emissions intensity in China from 2000 to 2010, increasing to a 27% reduction in 2010-2020. We estimated 1% and 15% reductions in these periods under BAU. There is no way that China will end up with a 25% reduction from 2000-2010. Emissions intensity rose from 2000 to 2005 and China is struggling to achieve its goal of reducing energy intensity by 20% from 2005 to 2010. Our estimates for India are pretty close to the EMF averages. The results also show a large variation in the scenarios. There is a lot of uncertainty about what is BAU.
China might achieve a 27% reduction in emissions intensity relative to 2010 by 2020 (the average given by the EMF22 models above). But it will be the result of policy action, not business as usual.
By contrast, many commentators argued that China's goal was just business as usual. This was because China's strong policies to reduce energy and carbon intensity were already included in standard scenarios.
I am now revising my paper with Ross Lambie on where it is cheapest to cut carbon emissions. We will use the results of the 22nd Energy Modelling Forum (EMF22) in this revised version. So I was curious what the business as usual scenarios developed by the participating models said about China and India in the 2000-2020 period:
On average they predict a 25% reduction in emissions intensity in China from 2000 to 2010, increasing to a 27% reduction in 2010-2020. We estimated 1% and 15% reductions in these periods under BAU. There is no way that China will end up with a 25% reduction from 2000-2010. Emissions intensity rose from 2000 to 2005 and China is struggling to achieve its goal of reducing energy intensity by 20% from 2005 to 2010. Our estimates for India are pretty close to the EMF averages. The results also show a large variation in the scenarios. There is a lot of uncertainty about what is BAU.
China might achieve a 27% reduction in emissions intensity relative to 2010 by 2020 (the average given by the EMF22 models above). But it will be the result of policy action, not business as usual.
Wednesday, December 1, 2010
CCEP Working Papers Now on RePEc
You can now download CCEP Working Papers from RePEc. Since launching the series with six papers we have added a further paper by Leo Dobes: "Notes on Applying ‘Real Options’ to Climate Change Adaptation Measures, with Examples from Vietnam.
The Role of Energy in Economic Growth
A few months ago I serialized a paper I was revising on the role of energy in economic growth. I didn't include all the material in the paper and it wasn't serialized in order. The paper will be appearing next year in Ecological Economics Reviews, which is an annual edition of the Annals of the New York Academy of Sciences.
A working paper version of the paper is now available in the new working paper series of the Centre for Climate Economics and Policy.* The paper is an extensively revised and updated version of my 2004 working paper on the topic. That turned out to be my most downloaded and most cited working paper but only saw publication in a shortened form in the Encyclopedia of Energy. So I thought an update with journal publication would be a good idea! The main novelty in the new paper is the synthesis section which presents a very simple unified growth theory model. I have another working paper that will appear shortly that provides much more detail on that topic, so will reserve comment until then.
*The paper is also available on SSRN as my first contribution to the USSEE/IAEE Working Paper Series.
A working paper version of the paper is now available in the new working paper series of the Centre for Climate Economics and Policy.* The paper is an extensively revised and updated version of my 2004 working paper on the topic. That turned out to be my most downloaded and most cited working paper but only saw publication in a shortened form in the Encyclopedia of Energy. So I thought an update with journal publication would be a good idea! The main novelty in the new paper is the synthesis section which presents a very simple unified growth theory model. I have another working paper that will appear shortly that provides much more detail on that topic, so will reserve comment until then.
*The paper is also available on SSRN as my first contribution to the USSEE/IAEE Working Paper Series.
Tuesday, November 30, 2010
Greg Combet
Today Greg Combet, Minister for Climate Change and Energy Efficiency, spoke at the Crawford School on "Australia in a climate changed world – Moving forward to Cancún and beyond". Frank Jotzo and Carolyn Hendricks gave follow-up presentations on international and domestic aspects of the current climate change policy debate.
The Australian is emphasizing his reiteration of the 5% unconditional target and the strong conditions he put on Australia adopting the 15% target. He said that the latter would require verifiable restraint in emissions from developing countries and similar action to Australia by developed countries. The US at the moment doesn't seem likely to do the latter and China is resisting the kind of verifiability he probably wants. He said the 25% target could be adopted only with a legally binding international agreement to limit greenhouse gas concentrations to 450ppm or less.
He seems to be a pretty good speaker (better than some politicians I've encountered) and laid things out in a very clear fashion. He also answered extensive questions in a very forthright manner. I was impressed also by his emphasis on carbon pricing as the main tool throughout his speech and answers and the blame he placed on Greens for not passing the CPRS. He said they were now being given a second chance. He didn't understand their opposition, given that the level of abatement can always be raised in the future by reducing the cap. Given the nature of the Australian economy, he didn't think lobbyists had an undue influence on the design of the CPRS (there was less compensation than proposed in the US and implemented in Europe from my understanding, so I'd agree on that). He also claimed that the 5% emissions reduction proposed by Australia was a greater percentage cut in per capita emissions than was Europe's 20% cut.
Saturday, November 27, 2010
Highly-Cited Papers are More Likely to Cite Highly-Cited Papers
An interesting paper in PLOS ONE performed an analysis of all papers published in 2003 that are included in the intersection of the Scopus and ISI databases. They find that the most cited papers in the following 5 years are more likely to cite other highly cited papers than lower ranked papers are to cite highly cited papers relative to how much each group cited less highly cited papers. This is the figure they give for the life sciences:
The black curve shows that more than 50% of the references in the 1% most cited papers were also to papers that were in the 1% most cited category. However, the green curve shows that less than 20% of the references in the papers in the bottom half of the citation distribution - were to papers in the 1% most cited group. The effects were less dramatic in physical and social sciences.
This isn't so surprising in retrospect but it's nice to see the data. The authors claim that this shows that innovative researchers "stand on the shoulders of giants" as Newton said.
At least a couple of other things could be going on:
1. Papers in small subfields or on speciality topics which won't get a huge amount of citations are citing other papers in their subfield or topic which also aren't highly cited. Think economic history for example in economics. No economic history journal has a high impact factor.
2. Top researchers are better at deciding which papers are important and are worthy of citation than are weaker researchers.
Also from PLOS ONE: referees suggested by authors rate papers better than referees suggested by editors and open access papers get cited more.
The black curve shows that more than 50% of the references in the 1% most cited papers were also to papers that were in the 1% most cited category. However, the green curve shows that less than 20% of the references in the papers in the bottom half of the citation distribution - were to papers in the 1% most cited group. The effects were less dramatic in physical and social sciences.
This isn't so surprising in retrospect but it's nice to see the data. The authors claim that this shows that innovative researchers "stand on the shoulders of giants" as Newton said.
At least a couple of other things could be going on:
1. Papers in small subfields or on speciality topics which won't get a huge amount of citations are citing other papers in their subfield or topic which also aren't highly cited. Think economic history for example in economics. No economic history journal has a high impact factor.
2. Top researchers are better at deciding which papers are important and are worthy of citation than are weaker researchers.
Also from PLOS ONE: referees suggested by authors rate papers better than referees suggested by editors and open access papers get cited more.
A Clarification, 14 Years On...
Maybe it is a bit late for this, but quite a few papers I get sent for review cite our 1996 paper in World Development as a rationale for estimating an environmental Kuznets curve model for a single country. We wrote:
"We believe a more fruitful approach to the analysis of the relationship between economic growth and environmental impact would be the examination of the historical experience of individual countries, using econometric and also qualitative historical analysis." (p. 1159)
As should be obvious from the preceding sentence "they will need to take the form of structural models, rather than reduced form equations of the EKC type" it's clear we didn't think that EKC models were very useful (though I know that "structural models" is rather vague). To the degree that they are useful they need to be estimated with a representative sample of countries and getting consistent estimates turns out to not be so straightforward.*
What we were thinking of in terms of individual countries is a proper analysis of what policies lead to reductions in pollution and what factors drove the adoption of those policies. For example, why was Japan one of the first countries to sharply cut sulfur emissions? What can we learn from this as we address our current issues with climate change or biodiversity protection? There is still plenty of scope for research of that sort.
* The simple between estimator seems to work quite well in this paper but that's not always the case.
"We believe a more fruitful approach to the analysis of the relationship between economic growth and environmental impact would be the examination of the historical experience of individual countries, using econometric and also qualitative historical analysis." (p. 1159)
As should be obvious from the preceding sentence "they will need to take the form of structural models, rather than reduced form equations of the EKC type" it's clear we didn't think that EKC models were very useful (though I know that "structural models" is rather vague). To the degree that they are useful they need to be estimated with a representative sample of countries and getting consistent estimates turns out to not be so straightforward.*
What we were thinking of in terms of individual countries is a proper analysis of what policies lead to reductions in pollution and what factors drove the adoption of those policies. For example, why was Japan one of the first countries to sharply cut sulfur emissions? What can we learn from this as we address our current issues with climate change or biodiversity protection? There is still plenty of scope for research of that sort.
* The simple between estimator seems to work quite well in this paper but that's not always the case.
Sunday, November 21, 2010
EEN Symposium on Monday
The EEN Symposium at the Crawford School starts on Monday. I'm giving my presentation at 1:30pm. The slides are here. For more details on the Symposium please visit the Crawford School website.
Thursday, November 18, 2010
Call for Papers
The Journal of Industrial Ecology has a call for papers on the topic of "Greening Growing Giants". Quoting from the call:
"Questions relevant to this special issue include but are not limited to: What quantities of resources will be required globally in the near future, given the current dynamics of per capita resource use in developing countries? What role does the demand in industrial countries, and international trade, play in raising consumption levels in emerging economies? How much CO2 and/or pollutants will be produced through the resource use? What will be the technological potential for the reduction of resource demand and emissions? What are alternative development paths with low materials consumption and low emissions? What is the role of innovative practices at local level, especially in cities, in achieving alternative, more sustainable development pathways? What kind of policies, tools and practices are effective in achieving alternative development pathways?"
Deadline is 30 April 2011.
"Questions relevant to this special issue include but are not limited to: What quantities of resources will be required globally in the near future, given the current dynamics of per capita resource use in developing countries? What role does the demand in industrial countries, and international trade, play in raising consumption levels in emerging economies? How much CO2 and/or pollutants will be produced through the resource use? What will be the technological potential for the reduction of resource demand and emissions? What are alternative development paths with low materials consumption and low emissions? What is the role of innovative practices at local level, especially in cities, in achieving alternative, more sustainable development pathways? What kind of policies, tools and practices are effective in achieving alternative development pathways?"
Deadline is 30 April 2011.
Wednesday, November 17, 2010
Madsen et al.: Four centuries of British economic growth: The roles of technology and population
In a paper forthcoming in the Journal of Economic Growth* Jakob Madsen et al. test the ability of alternative endogenous growth theories to explain the British Industrial Revolution. They conclude that Schumpetarian growth theory can explain the data while "semi-endogenous growth theory" cannot. Madsen recently won an ARC fellowship to pursue this research further.
Interestingly, when the change in coal production is added to the regression for labor productivity growth as a control variable its effect is found to be insignificant. Naturally, I find this surprising. What is being measured though is the effect of energy use controlling for a bunch of innovation variables. The expansion of coal use required extensive innovation. So maybe this isn't so surprising. Also the data is in terms of annual first differences, which will likely reduce the size of the effect found.
*A free version of the paper is available here.
Interestingly, when the change in coal production is added to the regression for labor productivity growth as a control variable its effect is found to be insignificant. Naturally, I find this surprising. What is being measured though is the effect of energy use controlling for a bunch of innovation variables. The expansion of coal use required extensive innovation. So maybe this isn't so surprising. Also the data is in terms of annual first differences, which will likely reduce the size of the effect found.
*A free version of the paper is available here.
Tuesday, November 16, 2010
Did Incomes Grow in Pre-Industrial England?
This is a current point of contention among economic historians. Gregory Clark thinks that in the long-run they did not though they fluctuated considerably. In the wake of the Black Death incomes were high due to the increase in land per worker and they subsequently fell as population grew till eventually rising gain as the industrial revolution approached. I think this much is agreed but the question is how high were incomes in the late 14th Century. Clark thinks they were just as high as at the beginning of the 19th century. Gunnar Persson deems this "The Malthus Delusion". As an interesting aside I was amazed at how popular the name John was in 14th Century Essex:
I would have thought it was popular, but not this popular.
I would have thought it was popular, but not this popular.
Recent Papers of Interest in Ecological Economics:
van den Bergh, J. C. J. M., Environment versus growth — A criticism of “degrowth” and a plea for “a-growth”, Ecological Economics.
van den Bergh makes a much more extensive version of the main argument I made in my review of Tim Jackson's book Prosperity without Growth. There aren't policy levers that can directly stop growth and it might not be what is needed to solve environmental problems anyway. It makes much more sense to implement direct policies on resource use, environmental quality etc. van den Bergh calls this "a-growth". Forget about growth per se as well as de-growth as policy targets and aim at achieving the things we actually want to achieve.
Henriques, S. T. and A. Kander, The modest environmental relief resulting from the transition to a service economy, Ecological Economics.
This paper expands Kander's previous study of dematerialization in Sweden to a group of 13 countries. That article showed that it was an illusion that a shift to the service sector had helped dematerialize the economy. Rather rapid productivity gains in the industrial sector had both reduced energy use and the share of manufacturing in GDP due to the fall in the price of manufactured goods relative to services. The trend to rising service prices relative to manufacturing prices due to productivity gains in manufacturing is known as Baumol's disease. They explain the new study in the abstract:
"A service transition is supposed to lead to the decline of energy intensity (energy/GDP). We argue that this interpretation is overly optimistic because the shift to a service economy is somewhat of an illusion in terms of real production. Several recent studies of structural effects on energy intensity have made the error of using sector shares in current prices, combined with GDP in constant prices, which is inconsistent and ignores the different behaviour of prices across sectors. We use the more correct method of sector shares in constant prices, and make an attempt to single out the effect from the real service transition by using two complementary methods: shift share analyses in current and constant prices, and Logarithmic Mean Divisia Index (LMDI) for 10 developed and 3 emerging economies. A service transition is rather modest in real terms. The major driver of the decline in energy intensity rests within the manufacturing sector. Meanwhile, the transition to a service sector had a small downward impact on energy intensity in 7 of the developed countries (and no impact in the others). For emerging economies like Brazil, Mexico and India, it is the residential sector that drives energy intensity down because of the declining share of this sector as the formal economy grows, and as a consequence of switching to more efficient fuels."
van den Bergh makes a much more extensive version of the main argument I made in my review of Tim Jackson's book Prosperity without Growth. There aren't policy levers that can directly stop growth and it might not be what is needed to solve environmental problems anyway. It makes much more sense to implement direct policies on resource use, environmental quality etc. van den Bergh calls this "a-growth". Forget about growth per se as well as de-growth as policy targets and aim at achieving the things we actually want to achieve.
Henriques, S. T. and A. Kander, The modest environmental relief resulting from the transition to a service economy, Ecological Economics.
This paper expands Kander's previous study of dematerialization in Sweden to a group of 13 countries. That article showed that it was an illusion that a shift to the service sector had helped dematerialize the economy. Rather rapid productivity gains in the industrial sector had both reduced energy use and the share of manufacturing in GDP due to the fall in the price of manufactured goods relative to services. The trend to rising service prices relative to manufacturing prices due to productivity gains in manufacturing is known as Baumol's disease. They explain the new study in the abstract:
"A service transition is supposed to lead to the decline of energy intensity (energy/GDP). We argue that this interpretation is overly optimistic because the shift to a service economy is somewhat of an illusion in terms of real production. Several recent studies of structural effects on energy intensity have made the error of using sector shares in current prices, combined with GDP in constant prices, which is inconsistent and ignores the different behaviour of prices across sectors. We use the more correct method of sector shares in constant prices, and make an attempt to single out the effect from the real service transition by using two complementary methods: shift share analyses in current and constant prices, and Logarithmic Mean Divisia Index (LMDI) for 10 developed and 3 emerging economies. A service transition is rather modest in real terms. The major driver of the decline in energy intensity rests within the manufacturing sector. Meanwhile, the transition to a service sector had a small downward impact on energy intensity in 7 of the developed countries (and no impact in the others). For emerging economies like Brazil, Mexico and India, it is the residential sector that drives energy intensity down because of the declining share of this sector as the formal economy grows, and as a consequence of switching to more efficient fuels."
Article Level Metrics
Public Library of Science are providing an Excel file with details of citations and page views for all 18,000 + articles that they've published so far. Not sure what to do with it apart from note that the most viewed article is Why Most Published Research Results are False. It's only the 7th most cited article though. The winner by that metric is "Human MicroRNA Targets". The correlation between Scopus citations and pageviews is only 0.43. Of course, RePEc provide similar data on abstract views and downloads and citations but it would be hard or impossible to line the two datasets up at all.
Thursday, November 11, 2010
Thursday, November 4, 2010
ARC Releases Proposed Changes to Discovery Program
They are looking for feedback on the proposals, but are planning to put them in place in time for the next round of Discovery applications in around 4 months time exactly. The main changes from what I gather are:
1. A new separate fellowship for early career researchers (ECRs).
2. The regular Discovery scheme will no longer take any special note of ECR applications.
3. "there will be revised application processes to reduce complexity, remove current restrictions on teaching relief and enable research-based creative practice proposals to be eligible."
4. All Discovery grants will be limited to three years full time ARC funding.
5. The current professorial fellowships will be replaced by a greater number (up to 70) of 2-3 year fellowships targeted at mid to late career academics.
6. "There will be greater emphasis on the assessment of the research proposal." I guess that means less emphasis on track records?
1. A new separate fellowship for early career researchers (ECRs).
2. The regular Discovery scheme will no longer take any special note of ECR applications.
3. "there will be revised application processes to reduce complexity, remove current restrictions on teaching relief and enable research-based creative practice proposals to be eligible."
4. All Discovery grants will be limited to three years full time ARC funding.
5. The current professorial fellowships will be replaced by a greater number (up to 70) of 2-3 year fellowships targeted at mid to late career academics.
6. "There will be greater emphasis on the assessment of the research proposal." I guess that means less emphasis on track records?
Saturday, October 30, 2010
Seminar 9th November
Just a reminder that next Tuesday 9th November at 2:00pm I will be giving a seminar at the Arndt Corden Division of Economics on the role of energy in long-run economic growth. I'll be happy to send you a copy of my paper if you e-mail me. We hope to get it up on the web in a formal working paper series soon at which point I will blog about it. Abstract and details are here.
Thursday, October 28, 2010
CCEP Working Papers Launches
Yesterday, CCEP: Centre for Climate Economics and Policy at ANU was launched. The Centre has around 30 associates. The majority are from outside ANU and several are from outside Australia too. We already have the first working papers up on the website.
Tuesday, October 26, 2010
EEN Symposium: 22-24 November
The EEN Symposium will take place between 22nd and 24th November at the Crawford School at ANU. The symposium will showcase the results of the research carried out by the Environmental Economics Research Hub as well as presentations of 14 invited papers from outside researchers. My presentation is at 1:30pm on Monday, 22nd November.
Registration is free! You can find the
full program here.
Monday, October 25, 2010
ARC Funding Outcomes Announced
The Australian Research Council announced it's decisions on applications for Discovery and Linkage grants today. Congratulations to Frank Jotzo and Peter Wood on the success of their application. Also, congratulations to my colleague in the Arndt-Corden Department of Economics, Prof. Athukorala whose proposal with Peter Robertson: "Sustaining India's economic transformation: challenges, prospects and implications for Australia and the Pacific region" was also funded. Also of interest given the topics of this blog:
Jakob Madsen won an Australian Professorial Fellowship for "The great divergence, long-run growth and unified theories of economic growth."
David Pannell, John Rolfe, Michael Burton, and Jessica Meeuwig received funding for their Linkage project: "Do scientist and public preferences diverge? Analysing expert and public preferences for environmental and social outcomes for the Swan River."
Congratulations!
Congratulations!
Sunday, October 24, 2010
Porsche Develops Hybrid Technology
I blogged about the development of hybrid cars by BMW and Mercedes when I was visiting Munich. My point was that fuel economy standards were forcing luxury car makes to adopt hybrid technology. I saw this as a route to wider adoption of hybrid technology in mass-market cars. Non-luxury hybrids seem so far to only appeal to "green consumers" willing to pay a premium for lower fuel consumption. Anyway, the New York Times has an article on new hybrid systems developed by Porsche.
One is a hybrid version of the Cayenne SUV with a 35kW electric motor and a 250 kW petrol engine. Urban fuel economy is improved from 16 mpg to 21 mpg. Highway fuel economy only improves by 10%. Then there is a racing car that uses a flywheel to store energy from braking which can then be used to power a generator. It has two 60 kW electric motors driving the front wheels in addition to the 360 kW petrol engine. The third system is a concept car that has a 375 kW petrol engine and front and rear electric motors that can produce a total of 164 kW. For comparison, the current standard V6 Ford Falcon has a 195 kW engine.
There is an increasing diversity of body plans out there, which is a positive sign in the development of a new technology.
Friday, October 22, 2010
Tests for Non-Linear Cointegration
It took more than 25 years since the discovery of cointegration for someone to come up with general tests of cointegration in nonlinear regression models. Choi and Saikkonen published a paper on the topic in the June issue of Econometric Theory. One place where this might be relevant is, of course the environmental Kuznets curve, where Martin Wagner argued that standard cointegration methods could not be applied to a model that included powers of the explanatory variables. Wagner has a paper with Seung Hyun Hong on just this topic. But a lot of standard models such as the translog consumer demand model involve non-linear functions. So this is a very useful advance.
Thursday, October 21, 2010
Survey Paper on Estimating Consumer Demand Systems
If you are looking for a nice survey paper on estimating static consumer demand systems (I was) Apostolos Serletis and William Barnett put one out a couple of years ago in the Journal of Econometrics. It's a nicely organized paper that should be understandable to anyone who's done the basic graduate level micro-economics and econometrics courses. In other words, it is really approachable compared to most papers published in the Journal of Econometrics :)
The beginning of the article reviews the basic neoclassical consumer demand theory. Following that, parametric approaches to estimating demand systems such as the Rotterdam model, flexible functional forms such as translog and AIDS, and "semi-non-parametric" forms such as the Fourier approach are covered. Next on the agenda are sections on revealed preference and on Engle curves. The final sections cover estimation issues, theoretical regularity - to what degree estimated demand functions meet the restrictions of neoclassical theory - and econometric regularity - mainly a discussion of non-stationarity - i.e. data with stochastic trends.
The only additional issue I would want the paper to cover, of course, is the question of how to interpret the estimated elasticities - are they estimates of short-run or of long-run elasticities - and how reliable the estimates from any single study are. The former is affected largely by the type (time series, cross section etc.) and properties (stationary, non-stationary etc.) of the data and the latter by sample size as I discuss for industrial interfuel substitution elasticities in my forthcoming paper and in my recent paper on estimating the emissions-income elasticity.
The beginning of the article reviews the basic neoclassical consumer demand theory. Following that, parametric approaches to estimating demand systems such as the Rotterdam model, flexible functional forms such as translog and AIDS, and "semi-non-parametric" forms such as the Fourier approach are covered. Next on the agenda are sections on revealed preference and on Engle curves. The final sections cover estimation issues, theoretical regularity - to what degree estimated demand functions meet the restrictions of neoclassical theory - and econometric regularity - mainly a discussion of non-stationarity - i.e. data with stochastic trends.
The only additional issue I would want the paper to cover, of course, is the question of how to interpret the estimated elasticities - are they estimates of short-run or of long-run elasticities - and how reliable the estimates from any single study are. The former is affected largely by the type (time series, cross section etc.) and properties (stationary, non-stationary etc.) of the data and the latter by sample size as I discuss for industrial interfuel substitution elasticities in my forthcoming paper and in my recent paper on estimating the emissions-income elasticity.
Why You Should Have a Blog
(if you are an academic)
The vast majority of hits on my website that originate with Google are from people entering keywords closely related to my name. By contrast, Google Analytics shows that almost no-one looking for my name arrives at my blog. So the blog attracts an audience that would be unlikely to arrive at my website and check out my publications on all the topics that I write about on the blog. In the past I did have a bit more content on my website. But the blog now has more than 300 articles for search engines to hit on a wide range of topics. So I think it has been pretty successful in getting my opinions and expertise on that range of topics out to people who are interested and much more successful than I think the website would ever have been. There isn't as much crossover of visitors from one site to the other as I would have liked. But then most of the links about my research on both sites go to RePEc etc. rather than to the sister site.
The vast majority of hits on my website that originate with Google are from people entering keywords closely related to my name. By contrast, Google Analytics shows that almost no-one looking for my name arrives at my blog. So the blog attracts an audience that would be unlikely to arrive at my website and check out my publications on all the topics that I write about on the blog. In the past I did have a bit more content on my website. But the blog now has more than 300 articles for search engines to hit on a wide range of topics. So I think it has been pretty successful in getting my opinions and expertise on that range of topics out to people who are interested and much more successful than I think the website would ever have been. There isn't as much crossover of visitors from one site to the other as I would have liked. But then most of the links about my research on both sites go to RePEc etc. rather than to the sister site.
Wednesday, October 20, 2010
Index Numbers and Consistency in Aggregation: Part II
This post gets even more technical than the last. I'm just blogging about what I'm reading in the course of my research. I read a whole bunch more papers on index numbers, which got more and more technical. The bottom line is that for most applications the chain Fisher index is an appropriate index to use.
An index is superlative if it is exact for an aggregator function (e.g. a production function) that can provide a second order differential approximation to an arbitrary twice differentiable linear homogenous function. A second order differential approximation is one where the level, first derivative, and second derivatives of the two functions are equal at the point of approximation.
Diewert (1978) shows that the Vartia I index differentially approximates a superlative index as long as both prices and quantities are strictly non-zero and there is actually no price and quantity change between the periods. What this means is that for relatively small changes in prices and quantities the Vartia I index will give very similar results to superlative indices like the Törnquis index and the Fisher index.
The nature of superlative indices themselves means that “chained” indices are always preferable to non-chained indices. A chain index is one where the index is computed for each year (or whatever is the smallest available gap between datapoints) and the product of those annual indices is used as the time series of the index over time even if we only want the change over a much longer period.
Diewert (1978) goes on to show that chained superlative indices will yield close to consistent aggregation for relatively small changes in prices and quantities. Diewert (1978) also shows in an empirical appendix that chained Vartia, Törnquist, and Fisher indices produce almost identical results and that two stage aggregation produces almost the same results as one step aggregation for Törnquist and Fisher.
An additional advantage of the Fisher index over logarithmic indices such the discrete Divisia index also known as the Törnquist index is that it can easily handle the introduction of new goods, as zero values pose no problem for it. One way to deal with this for the Törnquist or Vartia I indices is to compute the price index, assuming that the price of a new input was the same before its introduction as in the year of its introduction and then find the quantity index as the ratio of total value to the price index.
References
Diewert, W. E. (1978) Superlative index numbers and consistency in aggregation, Econometrica 46(4): 883-900.
An index is superlative if it is exact for an aggregator function (e.g. a production function) that can provide a second order differential approximation to an arbitrary twice differentiable linear homogenous function. A second order differential approximation is one where the level, first derivative, and second derivatives of the two functions are equal at the point of approximation.
Diewert (1978) shows that the Vartia I index differentially approximates a superlative index as long as both prices and quantities are strictly non-zero and there is actually no price and quantity change between the periods. What this means is that for relatively small changes in prices and quantities the Vartia I index will give very similar results to superlative indices like the Törnquis index and the Fisher index.
The nature of superlative indices themselves means that “chained” indices are always preferable to non-chained indices. A chain index is one where the index is computed for each year (or whatever is the smallest available gap between datapoints) and the product of those annual indices is used as the time series of the index over time even if we only want the change over a much longer period.
Diewert (1978) goes on to show that chained superlative indices will yield close to consistent aggregation for relatively small changes in prices and quantities. Diewert (1978) also shows in an empirical appendix that chained Vartia, Törnquist, and Fisher indices produce almost identical results and that two stage aggregation produces almost the same results as one step aggregation for Törnquist and Fisher.
An additional advantage of the Fisher index over logarithmic indices such the discrete Divisia index also known as the Törnquist index is that it can easily handle the introduction of new goods, as zero values pose no problem for it. One way to deal with this for the Törnquist or Vartia I indices is to compute the price index, assuming that the price of a new input was the same before its introduction as in the year of its introduction and then find the quantity index as the ratio of total value to the price index.
References
Diewert, W. E. (1978) Superlative index numbers and consistency in aggregation, Econometrica 46(4): 883-900.
Tuesday, October 19, 2010
Index Numbers and Consistency in Aggregation: Part I
There are many formulae for the index numbers used to compute price and quantity indices, such as a consumer price index or a volume index of imports, in economics. The Laspeyres, Paasche, Divisia, and Fisher indices are the best known of these formulae. A body of theory examines the criteria that can be used to decide which formula to use in a particular application. One important property is consistent aggregation. Say that consumers purchase the following categories of goods and services: Education, health care, food, clothing. First compute price indices for goods using the data on quantities and prices of food and clothing and services using the data on quantities and prices of education and health care and then compute a consumer price index using the resulting goods and services price indices. If this index is the same as a consumer price index computed using the data on the four original commodities then the index formula is said to exhibit consistent aggregation.
Another important property is “Fisher’s factor reversal test”. Compute a price index for the ratio of the prices of a group of commodities relative to a base year as well as the corresponding quantity ratio index. Then if the product of these two indices equals the ratio of the total value or cost in the second period relative to the first the index formula is said to pass Fisher’s factor reversal test.
Vartia (1976) proposed a formula that passes both these tests dubbed the Vartia I index. The Vartia I index for a change in price between period 0 and period 1 is:
where superscripts refer to the two time periods, the pi are the prices and the xi are the quantities of each of the commodities indexed by i. p and x are the vectors of the prices and quantities. L() is a function defined by:
But Vartia’s index isn’t perfect. Another desirable property of index functions is that a quantity index for the ratio of aggregate quantities in the second period relative to the first should be equal to the ratio of production functions that use those quantities of inputs in the second period and the first.* Such an index is called an exact index.
Diewert (1978) shows that Vartia’s index is only exact for the Cobb-Douglas production function. This seems rather disappointing as the Cobb-Douglas function imposes an elasticity of substitution of one between each pair of inputs rather than letting the data speak. This seems rather restrictive. So maybe Vartia’s index isn’t as ideal as he thought?
References
Diewert, W. E. (1978) Superlative index numbers and consistency in aggregation, Econometrica 46(4): 883-900.
Vartia, Y. O. (1976) Ideal log-change index numbers, Scandinavian Journal of Statistics 3: 121-126.
* Similar relationships exist for the price index and the unit cost function and for utility functions etc.
Another important property is “Fisher’s factor reversal test”. Compute a price index for the ratio of the prices of a group of commodities relative to a base year as well as the corresponding quantity ratio index. Then if the product of these two indices equals the ratio of the total value or cost in the second period relative to the first the index formula is said to pass Fisher’s factor reversal test.
Vartia (1976) proposed a formula that passes both these tests dubbed the Vartia I index. The Vartia I index for a change in price between period 0 and period 1 is:
where superscripts refer to the two time periods, the pi are the prices and the xi are the quantities of each of the commodities indexed by i. p and x are the vectors of the prices and quantities. L() is a function defined by:
But Vartia’s index isn’t perfect. Another desirable property of index functions is that a quantity index for the ratio of aggregate quantities in the second period relative to the first should be equal to the ratio of production functions that use those quantities of inputs in the second period and the first.* Such an index is called an exact index.
Diewert (1978) shows that Vartia’s index is only exact for the Cobb-Douglas production function. This seems rather disappointing as the Cobb-Douglas function imposes an elasticity of substitution of one between each pair of inputs rather than letting the data speak. This seems rather restrictive. So maybe Vartia’s index isn’t as ideal as he thought?
References
Diewert, W. E. (1978) Superlative index numbers and consistency in aggregation, Econometrica 46(4): 883-900.
Vartia, Y. O. (1976) Ideal log-change index numbers, Scandinavian Journal of Statistics 3: 121-126.
* Similar relationships exist for the price index and the unit cost function and for utility functions etc.
Monday, October 18, 2010
Launch of Centre for Climate Economics and Policy
The launch of the new Centre for Climate Economics and Policy at ANU will be on 27th October following the Asian Climate Change Policy Forum. The new centre will be directed by Frank Jotzo. There will be a working paper series, which will take over from the EERH Working Papers in the area of climate change.
Writing and Publishing Tips from Nature
Very good advice (almost all of which I follow myself) from Nature on writing and publishing.
My only caveat is that there is a real trade-off in economics between getting published in reasonable time and getting published in the top journals. The top journals have very slow review processes and very high rejection rates. Not all of them use the "desk reject" system used by top natural science journals like Nature and Science, though some do.* If top journals take a year to review a paper and accept fewer than 10% of submissions vs. 3 months at typical lower ranked journals and acceptance rates of 30-50% it is a real question as to which it makes sense to submit to. This is especially the case for people on the job market who want to quickly get some publications onto their CV but also for authors of policy-oriented articles who need to publish in a time-frame before the issues change significantly. If you just want to get your paper to the audience then it could make sense to send it to the second-tier journals (those ranked by the ARC as A journals). It will be cataloged in the Web of Science and Scopus and thought of as a reliable paper by most potential readers. But it won't help as much in getting a job or promotion as a paper in a top-ranked journal ** and some readers might think it a less reliable source and, therefore, be less likely to read it and cite it. If you have a paper that you think might be publishable in a top journal check whether that journal has a desk-reject policy.
* At Ecological Economics we do use the desk-rejection system (and we're a "second-tier" journal). Papers that are either very weak or not related to the topics we are interested in publishing on will likely be desk rejected. At the moment it is a minority of papers that are not sent for review by referees. Often an associate editor like me will decide whether to reject the paper.
** At the most elite departments in the US (maybe UK?) articles in a second-tier journal could be a negative, especially for new PhDs. If the only thing on your CV is a second-tier article, search committees are likely to downgrade their evaluation of you. It could be better to have no publications and a PhD from a top program. If you are coming from a low-ranked program you should get publications on your CV somewhat irrespective of quality.
My only caveat is that there is a real trade-off in economics between getting published in reasonable time and getting published in the top journals. The top journals have very slow review processes and very high rejection rates. Not all of them use the "desk reject" system used by top natural science journals like Nature and Science, though some do.* If top journals take a year to review a paper and accept fewer than 10% of submissions vs. 3 months at typical lower ranked journals and acceptance rates of 30-50% it is a real question as to which it makes sense to submit to. This is especially the case for people on the job market who want to quickly get some publications onto their CV but also for authors of policy-oriented articles who need to publish in a time-frame before the issues change significantly. If you just want to get your paper to the audience then it could make sense to send it to the second-tier journals (those ranked by the ARC as A journals). It will be cataloged in the Web of Science and Scopus and thought of as a reliable paper by most potential readers. But it won't help as much in getting a job or promotion as a paper in a top-ranked journal ** and some readers might think it a less reliable source and, therefore, be less likely to read it and cite it. If you have a paper that you think might be publishable in a top journal check whether that journal has a desk-reject policy.
* At Ecological Economics we do use the desk-rejection system (and we're a "second-tier" journal). Papers that are either very weak or not related to the topics we are interested in publishing on will likely be desk rejected. At the moment it is a minority of papers that are not sent for review by referees. Often an associate editor like me will decide whether to reject the paper.
** At the most elite departments in the US (maybe UK?) articles in a second-tier journal could be a negative, especially for new PhDs. If the only thing on your CV is a second-tier article, search committees are likely to downgrade their evaluation of you. It could be better to have no publications and a PhD from a top program. If you are coming from a low-ranked program you should get publications on your CV somewhat irrespective of quality.
Sunday, October 17, 2010
Causes of the Demographic Transition
Oded Galor has made many contributions to growth theory and population economics and the connections between them. A new working paper examines various economic theories of the causes behind the demographic transition. In Galor's terminology the demographic transition refers specifically to the decline in fertility rates and population growth. This is the third phase in the conventional demographic transition model:
Galor gives the background of each theory, facts that they predict, and whether those facts match reality. It is a very useful survey. Hypotheses/theories that he rejects based on the evidence are that the fall in fertility was caused by:
1. Rising income in the early industrial revolution - a theory he associates with Gary Becker.
2. The decline in infant and child mortality.
3. Development of capital markets reduced the need to have children to support oneself in old age - The old age security hypothesis.
The ones for which he finds support are (not surprisingly given his previous work on these topics):
1. The rise in demand for human capital resulting in a trade-off between child quantity and child "quality".
2. The decline in the gender gap in human capital and wages.
Galor gives the background of each theory, facts that they predict, and whether those facts match reality. It is a very useful survey. Hypotheses/theories that he rejects based on the evidence are that the fall in fertility was caused by:
1. Rising income in the early industrial revolution - a theory he associates with Gary Becker.
2. The decline in infant and child mortality.
3. Development of capital markets reduced the need to have children to support oneself in old age - The old age security hypothesis.
The ones for which he finds support are (not surprisingly given his previous work on these topics):
1. The rise in demand for human capital resulting in a trade-off between child quantity and child "quality".
2. The decline in the gender gap in human capital and wages.
Energy Efficiency Report Part II
Reading through the report they seem to come to similar conclusions to me on Australia's track record on energy efficiency. The main goal is a 30% reduction in Australia's energy intensity by 2020. This implies an annual reduction of 2.6% per annum. Since 1980 energy intensity has declined by 1.3% per annum so the target is fairly ambitious in seeking to double this historical rate.
The centrepiece policy recommendation is to broaden existing energy efficiency schemes that currently exist in NSW, Victoria, and South Australia to a national energy certificate scheme. Credits would be generated by energy efficiency increasing investments that could then be sold to energy suppliers who would be obligated to improve the energy efficiency of their customer base. An interesting feature of this proposal is that it reduces the "split incentives" faced by renters and landlords. Often, landlords are reluctant to improve energy efficiency because they won't gain the benefits of energy cost savings while renters are ill-informed about the energy cost parameters of alternative rental properties and so don't make choices of where to live and how much rent to pay on that basis. In very tight rental markets there is often little choice anyway on where you can live*. Under the certificate scheme the landlord could sell the credit and the renters gain from the cost savings. Under an energy tax the incentives remain as asymmetric as they are now.
A problem with such schemes is that they seem to ignore the rebound effect. However, rebound effects are usually much less than 100%.
* See search markets
The centrepiece policy recommendation is to broaden existing energy efficiency schemes that currently exist in NSW, Victoria, and South Australia to a national energy certificate scheme. Credits would be generated by energy efficiency increasing investments that could then be sold to energy suppliers who would be obligated to improve the energy efficiency of their customer base. An interesting feature of this proposal is that it reduces the "split incentives" faced by renters and landlords. Often, landlords are reluctant to improve energy efficiency because they won't gain the benefits of energy cost savings while renters are ill-informed about the energy cost parameters of alternative rental properties and so don't make choices of where to live and how much rent to pay on that basis. In very tight rental markets there is often little choice anyway on where you can live*. Under the certificate scheme the landlord could sell the credit and the renters gain from the cost savings. Under an energy tax the incentives remain as asymmetric as they are now.
A problem with such schemes is that they seem to ignore the rebound effect. However, rebound effects are usually much less than 100%.
* See search markets
Saturday, October 16, 2010
Report of the Prime Minister's Task Force on Energy Efficiency
The report of this group commissioned during Kevin Rudd's period as prime minister was released about a week ago. I gave a presentation to some members of the team earlier this year on my work comparing Australian energy efficiency to that of other countries. So I was particularly interested to see what they came up with. One interesting point for academic economists is that the extensive reference list in the report includes hardly any references to the academic energy economics literature. There are no references to journals like Energy Policy or Energy Economics. There are several to a special issue of the journal Energy Efficiency which dealt with the energy efficiency certificates that the Task Force came out in favor of. I wonder whether this is because what is published in these journals is too esoteric, too useless or irrelevant, or the group just didn't have the time to do look through that stuff. It's got to be a bit depressing for people who publish stuff on energy policy to see a review which doesn't really look at all at most of what has been published academically on the topic. Also, the advisory group to the task force included representatives of industry and NGOs but no academic researchers.
I'll have more on the report soon. Henry Ergas doesn't like it.
I'll have more on the report soon. Henry Ergas doesn't like it.
Wednesday, October 13, 2010
John Ionnanides
For those of you interested in meta-analysis, the Atlantic has an interesting article on John Ioannidis (needs subscription I think?). I've written about him previously and his paper "Why Most Published Research Results are False". This article gives more color about him and his research group.
On a related note, you can now get my article on meta-analysis of interfuel substitution for free at Journal of Economic Surveys. This seems an odd publication model to me. Why give away the paper for free before assigning it a journal issue and page numbers and then put it behind the paywall? The New York Times used to follow a similar online model where archived material cost money but the current issue was free online. But in that case people looking for old articles wanted very specific information and were probably more willing to pay for it than someone looking for current news who could just go to another website.
Monday, October 11, 2010
Tips for Choosing a Title for a Paper
I'm having a harder time than usual in deciding on a title for our latest paper. For some reason none of the alternatives I have seem good. So I looked on the web for some ideas and the following seem to be the key useful ones:
1. Make sure there are the main keywords in your title. You may think that your abstract will handle that. Google Scholar thinks otherwise.
2. Shorter is better than longer, subject to condition 1.
3. Active language is better than passive. One site I encountered favored the opposite resulting in really boring titles.
4. Get the most important idea first in the title - this was not something I had really thought of. That means I have to choose the most important idea :)
Now I have even more potential titles than before!
1. Make sure there are the main keywords in your title. You may think that your abstract will handle that. Google Scholar thinks otherwise.
2. Shorter is better than longer, subject to condition 1.
3. Active language is better than passive. One site I encountered favored the opposite resulting in really boring titles.
4. Get the most important idea first in the title - this was not something I had really thought of. That means I have to choose the most important idea :)
Now I have even more potential titles than before!
Sunday, October 10, 2010
The Story of Climate Change Legislation in the Obama Administration
Peter Wood pointed out this story in the New Yorker, which chronicles the history of the so far failed attempts by the Obama Administration and various senators to legislate on climate change policy.
Saturday, October 9, 2010
Is the Drought Over?
As you can see from the above graph, after a long period of standing at about 50% capacity, Canberra's dams are now at around 80% capacity. All the dams in the Western catchment in the Brindabella-Namadgi mountains are near capacity. Googong to the east which is half the total capacity is only at 60%. But that too is an improvement. As we flew into Canberra on Thursday we could see that the Eastern half of Lake George in now full of water. That's the most I've ever seen. I suppose that that is about a 1m depth. The historic shoreline is at 2m depth. The prehistoric shoreline, within the last 1000 years is at 17m! So far things seem to be following the prediction in one of my first blogposts.
Image of Lake George in August from Wikipedia
Thursday, October 7, 2010
Back Home
I'm finally back from my trip to Europe (mainly work) and Asia (mainly family visit/vacation). As an Australian the only country we visited that seemed expensive overall was Denmark. Sweden no longer seems to be a terribly expensive country as it once seemed to be. The Big Mac index doesn't agree though. Thailand, our last stop, is of course way cheap but I noticed that drinks in Starbucks don't cost much less than in the US. In general restaurant meals ranged from 1/6 (foodcourt in cheap mall) to 1/3 (waiter service restaurant but some are also more like 1/5-1/4 the price) of Australian prices for the same quality of service. Blogging might continue to be sparse until I am fully up to speed here again. The downside of going on vacation in careers like academia is that the work doesn't go away, it just piles up for you for when you get back.
Tuesday, September 14, 2010
LEGS
I attended a meeting of the LEGS Platform who are funding my visit to Lund. LEGS stands for Long-term Energy Growth and Sustainability.
As the website says: "It a new platform for inter-faculty and interdisciplinary knowledge creation and exchange at LUSEM (Lund University School of Economics and Management). It was founded March 2009 on the basis of a strategic decision by the vice chancellor of Lund University. The focus is on economic aspects of energy systems, which face the challenge of accomplishing large technological shifts in order to mitigate harmful climate change. To encourage shifts towards sustainability it is necessary not only to promote technical innovations, but also to learn more about firms under competitive pressure and new institutional settings."
The program is funded at a rate of SEK 1 million per year but also is a basis for seeking further grants. They are interested in bringing more visitors to Lund. We discussed various ways of raising visibility including conferences and workshops and I suggested starting a working paper series.
As the website says: "It a new platform for inter-faculty and interdisciplinary knowledge creation and exchange at LUSEM (Lund University School of Economics and Management). It was founded March 2009 on the basis of a strategic decision by the vice chancellor of Lund University. The focus is on economic aspects of energy systems, which face the challenge of accomplishing large technological shifts in order to mitigate harmful climate change. To encourage shifts towards sustainability it is necessary not only to promote technical innovations, but also to learn more about firms under competitive pressure and new institutional settings."
The program is funded at a rate of SEK 1 million per year but also is a basis for seeking further grants. They are interested in bringing more visitors to Lund. We discussed various ways of raising visibility including conferences and workshops and I suggested starting a working paper series.
Friday, September 10, 2010
Generosity
Peter Martin reports on a generosity index. It combines data on charitable giving (which is highest in the US) with volunteering of time, and willingness to help strangers. In combination, Australia comes out top. It's worth also checking out some of the countries that come in with very low scores. One of these is China. We often here about a lack of trust in China and these numbers bear that out.
Wednesday, September 8, 2010
Another Push Towards a Directly Elected Executive President?
Back in June when Rudd was ousted I suggested that in the long-term that move could help push Australia towards adopting an executive presidency once the Republic was back on the agenda. Yesterday's emergence of a Gillard-Labor government with a one seat majority backed by the two relatively conservative independent MPs is another nudge in that direction I think. People commenting in the media over this period have tended to go on about the Australian people electing the prime minister and government when of course the system does no such thing. But when there will be a choice to completely reform the system, direct election of the prime minister or president maybe exactly what does happen.
Tuesday, September 7, 2010
Endogenous Price of Lighting Services
Another potential issue with the Tsao and Waide and Tsao et al. work on demand for lighting is that the price of lighting services is to some degree endogenous. If this is the case then estimates of a model of the sort suggested in my previous post will be subject to simultaneity bias. So why might the price be endogenous?
First, there are the usual reasons that we are not separating the effects of moves in the supply and demand curves here at all. Demand for lighting might affect the prices of the energy sources used to supply it and the prices of lamps. Lighting accounts for only a few percent of energy use so the effects on energy prices might not be that important. What seems to be a more fundamental problem here is that at many periods in time more than one lighting technology has been available. For example, at the moment there are incandescent bulbs, fluorescent lights, emerging solid state lighting etc. The efficiency with which energy is converted to light depends on the mix of these energy sources. Therefore, the cost of "lighting services" as defined by Tsao et al. depends on that mix of technologies. But the mix of technologies is chosen by consumers. That seems to be a problem I think?
First, there are the usual reasons that we are not separating the effects of moves in the supply and demand curves here at all. Demand for lighting might affect the prices of the energy sources used to supply it and the prices of lamps. Lighting accounts for only a few percent of energy use so the effects on energy prices might not be that important. What seems to be a more fundamental problem here is that at many periods in time more than one lighting technology has been available. For example, at the moment there are incandescent bulbs, fluorescent lights, emerging solid state lighting etc. The efficiency with which energy is converted to light depends on the mix of these energy sources. Therefore, the cost of "lighting services" as defined by Tsao et al. depends on that mix of technologies. But the mix of technologies is chosen by consumers. That seems to be a problem I think?
Sunday, September 5, 2010
Tsao & Waide: "The World’s Appetite for Light"
Following up yesterday's post I looked at the study that provided the background for the paper studied there:
Jeffrey Y. Tsao and Paul Waide:
"The World’s Appetite for Light: Empirical Data and Trends Spanning Three Centuries and Six Continents"
LEUKOS VOL 6 NO 4 APRIL 2010 PAGES 259 – 281
This paper gives a figure of per capita light consumption and GDP/cost of light:
I drew the thick black line on the chart which represents an alternative curve fit. I know it is very crude but I couldn't draw a nice smooth curve with my software. I'm not saying that this is a best fit curve, just that it is a vaguely plausible alternative. If something like this model was fitted then the rebound effect is less than 100%. Tsao and Waide do not test alternatives to their linear model that assumes an income elasticity of one and a price elasticity of minus one of lighting demand. I think more research in this area is definitely warranted though I praise Tsao and Waide's pioneering attempt to bring together different sources of data and begin the empirical analysis.
Whatever the truth, the Economist's proposal to just stick with incandescent lighting is wrong. Even if there were no energy savings from introducing solid state lighting, as Tsao et al. state people would have more lighting services for a given energy input. There are environmental impacts to having too much outdoor light. These should be addressed separately, not by halting technological progress. If regulation limited the amount of outdoor light then these innovations would result in saving of energy...
Jeffrey Y. Tsao and Paul Waide:
"The World’s Appetite for Light: Empirical Data and Trends Spanning Three Centuries and Six Continents"
LEUKOS VOL 6 NO 4 APRIL 2010 PAGES 259 – 281
This paper gives a figure of per capita light consumption and GDP/cost of light:
I drew the thick black line on the chart which represents an alternative curve fit. I know it is very crude but I couldn't draw a nice smooth curve with my software. I'm not saying that this is a best fit curve, just that it is a vaguely plausible alternative. If something like this model was fitted then the rebound effect is less than 100%. Tsao and Waide do not test alternatives to their linear model that assumes an income elasticity of one and a price elasticity of minus one of lighting demand. I think more research in this area is definitely warranted though I praise Tsao and Waide's pioneering attempt to bring together different sources of data and begin the empirical analysis.
Whatever the truth, the Economist's proposal to just stick with incandescent lighting is wrong. Even if there were no energy savings from introducing solid state lighting, as Tsao et al. state people would have more lighting services for a given energy input. There are environmental impacts to having too much outdoor light. These should be addressed separately, not by halting technological progress. If regulation limited the amount of outdoor light then these innovations would result in saving of energy...
Saturday, September 4, 2010
Will the Adoption of Solid State Lighting Lead to an Increase in Energy Use?
The Economist discusses an article in Journal of Physics D: Applied Physics by Tsao et al. on the effects on energy use of the adoption of solid state (i.e. LED) lighting (SSL) on global energy use. The Economist argues that it would be better to keep incandescent bulbs as a result. This seems a bit crazy. The literature on the rebound effect suggests that for energy saving innovations for consumers in developing countries the rebound effect is typically of the order of 30%. In other words, the net energy savings are around 70% of the amount of energy nominally saved by the innovation. Joshua Gans comments on this article taking a direction inspired by the Schumpetarian endogeneous growth literature where new innovations are sold by monopolist innovators.
To understand why the authors posit such a large rebound effect I took a look at the original article. The first key leg of their model is the following relationship between historical data on lighting use and a very simple model:
Most of the early data relies on the work of Fouquet and Pearson. The model treats light consumption as a function of two variables: GDP and cost of lighting. The elasticity of demand with respect to the cost of lighting is minus one and the income elasticity is plus one. Based on this data the model looks pretty plausible. It would be nice though to see this data in per capita terms or to see lighting intensity of GDP plotted against cost of lighting to get a better idea of how robust it is. The way in which the cost of lighting is computed will be very critical too. I'll look at that in a subsequent blogpost.
If the demand elasticity is minus one then any reductions in the cost of lighting will be exactly offset by increases in consumption of lighting services. Both these elasticities seem high in absolute value for developed economies. So it would be nice to at least test a model which allows the elasticities to vary with income level vs. a model which does not if you are going to make big predictions about the future.
Solid state lighting will certainly reduce energy costs of lighting but this is achieved partly by substituting capital for energy. At the moment, LED lights are expensive. This means that the cost of lighting is reduced by less than the energy use is reduced by the innovation currently. Therefore, even if the price elasticity of demand was minus one, adoption of solid state lighting would reduce energy use (ignoring indirect energy costs of capital). The authors argue of course that these costs will reduce rapidly. Historically, they argue that capital costs are typically 1/3 of energy costs of lighting. They assume that by 2030 the capital costs of SSL will be the same so that there is no capital-energy substitution in the adoption of SSL.
To understand why the authors posit such a large rebound effect I took a look at the original article. The first key leg of their model is the following relationship between historical data on lighting use and a very simple model:
Most of the early data relies on the work of Fouquet and Pearson. The model treats light consumption as a function of two variables: GDP and cost of lighting. The elasticity of demand with respect to the cost of lighting is minus one and the income elasticity is plus one. Based on this data the model looks pretty plausible. It would be nice though to see this data in per capita terms or to see lighting intensity of GDP plotted against cost of lighting to get a better idea of how robust it is. The way in which the cost of lighting is computed will be very critical too. I'll look at that in a subsequent blogpost.
If the demand elasticity is minus one then any reductions in the cost of lighting will be exactly offset by increases in consumption of lighting services. Both these elasticities seem high in absolute value for developed economies. So it would be nice to at least test a model which allows the elasticities to vary with income level vs. a model which does not if you are going to make big predictions about the future.
Solid state lighting will certainly reduce energy costs of lighting but this is achieved partly by substituting capital for energy. At the moment, LED lights are expensive. This means that the cost of lighting is reduced by less than the energy use is reduced by the innovation currently. Therefore, even if the price elasticity of demand was minus one, adoption of solid state lighting would reduce energy use (ignoring indirect energy costs of capital). The authors argue of course that these costs will reduce rapidly. Historically, they argue that capital costs are typically 1/3 of energy costs of lighting. They assume that by 2030 the capital costs of SSL will be the same so that there is no capital-energy substitution in the adoption of SSL.