Let me state up front that (old wind-bag or not) I am not in any way "anti" 
wind energy, nor an advocate for nuclear. Actually I would like to be more 
pro-wind, but the numbers keep getting in the way. 

IOW, I am a strong advocate for trying to get a true picture of the comparative 
cost situation, since I consider both of these potential solutions to our 
energy crisis as "green" and both are *highly preferable* to burning coal, 
natural gas or any fossil fuel.

One interesting point about comparative load factors - which can make 
cross-comparison for (wind vs nuclear) "challenging" is that from the 
mid-nineties on, which is roughly the time that computer controls were widely 
implemented in the nuclear industry, and demand began to peak for all energy - 
the average load factor for nuclear has made a rather dramatic year-to-year 
gain. And also it should be noted that the older wind turbines were not as 
efficient as they are now. If you compare "old vs new", or "budgeted" instead 
of "actual" you can put a lot of "spin" on the numbers (pun intended).

And only recently has reliable actual results from the larger wind farms been 
available without some glossing over the problems of mechanical failure - which 
has been severe up to 2000. Here is the story for nuclear:

"Analysis of Load Factors at Nuclear Power Plants" by Michael T. Maloney is one 
of several articles which has looked at this - followed by an "truth" site 
about wind costing:

http://www.truthaboutenergy.com/Wind.htm

It is a "truth" site because in contrast to the wind advocacy groups - which 
this site claims are trying to present a distorted "socila engineering" picture 
of what wind energy "should cost" - they strive (claim to strive) to find 
actual costs, as opposed to budgeted costs.

The world-wide historical experience for the past half century in nuclear load 
factor is 69.4% for reactors currently operating, and 68.3 percent for all 
commercial reactors over all time. Often one will see 70% as the average which 
is used in planning.

However, in 2002 all reactors currently operating in the world hit an average 
of 85%. Since this is an average, it includes down time for refueling and since 
most of these reactors are older, it is a rather meaningful indicator that it 
is now high time to use the newer figures in planning - when we want to compare 
true costs vs. wind or solar.

This is a rather spectacular difference since 85% compared to 70% (if 70% was 
used in the planning stages) is not merely an improvement of 15% towards a goal 
of full optimization (which is impossible due to refueling) but is a 
comparative increase of actual over planned of 15/70 or 21+ percent. With the 
emphasis on *actual* as opposed to "budgeted" or "faceplate."

So lets say we use the 85% number since it is actual. What is the actual number 
for wind energy? Best I can tell it is not known and very site dependent.

There is a maximum of 27+%. I have never seen a higher reported actual number 
for the average over one year for any site. In Italy, the government reports 
actual at 19% for last year. In California, where the foothills are 
extraordinarily windy, and you have the largest wind farms in the USA, and you 
can see from the table a quarter of the way down this page that the actual 
figure is 22.2%

http://www.truthaboutenergy.com/Wind.htm

Bottom line: when you compare **actual load factor** for recent years of wind 
energy vs. nuclear energy -- there is generally a 4:1 advantage for nuclear in 
the load-factor category.

Like it or not - there is no better way to state it than a four to one 
difference in load factor as things stand now in terms of *actual* performance 
based on modern recent yearly result - so why fight it with meaningless "spin"?

Jones

Reply via email to