Will Nuclear Fusion Fill the Gap Left by Peak Oil?

[editor's note, by Chris Vernon] This is a guest post by TOD member Nick Rouse.

On the 12th. of December 2006 the UK Magnetics Society held its annual commemorative event with an afternoon seminar followed by the Ewing Lecture. This year the focus was on magnetic fusion energy.

Nuclear fusion has evoked opinions in the various energy blogs ranging from “sixty years of failure and a certain dead end”, to “the reason why we do not need to worry about peak oil”. This event was a good opportunity to gain a clearer view of what part, if any, fusion energy could play in filling the gap as oil and then gas production peak and decline.

After many years of half-hearted support there is now a surge of backing for fusion energy. Many will have heard about the agreement to built the International Tokamak Experimental Reactor ITER. Less well publicised have been the European Fast Track program and the bilateral agreement between the EU and Japan called ‘the Broader Approach’ which, amongst other things, will lead to DEMO, the first full electrical power generating reactor. From the UK side this new found enthusiasm has been in large part due to Sir David King, the Chief Scientific Advisor to the government, who may not fully accept the imminence of peak oil, but does see an energy crisis looming and has become convinced that the possibility of fusion energy is promising enough to warrant substantial investment.

The four speakers in the seminar were senior members of the staff of the Culham Division of the United Kingdom Atomic Energy Authority (UKAEA). The lecture was given by Dr. Frank Briscoe, operations director at Culham. Culham has been the centre of fusion research in the UK since 1960 when the first large scale UK fusion experiment, ZETA was transferred there from Harwell. ZETA was still there, but no longer working, when I first visited Culham from Harwell as a student in 1964. I have retained a personal interest in fusion since then.

Of particular interest to this forum were the Ewing lecture itself, entitled ‘Magnetic Fusion Energy: Progress and the Remaining Challenges’ and the seminar presentation by Dr. Derek Stork entitled ‘Scientific and Engineering Challenges of a DEMO Fusion Reactor’. Since the various contributions overlapped somewhat and some of the material was of specialised interest to those involved in magnetics, I have combined those parts of the contributions that I hope are of interest to this forum.

Dr. Briscoe started his lecture with a brief summary of how nuclear fusion works.

Fusion Basics

Atomic energy, both fission and fusion, exploits the fact that atoms weigh less than the sum of their parts. This is because of the energy that binds them together and is released in forming the atom from its constituent protons, neutrons and electrons and would have to be expended to rip the atom apart back to these constituent parts. The energy released in binding relates directly to the drop in mass of the atom by Einstein’s equation E = mc². This loss in mass is called the mass defect and varies between the elements. Energy can be released by either splitting very heavy atoms in two (fission) or joining light atoms together (fusion).

Mass Defect
Mass defect: The mass of an atom of each element expressed as a percentage
of the mass of the protons, neutrons and electrons that constitute the atom.
The three isotopes of hydrogen plus the most common or stable isotope of
the other elements are shown. Click to Enlarge

The problem with trying to fuse atoms together is that, although there is a very strong force of attraction between the protons and neutrons in the nucleus when they are very close due to the strong nuclear force, this force drops away very rapidly with distance so that at slightly greater distances the electrostatic repulsion between the positively charged protons becomes dominant. To fuse the nuclei of two atoms, they have to be forced together hard enough to overcome this repulsion until they are close enough together for the force to become attractive.

Many different fusion reactions and many ways of bringing the atoms together have been considered. Far and away the front runner, in terms of progress to achieve a large scale commercial electrical power plant, is the reaction between the hydrogen isotopes deuterium and tritium when heated to a temperature such that thermal collisions between the nuclei carry enough energy to overcome the electrostatic repulsion with magnetic forces being used to confine the reactants. The temperature required to cause the reaction is in the order of a hundred millions degrees. At such a temperature the reactants are completely dissociated into a cloud of nuclei and electrons called a plasma. The plasma is far too hot to confine with material walls but because the electrons and nuclei are charged and moving they can be deflected by magnetic forces and with a suitably shaped magnetic field, confined.

The fusion reaction is deuterium plus tritium gives helium plus a neutron:

2H + 3H —› 4He + n

20% of the energy released by the reaction is carried off by the helium ion as kinetic energy (3.5Mev per ion). Since the helium is ionised it is charged and is confined by the magnetic field. In colliding with the rest of the plasma it gives up its kinetic energy, heating the plasma. If the condition called ignition can be reached this will be the only source of heat needed to maintain the plasma temperature once ignited.

The other 80% of the reaction energy will be carried off by the kinetic energy of the neutrons (14.1MeV per neutron). Since the neutron is not charged it escapes the plasma. In a reactor designed to generate power, a ‘blanket’ surrounds the plasma and the collisions the neutrons make with the material of the blanket transfers the kinetic energy of the neutron to the blanket, heating it up. This heat, plus a little gained from absorbing the hard ultraviolet/soft gamma radiation emitted by the plasma, is transferred out of the chamber by a gas or liquid and used to heat steam, to drive a turbine, which turns an alternator to generate electricity.

Deuterium exists in enormous quantities in sea water where it forms 0.03% by weight of the hydrogen but tritium exists naturally in only tiny amounts generated by cosmic rays, and decays away with a half life of 12.3 years, so that there is only about 3.6kg of naturally generated tritium at any one time distributed all around the planet. All other tritium has to be made artificially. In a commercial reactor the blanket will also perform the function of creating tritium. It will contain lithium in some form and this will react with the neutrons bombarding it to form tritium using the reaction :-

n + 6Li —› 3H + 4He

This tritium will be collected to fuel the continuing fusion reaction. Lithium 6 is a stable isotope and forms 7.5% of natural lithium. Some blanket designs require enrichment of this isotope. The tritium breeding reaction is exothermic and increases the heat production by some 20%.

At first sight these reactions appears to combine to give an overall reaction of:-

2H + 6Li —› 2 4He

However this neat cancellation requires that every neutron from the fusion reaction reacts with a lithium atom to form a tritium atom and that every tritium atom so generated is collected and fed back into the reactor and takes part in a neutron generating fusion reaction. In practice there are many loss mechanisms around the loop that mean that this path alone will not provide sufficient tritium to maintain operation. To augment the tritium production a neutron multiplier is added to the lithium. The main candidates for a neutron multiplier are beryllium and lead. Experiments have shown that tritium self-sufficiency is possible but difficult. To generate much in the way of a surplus is very difficult. Dr. Stork indicated that a tritium breeding ratio of about 1.1 was all that was expected of the designs being considered. This implies that for every 10kg of tritium fed in as fuel for the fusion reaction, 11kg of tritium will be recovered from the blanket. The magnitude of the excess tritium available is probably the limiting factor on how fast fusion energy can spread once a prototype commercial fusion reactor has been demonstrated, as discussed below.

Magnetic Confinement

As part of his seminar presentation, Dr Tom Todd gave a fascinating review of the very varied magnetic confinement systems that have been, and are being, experimented with but again by far and away the front runner in the race to practical power systems are the toriodal systems first developed by the Russians called ‘tokamaks&rsquo, from a Russian acronym. The biggest and most successful tokamak so far is the European run JET system hosted in the UK at Culham. This produced a peak of 16MW of fusion power in 1997 but many other tokamaks have been built across the world.

Critics of fusion power have categorised these experiments as so many failures because none of them have produced more output fusion power than was put in. In truth none of them were built with the intention of achieving energy break-even. All were intended to understand and develop ways of controlling the seething monster that a dense plasma at a temperature of many millions of degrees is. For the most part these experiments have, after a fair bit of modification and adjustment, reached the sort of performance hoped for. Our ability to contain the plasma at high temperature, high density and for long enough to allow a sufficient degree of reaction to take place has increased four orders of magnitude over 40 years.

Lawson Criteria
This shows the progress over the years at confining a hot plasma. The fusion product
is the plasma density in particles/m³ times the time in seconds that the plasma
can be held in these conditions, times the ion temperature in degrees Kelvin.
The requirement for this to be at least 3 x 1028 for ignition to occur in a
deuterium/tritium plasma is one of the Lawson criteria formulated by JD Lawson 1955.
The best results of JET and JT-60U are close to energy break-even, Q = 1. Click to Enlarge

There has also been great progress in predicting the performance of a plasma device as computer power has increased, so there are now fewer surprises in new experiments.

Fusion Confinement Time
Predicted and measured confinement time for 13 different fusion devices under a great
variety of conditions plus indications of where ITER and a commercial power reactor are
expected to operate by scaling the results of existing machines. Click to enlarge

Although there is still more work to do on plasma control, we have now reached the stage where we can be reasonably confident that just scaling up the size of the reactor will produce substantially more power from fusion than is put in to create the reaction. Such a reactor has been designed in detail and the major parts have already been prototyped. On 21 November 2006, after years of delay, an agreement was signed to build ITER at Cadarache in France financed by China, the EU, India, Japan, Russia, South Korea, and the USA. Together these partners represent over half the population of the planet.

Jet and ITER: The two reactors are shown in cut away diagrams. Human figures give the scale. Click to enlarge.

Power will be feed into the ITER plasma in three main ways: by transformer action causing up to 15 million amps to flow in the plasma; by neutral high energy beams of deuterium and tritium fired into the plasma; and by radio frequency energy fed in from antenna patches in the walls to excite resonances in the plasma, Transformer action is very efficient but necessarily pulsed. The other two forms of heating are less efficient but can be continuous. ITER is expected to generate 500MW of fusion energy output, with less than a tenth of that input power (Q>10) and hold that power for 400 seconds. Also it should generate 500MW output for an hour at an input of one fifth the input energy (Q>5). Although it is not stated as an aim, there is the hope that it might achieve what is called ignition where enough of the fusion energy remains in the plasma to keep the reaction going without the need of external input energy (Q = infinity). This will require higher plasma densities than needed with external energy input.

Although there seems to be reasonable confidence that ITER will come at least close to the target in plasma performance this is just the start of the challenge that needs to be met to build a commercial electrical power generating station.

If all goes well ITER will produce the first plasma before the end of 2016, but, in order to speed the development of commercial fusion power, a ‘Fast Track’ strategy is being adopted and in addition to the ITER agreement there has been a bilateral agreement between the EU and Japan called ‘the Broader Approach’. Studies of the DEMO reactor to follow ITER are part of this agreement.

Beyond ITER

A major hurdle to be jumped is the design the breeding blanket that lines the inside of the reaction chamber and the selection of suitable materials for it. This blanket is required for three purposes; to convert the energy given off by the fusion reaction to heat, to breed more tritium to fuel the reaction and to protect the superconducting coils and chamber wall from neutron irradiation There are a variety of different blanket designs that have been proposed and all of them have some problematical features to them. ITER will not have (certainly not in the early years) a full tritium breeding blanket. Most of the reaction chamber will be lined with a simple cooled neutron and heat absorbing blanket to stop the reactor overheating. There will however be space to fit in and remove small areas of tritium breeding blanket and it is proposed to try, in turn, a variety of different designs.

Test Blanket Modules
Montage of some of the proposed Test Blanket Modules. Click to enlarge.

DEMO will have a full breeding blanket to achieve tritium self-sufficiency. The materials used to make the breeding blanket and particularly the first wall facing the plasma need to survive an extremely severe combination of conditions and retain adequate strength and other mechanical properties. The heat flux on the first wall of the blanket will 0.1 to 0.3 MW/m² in ITER and rise to 0.5 MW/m² in DEMO. This DEMO figure is about twice that of a PWR type fission reactor and almost the same as a fast breeder fission reactor, The flux of energetic neutrons means that over about 5 years every atom in the first wall will have been knocked out of place an average of about 3 times for ITER and 50-80 times in DEMO and perhaps twice this in a full scale commercial reactor. Each displacement will shift the atom several tens of crystal lattice spaces from its original site. Atomic transmutations caused by the neutron flux will leave hydrogen and helium embedded in the wall. For DEMO this will result in 500-800 parts per million by atom count (appm) of helium and 2300 to 3600 appm of hydrogen. For ITER it will be acceptable for the blanket to be well cooled to keep it at a fairly low temperature but in a reactor trying to generate electricity by a conventional steam cycle, it is important for high thermal efficiency that steam and hence the blanket coolant are run at as high a temperature as possible. It is expected that the blanket structure will operate at 500°C to 800°C

This combination of requirements mean there is almost no chance of a breeder blanket that can survive the full life of the reactor. After a few years the material properties of the blanket structure will have degraded so much that it will have to be replaced. The inside of the chamber will be far too radioactive for a person to go in there, and so a remote handling arm will have reach in through one of the ports, bending around the central pillar where required, and remove the old blanket, section by section and replace it, section by section with a new blanket disconnecting and reconnecting the pipework (probably by cutting and re-welding) without spillage. The sections are likely to weigh several tonnes. The blanket sections will have to have a fairly tight fit to protect all the chamber wall and coils, but the extreme service conditions mean that they will be significantly distorted at the end of their service life. They must not jam in place or the long articulated arm will not be able to pull them out. There is reasonable confidence that a blanket of some sort came be built to operate for some length of time but the economics of a future power station will depend heavily on how hot the blanket can run and how long it can survive before replacement and how fast it can be replaced. The remote handling arm is a major engineering challenge.

The helium generated in the plasma by the fusion reaction and any other contaminants such as material coming off the structure under severe bombardment, need to be removed from the plasma continuously to allow the reaction to continue. To this end, at the bottom of the reaction chamber there is a divertor structure where the magnetic field is reduced so that a small fraction of the plasma separates and is allowed to cool as it circulates to the point where it recombines to form neutral atoms before colliding with the divertor plates. The gases can then be pumped out and the hydrogen isotopes separated for re-injection. Although the plasma has been cooled before hitting the divertor plates the heat flux on them is still enormous. In the DEMO reactor it will be about 15MW/m². This is about 15% of the energy generated by the fusion reaction and this energy will taken away by a coolant (probably helium) and will be used to generate electricity together with the heat from the blanket. 15MW/m² is about 20% of the power density at the surface of the sun. There is even less chance of the divertor plates lasting the lifetime of the reactor. In fact the lifetime may be as short as two years and these will also need to be replaced by remote handling in sections usually called cassettes.

As well as all the other requirements that must be met by the materials of the blanket and divertor, it is important that the amount of radioactivity induced in them by neutron bombardment is minimised. This precludes the use of some elements that might otherwise be useful for alloying. Theoretically there are no net radioactive materials produced by the main reactions of the reactor since the tritium is recycled. However there will be radioactive waste due to side effects such as neutron activation and tritium embedded in the structure. This will affect both the replaced parts during the reactors life time and the reactor itself when it is decommissioned. Predicting the level of radioactivity of the waste is difficult and impurity levels will have a strong influence. However the radioactivity has been estimated to be 2 orders of magnitude less than a fission reactor and to be short lived, so that after 100 years the level of radiotoxicity will be less than the waste from an equivalent-sized coal fired power station, Keeping the radiotoxicity low will require the tritium recovery and recycling to be achieved with extremely low leakage.

Fusion Radiotoxicity
Decay with time of radiotoxicity. Click to enlarge.

The development and testing of materials to meet these very onerous requirements is crucial to the speed of deployment of fusion energy. Because ITER will not produce a plasma for almost 10 years at best and even then will not produce a neutron flux anywhere near as intense as in DEMO and that only intermittently, it has been decided to build a special facility to reproduce, over a small area, the conditions that DEMO and following reactors will have to face. This will be called the International Fusion Materials Irradiation Facility (IFMIF). This will be developed and tested in Japan as part of the Broader Approach agreement, although the final site for the installation has not yet been agreed. At IFMIF two 40MeV linear accelerators will provide 250mA of deuterons which will be targeted on flowing liquid lithium to produce neutrons with an energy spectrum up to 14Mev matching that expected in DEMO. The flux will be sufficient to produce 20 displacements per atom per year in the test samples. Further steps that will be taken to speed up development are that the JET reactor at Culham will have its present carbon chamber lining converted back to a metal wall to provide test data on this material for ITER and DEMO and the JT-60 tokamak will upgraded to have superconducting coils and to act as a satellite control for ITER. A smaller UK Tokamak MAST, which has a toroidal plasma aspect ratio squeezed so tight that it is like a spherical apple with the core cut out, will also provide input to the DEMO design.

Dr. Briscoe summarised the challenges ahead with the following table:

ITEM Existing
Phase 1
Phase 2
plasma disruption avoidance 2 3   C R R
steady-state operation 1 3   3 r r
divertor performance 2 3   R R R
burning plasma at Q>10   3   R R R
power plant plasma performance 1 3   C R R
tritium self-sufficiency   1   3 R R
materials characterisation     3 R R R
plasma-facing surface lifetime 1 2   2 3 R
facing wall/ blanket/ divertor materials lifetime   1 2 2 3 R
facing wall/ blanket components lifetime   1 1 1 3 R
neutral beam/radio frequency heating systems performance 1 3   R R R
electricity generation at high availability       1 3 R
superconducting machine 2 3   R R R
tritium issues 1 3   R R R
remote handling 2 3   R R R


    1     Will help to resolve the issue
2 May resolve the issue
3 Should resolve the issue
C Confirmation of resolution needed
r Solution is desirable
R Solution is a requirement


A timetable has been proposed for the overlapping development of the various proposed devices. It assumes that the only obstacles to its implementation are technical ones and comes with many caveats, but it sees the first commercial power station operational in 2048. Even if this very compressed time table is met, it does not signify the widespread availability of fusion power. There is a limit to the rate at which the number fusion power stations can be multiplied set by the supplies of tritium.

Fusion Timetable
The Fast Track timetable. Click to enlarge.

Tritium Supplies

The large scale adoption of fusion energy will see tritium used on a scale vastly greater than has ever been seen before. Something like a 220kg per year of tritium will be consumed for every 1GW of continuous electrical generation, assuming that 4GW thermal will generate 2GW electrical of which 1GW will be used to provide all the inputs to the system leaving 1GW of output power. At present world-wide electrical consumption averages to a continuous 1700GW

Nearly all the worlds supply of non-military tritium comes from the heavy water used to moderate CANDU reactors and some of these will be closing down in the near future. The supply accumulated over 40 years of operation of CANDU reactors will peak in 2027 at 27kg.

Tritium Supply
The world’s commercially available supply of tritium before any is removed by fusion programme. Click to enlarge.

Military reactors designed for tritium production produced only a few kilograms a year at a cost of about $200M/kg. Tritium increases the yield thermonuclear warheads (H bombs). It is thought that about 4g is used in each warhead, added in a container just before launch, so that decay of the tritium does not limit the shelf life of the weapon. There have been some hints that the latest warheads being designed will not use tritium. The US had a number of military reactors at its Savannah River site especially designed for tritium production but the last of these was closed down in 1988. It is believed that over 220kg of tritium was produced there over the years but that there was only about 73kg in 1995 which will have now decayed to about 37kg. It is unlikely the US military will release any of this for civil fusion power. One of the speakers said that he believed that at one time the Russians had mentioned the possibility of releasing some of their supply but had no further details. Other civil fission reactors could produce small amounts by placing lithium inside the reactor but some back of envelope calculations that I did for a comment to previous post show that it would take at least 60 tonnes of unenriched uranium to produce 1kg of tritium in a standard reactor and it may be much more. Specially designed accelerators are theoretically capable of producing tritium, and have been considered for military needs, but one to generate a few kilograms per year was estimated to cost $4.8 to 6.1 billion in 1991 prices and would produce vast quantities of radioactive waste.

Since ITER will produce only a tiny proportion of the tritium it uses (at least in the early years) because the experimental test blanket modules will only cover a small area of the chamber, ITER alone will severely deplete the worlds tritium stock. If DEMO is heavily overlapped in time with ITER the tritium supply will be very critical and it will be important to get the full blanket and tritium recovery system going as soon as possible if the programme is not to be delayed.

If fusion reactors are to proliferate, then probably near the start, and certainly after the first few, each new reactor will be relying on the small surplus tritium production from existing reactors to provide the start-up charge of tritium. This is likely to be some tens of kilograms. When I asked him, Dr Briscoe said that it will probably take two-and-a-half to three years from the start of one reactor for it to supply enough surplus tritium to start up another. The estimate he gave was that even if the only obstacles were technical ones it will be 2100 before fusion can supply more than 30% of Europe’s electricity.

The Energy Gap

All but the most dewy-eyed optimists foresee the production of conventional oil severely curtailed by then. If we are to get to 2100 without major economic collapse or using so much coal, tar and oil shale that the carbon dioxide will have risen to disastrous levels, we will have had to have adopted other energy sources on a major scale, as well as substantially cutting our energy consumption and finding some way of replacing liquid hydrocarbons for transport. Fusion will then be competing in a very different environment.

Cost estimates at this stage are obviously uncertain in the extreme but most estimates put the cost of generated electricity as comparable in today’s prices with today’s fossil fuel generated power. Most of the cost is in the capital cost of the plant amortised over the life of the plant. Running, periodic replacements and decommissioning costs coming next with fuel costs less than 1% of these and not likely to rise by depleting the richer ore, as would be the case with very widespread use of thermal fission plants. One estimate puts the capital cost at €14/W electrical for DEMO falling to €4/W for commercial plants in serial production. This link allows you to play with the assumptions and produce an electricity price.

These prices should be compared to today's fission and coal plants at €3 /W and €1.5 /W. respectively for the plant alone. However, the capital costs of coal plants do not include costs to mitigate environmental damage. Wind energy capital costs are now about €1.5/W but rarely have a load factor of more than 30%, whereas fusion plants after initial settling-in could have 85% load factor. Correspondingly more rated wind power plant would need to be installed to meet the same electrical demand. If wind power were to supply more than 30% of the total supply, extensive power storage would be needed and the transmission costs would be greater as many of the turbines would scattered across areas remote from demand.

The carbon dioxide emissions associated with fusion are again very difficult to estimate at this distance, but will also be dominated by that generated in building the reactor and its associated plant and the periodically replaceable parts. The carbon dioxide generated in fuel production and preparation will be constant and low in comparison. Fusion reactors are inherently proof against melt-down or nuclear explosion. There simply is not enough fuel in the reactor at any one time to cause one. Any failure of coolant, magnet supply or other major system will cause the reaction to die in milliseconds. There is not enough heat in the system even without coolant to breach the containment vessel. According to this report even in the worst credible accident with the release of all the tritium on site there would be no need to evacuate anybody beyond the site boundary. They are a very much less tempting a target for attack by violent political or religious groups than thermal fission reactors, still less than fast breeder reactors. If a large part of the energy gap world wide were to be made up by fission reactors using the present system of thermal reactors with once through use of enriched uranium we would rapidly be reduced to the use of ever lower grades of ore requiring more energy input. Some estimates put the point at which there is no energy gain to be 0.02% of uranium and that if nuclear energy were expanded from the present 16% to 50% of electrical generation we would be reduced to using such ores in 50 years. There has been much argument over such estimates but in the long term it is clear that if we are to rely on fission for a major part of our electricity let alone total energy we will need to move to fast breeder reactors or thorium reactors. The long development time of fusion energy may seem disheartening but that of fast breeders is not much better.

In 1946, the year I was born, the Nobel prize winner, Sir George Thomson, working at Imperial College London applied for a patent for using a gas discharge to generate controlled thermonuclear fusion. If the proposed timetable is kept to and if I live to be 102 I may just, in my dotage, see his dream brought to commercial reality. I do not know which of the two is more improbable.

Very thorough and comprehensive overview. Well done!

Consider this a reminder to positively rate this articles (using the icons under the tags in the story title) at reddit, digg, and del.icio.us if you are so inclined. (email me at the eds box if you have questions about this).

Also, don't forget to submit this to your favorite link farms, such as metafilter, stumbleupon, slashdot, fark, boingboing, furl, or any of the others.

I can assure you that the authors appreciate your efforts to get them more readers.

Not this side of a century.

First you have to achieve more than breakeven, but have a plant that runs a heat engine that can produce enough energy to run the confinement aparatus.

Second you have to compete with fission, which has enough fuel to last thousands of years just for light water reactors. In breeder reactor regimes fuel costs are negligable; In spite of being more expensive and difficult to implement than light water reactors, they're still worlds simpler than any conceptual fusion power plant.

Someday I'm sure it will be useful, but in the far far future.

Second you have to compete with fission...

I was talking to Bob Hirsch, famed for his DOE paper looking at peak oil mitigation timeframes, at the ASPO conference last year. He's totally dismissive of fusion power. He should know what he's talking about seeing as his bio includes:

Director, Division of Magnetic Fusion Energy Research, U.S. Energy Research and Development Administration. During the 1970s, he ran the US fusion energy program, including initiation of the Tokamak fusion test reactor.


The reasons he gave were cost (will always be more costly than fission), complexity (always more complex than fission) and waste (although fusion doesn't produce the very long lifetime waste products hundreds of years is just as bad as 1000s in commercial terms). Though Nick's report suggests waste quantities are significantly less than fission even over relatively short time scales.

Hirsch is a fan of fission - suggesting over a 1000 years supply of fissionable material, including fast breeders which he said were cheaper and easier than fusion. Whilst not doubting fusion could/will work technically (although he did say that the current research was going in the wrong direction) his problem with is that that it'll just never be completive with fission.

On a related note I've realised a lot of older scientists who are peak oil aware are also supporters of fission as significant part of the solution. My theory for why this is the case is that many of this generation of scientists became aware of fossil fuel depletion issues in the seventies - when the atom was still highly regarded, especially amongst the scientific community. The drawbacks that have become apparent over the last 20 years were not fully recognised then and the first positive impressions of nuclear power stuck!

Thanks to CV & others
Posts like this is why I read TOD each day

I think that many younger environmentalist do not realize that solutions have been identified for many of the technical problems with fission, such as proliferation, long lived wastes and runaway reactions. Their unwillingness to consider as part of our energy solution might have made sense before we understood the implications of peak oil. But now we know that fission, and especially the potential of Thorium, is just about the only good large scale energy source that will be plentiful for the next 100 years.

Maybe a review similar to the present one is in order, to cover the state of affairs regarding fission. I hear about various sorts of technology, like breeder reactors or thorium reactors or actinide burners. But how far have any of these technologies been pushed - are there working prototypes? What barriers remain to profitable use?

What always interests me is not a blank reassurance like "That problem has been solved," but rather a survey of the current research frontier. No problem is ever solved so finally that somebody somewhere isn't trying to find a better solution. For example, the waste problem. I hear some really stupid comments from folks that do not understand that the problem is rarely the danger from standing next to a radioactive source - after all, shielding is cheap. The problem is, what happens when the radioactive material gets into the air or water or soil and from there enters a person's body. Futhermore, radioactivity tends to corrode whatever material is used to contain the waste, so building a containment vessel that can last tens of thousands of years is much harder than it would be if the contents were not radioactive.

Some folks are apparently reassured by opaque promises that all the problems have been solved, or will soon be solved, or will be solved anyway by the fantastic technology that our great grandchildren will surely have developed to fix the problems we are leaving to them. I find it much more informative to learn about the current research frontier. There are surely folks around who are working to design better containment vessels for radioactive waste. What sorts of issues are they struggling with? If I know where the boundaries are, I can get a good idea of the size of the country.

Weapons proliferation is hardly a technical problem. The idea that such a problem can be "solved" is quite strange. The USA seems to be moving away from an approach of international cooperation to one of strong arm domination. For example, the recent treaty with India is quite strange. It sure looks like it is OK to develop nuclear weapons as long as you make a deal with the USA like maybe don't build a gas pipeline to Iran, or what was that deal all about anyway. "The dominant player stays strong enough to be able to impose its will on all others" - is that the kind of solution you envision?

But the problem gets really thick. If the current direction of the USA continues, where government and industry can use secret police and martial law to concentrate power and profit, then it seems unlikely that any very effective solutions for problems like waste management will get implemented. We have here a classical recipe for corruption. There are always lots of opportunities to increase profits by cutting corners and sacrificing safety.

So the big pattern I see developing is:

danger of weapons proliferation -> concentration of power -> cutting corners on safety

I would really like to see a review of the state of affairs with fission technology that addresses the issues at least at this depth.

Another issue I rarely see addressed in the realm of problems with nuclear power is the issue of mining. If fission power is hugely ramped up, mining waste and pollution issues will become correspondingly huge. As in many other such issues, there are sensible approaches that can mitigate the problems. The big question is, will these sensible approaches be implemented? Our poor past record with other energy sources, primarily those that require actual mining such as coal and tar sands, is not at all encouraging.

Not a lot of mining needed. The amount of Uranium it would take to power the country for a year would fit into a few semi trailers. Even with low grade ore, that's not a lot. Of course, without breeders, multiply that by a hudred or so, but it still isn't much, maybe one medium sized mine or so. Certainly nothing like the scale of our mining for almost any other mineral imaginable (copper, cobalt, coal, you name it).

I read some time ago about swedish trials for containing nuclear waste, IIRC they were using layers of stainless steel with a copper layer outmost, burying it a few hundred meters deep in granite in an area not prone to earthquakes. Each copper container would have it's own niche cut out in the granite surronded by a layer of clay that swells in contact with water and seals the container from groundwater, bentonite I believe has that quality. More I can't remember.

I wouldn't be surprised if Magnus Redin knew more about this particular project :)

Bait taken. :)

The containers will have an inner structure of cast iron with a bolted on lid and channels where dried used fuel elements and other highly active core components will be inserted. This iron structure provides mechanical strenght.

The inner cast iron structure will then be inserted in a 50 mm thick copper shell and a lid is friction stir welded to hermetically seal the container.

The containers are then to be stored embedded in bentonite clay that indeed swells when wet at a depth of about 500 m in crystaline bedrock. They will either be stored individually in vertical holes bored into the floor of a tunnel or several in a horisontal hole drilled between two tunnels. The later method requiers excavation of a smaller rock volume.

There has been geological research for this for about 20 years, the most intresting find is probably microbiological activity in the rock cracs.

Production methods for the inserts and copper shells have been develped, friction stir welding were better then electron beam welding. Copper forgings of this size where something new. Capsule handling have been tested and capsules with simulated decay heat stored and retrieved. A plant for filling the containers is being designed, I dont know if the design is finished.

All of the research, the interrim used fuel storage and so on is paid by fund filled by a small fee on every kWh produced. The same fund will also cover the dismantling of the reactors when they are worn out.

The research and the solution is shared with Finland who has the same kind of bedrock.

Here is a link to an official page with lots of information:

The final site for the storage is decided in competition between two municipialities. Its expected that final storage will commence in 2018, this means that all the major investments in facilities will be done while the powerplants are running wich I find wise if world finance should burp. A cute bonus idea is to build a railroad with the excavated rock if it is built close to Oskarshamn.

I find this solution good enough for me, its probably overengineered.

thanks for confirming most of what I said, and for the swift reply. 500 m is pretty deep, It does seem like this scheme will keep the material safe for perhaps severel glaciations and interglacials, that is if Scandinavia will be glaciated again before the nuclides decay, who knows if the cycle is broken. I will now go back to reading "The prize".

I expect that far before the end of this century, all what we now call "nuclear waste" will be taken out and recycled. The actinides will be burnt in specially designed reactors and the remaining fissile material will be used and enriched, and only the small fraction of remaining undecayed isotopes will be returned to the storages.

And of course our kids will be amazed at the stupidity of their parents and grandparents, doing what they do now...

I expect that far before the end of this century, all what we now call "nuclear waste" will be taken out and recycled.

I beg to differ. Reprocessing of this kind has proven to be both unnecessary and uneconomical. Even burying spent fuel is not the most economical approach; it's cheaper to just seal the stuff in armored casks and guard them.

About the mining: a Japanese group, some years ago, came up with a polyamidoxime polymer (obtained basically by treating ordinary acrylic polymer with hydroxylamine in hot methanol). This polymer selectively adsorbs uranium from seawater. Suspended in the ocean in a natural current, it adsorbs 1% of its weight in uranium over a period of months, which is not bad when you consider the concentration of uranium in the water is around 3 ppb. It can then be washed with dilute acid to liberate the uranium and reused.

The group estimated the cost of the uranium obtained to be a few times the current spot market price. There would be no mining waste, since the uranium is already liberated in the enviroment, as are all the decay products like radium and radon. At their estimated cost, reprocessing and construction of breeder reactors could be delayed for centuries, even if the world goes over to mostly nuclear energy as its primary energy source.

Seawater uranium extraction deserves more attention than it has been receiving, since it could render some other large government energy research expenditures (like breeder reactors, advanced nuclear fuel cycles, or DT fusion) superfluous for the forseeable future. The primary cost of seawate extraction is the capital cost of the support structure for the adsorbant, so combining this with offshore wind might be a good idea (they could share structural elements).

I beg to differ. Reprocessing of this kind has proven to be both unnecessary and uneconomical. Even burying spent fuel is not the most economical approach; it's cheaper to just seal the stuff in armored casks and guard them.

This has only been market tested with aqueous methods which certainly arent low on capitial and labor and are sort of designed for plutonium extraction. Of course its been more expensive than it will ever be worth. I fully expect that utilizing pyroprocessing methods we'll at least do uranium and fission product extraction sometime this century, as long as we avoid the trap of trying to do MOX fuel nonsense. Now maybe the actinides can be burnt someday in a fast neutron reactor of some sort for profit or maybe they cant, but there is potential profit to be made with non-aqueous methods on the unburnt uranium, xenon, and fission platenoids.

And then we cant discount the political machines that make unnecissary and uneconomical things happen anyways.

And seawater uranium extraction doesnt deserve any attention at all because we'll have so much uranium from more conventional ores for it to ever compete.

I continue to disagree. As it stands right now, reprocessing would be uneconomical even if it were free. The plutonium has negative value, costing more to fabricate into fuel elements than it saves in enriched uranium. This will be true of any reactor with Pu in the fuel elements, since the cost driver (the intense alpha activity of the Pu) will be the same.

I consider homogenous reactor systems, like molten salt reactors, to be nonstarters for practical reasons. No reactor operator wants a reactor in which the entire primary loop is intensely radioactive. Nor do they want reactors that have to include sophisticated chemical processing equipment for online reprocessing.

And seawater uranium extraction doesnt deserve any attention at all because we'll have so much uranium from more conventional ores for it to ever compete.

If so, that would be another reason to not go with reprocessing or breeding.

I continue to disagree. As it stands right now, reprocessing would be uneconomical even if it were free. The plutonium has negative value, costing more to fabricate into fuel elements than it saves in enriched uranium. This will be true of any reactor with Pu in the fuel elements, since the cost driver (the intense alpha activity of the Pu) will be the same.

Where did I talk about Pu?. As it stands, reprocessing just the uranium would be valuable, along with fission platenioids, xenon, and other marketable fission products. Dump the transuranic actinides seperately.

No reactor operator wants a reactor in which the entire primary loop is intensely radioactive. Nor do they want reactors that have to include sophisticated chemical processing equipment for online reprocessing.

Its sure a seperate business model from LWRs. But the benifits of no fuel fabrication, low fissile load, and extremely small waste stream are there. And while fuel costs are a small component of the cost of nuclear power, they aren't negligable.

Given that MSRs have never been market tested, suggesting that they're a nonstarter because of a different business model is a bit premature.

I am glad you are open to a debate. Waste is less of a problem if you burn up all the long lived waste so that what remains has a half live of only a few hundred years. That technology has been proven. Proliferation is partially a technical problem in that some reactors and fuel cycles create more material that can be made into bombs than others. If the result mixes fissionable and non fissionable isotopes in a ratio that is not weapons grade, then the proliferator needs to build an istope separation process, which is extremely expensive and difficult. And regarding mining waste, the volume of fission fuel is very small comparded to, say, oil sands.

I am not saying that the problems have all been eliminated but that we have learned an enormous amount in some 60 years of experience. In peak oil and gas, we face a problem of enormous magnitude. Wind and solar are potentially good energy sources but they are very diffuse and intermittent. We will need to exploit every resource that we can find but we still may not avoid a catastrophe.

Waste is less of a problem if you burn up all the long lived waste so that what remains has a half live of only a few hundred years. That technology has been proven.

I would like to know what kind of proof you are talking about. I have done a little bit of googling and only turned up speculative designs.

There are two issues of course:

1) Given some complex mix of chemicals in various isotopes, one can probably devise a system to separate out the various components and then use neutron beams of the right energy or whatever to induce nuclear transformations.

2) Is there a way to do something like this but in a cost-effective way?

So I would really like to hear about working systems to eliminate long-life radioactive products from spent nuclear fuel.

Really, "proof" is a mathematical concept. It has some relevance to something like physics, less so to engineering, and almost none in the real world. A system that seems to be working on one day can turn out to be a miserable failure the next.

So I would really like to hear about working systems to eliminate long-life radioactive products from spent nuclear fuel.

Sure you can do that with fast neutron incinerator reactors. A liquid chloride fast neutron reactor fed actinides and other transuranics can breed thorium into U233 for liquid fluoride reactors, or it can just dump the neutron surplus into conversion of long lived fission products into stable isotopes.

A better question though is why bother? These things are pretty easily contained and monitored over at least a century, and one can very reasonably assume we'll have better techniques for managing waste by then. Chemical waste is toxic forever, but we dont have giant programs to convert lead into iron.

We'll do it if it makes sense, and if not, just stick it in an empty lot.

You haven't looked very hard.

Take fuel rods, melt them down, mix with liquid salt. Put in an electrode, run current, pull out a giant lump of metal. This is all the actinides. Melt down, make new fuel rods, you're done. Pour the salt into a drum, seal it, come back in 300 years and it's not dangerous anymore. Simple, easy, hardly the rocket science that the eco-dweebs would have you believe.

The only complication is that after you do this a few times, the resulting fuel needs to go into a fast breeder reactor, because thermal neutrons won't fission some of the heavy actinides. That being said, breeders are an old hat, the French had one going in the 1970s (superphenix), but it closed in 1995. That is hardly the only breeder reactor around.

In any case, it was eventually closed because eco-terrorists hated it (even fired rockets at it), god forbid we get a reliable and non-polluting source of power, don't you know, and because it was pretty costly. It didn't make much sense in an age of cheap uranium, but it was hardly a technical difficulty even using 1970s technology.

In any case, these various technologies are stupidly simple, a fission reactor is little more than a big pile of metal with some pipes for heating water, and they've been proven a hundred times over through the course of the last 40 years or so.

The issue with this is that there isn't much research to do with fission. Most of the research is related to making fuel rods that can last a really long time between changes, as that saves money and is a pretty hard problem. Beyond that, reactors were easy even in 1960, it's just a very simple technology. All the current reactor designs floated around, there's not the slightest chance that they won't work exactly as expected, modulo cost overruns in the construction, and excessive maintainance costs. There just isn't much research to do here, all the problems are solved, and none of them were hard to begin with.

1) Waste management. The industry doesn't really care much, because handling any volume of waste just isn't that expensive, but if you care, google pyrometalurgy for a better way to do it. There's not the slightest chance this won't work exactly as advertized, it's been tested and everything. The only question is, how cheap will it be, and why bother. I think we should bother, but I'm not calling the shots. It makes the waste safe in about 300 years (as compared to 100 or 200 for a fusion reactor). Most reprocessing has historically been done with PUREX, as that's what the military used to make weapons, so why not use the same process? Once again, it works, and no civilian systems are terribly concerned with reprocessing waste, so why bother.

2) Breeder reactors. This was an old hat in the 1960s, look up superphenix. They cost more though, so people never bothered, because Uranium is cheap. Some were built, and run for decades, the environmentalists always hated them, they were generally shut down because Uranium is cheap, so what's the point.

3) Proliferation. The US is already a nuclear power, I can't fathom how us using more nuclear power allows bangladesh to get nukes.

The nuclear industry had a decades long record of arrogance, overconfidence, lies and secrecy. Not to mention a series of very serious disasters which were 'impossible' (Windscale, Chernobyl, TMI) and a legacy of radioactive waste and sites which will take decades or centuries to deal with (most notably on the military side, of course).

Which is not to say that big improvements haven't been made, eg in US reactor operation and safety post the 1979 Three Mile Island disaster. Or that in an age where global warming is the greatest threat, that it doesn't have a role.

It was an industry conceived in an age of great techno-optimism, when it was inconceivable that human action could permanently degrade the environment, and the opportunities for human progress were endless. Massive government subsidies were poured in to bring the industry to life.

Today we are vastly more cognisant of the risks of unintended consequences, of the biases of large government agencies and government-industrial complexes, and of the tendencies towards secrecy and denial of large bureaucracies.

In a curious mirror of global warming, we don't trust authorities as much any more. Just as people are not willing to trust any number of authoritative scientists on the reality of global warming, so they are not willing to trust scientists and engineers on the safety and efficacy of nuclear power.

Call it a post modern age. We no longer believe that there are truths, separate from their societal and social context and meanings. It is argued that global warming is a giant conspiracy by scientists to enhance their own position and funding, and weaken the United States. That is an argument entirely founded on views about how society works as a social and political construct.

This, above all, a group of French philosophers has left us with as a view of the world.

I have always felt that it was ridiculous to place power metals in a deep mine and predict that it won't be touched for geologic ages. Its no different than locking up the Pharaohs gold in a Pyramid. For eternity.

The real problem with actinide stores is that some men will eventually purposely mine it to obtain the power metals.

Reprocessing and burning up the actinides is the only way to insure that there is nothing in the waste depository that anyone would want.

Although today I understand that archaeologists actually open the ancient cesspools to explore the diets by inspecting the fecal remains. So to say even that won't happen is still unpredictable.

Bu the the attraction of going after power fissionables will ensure that man WILL disturb the waste depositary.

I don't think we will continue to use fission long term, after Fusion becomes practical, but some facilities will eventually be created to specifically consume the power actinides and remove them forever.

I used to be very much against fission reactors because of safety concerns. I wouldn't make that argument any longer because of the safety record of fission reactors. If we vastly expanded their use, there would be very serious accidents, but there is not a single technology which is perfectly safe and has a clean environmental bill. So it becomes a judgement call. That is a call we have to make collectively.

I do see a very problematic future for fission because of local politics, though, after all, who in their right mind wants one of these next door, even if the risks are small and understood? Moreover, federal politics has shown to be incapable of supplying fission technology with the needed regulatory framework and long term infrastructure for re-processing and fuel re-cycling for breeders. Thorium reactors might be a way out, but they seem to have a couple of decades of R&D ahead, still.

And finally... until the total cost of fission including waste storage is known, we can only assume that electricity from reactors is not significantly cheaper than electricity from renewables. But this would probably change if waste heat recycling (for industrial processes and heating) could be included into the equation. I don't know how serious R&D efforts in the US are to look into that, though.

And finally... until the total cost of fission including waste storage is known, we can only assume that electricity from reactors is not significantly cheaper than electricity from renewables.

The cost of waste storage is nearly all upfront. The rest is nearly zero because of discounting; Now the cost of geologic repositories might be huge, but they're political objects that are entirely unnecissary. Just store the waste in an empty lot for the next century and revisit the issue again. There will probably even be a market for it with all the unused fuel and fission platenoids in it.

Right. Let your great-grandkids clean up after you.

I want my MTV!

Sure, just like we're dealing with the massive social ills caused by the lead pipes of the Roman empire.

Right. Let your great-grandkids clean up after you.

Right, do unnecessary work now and let your great-grandkids clean up the incurred public debt.

The 'economic pollution' of unnecessary government spending is a worse problem than the nominal cost imposed on future generations by interim storage of spent nuclear fuel.

Conventional reactors can also be retrofitted to breed/convert thorium to uranium. I worked on the successful thorium light water breeder reactor project back in the late 1970's. The Canadian CANDU reactor also offers lots of possibilities for using thorium and for minimizing long-lived nuclear waste. We have the technology. We just need to have realistic economic incentives against generating both CO2 from fossil fuels and long lived nuclear waste from conventional reactors. See www.thoriumpower.com for some more info on one thorium fuel cycle approach.

The problem is with any breeding configuration in solid fuel reactors that implies an entirely seperate reprocessing and fuel fabrication regime that has to deal with fairly hot fuel elements, unlike fresh uranium before it gets stuck in the reactor. This vastly magnifies cost and probably isnt worth it. We're more likely to reap benefits from fluid fuel reactors or once through light water reactors.

Now reprocessing with molten salts might offer some advantages, but all large scale reprocessing plants today use aqueous methods which are just in every way horrible. They are ideal for doing plutonium extraction for weapons production, but not so much for reactor fuel.

One good thing with small and medium size liquid fuel reactors that run at a high temperature is that they can deliver both thermally manufactured hydrogen and hot process steam to oil refineries. That ought to be a good way to make heavy oil and old refinery infrastructure last longer.

The problem is with any breeding configuration in solid fuel reactors that implies an entirely seperate reprocessing and fuel fabrication

This omits the possibility of a system that could breed and consume fuel in-situ, without reprocessing. This would require fuel elements capable of achieving high burnup, but metal fuel elements have that property.

The problem would be keeping the reactivity within bounds as the fuel evolved. This could be done either by careful and frequent rearrangement of discrete fuel elements (a thorium-uranium near-breeding scheme in CANDU did this) or by use of an accelerator-driven reactor that can continue to operate as k declines (just turn up the accelerator; time share the beam between multiple cores so the accelerator's capacity remains fully utilized.)

I support fusion research even given the low probability of success because a success would be very important.

However, I think that equal level of commitment ought to be given to accelerator-based fission or other advanced fission cycles, because the fundamental physics is easier and there is less major technological development necessary.

Accelerator based fission (and other fast neutron schemes) can burn up the long-lived radioactive actinides which are the problem for waste disposal.

Accelerator based fission also has no meltdown/chain-reaction problem, similarly to fusion, removing the external inputs stops the reaction.

The fundmental advantage of fission over fusion has persisted since the beginning.

In the fission reaction you shoot a neutral particle at the nucleus. There is no interaction until it gets close enough for the strong force.

In a fusion reaction you have to squeeze two oppositely charged, and hence repelling, particles very closely together. If you just slightly miss---which almost all collisions do---you end up increasing chaos and driving towards thermodynamic equilibrium thereby reducing the reaction rate.

We knew about fusion reactions in the early 1930's and fission reactions in the late 1930's. There was a major industrial scale fission plant a few years later and full commercial operation just a little while after that.

We're not even close to Hanford with fusion.

I think advanced, well designed fission reactors are the way to go.

Accelerator based fission also has no meltdown/chain-reaction problem, similarly to fusion, removing the external inputs stops the reaction.

The only thing accelerator based fission buys you is extra control on criticality excursions. A 'meltdown' is still possible as much of the heat present is heat from decay chains. A better thing to do is just design a safe critical reactor for the hard epithermal spectrum for actinide incineration, and liquid chloride reactors are the way to go with that.

Or you can use modern light water reactors and stick the waste in casks. Its not like we're short of space for them.

The accelerator driven thorium reactor proposed by Carlo Rubbia, Director General of CERN 1989-3 is safe against runaway. In is inherently sub-critical as only one spare neutron is produced per fission event and even a tiny loss of neutrons, of which there must be some, will lead to the reaction dying away almost instantly with no neutron input. Only the input from the accelerator keeps the reaction going and the reaction rate cannot be faster than that determined by the maximum accelerator output and the fixed multiplication factor. Although there will still be a lot of after-heat from the fission reactions, it is much less than from a normal reactor and his design uses passive convection of vast amounts of liquid lead in a deep chamber dug into the ground. Although it is not impossible to imagine a loss of coolant accident and melt down (a very violent earthquake) with the correct choice of site it is vastly less likely than the already small chance of such an accident in a PWR reactor that requires pumped cooling.

The additional advantages of there being vastly more thorium 232 (almost all of natural thorium) than uranium 235 (0.7% of natural uranium) in the world, the massively lower production of long lived actinides plus the possibility of transmuting them to shorter lived isotopes, the lack of the need for enrichment and the far lower opportunity of diverting material to weapon use, either by the operator or through raids by violent political of religious groups make it an attractive option if we chose the widespread adoption of fission power generation.

The accelerator driven thorium reactor proposed by Carlo Rubbia, Director General of CERN 1989-3 is safe against runaway.

So is any critical reactor with passive safety features and negative reactivity coefficients, and you dont have to buy an accelerator with the package. Rubbia is an accelerator guy and sees everything as a nail to be solved with his favorite hammer. Accelerator driven systems are a bad, expensive idea. An interesting one, but they're dumb.

The additional advantages of there being vastly more thorium 232

You can use thorium in critical reactors that are far more ecnomical than any ADS.


Careful here, you aren't really talking about the same thing as the parent. The parent is noting that even a sub-critical reactor can melt down, potentially, and he's right. Energy generation doesn't stop when fission stops, but rather it slows gradually over the next few days, much of the energy comes from the rapid radioactive decay of the fission fragments.

In any case, modern reactors are safe. There is a 100% probability that coal will kill at least 300,000 people in the US alone this year and every other. The worst nuclear disaster in history killed maybe 100 people, and no matter how many ways you try to add to that tally, it's still way less than any given year of coal in a fairly large country. Even if you go all out and try to make the nuclear tally in the thousands, or tens of thousands, you aren't going to be able to (with a straight face) claim that it's even on the same order of magnitude as coal, millions per year worldwide. Also, fossil fuels release more radioactivity than even nuclear disasters, to say nothing of lead and mercury.

In any reasonable comparison (number of people killed or sickened, etc...) civilian nuclear power is about the best out there, possibly as safe or safer than some (if not all) renewables. Hydroelectric dams burst, mechanics fall off of windmils, and those solar panels don't just make themselfs you know, Hydrofluoric acid is no fun to be splashed with.

300,000 per year at a death rate of 8.26 per thousand in a population of 300M would mean 8.26% of all deaths are caused by coal. This seems way too high. Also coal at the moment provides more energy than nuclear fission. Fission deaths should include uranium mining which is not without hazard and will increase as ore grades sink and more has to be mined for a given output. This somewhat closes the gap with coal but I agree coal is more dangerous and not free from radioactivity as my graph shows.

As regards melt-down of a thorium accelerator driven reactor, I did accept that melt-down is a remote possibility and specifically mentioned after-heat but in the design I mentioned, all that needs to prevent melt-down is that the 10,000 tonnes of lead stays in the pit. No
playing with the controls, failure of axillary equipment or fracture of piping of containment vessel will lead to a melt-down. None of these would even lead the temperature rising. It is unlikely that a direct hit by an aircraft will do it.

My point is that in a modern design for a critical reactor, you're not going to have any chance of meltdown either. You aren't getting added safety worth the enormous infrastructure cost.

A better thing to do is just design a safe critical reactor for the hard epithermal spectrum for actinide incineration, and liquid chloride reactors are the way to go with that.

Does going to hard spectra for actinide burnup (to reduce half life of wastes) cause additional safety issues?

I thought that was the principal motivation for a spallation system, that you can get both fast and safe and with reasonably robust lack of complicated plumbing/chemical loops.

Anyway, as policy I would advocate actinide-burning extremely safe fission (accelerator as one potential to explore, not exclusively) as a major addition to fusion.

I also believe that existing light water reactors are still much better than coal that we ought to use them regardless of the cost increase, but enough people don't seem to agree. I guess that the nuclear option will have to present huge advantages with answers for nearly all traditional nuclear problems (not just relative superiority to coal) until it is accepted.

Why else the work on fusion if that weren't the approximate consensus?

Does going to hard spectra for actinide burnup (to reduce half life of wastes) cause additional safety issues?

Yes. You have a much smaller delayed neutron component, so reactivity flux tends to be higher and the reactor is harder to control.

I thought that was the principal motivation for a spallation system, that you can get both fast and safe and with reasonably robust lack of complicated plumbing/chemical loops.

You're trading one set of complications for much more complication. That accelerator wont be cheap. In any case, the accelerator system will either use a solid fuel system where you have to have a reprocessing regime for fuel fabrication (not good, much more complicated) or you'll be using some fluid fuel liquid chloride regime allready that has all the plumbing you seek to avoid.

But in a fluid fuel reactor, even a criticality excursion isn't _that_ bad. the fuel can get real hot, but the void coefficient is negative, as is the temperature coefficient, so it cant go on a runaway excursion... the worst that happens is bubbles in the fuel and melting of the freeze plug and the fuel drains into dump tanks, which is an easy fix in a well designed fluid fuel reactor.

Anyway, as policy I would advocate actinide-burning extremely safe fission (accelerator as one potential to explore, not exclusively) as a major addition to fusion.

Critical thermal thorium reactors with liquid fluoride fuels are my prefered next generation reactors, they dont produce long lived actinides in the waste stream, and the actinide component of the fuel is small enough that you can just burn it in the reactor. I think we should start building hundreds of mature LWRs today, and fast track liquid fluoride reactors for the next generation.

Why else the work on fusion if that weren't the approximate consensus?

Because fusion is cool. It doesnt make it a good solution to energy problems, but it is very cool enineering. So was the apollo program and that was just as useless.

So a summery of this is : fusion is hot, its almost working (the next generation of plant will probably generate power) but the technical startup costs (especially tritium) will prevent large scale adoption for the next 100 years, and as such there is no chance to save us from PO?

anyway, a well written article explaining the pros and cons of the current state of the industry more clearly than I've read in a while.

A great summary with some extra insights for me, thank you!

There is another obstacle, which I would call socio-technical:

A fusion plant of the proposed design relies on a very, very high level of industrial infrastructure and knowledge.

To name just one example: The production of the required tons of super-pure materials for the blanket currently poses huge problems even for the most advanced nations in the world.

So it is not just a matter of devising and licensing the proper blueprints for everyone to be able to build such a plant. The necessary level of sophistication and therefore complexity of the supporting infrastructure and its management processes might well turn out to be beyond the achievable, particularly in societies under serious energy depletion pressure.

Look at how difficult it is for developing countries to even build a "simple" nuclear fission reactor - usually, they cannot do this on their own, and some cannot do it at all, even though all principles and even medium-scale blueprints must be considered "public domain" by now.

Re spending money on fusion: Sure, there might be some usable output eventually. But as with all investments, you have to look at the opportunity costs: How does the investment in fusion compare with other alternative energy investments in terms of effectiveness, timeline, amortisation etc.

If you took the money and financed low-tech combined heat and power plants with district heating where population density is high enough (all cities!), or large-scale solar thermal district heating or at least block heating, the numbers would make much more sense and the solutions can be deployed today.

Tech people often have the tendency to dismiss the pragmatic, alledgedly imperfect solution and rather support the research for an imagined far-off wunderwaffe as the solution to all problems. I think this is a classic fallacy.



These are research and development efforts not technology that can be fielded today.
As such there are no guarantee that it will work out but if we dont try we wont get to know what we can know. Since the effort is at the edge of what our technology can handle a failure at least hones our collective skills in material science, etc.

It would be dumb to bet all the development budgets on fusion and crazy to use the "deploy todays technology budgets" but if we leave no money for this kind of research we lower the chance that things will be alot better in the medium term future. The same goes for breeder reactor research, novel solar cell research, etc.

And we should definately go ahead with this if the doomers are right and things will go to hell regardless of what we do. Then future generations can say that during the golden age they lit and held a sun for a short while before loosing it. Not a bad pyramid to leave as heritage, done once it might someday be repeated in a better way.

Well, money can only be spent once. If the cost of fusion research would just be a couple of millions per year, I would agree with you. But completely in sync with the law of diminishing returns, the intellectual and financial effort is huge.

It is a bit like playing the lottery - to have any sizable probability of winning, you have to spend a big lot of money and still have no guarantee.

It is also a bit like dismissing a readily available and effective, but painful cure for some short-term deadly illness because you want to put the money into research for a less painful cure that might be developed one day. Oh, and you also withdraw the doctors from the sick, because they are needed to do the research.

Sorry, but I can't follow you at all.


My standard arguing is that we should do what we can do right now since we can not depend on long term break thrus, good things will probably happen but we can not be sure about what and when.

Unfortunately I dont use to insert a "mostly" since we also should do research and development for the long term and to get to know more about how the pysical world works. I would be happy to for instance see a Hubble 2 fly and having us find out more about things distant enough in space and time to never ever be of any utility for energy or feeding a physical need. Thats an example of very expensive science, the kind of projects we only can do a few dozen of at the same time and most of them are done with international cooperation.

Doctors are withdrawn from the sick all the time to do research. But it would be smarter to minimize the growing paperwork and not the research. When arguing about resources for expensive research, energy or other, versus good ideas for energy investments I would rather argue about good ideas for energy investments versus wastefull lifestyle habits.

But this is slightly religious for me. What is the point with resource abundance if we dont gather new knowledge and ideas? If I could have ten million dollars or lots of insights in how the world works I would not hesitate in my decision. If for instance USA would dismantle all research and creation of new culture but managed to keep a large middle class happily fat while doing zero cost recycling of old culture I would regard your existance as pointless from my selfish point of view, ok not completely pointless since I like to see happy people. ;-)

I am also fascinated by the achievements of modern science, like deep-space observation etc. - but my fascination tends to be curbed once I think of all the lost opportunities to spend the money on sustainability. Like my fascination with modern weapon systems like, say, the Eurofighter gets crushed as soon as I realise it's all for killing people.

Spending money on deep space science or - if you will - on fusion research which might bear fruit in 50-100 years while we are facing a multicrisis threatening the very existance of humankind on any meaningful level, is like fiddling while Rome burns or like the band playing on while the Titanic sinks.

Until we do reach a sustainable equilibrium, I consider the sums spent on trying to find out how the physical world works on sub-particle levels or in its cosmological context rather obscene.

Re your spiritual point: I consider myself an atheist and have been described as a rather too intellectual and knowledge-oriented person (as opposed to a more empathic, intuitive person). Nontheless, I feel that the meaning of life can only be true happiness - not in the consumerism/incentives/competition sense but more in the fundamental de Mello sense: being connected to reality, being aware and feeling love for the world and the people around you. This type of happiness would almost automatically lead to a more sustainable world.

Gathering knowledge is fine, as long as it does not preclude solutions for the existential problems of humankind.



I am perhaps not as nice a person as you. I use some of my time to think about how weapons could be more efficient and appreciate that what is left of the Swedish military after the cold war draw down have very good weapons and I figure some of them could be even better. But I regard violence as a last resort when everything else fails. The biggest problem with violence is that it works for creating and maintaining power, if violence never were a solution it would not be a problem. But it is of course not a solution for having a nice time, use of violence shatters the pleasant and relaxed naivety, or rather to not feel hunted, that I and manny others enjoy.

Btw, scrap the Eurofighter and buy less expensive Gripen instead. ;-)

Perhaps more serious, the difference in R&D productivity between different countries shows that we probably can do more globally. I can find examples in the weapons area but why not take cars instead. Tiny Volvo cars owned by Ford and tiny Saab owned by GM so far manages to be development efficient enough to not be dismantled by the financially hurting US owners but rather used as technology development centers.

That knowledge gathering, even expensive knowledge gathering, should preclude solutions for physical and existensial problems of humankind is odd when we live in the era of maximum oil production and maximum fossil fuel production. We have never ever had so much physical resources available and now we physically can make manny investments in parallell. The big resource problem is probably to ballance investments versus consumption, double ice cream today or small ice creams for a very long time and the possibility of developing new tastes?

It is of course true that we live on only on earth and a lot of things are going bad but human culture is not homogenous in its strife and priorities. We will probably never have a global priortization list and that is ok since we need competition and multiple solutions to get around the problem that one human or an organization of humans have a limited capacity for knowledge and good decision making. But things will be bad for a lot of people if manny people dont think about long term problems and solutions for them.

This line of thouyght is going out in the blue, I should probably be wise to leave it and get on with my life. :-/

I do appreciate the exchange!

(But have to go home now... ;-))



PS: And yes, I think I will rather buy the Gripen than the Eurofighter - next time, definitely! :-)


...while we are facing a multicrisis threatening the very existance of humankind on any meaningful level,

is wholly egocentric. The first and second worlds may be impacted severely by PO, but there are many peoples who use neither oil nor natural gas and therefore they will not be affected.

That's funny, because even the Third World needs fossil fuels - and in a more existential way. If they loose even a fraction of what they have now (for whatever reason), people will die - not just experience a minor discomfort, like the First and Second World.

Right, there are some indigenous tribes that live without fossil fuels.

But even they will be very much affected by the catastrophic climate change that will be accellerated by the not-so-clean "substitution" coal-to-liquids fuel, for example.

This is one world. All crisis are interlinked.

And even if the argument were egocentric, the "ego" still consists of about half this planet's population!



Very nice review, indeed. I will add : dont forget that peak oil is only the first of future peak fossiles. After that will come peak gas, and then peak coal. Now consider the amount of fossiles used to build a reactor, including the mining and the processing of metals, building of the reactor, transportation of materials and handling of radioactive wastes. The real cost of nuclear reactors in the far future will be the current one plus the cost of replacing all the fossile energy by sustainable ones, i.e. electric power, hydrogen or biofuels (of course this holds also for fission reactors, even breeders). By how much do we have to multiply the current estimates? It is not even obvious that the EROEI will be > 1.

It is not even obvious that the EROEI will be > 1.

Its obvious to anyone that measures the energy inputs.

It seems equally obvious to anyone that, by a very long shot, not all the energy inputs for a working production model system are known, and won't be known until we actually get there.

You're kidding, right? Stop and think about this a minute. A huge reactor that will make energy from water for several years, gigawatts continuously. I don't know what you think they're going to be doing to build this thing, but unless it requires lighting a fire in a coal mine, the EROEI (I shudder to even use that useless term, but will do so here) is likely to be in the dozens at least, possibly thousands.

"I don't know what you think they're going to be doing to build this thing, but unless it requires lighting a fire in a coal mine"
Did you read, or even skim, that article? There are a LOT more complex supportive and sustaining problems than simply putting water in a huge reactor. Lighting a fire in a coal mine? No. Containing and sustaining a piece of the Sun on the Earth's surface? A little closer.

I believe the point is that we don't know the eroei's of the energy inputs of the future, though we can reasonably expect them to be less than those that exist today, which to all appearances are less than yesterday's.

This gets to the central problem of all the proposed 'alternatives': what happens to the eroei of any alternative as the hydrocarbon platform decays (declining eroei of hydrocarbons) and ultimately collapses (too few hydrocarbons to maintain the platform)?

It also points to the profound wisdom that reduction in energy consumption is the surest way to sustain the project of civilisation.

Until today I supported continued public finance (where is the fabled private sector?) of research into fusion. Now I wonder if it isn't another unaffordable luxery.

The simplest thing to do is look at the most obvious alternative, nuclear fission, and its energy inputs.

Nearly all of them in light water reactors are in enrichment, but you dont need enrichment if you do CANDU or breeder reactors.

And for uranium itself:


I'll be impressed when I see uranium mining uranium, uranium building reactors and peripheries including such things as the required education and training facilities, uranium building and maintaining the electricity distribution network...

I rather use electricity. :-)

A great summary, and thank you!

As long as I have been looking at this stuff (bizarrely, since I was 7, but that's a long story) commercial fusion power has always been '50 years out'.

I suspect it will stay '50 years out'.

It's still worth pursuing, the option value of controlled nuclear fusion is so large that it's worth having.

The big concern, as always, is that it squeezes R&D budgets which might be used for much higher and sooner return activities: eg energy conservation. Nuclear fusion is like high velocity atom smashers or manned space flight: it eats government science budgets.

Mainly it pays the salaries of otherwise useless physicists and nuclear engineers. The description of the ITER makes it sound like another Rube Goldberg machine, including the free lunch [it generates its own input tritium].
Another proble is explosion; yes, I know somebody above said that this was not possible but there would be a substantial amount of very hot material that could react with normal material creating an old fashioned explosion. Oops..

It's come a very long way, but doesn't move quickly. There is little doubt that if we made a commercial sized reactor (energy leaks proportional to surface area, but is generated proportional to volume, bigger is better) it would generate more energy than it uses, even the prototypes already do that. The thing that will really be 50 years out is making it do so for anything approaching a reasonable price.

The most likely scenario is that for a VERY long time, fission will be much cheaper. Might still be very nice for space probes or something though. I don't think there are very many doubts that it will work now, but at what cost, that's the question.

Fusion could produce immediate results if we detonate H-Bombs to cripple the world economy. Instantaneous Demand Destruction.

Did you consider applying for open positions at Al Qaeda? They are working on that plan as we speak. But all they will need is probably one tiny fission device detonated in a major US city. The resulting economic bust will set the US back by a decade, easy. Half a dozen devices and the US would be dead in the water. Not so much because of the immediate damage but because everyone would move out of the cities. The restructuring cost that would follow from that stampede is hard to imagine.

If one is to believe the London Independent of Sunday, January 7th, 2007, the first city to be nuked this century will be Tehran by those wonderful people, the Israelis, the close allies of the only other state to use atomic weapons.


Seems it takes half a century for unproven technology to get relegated. Tonight's TV news had real footage of bicycles and biomass but CGI of a zero emission coal plant. The new Chevrolet Volt is all the rage but doesn't have a battery yet. I think concepts should show practical results within five years or should not figure in energy scenarios. Think of it as picking up shipwreck survivors in a lifeboat who say 'no need to row because the current will take us there'.

A zero emission coal plant is as science fiction as a commercial nuclear fusion plant.

A modern light water reactor is a commercial product. Write a big enough bank wire, and you get one.

"I think concepts should show practical results within five years or should not figure in energy scenarios." 5 years yes, 50 years, less so.

An excellent piece. It should be clear that a medium-future fusion/hydrogen economy is not a possibility. In summary:

* It will be over 40 years before the first commercial fusion reactor comes on line

* Electricity from fusion reactors - at least initially - will be hugely expensive

* It will be 2100 before fusion can supply more than 30% of Europe’s electricity - a timescale totally irrelevant to peak oil and even peak gas

* A fondly-imagined future of hydrogen-fuelled cars running on fuel produced by fusion-powered electrolysis would require a massive increase in electricity production. It is not going to happen. The only way it would happen is with a huge number of fission (conventional nuclear) stations - which seems impossible in terms of political acceptability and known uranium resources.

Probably by the time the first demo reactor is being planned the world economy will be too far gone (at best, on the road toward sustainability, rather than plain poverty/anarchy), that it will never be built.

So we face either a phase-out of what Kunstler calls "easy motoring" and all in our current society that goes with it in the next couple of decades, or go for a massive CTL (coal to liquids) programme that will postpone the inevitable for a few years and lead to a climate catastrophe.

All of this would be true if not for two simple facts: conservation and renewables. Put them together and what you get is actually an economic boom, not a bust, and motoring without excessive global warming. Motoring in compact electric vehicles and hybrids, that is, not in SUVs.

Agreed, it's a solution beyond the horizon of peak oil. If it wasn't hugely expensive (the real hurdle, imo), then we'd likely be able to speed things up by having more fission reactors to make more tritium. Yet another reason to use fission right now. It's cheap, green (though the greens hate it), and can produce the tritium that we could use to ramp up fusion, if we were so inclined.

said another way : imagine that 150 years ago, people needed an international collaboration to drill a first prototype of well (a unique exemplary) supposed to demonstrate the possibility of getting some oil from the earth after 15 years, with an expectation that oil could start to be produced after 50 years and could be of some importance after 100 years.... would you have believed that oil would be the energetic basis for a new, extraordinary, industrial world ? ;)

That's a good point. But then we had a coal-based economy that actually kept going and growing for several more decades. The problem is, how do we now get from here (2007) to there (well into the second half of this century) and keep running our society on the same path of ever-increaseing monetary wealth and energy use?

And there is also the cost, which is money not then available for development of other energy technologies, some of which might actually produce something before mid-century. At: www.jet.efda.org/pages/content/news/2005/050628iterincadarache.pdf

construction cost is estimated at “4.57B€ (at 2000 prices)”, while on Wikipedia: http://en.wikipedia.org/wiki/ITER

“the program is anticipated to last for 30 years—10 years for construction, and 20 years of operation—and cost approximately €10 billion (US$12.1 billion)”.

You could maybe double that for cost overruns and associated fusion research work. Then double that again for the pilot plant if it ever gets built. So that’s $50 billion before you produce a Watt – if you ever do. That could buy a lot of railcars, trams and insulation foam.

Actually you couldn't buy much insulation or railcars with this amount.

The US$12.1 billion equates to about £6.4billion Sterling.

This works out at about £213 million per year for 30 years, and if you divide this amount by say, the 7 main participating countries (US, Japan, China, France, UK, Germany, Canada) then this is only £30.4 million per country per year.

Which is a drop in the ocean compared to other government spending and probably isn't even a small fraction of the current UK annual rail subsidy.

In fact with a population of approx 60 million, the UK probably has about 30 million active taxpaying adults.

This means that in effect, fusion research is only costing the average UK taxpayer approx £1 per year.

I can't buy many solar panels for a pound. So it looks like a damn fine investment from where I'm standing.


Spread out over 30 years that's a fair comment. But a few hundred million invested NOW in subsidising domestic energy efficiency and insulation would save a lot of future energy demand and keep working for decades. The UK's biggest - almost immediate - problem is in the next decade when North Sea oil and gas are running down, old nuclear stations closing and before the next generation of fission stations are ready. We are trying to find a supply-side solution when we should be tackling the demand side.

Actually I disagree.

I believe that we should be tackling supply side while we still have time. I believe that over the next 15 years we should concentrate on a high priority programme to replace at least 7 - 9 GW of installed nuclear capacity.

In the same timeframe we could also feasibly commission approx 6 GW of carbon sequestering coal, and if we have the political will we could also develop carbon sequestering CTL facilities as well.

Despite all of this we’d still be short of energy, but this will be dealt with by pricing mechanisms. And I don’t expect it to be a pretty scenario.

Your assertation that we should look to the demand side of things is a fallacy. If we distract ourselves by looking at saving a paltry 20% of electricity consumption then we will still find ourselves short of both genset capacity and the fuel to provide the consumption. (NOTE: Installed capacity and consumption are two very separate things).

We need to look at replacing & increasing genset capacity. And we should have started 5 years ago.

Supply side is where the action will be.


Throw in 20GW of wind, and you are very close to a viable electricity strategy for the UK. 20-25GW of wind is doable on a 20 year view (roughly 50/50 onshore and offshore).

There might be up to 5GW of Combined Heat and Power, biomass etc.

(one should also be looking to save up to 20% of the UK's electricity consumption via active demand management-- Ontario and California have both been quite creative in this area. If the utilities are able to turn off major appliances from 4-6.30pm, it has a big impact at least from a greenhouse gas perspective)

(8GW of nuclear capacity (which would be about 6 3rd gen reactors) is probably stretching it in 15 years, but is certainly doable in 20-- actually I could see 10GW in 20 years, it will take a long time to get the construction programme going).

The real problem will be economies of scale and construction. The UK's fragmented energy market makes the individual risk to any one player of 'going nuke' very high, and then the learning doesn't necessarily carry across to another operator.

I work for a large utility and we already have active demand management for peak loading.

Most large (discretionary) users are already on the same system, so we won't be able to save that much as you can't save a saving twice if you follow me.


Roughly 2/3rds of electricity demand is commercial or residential: more in some UK electricity distribution regions.

I don't believe we have fully cracked what can be done in demand management with those sectors. Everything from shutting off fridges and air conditioners (for 30 minute segments) to washing machines and dryers, for example.

Ontario has done a lot of work on this. My parents have a tariff that allows the utility to shut off their heating (and air con I think) for 30 minutes at peak periods. I don't know of any such schemes in the UK.

I do believe that there is probably relatively little to be tackled on the industrial side, with big users.

Is that 20GW of peak or continuous?

The terminology around renewables is always a bit problematic. 20GW wind peak is "only" something like 6GW average for close to 30% utilization (UK offshore might be more windy than I imagine, though). That is not bad, but not exactly enough to move mountains/EVs, either. To move 20 million EVs with 10kWh/day consumption would require over 8GW of capacity alone. And that is probably on the low end of what is needed. What is the oil equivalent for the UK in terms of electrical power?

I would generally prefer we went from peak to average when discussing renewables, but that might not happen any time soon. In any case, I try to do that whenever I give a solar example: IMHO it is more informative to say that one m^2 of PV generates something like 20W on average: 120W peak from the panels into the grid and roughly one sixth of that on average for 4h/day of full sun in the Southwest of the US. The same system in the UK might only generate 12-15W/m^2, of course.

20GW is capacity.

If you want to work out the contribution to total UK electricity consumption, you have to work in Gigawatt hours, not GW.

20GW capacity means that at some points, the UK will be 50% powered by wind power, and at some points, very little.

At 30% LF, that would give you about 52,560 GWhr pa. Or about 1/7th of UK current demand (and 1/8th by the time you had built all this). Neglecting transmission loss.

Whether 20% of total demand is feasible, I don't know.

Personally I would prefer to stick with annually averaged power instead of energy (a GWh would be 3.6e12J - as handy a unit as calories IMHO). But then... I am not from the energy business, so I don't like to use BTUs, either, and they are quite common. You are right, though, if the convention is GWh, so be it.

Is UK demand really that "low"? I would have thought it was more. I am probably mislead by the size of the US energy budget... But I get it, 20GW is a serious contribution and probably beyond the limits of what the grid can take without getting into trouble with base load and peak load ratios. On the other hand... having 8000 2.5GW or 4000 5GW turbines around is not a small feat, either.

That was meant to be 2.5MW and 5MW turbines, of course. I wouldn't want to be anywhere close to a 5GW model. How big would that have to be? If I am not mistaken we are talking 2500m blade diameter and enough turbulance to suck you off the ground when the blade goes around once every few minutes? I guess that is not going to happen...

This is one of two 5MW turbines installed by REpower last year in the Scottish North Sea, in the Moray Firth:

It's in 44m of water, the hub height is 90-100m and the rotor diameter is 126m.

It seems to me that the major actor in any new UK nukes will be EDF. The most logical thing would be to give them the entire program for a new generation of plants, solving the problems of scale and construction (and of security and maintenance, which are a killer for operating costs).

But almost certainly, for ideological reasons the UK government will insist stupidly on competition. So you'll get a scattering of other constructors : Germans, US/Japan (Westinghouse), perhaps the Swedes?

Another way of looking at this is that we are not investing nearly enough in energy. You are right, £1 per taxpayer does not buy much, but an annual investment of £100 over thirty years will buy a significant fraction of your energy needs in terms of renewables. In case of the UK I would probably not invest as much in PV as I would in insulation and wind energy, but that is a different matter. If you look at £1/W long term cost of wind, the £3000 investment buys 3kW of generating capacity for each tax payer. That is something like 1kW continuous or .5kW/capita, which is not small on any scale.

What do you guys think of the inertial electrostatic confinement fusion? From here: http://en.wikipedia.org/wiki/Polywell

This to me looks like a much more promising solution compared to the magnetic toroidal confinement of old russian tokamak designs. An electrostatic trap for electrons, controlled by the electromagnets, to create a well potential in the centre, that attracts fusible ions, that then fuse in the center. The guy (Robert Bussard) worked under Robert Hirsch in AEC. He had this lecture given to Google, trying to get funding for next phase, after Navy cut the program: http://video.google.com/videoplay?docid=1996321846673788606

It's hard to believe that he cannot raise a few million to build WB-7 and WB-8, given the billions spent per year on these tokamak cathedrals. He is saying he needs 200m over 5 years to build a full-scale power-producing prototype..

Those who haven't watched the lecture should do so. I'm in no position to judge the physics, but Bussard is very convincing. Convincing enough to cause one hard core fast-crash PO doomer engineer I know to decide there may be a way out after all.

Bussard claims to have solved all the physics problems of electrostatic containment and scaled the power output by 5 orders of magnitude over earlier fusor designs, basically to the energy break-even point, using deuterium-deuterium reactions. He is promoting the idea that his machine could use the H-boron 11 reaction that would produce no neutron output, eliminating the need for the blanket found in tokamak designs and the problem of the machine becoming radioactive over time.

It all sounds a little too good to be true, but Bussard claims he needs only few million to build a new demonstration machine to allow external verification of his design. Compared to the cost of ITER, this is chump change. I hope Google gives it to him. Bussard intends to publish all his research material starting this summer, so we shall see what the scientific community makes of it.

I saw this a few months back, and I also found it most interesting.

I should add that I have a background as a physicist (but not a plasma physicist) - I had a roommate in graduate school who worked on magnetic mirrors (that ultimately got dropped in favor of the Tokomak), and his reaction would be interesting. There wasn't anything in there that struck me as obviously wrong, but it is people who do plasma for a living who you have to convince - some of them are going to try and shoot the thing down without trying to evaluate it - that's human nature. But there will be some who will sit down and look the stuff over with an open mind.

I would tend to agree that when the papers come out will be when we will say for certain. Rather than get a hundred million to build a full-scale, I would expect that the next step would be that someone would attempt to rebuild the last prototype, and see if they can reproduce the results - and if that comes out the way that Bussard has promised, then and only then would it make sense to really push to build a full-scale reactor.

If it sounds too good to be true, it probably is. Electrostatic fusion devices are, at best, the brainchild of people who do not understand plasma physics. At worst they are scams to get at people's money. You might get rid of yours easier if you bought my Bridge in Brooklyn. At least you wouldn't get a headache trying to think about the details of the scam you are falling victim to.

Did you notice that all these devices either fail to deliver OR get "magically" destroyed before they can be validated? That, of course is due to the international conspiracy of plasma physicists who regularly send out their saboteurs to make sure they will not lose their funding!


Perhaps. But the guy isn't a backyard mechanic messing with stuff he found in the junkyard. He started with a decent reputation and he got funding for many years from the Navy.

Still, I would be most interested in seeing how the plasma physics community receives the papers that he is going to publish. Then we will get a much better indication of how much sense this all makes.

"He started with a decent reputation and he got funding for many years from the Navy."

I believe the Army or the CIA or someone also funded psychics and mind control techniques for years, so that all by itself proves nothing. Not all military people are born on the deep end of the brain pool. You got technical geniuses among them like Admiral Rickover and then you got total morons. The people who are funding this thing are more likely of the moron kind.

However... there are good reasons to study these kinds of devices: they are great neutron sources even if they never break even. I wouldn't be surpised to learn that the Navy doesn't really care about fusion but about getting access to a cheap and reliable neutron generator. There are tons of applications for neutrons and a reactor is not always at hand. Not sure I would advertise these efforts as much as they do, though. Most neutron applications they are interested in fall under national security...

"Still, I would be most interested in seeing how the plasma physics community receives the papers that he is going to publish."

First of all I would assume that he is not going to publish any professional papers. If he did, he would have to go through peer review, which is going to expose errors in the theory, experimental methodology and data analysis. If these papers were to be published and if they had professional quality, they will simply show that the devices do not work, which will make most experts respond to with: "Yawn. What else is new?".

I am sorry to dissapoint you, but these kinds of devices have been studied theoretically for a long time. They all suffer from the problem that electrons and ions thermalize a long time before the fusion reaction is likely to take place. But once you got thermal equilibrium, the same plasma and bremsstrahlung losses apply that limit any other type of fusion reactor. Hence, you are back to the usual Lawson criterium. I don't see what physical space between accelerator beams and thermal fusion this device is trying to squeeze in. As far as I can see there is no region inbetween that can break even easier than thermal plasmas.

There is one simple magnetic confinement concept that seems to work: linear mirror machines. But it turns out that they have to be made giant to achieve EROEI > 1. I believe the smallest of these machines are on the TW scale... i.e. we would only need a couple to power the whole Earth! And currently there is probably not even enough tritium to fill one of these babies even for a test run! Of course people are researching much more elaborate small geometries like levitated dipoles and compact toroids. But all of these machines suffer from the same basic problems.

So lets just assume for a moment that one could build a working IEC! Now you got to solve EXATLY the same problems as with the Tokamak, i.e. you need to thermalize the neutrons and breed tritium with exactly the same problems described in the article, i.e. your wall materials are degrading because the neutron flux is brutal and because breeding is very inefficient it takes forever to get the tritium cycle going... and then you end up with vast amounts of radiocative reactor parts that need to be stored until they cool down. What a nightmare.

Even if you are not an expert, there is an easy way to distinguish good science from fakes: if people are spending billions of dollars and a lifetime on something that is hard, it most likely is. Plasma physicists who are part of the community are not morons who missed something a guy can build in his garage. They also don't care about employment as some conspiracy theorists would claim. If IEC devices were possible, they would have been invented and made to work 50 years ago by the same people who went from studying pinches to Tokamaks and Stellerators and inertial fusion. These people would have gotten ten to hundred times more funding than they did to put these devices into anything from your run-of-the-mill power plant to submarines and space propulsion systems. We would probably be colonizing Mars today if these things would work. The logical alternative is simply that they don't work. If you don't believe that, I would suggest to take an introductory class on plasma physics at a university. Part of the material covers the issues of confinement in different geometries. By the end of the class you should be able to decide for yourself if IECs are possible or not. It just plasma physics, not rocket science.

You're right to be skeptical, but wrong to dismiss this out of hand without taking even a quick look at the presentation. Bussard does have credentials in the field and clearly knows that his claims will require verification. His reputation is on the line. The problems that you rightly say must be solved by any fusion project are directly addressed in his presentation. He discusses all the various confinement geometries and explains the advantages of the one he has chosen in detail.

Yes, things that sound too good to be true usually are but that should not preclude keeping an open mind about the possibilities, something that I would have thought you of all posters would understand.

If Bussard fails to publish his work (in peer reviewed journals) he will be dismissed, but publish, he clearly intends to do. We will see.

The above poster is right. There is a fundamental issue which must be overcome, and there is no obvious evidence that it has been done.

Specifically, 'near misses' of protons against nuclei and electrons scatter them, causing chaos and descent to thermal equilibrium.

Nuclei typically have to have a near-collision thousands of times before they collide close enough to have a nuclear reaction. True in a tokomak and also in the electrostatic devices. Those collisions result in a push back to thermal equilibrium (and hence much lower probability of fast enough velocity tails for fusion) and energy loss.

There is a PhD theoretical thesis from a guy at MIT in the mid 90's who analyzed in a general form the best possible cases from non-equilibrium fusion prospects, and they aren't good.

Bussard must have an explanation why this guy was wrong (or assumptions different) and what magic technique he has as a feasible solution to this essential problem.

My best guess is that the electrostatic machine builders make the same mistake as the proponents of perpertual motion machines - they oversimplify one part of their calculation using some "common sense" argument and that is where the whole thing breaks down. Physics is all about simplification of the theory to the point where one can actually calculate something numerically (in most theories one can't calculate much more than a few trivial systems without these simplifications). What seperates professional physicists from charlatans is that the physicists always check if ALL assumptions are actually correct. Plasma physics is a great example of a decade long battle to align models and simulation codes with experimental data. They got it figuered out, by now, I would say. And the result is that everybody who knows how to calculate plasmas understands that the machines have to be huge and the required fields are near the engineering limits of our best technologies, hence the cost is very large.

I was struck by the amount of Tritium needed to kick start the process. It seems like a catch-22. Have I got it wrong?

Good summary, but ...

The other thing overlooked, in all this cost / benefit / risk / probability talk is that the subject here "Nuclear Fusion" is actually only the high enregy gas-discharge type of fusion which already has had huge investments sunk into massive capital projects like JET / ITER and other Tokomak based projects around the world.

There is a whole other nuclear fusion ball-game in low-energy / low-temperature fusion processes to be taken from research to fruition too. Much less macho and capital intensive, and not (so far) reliant on tritium supplies. Like the debates over fission, the big risky stuff get's the headlines over the safer smaller scale stuff, too easily ridiculed and dismissed by the big macho boys with their big balls already in the fire.

This is significantly as much to do with human psychology and behaviour as "hard technology" issues.

Are you talking about various types of "cold fusion" (bubble implosion, etc.)? I was under the impression that it is not yet generally accepted that this works.

No, you have got it right. Although there is relatively little tritium in the chamber at any one time it takes quite a while before the bred tritium is recovered from the blanket. This is especially true of the designs using a pebble bed of lithium based ceramic spheres and beryllium spheres. A fair bit of the tritium stays embedded in them and they are not removed until a fair proportion of the lithium is used up. Also, until it reaches saturation the initially clean structure will soak up a fair bit of tritium. At start-up you have to keep pumping in tritium from stock until the recovery cycle gets up to speed. For a 4GW thermal, 1GW net electrical reactor this is about 4.3kg per week.

This and the small excess over self-sufficiency is why it can take three years from start-up before there is enough spare tritium to fund another start-up.

The Canadian CANDU system uses tritium. So there should be an adequate world supply, as long as those plants are kept open.


good summary of the issue.

Adequate world supply from CANDU reactors?

Nick states above:

Something like a 220kg per year of tritium will be consumed for every 1GW of continuous electrical generation, assuming that 4GW thermal will generate 2GW electrical of which 1GW will be used to provide all the inputs to the system leaving 1GW of output power.

Nearly all the worlds supply of non-military tritium comes from the heavy water used to moderate CANDU reactors and some of these will be closing down in the near future. The supply accumulated over 40 years of operation of CANDU reactors will peak in 2027 at 27kg.

And there's a graph. That doesn't seem adequate to me.

Of course you are right in the context of commercial scale electric power production. We would have to build new tritium plants (not impossible).

That study I quoted is for building ITER and running it.

China is opening about 1GW of new coal fired generating capacity a week. To envisage this rate of build up of fusion reactors would require a least a tonne of tritium a year just for start up charges. This would require the combined tritium output about 25,000 CANDU reactors

Presumably though once you have one working fusion reactor, you use it to produce more tritium?

I am sure the CANDUs are not optimised to produce tritium? So there might be some significant upside there, too, in terms of increasing yield.

They don't use Tritium per se, it is a by product of neutron capture by the Deuterium in the heavy water moderater....


I noticed on some of the charts that several D-D cycle reactors have almost achieved the same efficiencies that the D-T JET and JD-60 reactors have.

Whats the technical issue of using only deuterium in the reactor? It seems to me that Fusion would scale up exponentially faster if we did.

You will note on the diagram the Q lines says Qdt. That is the Q factor for a deuterium/tritium plasma. The Lawson criterion for deuterium-deuterium fusion is approximately 100 times bigger. Many of the experimental machines were not capable of safely containing radioactive tritium so limited their experiments to deuterium. From these results it is possible to estimate fairly accurately what a deuterium-tritium plasma would have done in the same machine if it had been safe to do so. To obtain break-even with deuterium only fuel would require an even bigger machine or 100 times better confinement. Given the progress that has been made, the latter is not impossible but nobody knows how at the moment.

Nick, thanks for this informative overview. I don't know whether to laugh or cry. "We" start out on a mission to make clean energy from one of the most abundant elements (H) and end up with a reactor design that uses the scarcest isotope on the planet (3H). It seems to me that our physicists and engineers have lost site of the objectives in pursuing physics and engineering problems for their own sake.

Add to the scarcity of 3H the notion that the whole of the reactor interior will have to be reconstructed at regular intervals using a robotic arm - I just don't ever see this working. You just need to look at the size of the sucker - its huge!

All this to recreate on Earth the conditions of our Sun - it has to be simpler to just harvest the Suns energy here on Earth. It looks like the whole global community is to place its eggs in this one gigantic basket - what if it doesn't work?

As per my earlier post, £1 per taxpayer per year in the participating countries is not "placing all our eggs in one basket"

Its somewhat closer to "a drop in the ocean"


As compared to one penny spent on renewables, you mean? Historically the research spending (in Germany, at least) had fusion at the top and solar, wind etc. close to the bottom of the list. If you look at the return on investment, renewables lead by infinity and will probably lead by the year 2100 still by a factor of 100.

I hear that H3 is abundant on the moon....

3He is "abundant", which could be used in an almost neutron-less and therefor much cleaner fusion reactor. But what is the cost to mine the lunar top soil, extract the 3He and bring it back to earth? Not to speak of the vastly more difficult confinement conditions for the 3He reaction...

A much easier task would be to connect a cable to the moon and convert its kinetic energy into electricity! All you need is a 400,000km long superstrong cable and a giant winch around the equator. But that is probably much easier than 3He mining and fusion.

AKA: 'Tidal Energy'

The tidal coupling is very weak and there are limits to how much energy can be extracted using available shore lines. The cable to the moon, on the other hand, can make that coupling arbitrarily large. And hey... it is about as likely as a space elevator.


I may have misset the most important part in reading this page. How is the electrical power actually generated here? In a fission reactor you just pass the water, or a heat exchange medium, through the reactor, it gets hot and you boil water. This water creates steam and you pass it through a turbine that turns a generator and you generate electricity.

Do you boil water with this fusion reactor? How is the electrical power actually generated?

Ron Patterson

Yes you must of missed it. Sorry it was a bit long. I said somewhere in there
"This heat, plus a little gained from absorbing the hard ultraviolet/soft gamma radiation emitted by the plasma, is transferred out of the chamber by a gas or liquid and used to heat steam, to drive a turbine, which turns an alternator to generate electricity."

Ah, the "beauty" is that fusion works like fission...thermalize the neutrons, boil some water, and spin turbines...

I don't see much hope for fusion, even less so now that I fully appreciate the Tritium supply issue. Fusion is simply too expensive, too complex. The truly scary thing is that the vast(!?) amounts of money that have to raised in political dogfights for this type of research would represent literally a drop in the bucket for one of the big IOC's. Having seen firsthand how big science is funded in the US, it would curl your hair to see how that money actually gets assigned to a given research project.

Here's anothere thought, by some estimates the US has spend $250 billion on the "War on Marijuana" over the past 10 years, think of the alternative energy research that could have been done with 1/10th of that.....

It is much worse than that. For the cost of the Iraq war the nation could have reduced its energy dependence by at least 10%, which, incidentally would have kept oil prices way down, further reducing our energy cost, thus paying for more independence. Instead we increased our debt. Including future interest on that debt we gave up the one time chance to reduce our dependence by a very significant amount.

Of course, not all is lost. Washington could decide tomorrow to introduce an additional $1/gallon in gas tax and spend the money on conservation and renewables. Yeah right... that is going to happen... in my dreams.

Great article! thanks!

When I see the cross-sections of the ITER reactor, I can't help thinking about this book:

Incredible Cross-sections of Star Wars, Episode I - The Phantom Menace: The Definitive Guide to the Craft

The technical challenges facing fusion energy are staggering! Even if a proof of concept is reached in 2050, I don't quite see yet how a commercial plant could function. It's obvious, that the self-sustaining plasma conditions (Q>10) cannot be maintained for a long period because of the impact on the plasma container so the ignition conditions will have to come and go, what would be the power output pattern? The probability of something going wrong at one point in such a complex system will probably be very high and will make the eventual generation of electricity unreliable.

In the EU almost € 10 billion was spent on fusion research up to the end of the 90s and the new ITER reactor alone is budgeted at € 10 billion. It is estimated that up to the point of possible implementation of electricity generation by nuclear fusion, R&D will need further promotion totaling around € 60-80 billion over a period of 50 years or so (src: Wikipedia). For comparison, the cost of the entire Apollo program is estimated at $135 billion (2006 dollars).

It comes back to what Euan said:

All this to recreate on Earth the conditions of our Sun - it has to be simpler to just harvest the Suns energy here on Earth. It looks like the whole global community is to place its eggs in this one gigantic basket - what if it doesn't work?

Sunlight - we should be aiming for a future where we can live on energy flows - there's no shortage of energy flow from the Sun we just need a way to capture it. If I was investing €10s of billions over decades I'd be investing in photovoltaics with the aim of reducing the capital cost per peak watt from the €5 it is now to €0.05. That's only two orders of magnitude and on the face of it a far simpler challenge than getting fusion working. Not only that but it delivers *something* from day one.

Remember there are lots of ways to use solar that don't take PV- simplest maybe is a clothes line to dry laundry. I have used one for 60 years and guarantee its cost-effectiveness.

And then all the solar-thermal ways to get electricity. Maybe just too simple-minded to get anybody's enthusiasm??

And how about some windmills and sail ships?

And,if you insist on fusion, not to forget the put-put power plant. Just drop H-bombs one after another into a big hole and pour water in to get steam. Simple, cheap, fun!
Leaves a nice big crater when you're done with it; can be filled with rain and rented out as beach property.

Fascinating piece in this week's New Scientist on using 'flow thru' batteries on a windfarm in Tasmania.


if we crack energy storage, we crack a very big part of the renewables problem. The big problem with solar and wind is the intermittency. There's no other major obstacle to getting much or all of our power from those sources (given current movement downward in costs per kw of capacity).

OK, I will say it again ( what's the use?). Just make a bunch of small (relatively- actually, pretty big) hydrostorage systems everywhere. Water is cheap, sheet steel is cheap, Both last a long time. Pump/turbines are totally well known. So, when the great day comes and wimbi rules the world, every little patch of town will have a big water tank sticking up, with a megawatt-hour gage on it. Citizens will take great pride, not to mention comfort, from looking up at the high reading on their local megawatt-hr storage meter. Every PV and every windmill, not to mention every stirling solar/biomass widget, will be pumping water into that tank from the local duck pond.

And, don't stop there, Teenagers are a total pain in the ass because they are UNDEREMPLOYED. Each teenager must have a foot powered water pump connected to the local storage tank, and has to pump at least a kw-hr into it every day before getting his chit to go eat.

So much for energy storage, street crime, parental mental collapse and obesity. The world is saved. Now I can go fishing.

As I commented above, having all that plasma at millions of degrees is a bomb waiting to go off, when something upsets the containment, like an earthquake or a part failure. I'm simply applying Murphy's law.

No it is not a bomb waiting to go off. Because the throughput is so fast there is not enough energy in the chamber at any one time to even melt the front wall of the blanket. No failure of the cooling or magnets or electrical controls will breach the containment. The plasma is very hot but with low mass it will not heat large masses up. Even a containment fracture will not require evacuation beyond the perimeter which is more than can be said for many chemical plants. Fusion is very different fron fission in this respect. There is no vast thermal mass that will keep generating heat. Fuel does not sit there for months. Have a read of the safety report I gave a link to.

Great article - very informative. I knew tritium was rare but wow how rare. Fusion power would be huge for human history as its perhaps the ultimate energy source in the universe. The applications will be epic in ultimate scale going beyond terrestrial power generation. At this point though, the neutron flux is brutal on the engineering. I read an earlier fusion link from the oil drum (Kulcinski Univ Wisconsin) about next generation 3He fusion with much reduced neutron issues.
We can say fusion power won't happen in the next decades but there's only a small cadre of fusion researchers now. Its been victimized by short term outlooks and oil suppliers. I wonder if the post-ITER projects are going to China.

He3 is also very scarce and expensive. I recall an experiment where the cost of the He3 was more than the cost of the rest of the apparatus, about $250,000 for 375 cc (in its liquid state)


The main supply of helium 3 is from the decay of tritium. The next best source is mining it on the moon. Need I say more?

There is a now mostly discredited theory that the Permian extinction was caused by a huge meteoritic impact, based on trapped He3 in some rocks of extraterrestrial origin in China (unfortunately no one else could replicate the findings, and the originals are, umm, "no longer available". Anyway, elevated He3 is an extraterrestrial marker) So perhaps instead of looking out for possible Earth-striker asteroids in order to push them away, we should be drawing them in. Just the big ones, mind -- the little ones wouldn't be cost effective...

"Fusion power would be huge for human history as its perhaps the ultimate energy source in the universe."

Actually, it's not. Most of the energy released in the universe comes ultimately from gravitational collapse. But if you want to tap into fusion, it's really easy: put a solar panel on your roof or just a solar water heater. The Chinese are making these for something like $200 for the simplest models, I believe. Compared to the $100 billion invested in fusion over the next decades that is a steal...

"I read an earlier fusion link from the oil drum (Kulcinski Univ Wisconsin) about next generation 3He fusion with much reduced neutron issues."

Except that basically everyone in the fusion community is sceptical about us being able to reach the necessary confinement conditions with an EROEI>1. 3He fusion is a boondogle.

"Its been victimized by short term outlooks and oil suppliers."

That is total BS. Fusion has consistently eaten up the largest fraction of the German research budget since the early 1970s. The total investment probably exceeds that in renewables by one to two orders of magnitude. Inform yourself. Or stop trolling.

This is the best overview about fusion I have seen. Thanks!

Wow... what a dire outlook for fusion this is. I always thought it was a poor technology to start with, but looking at the details it just seems to get worse. The fusion experts I talked to a decade ago (back in Munich), were hopeful to make reactors work at a cost roughly four times that of conventional electrical plants (relative to 1990s cost). At the time it still had a slight advantage over solar technology, but that has just completely gone away. At 4 Euro ($5) per Watt, solar can compete today. And since it is expected to drop in cost by roughly 20% for every doubling of production capacity, we can see enormous price advantages in the future. Ultimately, this progression will probably level off around $1-2/W simply because there are non-trivial installation costs for residential solar (somebody will have to climb on your roof to put it up there and I would assume they want to be paid living wages) while industrial solar might come in a little cheaper (but not by much). But even at $2/W solar is vastly more affordable, especially if we take into account that we don't have to wait until 2100 to make use of it. Large scale solar power generation can be installed by 2040, years before the first fusion plants start operation.

So what is the remaining advantage of fusion? Continuous operation. The question, though is how important that will be. We have seen battery technologies evolve considerably over the last two decades and progress is still being made. Batteries and fuel cells can bridge days to weeks of intermittent energy demand. Hydroelectric storage can bridge fluctuations on the week to month scale. And finally chemical storage using hydrogen or hydrocarbons can extend that to months, at wich point solar (and wind) can supply much of the base load.

I always wonder where solar would be if the tens of billions that went into plasma physics had been invested in it... Not that I mind that the plasma physicists are having fun, I just doubt that it will ever return its money.

You are right that solar is a competitor but photovoltaic struggles to make 15% load factor untracked. Here in the cloudy UK my system achieves just over 10% with only 6% of the annual output during the winter three months when we need most heating. Fusion may make 85% load factor. The comparison for direct replacement is then the cost per 1W of fusion to the cost per 6W of photovoltaic plus the costs of sufficient storage to level out 6W intermittent to 1W continuous over the year.

Photovoltaic may still come out ahead in 30 years time but it is a closer race than you imply. Batteries may have advanced sufficiently to consider for transport but they are still hopeless for levelling photovoltaic output. €0.15/Whr is about the best you can get and you need at least 2000Whr per 1W of photovoltaic generation to get full year around levelling. Pumped storage is cheap but geographically limited. Steam electrolysis followed by a pure hydrogen oxygen gas turbine is probably the cheapest storage than can be put almost anywhere. Only about 60% round trip efficiency but would work and is cheap fairly conventional engineering.

Fortunately photovoltaic generation does not have to compete on these terms to capture a very substantial market and is doing so. However this still leaves the base load market where it would have to compete on these terms if it is to be a competitor with fusion.

As I said in my article it will be a very different environment at such time as large scale fusion power becomes available. We will either of achieved something or died in the attempt.

"You are right that solar is a competitor but photovoltaic struggles to make 15% load factor untracked. Here in the cloudy UK..."

I agree... solar is not the right choice for the UK. On the other hand, if you happen to live in the Southwest of the US... it screams at you "PUT ME IN!". I went to Las Vegas a little while ago. The city is surrounded by miles and miles of desert with enormous amounts of radiation being available for most of the year. There are enormous unused roof areas over there just as much as they can be found anywhere around CA, NV, AZ, TX etc...

"The comparison for direct replacement is then the cost per 1W of fusion..."

True, but that assumes you get fusion for free once you set the machine in motion. The cost to maintain solar panels is close to zero. Experience shows that one can expect 30 years of lifetime, after which the replacement cost will be a fraction of the original cost for current generation cells. I can only see solar getting cheaper as we go, while fusion will be getting more expensive with everything we learn from here on about what it takes to operate a machine.

"Photovoltaic may still come out ahead in 30 years time but it is a closer race than you imply."

I would bet an expensive case of wine on there being no operational and economic fusion plants 30 years from now while there will be vast amounts of economic PV around.

"Batteries may have advanced sufficiently to consider for transport but they are still hopeless for levelling photovoltaic output."

Batteries are only one way of storing energy, there are many others. Like I said... you can have different solutions on different time scales. There is absolutely no need to store energy for a whole year in either batteries or any other storage system. Storage for a month is as much as it probably takes for most transportation and electricity generation needs. Both could happen with hydroelectric dams and chemical storage. Hydrogen is but one option and the efficiency hit is not a real problem - every generation system has to be oversized and having hydrogen for transportation around might be the way to go, after all.

The sun always shines... it is only the clouds that cause a problem beyond the 24 hour time scale. In some parts of the world clouds are very rare. If I were the Saudis, I would be thinking very hard about what resources I had to replace oil revenue... and with covering only 1% of their land area they could generate as much energy as they are getting from producing oil right now. I wouldn't be surprised if Europe ended up getting its energy from the Arabian Peninsula for a long, long time. Northern Africa comes to mind, too.

"However this still leaves the base load market where it would have to compete on these terms if it is to be a competitor with fusion."

As of now I don't see fusion compete with anything seriously, certainly not with wind, coal and fission (wind is limited, coal is dirty and finite and I personally don't like fission and breeders, so I am advocating solar). I don't even see it working on any level where I would want to invest money in it. And long term it does not have much of a chance to cover needs that go way beyond our current ones, either. In another post I was giving a back of the envelope argument why total power generation capacity can not exceed current capacity by much more than an order of magnitude without causing global warming directly. Solar plants operating at 30-40% efficiency are as good as any other technology in terms of waste heat generation and have the additional advantage that they can actually modulate planetary climate. If we were to generate power any other way, we would need to put something very similar in place to get the waste heat off the planet, so why not go clean in the first place? Just because fusion is so much more SciFi than solar farms makes for a poor choice. But I don'e even think we have to go to farm based solutions for now. Residential solar will continue to grow for a very long time and once we make it part of our roof structure in new homes, it almost comes for free. People just have to get used to it.

There is more than just extra sunlight and lots of otherwise unused space that makes the Southwest of the US more suited for photovoltaic generation. The load better matches the available power as is biased by summer cooling whereas in the UK it is in anti-phase to generation as domestic cooling is almost unknown and winter heating is universal.

The meteorological factors here add to the astronomical ones to emphasise the contrast between summer and winter generation. My 2.7kW array has generated a miserable 5kWhrs in the last week compared to 100kWhr in the middle of last June. This is not a problem while solar penetration is so low and I can sell surplus electricity for about the same as I buy it. Because of this I can use the grid as a vast free energy store. My summer generation pays for a fair part of the winter input to my ground sourced heat pump especially as with a large buffer tank I can set it up to use a lot off-peak electricity at night times which is about a third the cost of the day time.

This scheme would not work if photovoltaic power were to become a large part of our generation capacity. If it were to become the major part of our generation then with our pattern of generation and demand we would be looking at storage over many months and the cost of that would exceed the cost of the generating plant. We have about 12GWhr of pumped storage in the UK and not much room for vastly more. When ever I mention this Alanfrombigeasy usually comes in and says he is sure we do have room. I don't think he appreciates the problems there would be trying to persuade the Welsh and Scots to have their greatly treasured mountain tops flooded to store energy for mainly English use. Wales and Scotland have low population densities by UK standards but they are still high by the standards of much of the US.

Maintenance of non-scanned photovoltaic generation is certainly low cost. In the nearly three years I have my array it has been limited to an extension of the mop handle to clean off the bird shit. In fusion the cost of replacing the blanket periodically is a sizeable cost but if you fiddle around with the cost generator I gave a link to in the post, it is still a fairly low percentage. Amortised capital cost dominates.

Export from North Africa of large scale solar generated electricity, either photovoltaic of solar thermal is a distinct possibility. I have talked about this in previous comments. High voltage DC links to export it across the Mediterranean are becoming cheaper. I wonder if the solar chimney scheme they are having trouble bringing to fruition in Australia might have better prospects in North Africa with a link to Europe.

Alanfombigeasy's scheme of bringing geothermal generated electricity from Iceland is also very good. I had better mention this before he does if he is reading this. He has, I believe, put a lot of good work into this.

Wind is not very limited here. We have lots of stormy coast surrounding a fairly small area and the wind pattern is a much better match to demand. We also have lots of waves and the second highest tides in the world in the Bristol Channel.

In short there is a lot of renewable energy available to the UK and every good reason to expand it, certainly up to 25% of generated power. But this is the point at which storage starts to enter the picture. It is no problem at all at the present level, rises to significance at 20% or so and rises to dominate costs at 40% or so. Fossil fuel back-up helps a little at the middle range but does not solve the problem at higher levels. Wind and solar have such a peaky distribution in time of available power that you cannot push the directly generated fraction of generated power much above 35% before your capacity starts to exceed demand for significant lengths of time while still generating only a small fraction of demand at others. Without storage you have to throw away the available excess power. You hit diminishing returns. Substantial amounts of extra wind or solar capacity fills relatively small parts of the dips while having every increasing fractions of available power thrown away.

If renewable generating capacity is 120% of peak load and the load factor is only 25%, it would probably only supply at best 25% of total demand as some available power would be thrown away. Demand is also very peaky. Peaks of available power will frequently coincide with troughs of demand. Wind speed distribution and the very non-linear speed/power characteristic of wind turbines means that even with 120% of peak demand capacity there will fairly frequently be times where only 10% of demand is met. Photovoltaic will of course give zero during the night. Doubling the installed capacity to 240% of peak will only bring the 10% up to 20% still leaving 80% to be met from other sources, not much of an improvement on the original 90%. It will however greatly increase the capacity thrown away. Without storage in the thousands of GWhr range meeting demand with renewable generating capacity becomes ever more expensive.

It is for this reason that I said that if fusion has any hope of competing, it is for base load and the cost comparison for renewables is cost per watt multiplied by a factor derived from the relative load factors plus the cost of storage to smooth out variations over an annual scale. Even with this handicap it could well be that wind, wave, tide and imported solar and geothermal win.

Re: A timetable has been proposed for the overlapping development of the various proposed devices. It assumes that the only obstacles to its implementation are technical ones and comes with many caveats, but it sees the first commercial power station operational in 2048


Just wanted to emphasize this point. I'll be dead but I'm betting subsistence farming will be all the rage at that point — if you have weapons to keep the freeloading intruders from stealing your food, raping your... — nevermind, screw it. Maybe it won't be that bad.

By 2048 I see wind turbines and PV supplying 50% of the base load thanks to advances in energy storage technology. Not that we will need so much energy per capita in 2048... we will have learned to do much more with less. I agree about one point, though... I will be dead by then myself.

Where's the other 50% going to come from?

A lot can come from hydroelectic plants, like it is now and nuclear fission will not go away. Fianlly, there is coal and that won't go away for a long time, either, no matter what we wish for or how bad global warming gets.

Sorry, guys, but you sound like old farts.

Maybe this prototype will never become a working machine, but I gladly contribute my 1 pound a year (rubles in my case) because it’s science. Actually, it’s BIG SCIENCE. Such giant machines are what make me proud of being a human.

You should contribute more to CERN, then. For one thing, it is way bigger science, for another, it is a way bigger machine. And finally, it is guaranteed to work and produce the results it was built to produce.


Chris V. et al. -- excellent and fair summary of the ITER/DEMO magnetic fusion (i.e tokamak) route to fusion power!! Well done!

Couple of comments:

-First: full disclosure: I work in inertial confinement fusion (ICF). At Lawrence Livermore Laboratory in California, we're around 4 years from ignition and burn of a DT ICF fusion fuel capsule with energy gain ~10 at the National Ignition Facility ("NIF". See http://www.llnl.gov/nif/project/pdf/NIF_Facts.pdf). This will be some 10-15 years ahead of ITER. Both ITER or NIF are in the business of demonstrating the scientific basis of fusion burn and energy gain; success on these facilities will be the culmination of some 50-years of immense scientific endeavor of making fusion work at all!. The work after NIF/ITER is the path to a truly attractive reactor product (see below) and this probably won't look much like our conventional view of the DEMO today.

- Why are we pursing fusion? Two reasons:

(1) It's the only energy source indigenous to the earth that will last as long as the earth lasts, i.e billions of years. Thus, one can view it as a cheap insurance policy (say few $B/year) for what is a truly unlimited energy source. We have time to get this right!

(2) Fusion's competitor in the near term (this century) is breeder fission. I'm a great of fan of fission energy and believe we need a crash program now to offset the CO2 problem (fusion, when it comes, will probably be too late to contribute to the initial stabilization; its job will be long-term CO2 maintenance). Today, a fission breeder appears cheaper and less complex than our conventional roll-forward DEMO-like reactor concepts. However, fission is burdened by the important externals of safety, environment, proliferation and long term waste disposal. Proliferation is a particular concern for breeders and associated reprocessing. So, an interesting question is: What are the minimum performance requirements of the fusion power plant in terms of size, cost, complexity, etc, so that fusion is deemed competitive overall to breeder fission when the latter's externals -- e.g., safety, environment, proliferation and long term waste disposal -- are folded into the equation? Bottom line is that a fusion "core" can, in principle, be bigger and more complex than a fission core and still be competitive overall!

- Regarding solid first walls and blankets in magnetic fusion, Chris Vernon correctly states: "The materials used to make the breeding blanket and particularly the first wall facing the plasma need to survive an extremely severe combination of conditions .... there is almost no chance of a breeder blanket that can survive the full life of the reactor.... the economics of a future power station will depend heavily on how hot the blanket can run and how long it can survive before replacement and how fast it can be replaced. "
Thus, in inertial fusion energy (IFE), we're striving to design our IFE reactors with (neutronically) thick, free surface flowing liquid walls -- e.g., liq Li or LiPb of FLiBe -- that are lifetime components. There's no need to shut the reactor down for a blanket change (equiv to a refueling outage in fission). All solid structural materials are shielded behind this liquid stream; thus they're also lifetime components and qualify for near surface burial at end of life. (Additional advantage is that we don't need to pay for a separate IFMIF materials test facility to qualify solid materials for high neutron flux damage)

- Final general comment on fusion. So far we have been talking about "thermonuclear" fusion. Note that the size, cost and complexity of conventional thermonuclear fusion reactors -- both magnetic and inertial -- are fundamentally governed by the need to sustain a minimum value of the plasma temperature of ~10keV (100,000,000-degC) in the face of significant loss processes. This minimum temperature is necessary so that energetic ions in the tail of the thermonuclear Maxwellian plasma have sufficient energy to tunnel appreciably through the repulsive Coulomb barrier and, thereby, induce acceptable fusion reaction rates. Fission, by contrast, has no Coulomb barrier as it's propagated by neutrons. Thus fission can proceed at zero fuel temperature and you can make a small cheap fission core (But, the downside is you can also make a small cheap fission device on a tabletop -- i.e., a bomb!).
Very modest changes in the Coulomb barrier geometry can have a profound impact on the fusion cross section and reaction rates. This suggests that one approach to achieving a fundamental step-change in the physics and, perhaps, the economics of fusion is to circumvent, at some level, this high temperature barrier threshold of the conventional (thermonuclear) cross section. This would be particularly advantageous if we are ever to realize economically attractive fusion reactors based on the so-called “advanced” fusion fuels like p-11B and p-6Li. At LLNL, we have examined several methods for enhanced barrier-penetration fusion. These include, for example, muon-catalyzed-fusion. Here a muon acts as a very heavy electron. It orbits a D-T nuclear couplet at ~0.005A and hence screens the repulsive Coulomb repulsion down to this radius. The barrier penetration probability goes through the roof with the profound result that the resulting optimum temperature for muon-catalyzed fusion reduces to only ~600-deg C (compare this with 100,000,000-degC for thermonuclear fusion). In general, this is an interesting area to explore for a true step change in the fusion reactor realization. Remember, we have an infinite fuel supply!

Hello John
Thanks for the information. My write up was limited to magnetic fusion because it was about an event organised by the UK Magnetics Society, and magnetic fusion was the subject, not because I discounted other routes to fusion. It is not clear from the link you give what the NIF experiment will involve. The size of the facility is enormous. It makes ITER look quite dainty but it is not clear what is at the business end. It is one thing for a scientific organisation to use acres for describing optics but what is a BB sized target? You say you will achieve an energy gain of 10. Is that fusion power to optical power. What is the electrical input and is there a good prospect of achieving a high total energy gain given present laser efficiency?

You talk about liquid walls. One of the things that I have never managed to find is an explanation of how a commercial inertial power generator will work. I believe at the moment you use illumination that practically surrounds the target. How do you get the light through the walls? If you scale the system up to give a gigawatt average electrical power, as you would need to do to be commercial, and need two or three gigawatts of fusion power delivered in a series of explosions how do you protect the optics from the blast and the liquid splashing around. Even at 5 or 6 blasts a second that is close to the equivalent of a tenth of a tonne of TNT at each blast. Will the experiment you are describing as due in four years be more that isolated blasts and will you have any means of collecting the energy?

ITER may not be a power generating reactor in its own right but with the addition of a breeding blanket (which may be fitted in a latter stage) it has the general outlines of a commercial reactor and may even be about the same size. Some of the outline designs of DEMO are no bigger than ITER but have several times the power output. Could you say that NIF bears any relation to a commercial reactor?

Muon catalysed fusion is interesting but from what I have read in the past the short lifetime of muons means the vast bulk of them will decay before they can catalyse a fusion event and the means of generating them used in particle physics has such an appallingly low energy efficiency that there would be no energy gain. Have you come up with anything that will change that?

Reactions like p-11B and p-6Li have Lawson criteria five or six orders of magnitude higher than deuterium-tritium. If deuterium tritium strains our technology in both inertial and magnetic confinement then without muon catalysis they seem out of reach for a long time at least by these means.

You say fusion is the only energy source indigenous to the earth that will last as long as the earth lasts, i.e billions of years. Geothermal and tidal energy would meet those qualifications. Here in the UK we have in the Bristol Channel, with the second highest tides in the world, the ability, with a multi-basin scheme, to generate over 10GW of continuous power from tides. Solar in its broadest sense (including wind and wave) may not be indigenous but it will be delivered here free of charge for a billion years if a bit erratically locally.

Looking back on that it seems very negative. I was not setting out to knock you. I hope your experiments go well. I would be delighted if you prove me wrong and get a fusion system up and running before magnetic fusion. It is not so outlandish a proposal that it is not worth the investment.

Just a quick comment on mu catalyzed fusion...the problem is not the muon life time but the sticking fraction. After catalysis the muon has a finite chance of remaining bound to the He nucleus. If it sticks it is removed from the cycle. Depending on the conditions of the D-T target vessel, a muon could catalyze up to about 400 reactions. IIRC the work of Jones et al demonstrated that you could get to EROEI of 0.8 for a D-T mixture. It was a bit lower for D-D. However, if you used the neutron flux to breed Pu239 and U233 you would get a net energy gain. Some one should figure out if it is cheaper to breed fuel in this way as opposed to a breeder reactor.

Absolutely fascinating how close we came to get a back door into fusion power....

Fusion's competitor in the near term (this century) is breeder fission.

Why not conventional, non-breeder fission? This would probably presuppose that extraction of uranium from seawater could be performed sufficiently cheaply (and therefore that breeding or even just reprocessing would be unnecessary) but a Japanese group was claiming it could be. As I recall, reprocessing of ordinary burner reactor spent fuel isn't economical until uranium becomes quite expensive, and breeder reactors have been even more expensive to build and operate than burners.

I was at University in 1960. My physics techer said that commercial fussion was 40 years off.

Now, in 2007, I am told the commercial fusion is 40 years off.

Great article but it ain't going to happen. Fusion is the ultimate jam tommorow.

I have to agree, and I'm afraid the techno-dreamers and Trekkies are going to be disappointed. Just as we have not been to the Moon for 35 years (but might just get back there) yet still imagine going to Mars that, like a commercial fusion reactor I think will never happen. In the table (Dr. Briscoe's) above there are so many things still to be resolved and the description of the technology almost seems like building the Starship Enterprise.

The worry for me is that TPTB will not see it like that and adopt a "burn-everything" (tar sands, CTL, coal gasification) approach, in the hope that that would tide us through to a fusion-powered utopian future. Most likely what we would get from that by mid-century would be uncontrollable greenhouse warming, catabolic collapse as envisioned by J.M. Greer, die-off well underway so that we'd be lucky to be able to build a wood-stove.

Get real, get sustainable.

"The news was greeted with joy by Leo Mascheroni, a maverick fusion physicist who was fired nine years ago by Los Alamos National Laboratory in a head-on collision of scientific ideas.

"I love it," said Mascheroni, who lost his job after he proposed an alternative fusion energy laser that threatened the status-quo research of the secret military fusion program.

He says he was wrongly fired on trumped-up security violations, a contention that later was confirmed by Los Alamos-based DOE security officer Bill Risley, who independently investigated the case for headquarters.

"They punish me, my ideas and my family for 10 years," said Mascheroni, an Argentine immigrant who says he still has a lot of hope for the American system."

Accused at one point of being a spy and hounded by DOE security agents, Mascheroni said, "My case is the opposite, really, of democracy. When they did what they did to me, I couldn't fight back.

"I chose to live in hell for my ideas and I won't let them go now," he said "They still must resolve the scientific issues."

He continues to demand that DOE conduct an independent scientific review of the costly military fusion research program.


We continue to move ahead


Payne hasn't spoke to Leo [Pedro Leonardo] Mascheroni in some years.

Mascheroni is older than Payne, who is 45 days younger and, a ... willing, out-lived saddam.

Mascheroni was working alone.

We are not working alone.


I don't know what happened to Leo.

But what I saw many years ago was very sad.

Mascheroni's idea [My expertise is not in physical science. http://www.walmart.com/catalog/product.do?product_id=759906]
apparently required lots more energy to ignite to produce fusion.

Leo attempted to explain this to me.

Leo may have failed at his legal project.

We're trying to win at our legal project.


Stay tuned for the week of January 22, 2007.


hi dad!
finally got all the way through the article (not all the way through the comments though...). I think i understood about 70-80% of what you were saying which is pretty good going!
off-topic slightly - I was wondering about that thing you were writing on wind power, and the reliability over 20-30% contribution to national supply. In heat its suggesting that DC cables are now economically viable and will get even further so in the future. With DC cables and wind farms spread over a large area - the whole of the UK - thereby spreading the risk of low wind speeds, could the contribution be higher?

Hello Louise,
There are three advantages to DC transmission of high power that are of relevance to wind power but they do little about the need for storage when the proportion of wind powered generation rises.

The first advantage is that you can bury the cable underground or put it under the sea. The changing voltages and currents in underground AC cables induce currents in the ground which heat up the ground and suck power out of the cable. For high power long distance cables these losses are not usually acceptable. DC cables do not have these losses. This enables undersea cables for big wind farms far offshore. Pylons way out to sea are not very practical. In places where it is difficult to get acceptance of onshore wind turbines, not having pylons as well as the turbines in view can help to get them past the planning approvals. Underground cables are still very much more expensive than overground pylons but they are at least technically feasible.

The second advantage is that DC links remove the need for all the generators to be synchronised. In the UK, all generators are linked to a single AC grid. All rotating generators, coal and nuclear powered steam turbines, natural gas powered gas turbines, hydro-powered water turbines and some wind turbines that do not have a DC link all have to rotate exactly in step so the voltage swings positive and negative at the same time all over the grid. When the grid is operating at near maximum load it can get difficult to control. An overload in one place can cause one generator to slow slightly and this will try and drag down the speed of all the others. This can cause very large currents to flow and cause protection devices to trip to isolate it. Extra load can be transferred to the remaining generators which can trip in turn rapidly blacking out large areas as has happened several times in America. The larger the grid the more difficult it is to control and the unpredictable variation of wind power makes the problem worse. Wind power is not a problem in this way yet in the UK at the moment because there is so little of it but in Germany it is nearing the point where it could be a concern. The link across the English Channel is DC. It needs to be because it is undersea but it also prevents the problem of having to synchronise all of France’s generators with those of the UK. Splitting the grid in parts with DC links between than would make the incorporation of more wind power easier.

The third advantage is that a given transmission line can carry more DC power than AC power. The maximum voltage the line can carry is set by the distance between the wires and the length of the stack of glass or ceramic disks that insulate the wires from every pylon. In an AC system the voltage swings positive and negative in a rounded sine wave 50 times a second here in Europe and 60 times a second in America. The insulation must stand the peak voltage of this curve but because the voltage is less than this peak value at other times and indeed is zero twice each cycle, the effective voltage, as far as power transmission is concerned is lower than the peak. An AC transmission line rated at an effective voltage of 275kV will have a peak voltage of 389kV. If the line is converted to DC the voltage can be turned up to a constant 389kV and carry 41% more power. This is not much of an advantage for DC transmission lines built from scratch as the money saved on a smaller transmission line compared to an AC line of the same power rating is offset by the greater cost of the electronic voltage converters at each end that replace the simple transformers of an AC system. However it does allow existing AC transmission lines to be up-rated at lower cost than a new line. This is important in the UK as the wind generators being installed in windy north Scotland are putting a strain on the lines transmitting the power to England where most of the load is.

However the problem remains of diminishing returns on extra wind power over about 25% to 30% where without storage, each additional turbine adds an ever decreasing contribution to the power actually consumed and an ever increasing contribution to the available power wasted because it is surplus to requirement at that instant. Vanadium flow batteries such as those in Australia as described in this week’s New Scientist and in Ireland are a useful contribution even though they are small and expensive. By making the output of a wind system exactly predicable over only a few hours makes the electricity much more valuable in our electricity market system and makes the system much more easy control. The grid controllers have to cope every day with huge swings in demand and bring in and stand off generation plant to match supply and demand and can factor in wind generation into this schedule when they can be sure it is available. It is the difference between the output predicted from weather forecasts and what is actually generated that causes the problem. However to push wind energy up to 40%, or above, economically will require massively more storage. As you know the scheme I prefer is close to you in Bristol, the Severn barrage with multiple ponds and the ability to use the turbines as pumps to form a huge pumped storage scheme,

I missed the point there about diversity over the county. Yes of course this helps but less than many people hope. It is not the rare possibility that all of the UK will be becalmed at the same time that is the problem. It is the much larger possibility that most of the UK will have a wind speed of less than about 5m/s. Because the output power of a wind turbine changes so rapidly with wind speed, fairly small variations in wind conditions across the UK can cause massive changes in electrical generation.
This graph is taken from a paper called "Why Danish Wind Power Works." It shows the total output from a vast number of turbines all across west Denmark.

http://i117.photobucket.com/albums/o60/Nick_Rouse/Danishwindpower.jpg >Danish  Wind Power

There were 9 when output was over 100% of demand and 54 days when it was less than 1%.
Overall wind power equalled 27% of Danish demand. Denmark solves the problem by exporting most of their wind power to Norway and Sweden. Norway has enormous hydro-power and can control the output of this up and down to even the Danish wind power variations out. There is a paper that have not got a reference to at the moment that shows that the UK wind pattern averages out a bit better than Denmark but there will still be massive variation and we have not got a much larger neighbour with hydro power installed several times larger than out total demand. Diversity smoothing will improve even more with substantial amount of wave and tide power but we will still need lots of storage.

Thanks for the comprehensive explanation, is the UK investing in DC?. On storage, do DC currents make any difference in terms of making more locations possible for pumped water storage (like up Scottish mountains?). Also another thing Monbiot is talking about is demand management in a slightly sci-fi ish way by adding communications technology to fridges and other appliances that don't need to be on all the time. The grid could send messages to the fridges when supply was falling off and they could all switch off to match the supply. And one more thing, what do you think of vehicle to grid storage (v2g) like those mentioned here Sorry for distracting you at work!

Also another thing Monbiot is talking about is demand management in a slightly sci-fi ish way by adding communications technology to fridges and other appliances that don't need to be on all the time. The grid could send messages to the fridges when supply was falling off and they could all switch off to match the supply.

This is as you say only slightly scf-fi. It's called Dynamic Demand and I wrote some notes on it here:
Intermittency of Renewable Energy

Also The Climate Change and Sustainable Energy Bill which was passed into law last June has a specific requirement for government to report on the potential for dynamic demand technologies and the July 06 energy review described dynamic demand as a promising strategy for “managing variability and capacity contribution of renewable generation”. I think it has real potential. The communication is already there, it's the 50Hz Nick discusses which on a AC network varies with load.

I also think vehicle to grid storage has some mileage. Here's a very simple example - a car may be used for 1 hour per day, it may take 3 hours to fully charge, which leaves 20 hours each day to do what it likes. The do what it likes bit means it can choose to charge when there is a surplus on the grid and even choose to deliver energy back to the grid when there is a shortage. Put a couple of million of them around the UK and you have a nice soft buffer to go towards mitigating the variability of renewable sources. A nice side effect is that that buffering capacity will have been paid for by the motorist!