Nuclear fusion: finally viable, but is it desirable?
Nuclear fusion: many have heard of it, but few have a precise understanding of how it works. Fewer still seem to know much about the fusion experiments that are currently being conducted and how close — or far — they are from making fusion a reality. With the elusive technology tantalizingly close to being realized, now is time for those outside the scientific community to not only learn more about fusion but to also start asking hard questions about its viability.
The idea for nuclear fusion power has existed since at least the 1930s. Since then, it has become something of a “holy grail” for scientists and engineers — a safe, potentially limitless source of energy that produces almost no pollution. The advantages of nuclear fusion over nuclear fission (the process behind our current nuclear technology) are many: the fuel is (potentially) more readily available and cleaner than uranium; fusion produces no long-lasting nuclear waste; and fusion is much safer than fission, which does not always have the best track record of safety.
With so many advantages to fusion, why then haven’t we just made it standard and moved our world into a new era of guilt-free phone charging?
Well, there is one big problem — the technology to produce fusion energy still doesn’t exist. To build it will require billions of dollars, which is billions more than the billions that have already been poured into fusion research. As we shall see later in this article, fusion is perhaps the best example of what happens when the dreams of science meet the brutal realities of economics.
Fusion is the reaction that gives all the stars, our sun included, their extraordinary ability to generate energy. As such, it is incredibly difficult to recreate. A fusion reactor is, essentially, an artificial star, which might give you some sense of the magnitude of the challenge.
Fusion works like this: When atoms become incredibly hot, the nuclei are “stripped” of their electrons, which means that they are, essentially, floating free. Able to move independently, the nuclei start whizzing around very quickly, bumping into one another like kids high on sugar on a bouncy castle. In this state, a plasma is created. Because nuclei are all positively charged, they would normally repel each other. But at high enough speeds the nuclei will actually collide — like kids butting heads — and at really high speeds they will collide and fuse into a single, bigger body. This process of fusing to create a heavier body releases massive amounts of energy, which we call fusion.
To get nuclei traveling at such speeds, requires tremendous heat and pressure, like that at the center of a star. But recreating that on earth is wildly difficult. This is why, ever since the 1950s, scientists have been talking about fusion technology being “20 years away”, only to be forced to revise that projection every 20 years. So, what is working? After so many tantalizing promises, are we now finally on the brink of creating real fusion? Yes and no. But we are certainly closer than we’ve ever been.
Over the years, there have been many models for fusion reactors. The most promising of all these models is called a “tokamak”. Initially developed in the Soviet Union, this reactor is characterized by a donut-shaped vacuum chamber. The hot plasma is contained within the ring of the donut, pressurized by a magnetic field produced by magnets of extraordinary power. This magnetic field essentially squishes the plasma together, giving it the density required for fusion to take place. The temperature inside the tokamak reaches approximately 150 million degrees Celsius (270 million degrees Fahrenheit), which is 10 times hotter than the sun.
Once the atoms fuse, the energy produced is absorbed as heat in the walls. This heat is then used to produce steam which turns turbines and generators — just like any regular power plant.
Fusion sounds amazing (and is!) but there’s one glaring problem: the amount of energy required to create the conditions inside the machine is currently much more than is ever produced by the tokamak. In fact, no fusion reactor, even the most advanced, has yet to generate true fusion. Which is to say, none have managed to create more energy than they consume.
In fact, so far, fusion has been pretty underwhelming. In 1997, the record for a “fusion energy gain factor” was set by the JET tokamak in Oxford, England. More than 20 years later, it’s a record that still stands. Even then, it was only a bit more than half of the output required to break even, let alone generate power.
The truth is, very few fusion reactors have even been used to create “fusion”. Rather, they are mostly instruments used to study the behavior of the superheated plasma. Even creating a fusion reactor that produced a positive energy output would not necessarily be enough to make such an expensive project commercially viable. That would require a large energy output that could recoup the massive costs of building the equipment and every other expense involved in such a complex project.
Another problem with fusion is its fuel. The fuel of a fusion reactor consists of extremely low-density isotypes of hydrogen called deuterium and tritium. Deuterium is naturally occurring in seawater, which means it’s easy to find. Tritium, however, is a different story. This element has a nuclear half-life of only 12.5 years, which is why there are only about 20 kilos of naturally occurring tritium on the whole planet!
To obtain tritium the tokamak itself must produce it, using what is known as a “breeding blanket”. Essentially, this is a layer of lithium coating the walls of the fusion chamber that creates tritium during fusion. It’s estimated that commercially viable fusion will require about 300 grams of tritium per day in order to produce 800 megawatts (MW) of power. That’s a lot and places a great deal of faith in a technology that has still yet to be proved.
These facts raise a very obvious question: is fusion on earth even possible, or are we wasting our time and money on a sci-fi fantasy? In the next decade we should, at least, finally have a firm answer.
Fusion’s biggest bet is currently being built in the south of France, the location of the ITER project. A true “moon shot” experiment of mind-blowing proportions, ITER is an international collaboration funded mostly by the EU, but with major investment from six other countries (including: South Korea, the US, Russia, China, India, and Japan). The aim of ITER is to build a fusion reactor that can produce 500MW of power. By contrast, for example, JET produced only 16MW.
As exciting as this technology might be, perhaps ITER’s most inspiring aspect is its commitment to sharing all intellectual property, meaning that no one country can dominate the fusion market. If fusion becomes a reality, it could herald a truly global revolution in energy production.
The scale of the ITER project is awe-inspiring: the tokamak is about eight times the size of the largest ever built and can contain pressures that are twice the strength of the space shuttle take off; each of its 18 mega magnets weighs as much as a fully loaded jumbo jet; and a specially modified seaport had to be built to unload giant components and a road to transport them.
The fact that so many global powers are willing to spend so much on such a complex project points to confidence in the future of fusion technology. This is probably ITER’s biggest selling point — on this scale, failure is not an option. But are we really so sure that the problems of fusion can be overcome? And what about alternate models?
ITER is so big that, like the sun itself, it has sucked almost all global fusion investment into its orbit. The result is that many fusion programs around the world have lost their funding to ITER. In turn, this means that other potential fusion technologies, such as stellarators or laser inertial fusion have been largely sidelined. ITER’s sheer scale means that it is, essentially, dictating the direction of the science.
MIT was an institution that experienced firsthand ITER’s impact. Its fusion chamber, the Alcator C-Mod, was effectively shut down after the federal government cut off its funding and redirected the money to ITER. This forced the university to pivot and form a partnership with a private company called Commonwealth Fusion Systems.
Backed by a Bill Gates-funded investment company, Commonwealth’s research focuses on creating fusion reactors that are only 2% the size of ITER’s tokamak. The key to these small tokamaks is their new magnet technology. The magnets are created using thin layers of superconducting rare-earth barium copper oxide (ReBCO) on metal tape. When chilled to -235 degrees Celsius (-391 Fahrenheit), the tape becomes superconductive and creates pressures inside the tokamak that are double that at the deepest point of the ocean. Commonwealth’s hope is to create smaller, more nimble fusion reactors that, although producing less power, could be made far more cheaply than the ITER monster.
There are a couple of other fusion start-ups, but being in ITER’s shadow means that they struggle to be taken seriously. Whether Commonwealth or anyone else, will be able to overcome the barriers to commercially viable fusion remains to be seen. But there are other, even bigger questions about fusion — questions that undermine the very motives of the entire project.
Like flying to Mars, fusion would incredible — not just be new technology, but a project with the potential to change our entire culture. Yet fusion has its critics. Indeed, there are a number of scientists and environmentalists who think that the obsession with fusion is a waste of resources and a distraction from potentially more effective technologies.
Is our obsession with technology itself and the “solutions” it offers the very core of the problem?
Although it is quite probable that fusion will one day be our primary source of energy, that date may be a couple of generations into the future. ITER won’t even be switched on for experiments until 2025 if it runs on a schedule, that is (it isn’t). In the meantime, we have an environmental emergency that needs to be addressed — immediately. Is waiting for fusion technology really the best way to address climate change? Why don’t we simply deal with, what the Swedes call, “the cow on the ice”?
Spending billions of dollars on untested technology does seem wasteful when we could put those same resources into things that we know work. There are many technologies available right now that combat climate change, technologies that could be vastly improved with the funding dedicated to a project like ITER. For instance, for the cost of producing a viable fusion program, we might be able to build all the solar plants and wind farms we need; or design safer nuclear fission power — like molten salt reactors — that could be distributed cheaply throughout the world.
Even if we were to build a working fusion reactor, it would probably be a long time before the costs came down enough for developing countries to be able to afford them. The materials, supply chain, and construction would put the technology out of reach for many. Meanwhile, to keep up with their rapidly growing economies, those countries will probably continue their current strategy: investing in cheap fossil fuels.
Seen from this perspective, fusion can seem like an indulgence, a classic case of “techno-optimism” from a developed society with a tendency to miss the forest for the trees. After all, if developed countries really want to help the environment we could do much more by simply consuming less.
However, there remain several good arguments for the creation of nuclear fusion, the strongest of which may be the least practical.
Nuclear fusion is a long-term strategy, and investing long-term is always a good idea, especially on a planet that will likely be our only viable home for the next thousand years. Ultimately, however, fusion is a technology with the potential to unite human beings — if everyone has access to unlimited energy, then battles over resources might suddenly look ridiculous.
Of course, it would be foolish to think that human beings would suddenly become a global happy family simply because we all have endless free energy — our ability to invent reasons to fight one another seems to be almost as creative as our ability to dream new technology. Yet surely the basic goal of striving towards utopia is worthwhile?
We should never discount the intangible values of discovery. Just as we don’t really need to go to Mars — we could just send advanced robots, for example — our emotional human need to see ourselves physically land on a planet seems to excite us in ways that go to the very essence of our humanity. The moon landing was a moment that united the world and propelled our imaginations, leading to achievements that we take for granted but which are jaw-dropping when we consider the tiny timeframe in which they occurred.
Fusion is not practical, right now. But its creation would be a chapter in the history of discovery on par with the creation of the first artworks, the splitting of the atom, and the decoding of DNA. To not attempt such a feat would be not merely to deny ourselves a potentially useful technology, but to deny our own humanity.