Fusion of cultures
"With the establishment and explosive growth of the Internet
and other forms of global communication and rapid transportion, currently divergent
human cultures will tend to fuse"
Robert Zubrin - Entering Space - page 7
Cost plus
"Beyond These considerations stands the government contracting
system, known as "cost plus," which has been in place for some time
now in the United States. According to the people who invented the system, it
is essential that corporations be prevented from earning excessive profits on
government contracts. Therefore, rather than negotiate a fixed price for a piece
of hardware and allow the company to make a large profit or loss on the job
depending on what its internal costs might be, regulators have demanded that
the company document its internal costs in detail and then be allowed to charge
a small fixed percentage fee (genarlly in the 10 percent range) above those
costs as profit. This system has served to multiple the costs of government
contracting tremendously, so much so that it has produced public scandals when
news leaks out thabout the military paying $700 for a hammer or a toilet seat
cover.
To see how this works, consider the case of the Lockheed Martin corporation,
the largest aerospace contractor in the world. I was employed as a senoir, and
later staff, engineer at the prime facility of this company for seven years.
Lockheed Martin almost never accepts hardware contracts on a fixed cost basis.
That is, the company rarely says to the U.S. government, "We will produce
the ABC vehicle for you at a price of $X. If it costs us less than $X to make
it, we will make a profit. If it costs us more, we will take a loss." Instead,
most important contracts are negotiated along the following lines: "We
will produce ABC vehicle for you at a cost of about $X. We will then add a 10
percent for to whatever it actually costs us to produce to provide the company
with a modest profit." In other words, the more the ABC vehicle costs to
produce, the more money the company makes. Hence, in addition to the vast numbers
of accounting personnel that the cost-plus contracting sytstemn necessarily
entails, the company is saturated with "planners" "marketeers"
and "matrix managers" among swarms of other overhead personnel. Of
the 9,000 people employed at the Lockheed Martin main plant in Denver (where
the Atlas and Titan launch vehicles are made) only about 1,000 actually work
in the factory. The fact that Lockheed Martin is keenly competitive with other
aerospace giants indicates that thier overhead structures are similar.
In the context of this regime, government willingess to give such corporations
cost-plus contracts for product improvement can acutally serve as disincentive
for company investment in innovation. A number of years ago, I was part of a
team that proposed a new upper stage for the Titan rocket, which would have
increased the vehicle's performance by 50 percent. Creating the new upper stage
would have required a company investment of abou $150 million. (Asingle Tital
launch sells for between $200 million and $400 million.) The coporation's management
declined, saying, in effect, "If the Air Force wants us to improve the
Titan, they will pay us to do it." As a result, the Titan was not improved
and the company's commercial Titan line was shut down when all of its private-market
business was taken by the slightly less obsolescent French Ariane. The company
didn't mind much, however, as all of its cost-plus U.S. government launch contracts
are protected by law from foreign competition, and it faces no U.S. competitors
in the Titan's payload class."
Robert Zubrin - Entering Space - page 24
Reusable vs. Expendable launch systems
"Another factor inflating launch costs is the fact that
all existing launch vehicles are at least partly, and generally wholly, expendable.
This comes as a result of the unique heritage of space-launch systems: Of all
the methods of transportation known to human history, only launch vehicles are
descended from ammunition. When John Glenn traveled to orbit in 1962, he rode
atop an Atlas rocket. The Atlas was an ICBM directly derived from the German
army's V-2, which itself was simply a replacement for the Paris Gun and other
long-range artillery forbidden Germany by the victors of World War 1.
One does not recover ammunition after it is fired. Thus, a $300-million Titan-a weapon system designed to deliver warheads to Soviet cities but later used as a transportation system to deliver Gemini astronauts to orbit, Viking to Mars, and Voyager to Neptune-can be used only once. Consider how expensive air travel would be if Boeing 747s were scrapped after one flight, and a 747 costs only about $100 million! If the excessive overhead were squeezed out of the space industry and its immediate suppliers, the Titan could probably be built for $30 million (It's a much simpler machine than a 747), but even at that rate, the practice of expending each booster after a single use would still make space travel orders of magnitude more expensive than any other form of transportation.
So why not make launch systems reusable? Well, there are significant
engineering reasons that favor expendability. If a vehicle is to be expended,
it won't need landing gear, a deceleration system, or a reentry thermal protection
system. Eliminating all these items reduces vehicle weight and therefore increases
payload. Expending a rocket also makes it easier to adopt staging strategies,
in which one rocket is launched from atop another, a practice that also increases
payload or maximum range. In addition, expendability makes the launch system
simpler overall, and drops the technological requirements on subsystems. For
example, rocket engines only have to be designed to start once and endure a
single burn. Nothing needs to be designed for servicing after use, and no special
tools, procedures, or personnel to engage in such servicing need to be developed.
If the launch rate is limited, many more launch vehicles will be needed to service
the payload manifest (the ensemble of payloads available for launch) if the
boosters are expendable than if they are reusable. Therefore, the economics
of mass production will always favor expendables, and a single expendable booster
that is part of a production line will have a much lower manufacturing cost
than a single re-usable vehicle with equivalent lift capability.
A reusable vehicle will require a ground support team to service it, and these
people will have to be paid all year long regardless of how many times the craft
is used., If the launch rate is too low, this payroll could conceivably exceed
the costs of performing the same number of launches with expendable. In the
case of the rather complex Space Shuttle (currently the only operational reusable
system), the ground support team amounts to a virtual standing army, with an
annual program cost of about $5 billion per year. Thus, at the current launch
rate of about eight per year, a shuttle launch costs close to $600 million.
This is twice that of the pricey Titan IV - Centaur, which offers equivalent
lift capability. But since the shuttle is mostly reusable, if the shuttle launch
rate could be doubled to sixteen per year it could match the Titan costs, and
if it were tripled to twenty-four per year, could significantly beat them"
Robert Zubrin - Entering Space - page 26-27
Political impact of the US space program
"The attempt to cope with these economic realities underlies
much of the pathology associated with the Shuttle program for the past twenty
years. For example, in selling the Shuttle program to Congress during the 1970's,
NASA officials claimed that the Shuttle would fly forty times per year (one
launch every nine days!). This prediction should have aroused skepticism on
two grounds: (a) the technical difficult in preparing a Shuttle for launch in
so short a time and (b) the lack of a payload manifest large enough to such
a launch rate. NASA leaders left (a) up to the engineers to solve as best they
could, but attempted to solve (b) themselves through political action. Specifically,
the NASA brass in the late 1970's abd tge early 1980's obtained agreements from
the White House to the effect that once the Shuttle became fully operational,
all U.S. government payloads would be launched on the Shuttle. That is, NASA
wanted to the expendable Deltas, Atlases, and Titans phased out of existence
so that the Shuttle could enjoy a bigger manifest and have its economics improve
accordingly. The Air Force resisted this policy, as they feared that a Shuttle
accident could cause a stand-down of the entire program, which would them make
it impossible to launch vital military reconnaissance and communication satellites
when required. It seems incredible today, but the NASA argument actually carried
the day against the Air Force in Washington's corridors of power. During the
1980's, the expendable "mixed fleet" was in the process of being phased out.
It was only after the Challenger disaster in January 1986 proved the Air Force
concerns were fully justified that President Reagan reversed the decision."
Robert Zubrin - Entering Space - page 27-28
Politics and the International Space Station (idealogical driven scientific
goals)
"The need to increase the launch manifest to justify Shuttle
economics played a central role in the decision to initiate the Space Station
program. In the early 1980s, NASA Deputy Administrator Hans Mark saw clearly
that achieving a shuttle launch rate of twenty-five per year would be impossible
without the manifest created by the construction and supply needs of a permanently
orbiting outpost, which he already supported as a facility for in space scientific
research (mark did not believe the forty launches per year touted by earlier
shuttle advocates was feasible under any conditions). Based on this (probably
accurate assessment), Mark convinced first NASA Administrator James Beggs and
then the Reagan White House of the need for a space station program. The need
to generate a large shuttle manifest also helps to explain the bizarre nature
of the engineering designs that have guided the space station program since
its inceptions.
The right way to build a Space Station is to build a heavy-lift
launch vehicle and use it to launch the station in a single piece. The United
States launched the Skylab space station in this manner in 1973. Skylab, which
contained more living space than the currently planned International Space Station
(ISS), was built in one piece and launched in a single day. AS a result, the
entire Skylab program, end to end from 1968 to 1974, including development,
build, launch, and operation was conducted at a cost in today's money of about
$4 Billion, roughly one-eighth of the anticipated cost of the ISS. In contrast,
the Space Station has gone through numerous designs (of which the current ISS
is the latest), all of which called for over thirty Shuttle Launches, each delivering
an element that would be added into an extended ticky-tacky structure on orbit.
Since no one really knows how to do this, such an approach has caused the program
development cost and schedule to explode. In 1993, the recently appointed NASA
Administrator Dan Goldin attempted to deal with this situation be ordering a
total reassessment of the Space Station's design. Three teams, labeled A, B,
and C, were assigned to develop complete designs for three distinct Space Station
concepts. Teams A and B took two somewhat different approaches to the by then
standard thirty-Shuttle-launch/orbit assembly concept, whereas team C developed
a Skylab-type design that would be launched in a single throw of a heave lift
vehicle (a "Shuttle C" consisting of the Shuttle launch stack but without the
reusable orbiter). The three approaches were then submitted to a blue ribbon
panel organized by the Massachusetts Institute of Technology for competitive
judgment. The M.I.T. panel ruled decisively in favor of option C (a fact that
demonstrated only their common sense, not their brilliance, as C was much cheaper,
simpler, safer, more reliable, and more capable and would have given the nation
a have-lift launcher as a bonus). However, based on the need to create Shuttle
Launches as well as a desire to have the Space Station design that would allow
modular additions by international partners, Vice President Al Gore and House
Space Subcommittee chairman George Brown overruled the M.I.T. panel. By political
fiat, these gentlemen forced NASA to accept option A, and thee space agency
has had to struggle with the task of building the Space Station on that basis
ever since. The result has been a further set of cost and schedule overruns,
the blame for which has been consistently placed on various NASA middle managers
instead of those really responsible."
Robert Zubrin - Entering Space - page 28-29
Fusion of cultures
"The combination of low-cost space access with orbital servicing
operations will also allow the development of global communication systems whose
capabilities will impact society in ways that exceed the imagination of most
people today. For example, such augmented communication constellations could
enable low cost wristwatch-sized communication devices that would be able to
access on a real time interactive basis all the storehouses of human knowledge
from anywhere in the world. In addition, these devices would enable their users
to transmit very high volumes of data-including voice, video, and music-either
to each other or to the system's central libraries. The practical value of such
a system is obvious, but their implications go far beyond the practical into
the social and historical. We will se human society thoroughly linked together,
resulting in deep cultural fusions and a radical generalization of the dissemination
of human knowledge. In a real sense, the establishment of the full range of
global communication services that orbital industry will enable represents the
final step establishing humanity as a Type I civilization."
Robert Zubrin - Entering Space - page 75-76
Plants
"Plants are enormous consumers of light energy, typically
using about 3,000 kW/acre, 750 MW/km^2. If this does not seem too much consider
this: The amount of sunlight that illuminates the fields of the state of Rhode
Island (not usually thought of as an agricultural giant) is around 2,000,000
MW (or 2 Terawatts), which is comparable to the total electric power currently
generated by all of human civilization."
Robert Zubrin - Entering Space - page 81
Fusion Reactors
"Now, one obvious and frequently noted flaw in this
plan is that fusion reactors do not exist. However, that fact is simply an artifact
of the mistaken priorities of the innocent gentlemen in Washington, D.C., and
similar places who have been controlling scientific research and development's
purse strings for the past few years. Lack of funding, not any insuperable technical
barriers, currently blocks the achievement of controlled fusion. The total budget
for fusion research in the United States currently stands at about $250 million
per year- less that half the cost of a Shuttle launch, or, in real dollars,
about one-third of what it was in 1980. Under these circumstances, the fact
that the fusion program has continued to progress and now is on the brink of
ignition is little short of remarkable."
Robert Zubrin - Entering Space - page 84
D-T Fusion reactions
"Currently, the world's fusion programs are focused on achieving
the easiest fusion reaction, that between deuterium (hydrogen with a nucleus
consisting of one proton and one neutron) and tritium (hydrogen with a nucleus
containing one proton and two neutrons). Deuterium is non-radioactive and occurs
naturally on Earth as 1 atom in 6,000 ordinary hydrogen atoms. Its expensive
(about $10,000 / kg), but since an enormous amount of energy (about $5 million/kg
at current prices) is released when it burns, this is not really a problem.
Tritium is mildly radioactive with a half-life of 12.33 years, so it has to
be manufactured. In a deuterium-tritium (D-T) fusion reactor, this would be
accomplished by first reacting the fusion fuel as follows:
D + T --> He4 + n (5.2)
Reaction 5.2 yields 17.6 million electron volts (MeV) of energy, about ten million times that of a typical chemical reaction. Of the total yield, 14.1 MeV is the neutron (denoted by "n") and 3.5 MeV is with the helium nucleus. The helium nucleus is a charged particle and so is confined in the device's magnetic field, and as it collides with the surrounding deuterium and tritium particles, its energy will heat the plasma. The neutron, however, is uncharged. Unaffected by the magnetic confinement field, it will zip right out of the reaction chamber and crash into the reactors first wall, damaging the wall's metal structure somewhat in the process, and then plow on until it is eventually trapped in a "blanket" of solid material positioned behind the wall. The blanket will thus capture most of the neutron's energy and in the process it will be heated to several hundred degrees Celsius. At this temperature it can act as a heat source for high-temperature steam pipes, which can then be routed to a turbine to produce electricity. The blanket itself is loaded with lithium, which has the capacity to absorb the neutron, producing helium and a tritium nucleus or two in the process. The tritium so produced can later be separated out of the blanket materials and used to fuel the reactor. Thus a D-T reactor can breed its own fuel.
However, not all of the neutrons will be absorbed by the lithium. Some will be absorbed by the steel or other structural elements composing the reactor first wall, blanket cooling pipes, etc. In the process, the reactor's metal structure will become radioactive. Thus, while the D-T fusion reaction itself produces no radioactive wastes, radioactive materials are generated in the reactor metal structure by neutron absorption. Depending upon the alloys chosen for the reactor structure, a D-T fusion reactor would thus generate about 0.1 to 1 percent of the radioactive waste as a nuclear fission reactor producing the same amount of power. Fusion advocates can point to this as a big improvement over fission, and it is. But the question of whether this will be good enough to satisfy today's and tomorrow's environmentalist lobbies remains open.
Another problem caused by the D-T reactor's neutron release
is the damage caused to the reactor's first wall by the fast flying neutrons.
This damage will accumulate over time, and probably make it necessary to replace
the system's first wall every five to ten years. Since the first wall will be
radioactive, this is likely to be an expensive and time-consuming operation,
one that will impose a significant negative impact on the economics of fusion
power."
Robert Zubrin - Entering Space - page 86-87
D-He3 Fusion reactions
So the key to realizing the promise of cheap fusion free of radioactive
waste is to find an alternative to the D-T reaction, one that does not produce
neutrons. Such an alternative is potentially offered by the reaction of deuterium
with helium-3. This occurs as follows:
D + He3 -->He4 + H1 (5.3)
This reaction produces about 18 MeV of energy and no neutrons. This means that in a D-He3 reactor, virtually no radioactive steel is generated and the first wall will last much longer, since it will be almost free from neutron bombardment. (I say "virtually no" and "almost free" because even in a D-He3 reactor some side D-D reactions will occur between deuteriums and that will produce a few neutrons.) In addition, no lithium blanket or steam pipes are needed. Instead, the energy produced by the reactor, since it is all in the form of charged particles, could be converted directly to electricity by mangetohydrodynamic means at more that twice the efficiency possible in any steam turbine generator system.
There are tow problems, however. In the first place, the D-He3
reaction is harder to ignite than the D-T, requiring a Lawson parameter of about
1 x 10^22 keV-s/m^3. That should not be fundamental; it just means that D-He3
machines need to be a little bigger or more efficient at confinement than D-T
devices. If we can do one, then in a few more years we can do the other. The
bigger problem is that helium-3 does not exist on Earth on the Moon.
Robert Zubrin - Entering Space - page 86-87
Private companies vs. Government
Government projects are generally inefficient, and it is almost
always the case that private companies ca get a job done much more cheaply and
quickly than government agencies attempting equivalent work.
Robert Zubrin - Entering Space - page 114
Cultural fusion and Mars
Humanity needs Mars. An open frontier on Mars will allow for
the preservation of cultural diversity, which must vanish within the single
global society rapidly being created on Earth.
Robert Zubrin - Entering Space - page 123
The Sabatier Reaction
CO2 + 4H2 --> CH4 + 2H20 (7.4)
Reaction 7.4 is known as the Sabatier reaction and has been
widely performed by the chemical industry on Earth in large-scale one-pass units
since the 1890's. It is exothermic, occurs rapidly, and goes to completion when
catalyzed by ruthenium on alumina pellets at 400°C. I first demonstrated
a compact system appropriate for Mars application uniting this reaction with
a water electrolysis and recycle loop while working at Martin Marietta in Denver
1993. The methane produced is great rocket fuel. The water can be either consumed
as such or electrolyzed to make oxygen for propellant or consumable purposes,
and hydrogen, which is recycled.
Robert Zubrin - Entering Space - page 154
Carbonyls
For example, carbon monoxide can be combined with iron at 110°C
to produce iron carbonyl {Fe(CO)5}, which is liquid at room temperature. Then
iron carbonyl can be poured into a mold and then heated to about 200°C,
at which time it will decompose. Pure iron, very strong, will be left in the
mold, while the carbon monoxide will be released, allowing it to be used again.
Similar carbonyls can be formed between carbon monoxide and nickel, chromium,
osmium, iridium, ruthenium, rhenium, cobalt, and tungsten. Each of these carbonyls
decomposes under slightly different conditions, allowing a mixture of metal
carbonyls to be separated into its pure components by successive decomposition,
one metal at a time.
Robert Zubrin - Entering Space - page 156
Energy consumption and the human standard of living
To glimpse the probable nature of the human condition a century
hence, it is first necessary for us to look at the trends of the past. The history
of humanity's technological advance can be written as a history of ever-increasing
energy utilization. If we consider the energy consumed not only in daily life
but in transportation and the production of industrial and agricultural goods,
then Americans in the electrified 1990s use approximately three times as much
energy per capita as their predecessors of the steam and gaslight 1890s, who
in turn had nearly triple the per-capita energy consumption of those of pre-industrial
1790s. Some have decried this trend as a direct threat to the world's resources,
but the fact of the matter is that such rising levels of energy consumption
have historically correlated rather directly with rising living standards and,
if we compare living standards and per-capita energy consumption of the advanced
sector nations with those of the impoverished Third World, continue to do so
today. This relationship between energy consumption and the wealth of nations
will place extreme demands on our current set of available resources. In the
first place, simply to raise the entire present world population to current
American living standards (and in a world of global communications it is doubtful
that any other arrangement will be acceptable in the long run) would require
increasing global energy consumption at least ten times. However, world population
is increasing, and while global industrialization is slowing this trend, it
is likely that terrestrial population levels will at least triple before they
stabilize. Finally, current American living standards and technology utilization
are hardly likely to be the ultimate (after all, even in the late twentieth-century
America, there is still plenty of poverty) and will be no more acceptable to
our descendants a century hence than those of a century ago are to us. All in
all, it is clear that the exponential rise in humanity's energy utilization
will continue. In 1998, humanity mustered about 14 watts of power (1 terawatt,
TW, equals 1 million megawatts, MW, of power). At the current 2.6 percent rate
of growth we will be using nearly 200 TW by the year 2100. The total anticipated
power utilization and the cumulative energy used (starting in 1998) is given
in Table 8.1
By way of comparison, the total known or estimated energy resources are given in Table 8.2.
Table 8.1
- Year - Power (TW) - Energy used after 1998 -
2000 - 15 - 29
2025 - 28 - 545
2050 - 53 - 1,520
2075 - 101 - 3,380
2100 - 192 - 7,000
2125 - 365 - 13,700
2150 - 693 - 26,400
2175 - 1,320 - 50,600
2200 - 2,500 - 96,500
Table 8.2
- Resource - Amount (TW-years) -
Known terrestrial fossil fuels - 3,000
Estimated unknown terrestrial fossil fuels - 7,000
Nuclear fission without breeder reactors - 300
Nuclear fission with breeder reactors - 22,000
Fusion using lunar He3 - 10,000
Fusion using Jupiter He3 - 5,600,000,000
Fusion using Saturn He3 - 3,040,000,000
Fusion using Uranus He3 - 3,160,000,000
Fusion using Neptune He3 - 2,100,000,000
Robert Zurbin - Entering Space - pages 158-160
Michio Kaku and anti-nuclear activists protesting probes with RTG's
This fact is disputed by anti-nuclear activists such as Professor
Michio Kaku, a string theorist from City College of New York, who have demonstrated
and filed lawsuits to attempt to block the launch of every recent radioisotope-equipped
probe. According to them, the launching of RTG's represents an intolerable risk
to the Earth's environment, because in the event of a launch failure the plutonium
contained by such devices could break up on reentry and pollute the world. Furthermore,
they maintain, such devices are unnecessary. In a debate with NASA's former
nuclear program director Dr. Gary Bennett prior to the Galileo launch, for example,
Professor Kaku claimed that the mission could be just as well performed powered
by batteries instead.
In fact, the anti-nuclear activists are wrong on both counts. An RTG contains about 100,00 curies (Ci) of plutonium-238. On a personal level this is a nontrivial amount - you certainly wouldn't want it around your house - but on a global level it is utterly insignificant. To put it in perspective, if a launch were to fail and an RTG were to break up and be dispersed into the world's biosphere (instead of staying intact and sinking as a solid brick into the sub seabed Atlantic downrange from Cape Canaveral, which is actually what would happen), it would release a radiological inventory approximately 1/100th of 1 percent as great as that released by a typical nuclear bomb test. It would represent an even smaller fraction of the radiological release emitted by each and every one of the half dozen or so sunken U.S. and Soviet nuclear submarines (such as the Thresher) currently rusting away on the ocean floor. Furthermore, the plutonium-238 used in RTGs is not the right kind to use in atomic bombs and has a half-life of 88 years, so it does not last as a long-lived feature of the Earth's environment. In RTGs, it is present not as a metal, but as plutonium oxide, in which form it is chemically inert. The statement that a reentering RTG could represent a significant threat to the Earth's environment is simply untrue.
Equally wrong is Professor Kaku's assertion that outer solar system probes could be powered by batteries. To see how silly this idea is, consider the Galileo spacecraft, which is powered by two 300-W RTGs and warmed by several hundred 1-W RHUs, for about 800W in all. For the sake of discussion, let's grant this is overkill and assume that the mission really could get by with just 200 W of power. Good primary batteries can store about 300W-h/kg. Galileo left Earth in October 1989, and as of August 1998, or 70,000 hours later, was still functioning. At 300 W-h/kg, that would be about 47,000 kg of batteries! (The two RTGs currently on board weigh about 60kg each; the RHU mass is negligible.) Of course, with this much battery mass, the power requirement would be much greater than 200 W, since the spacecraft would require additional power to keep the batteries from freezing. To keep 47,000 kg of batteries (about 5,000 gallons worth) warm, we would probably need to expend at least 2,000 W. But to supply that power, we would need 470,000 kg of batteries, which would need 20,000 W to keep warm, which would require 4,700,000 kg of batteries, and so on. The mission is clearly impossible on battery power.
In fact, outer solar system exploration needs to move in the
direction of significantly higher power levels if it is to be executed efficiently.
Not only do we need RTG's, we need to move beyond them to actual space nuclear
power reactors that use nuclear fission, rather than mere radioisotope decay,
to generate tens or hundreds of kilowatts. The reason for this is very simple.
On Earth, it has been said, knowledge is power. IN the outer solar system, power
is knowledge.
Robert Zubrin - Entering Space - page 176-177
Good primary batteries can store about 300W-h/kg.