It is a bit long, but outlines very well how the U.S. succeeded
technologically in the last two decades and is rapidly going downhill
economically and losing its manufacturing base in this one.  

Somebody, somewhere had better do something fast. But that is not
likely.  At least until it is probably much too late...

David Bier

http://www.washingtonmonthly.com/features/2005/0503.wallace-wells1.html

Off Track
America's economy is losing its competitive edge and Washington hasn't
noticed.

By Benjamin Wallace-Wells

There is a moment in the lifespan of every cool new gadget—two years
after Bill Gates buys one, a year and a half after the popular press
gets wind of it—that its price drops enough to show up in significant
numbers on the shelves at Best Buy, the electronic superstore. At this
instant, the product becomes accessible for middle class Americans,
something they can imagine themselves buying, and so these electronics
stores have become temples to innovation, the place most Americans go
to get as close to the cutting edge as most of them dare. On weekend
afternoons, Best Buy is as bustling as a souk, full of grandmothers
and little kids tooling around with digital video cameras and
geeked-out salesmen explaining to the moms that the cell phones in
their hands have nearly the computing power of desktop PCs. But it's
the men who are the most transported, moving from department to
department with gawky reverence. At a Best Buy I visited recently in
Alexandria, Va., I watched one dad gaze in wonder at row upon row of
giant plasma televisions—elegant silver-framed screens that seemed not
just to capture the way the world looks, but to improve upon it. He
watched bees extract honey from flowers, and spiraling footballs drop
into the hands of receivers, and you could almost see a two-part
thought process play out over his face: First, If I wait a year, these
sets will be half the price. Second, Screw it, I'm buying one now!

But there was something else I noticed: Whereas a decade ago the most
creative, groundbreaking stuff came from Silicon Valley, now it all
seemed to come from overseas. The plasma televisions were from Korea;
the computer-like cell phones were from Finland; the feature-packed
digital cameras were from Japan.

During the last six months, we have begun, quietly, to enter a newly
tense moment, with university presidents, business leaders, and
columnists delivering ominous-sounding reports and editorials about
the threat to American innovation posed by a freshly competitive
world—the renewed vitality of western Europe, Japan and Korea, and the
ravenous growth of China and India. "We no longer have a lock on
technology," David Baltimore, a Nobel laureate and the current
president of the California Institute of Technology, wrote recently in
the Los Angeles Times. "Europe is increasingly competitive, and Asia
has the potential to blow us out of the water."

What worriers like Baltimore are beginning to grasp is that these
changes are emerging just as the American economy is being made more
vulnerable by the movement of manufacturing and service jobs overseas.
As a result, we've become increasingly dependent on maintaining our
edge in discovering the new technologies and applications that create
whole new industries—just as other countries are closing that gap.

This is a fundamentally new threat. In the '70s and '80s, Japanese and
European firms adopted American technology and made key improvements
in process and design to shave cost and increase quality. Now, foreign
companies are making many of the most important breakthroughs
themselves. This shift is part of a change in strategy: instead of
copying our innovations, foreign governments have decided to copy our
very model of innovating. They have studied our centers of invention,
the Silicon Valleys and Research Triangles, where university
scientists, venture capitalists, high-tech entrepreneurs, and
educated, creative workers, many of them from overseas, congregate.
These creative centers, our competitors have learned, were the result
of federal policy—decades of investment in basic scientific research;
patent law changes that allowed universities to capitalize on
discoveries made in their labs; financial reforms that gave rise to
the venture capital industry; and immigration laws that opened the
door to talented foreigners.

Over the last decade, our competitors have implemented similar
policies at home: They have built universities, reformed financial
markets, invited in immigrants, and made the development and adoption
of new technologies national goals. Now, they're reaping the benefits.
The technologies behind plasma screens emerged have been refined and
expanded in labs under a research partnership between the Korean
government and the electronics maker Samsung. Europe established its
lead in mobile phones when European countries set a single standard
for mobile communications (American firms are hobbled by lower-quality
spectrum and three competing standards).

Foreign competitors are edging out the United States not just in
today's snazzy consumer goods, but in the technologies that will
define the marketplace in the years to come. Most economists and new
economy thinkers believe that the likeliest candidate for the Next Big
Thing is the research being done in nanotechnology, a catch-all term
for the manipulation of matter at the molecular level. Nanotechnology
could someday be used to repair broken DNA to prevent cancer, create
supercomputers the size of pinheads, or fabricate building materials
150 times the strength of steel. American scientists have been
tinkering with nanotechnologies for 20 years. But some of the most
cutting-edge research today is coming from overseas. Last August,
Israeli scientists announced that they'd managed to develop
manipulable nano-wires, tiny organic tools they could use to rearrange
atoms and conduct electricity over microscopic spaces, a breakthrough
a leading MIT nanotechnologist admitted American researchers had been
chasing "for many years." In September, Japanese scientists announced
that they would soon be able to use nano-engineering to build a
computer chip 30 times more powerful than Intel's best. The
breakthrough led American analysts to conclude that the United States
was beginning to lose the race to bring nanotechnology products to
market.

The worry of economists and business leaders is not simply that Japan,
Israel, or South Korea will beat us, like one football team does to
another. It is, more precisely, that we'll only be able to take
advantage of rising wages in those countries (and afford our own here)
if we continue to create new, cutting-edge products and services to
sell to those countries—and right now America does not seem to be
doing as much of that as we were just a few years ago.

This new competition from other developed countries, and the failure
of America to fully keep pace, is one cause of our anemic job
creation, three years after what was, by historical standards, a brief
and fairly light recession. Another reason, of course, is the rise of
China and India, where U.S. firms have not only moved manufacturing
plants but also "outsourced" service sector jobs. America's employment
base is being squeezed by these two pincers—China and India from
below, and the developed world innovating from above. Over time, those
pincers may come together, as China and India also become proficient
in high-end innovation. China is already opening universities at a
breathtaking clip, while Intel, Hewlett-Packard, Microsoft, and
Verizon have all opened research labs there—the kind that anchored the
development of Silicon Valley. "It's become inevitable," says Ross
Armbrecht, president of the Industrial Research Institute, which is
the think tank for the research arms of America's corporations, "that
more and more of the most far-reaching innovations will be going
overseas, to India and China, in the near future."

Economics is a negotiation in uncertainties, and so nobody's really
sure what all of these changes will mean for the well-being of the
American middle class. But when you survey economists, policymakers,
and business leaders about America's long-term future, it's hard to
find many rank optimists; there are the Panicked, and then there are
the Merely Tense. Richard Lester, the head of MIT's Center for
Innovation, told me he belongs in the latter camp: "Things look
somewhat bleak in the long-term, but if you look around Boston, at the
incredible concentration of talent and opportunity here, we've still
got a head start, and if we're smart we can probably build on it."
Among the Panicked are economists such as MIT Nobelist Paul Samuelson,
who has recently argued that the rapid spread of innovative capacity
to other countries with lower labor costs makes him doubt the whole
doctrine of "comparative advantage," on which much of modern economics
rests.

If there's a way to escape this grim future, economists agree, it is
for America to reverse its slowly slumping innovation machine. Perhaps
the hottest area of economic research right now centers around
technology, trying to figure out what exactly the United States did in
the '90s and how we can do it again. In university economics
departments and corporate executive suites across the country, the
sense that we're in a pivotal fight for continued economic preeminence
is already common knowledge.

But in Washington, these new economic realities have barely been noticed.

The heroism of the 30-year mortgage

On an overcast day in mid-December, President Bush assembled a group
of CEOs at the Reagan Building—a behemoth of a federal office complex
that has become the favorite venue for small-government
conservatives—for a conference to promote his economic agenda. The
tone of the conference, so soon after a winning election, was upbeat,
cheery, back-slapping, the happy Chamber of Commerce banter of
executives who have recognized a problem that they know how to fix. At
the end of the day, the president himself took the stage. He said the
economy was fundamentally strong and that government's role would be
to "create an environment that encourages capital flows and job
creation through wise fiscal policy." To do this, he said, he would
ask for Congress to privatize Social Security and make his tax cuts
permanent. He compared himself favorably to Franklin Roosevelt. He
left the stage.

During the same conference, two floors up in the very same building, a
group called the Council on Competitiveness held another event for the
press, in which it laid out a very different vision. This group,
comprised of 400 blue-chip business executives (the CEOs of IBM,
Pepsi, and General Motors, among others) and university presidents—as
rough an approximation of the American establishment as you could fit
in a single room—was nearly as downbeat as the president was buoyant.
The astonishingly fast rise of international competitors, they warned,
has meant that the American economy has reached an "inflection point,"
a "unique and delicate historic juncture" at which America, "for the
first time in our history…is confronting the prospect of a reverse
brain drain."

The report made a point of noting that the United States remains the
world's dominant economy, the leader in fields ranging from
biotechnology to computers to entertainment, but the CEOs nevertheless
cited worrying evidence that this dominance might not last. For
decades, the United States ranked first in the world in the percentage
of its GDP devoted to scientific research; now, we've dropped behind
Japan, Korea, Israel, Sweden, and Finland. The number of scientific
papers published by Americans peaked in 1992 and has fallen 10
percent; a decade ago, the United States led the world in scientific
publications, but now it trails Europe. For two centuries, a higher
proportion of Americans had gone to university than have citizens of
any other country; now several nations in Asia and Europe have caught
up. "Those competitor countries…are not only wide awake," said Shirley
Ann Jackson, the president of the American Association for the
Advancement of Sciences, "but they are running a marathon…and we tend
to run sprints."

While the president's talk focused almost exclusively on the need to
free up capital for investment, these CEOs barely mentioned that as a
problem. Instead, they stressed various below-the-radar government
actions that they felt were undermining America's competitive edge:
security arrangements that have crimped the supply of educated
immigrants; recent cuts in science funding (the president's 2005
budget sliced money for research in 21 of 24 areas); and the
reassigning of what research funding remains to applied research, most
of it in homeland security and the military, and away from the basic
scientific research that economists say is the essential engine of
future economic growth. They also expressed concern about those
policies Washington was not pursuing but should be: broadening access
to patents; increasing research into alternative fuels; and bringing
information technology into the health care market.

When the newspapers reported the event the next day, the president's
speech got front-page treatment. The CEO's presentation received only
a short item on page E3 of The Washington Post, and no mention at all
in The New York Times. This gap in media coverage reflected not only
the power of a newly elected president to dominate the news, but also
what might be called a macroeconomic bias. When the press and most
Americans think of economic policy, they think of macroeconomic
matters—tax rates, budget deficits, trade balances—whose fluctuations
have instant, tangible effects on interest rates, stock prices, and
exchange rates—things newspaper readers and casual investors can see,
track, and relate to.

But there is another set of ways in which Washington has always
affected the long-term health of the economy: by making investments,
regulatory changes, and infrastructure improvement to spur the economy
forward, creating new industries and giving new tools to old ones.
This category of policies has not traditionally been given a single
name but might best be called "microeconomic policy." Historically,
this has been the heroic side of economic policy: The Louisiana
Purchase may have been a shrewd maneuver for continental expansion,
but it was also a jobs program for landless citizens eager to carve
their own farms in the wilderness—which is how Jefferson sold the
treaty to Congress. The land grant college system, signed into law by
Abraham Lincoln, provided the nation's farmers with expert guidance on
the latest agricultural techniques to improve their crop yields. No
entrepreneur could figure out how to mass produce cars profitably,
writes Harold Evans in his excellent new book They Made America, until
Henry Ford fought an aggressive bid against restrictive patents. The
pharmaceutical, financial, and airline industries blossomed thanks to
the creation of the FDA, SEC, and FAA, which gave customers some
assurance of safety when they popped pills, traded stocks, or boarded
flights. The G.I. Bill provided a generation of veterans with the
college educations they needed to build the post-war middle class. The
creation of the federally-guaranteed 30-year mortgage proved the
decisive tool in the growth of the post-war American suburb.

These investments and regulatory changes aren't merely tools of the
past; it is impossible to imagine the '90s boom emerging without them.
Early investment from the Pentagon helped nurture the Internet. The
algorithm that powered Google was developed when co-founder Larry
Page, then a Stanford graduate student, won a federal grant to write a
more efficient sorting and search engine for libraries. The innovative
new medicines that have driven the expansion of the biotech and
pharmaceutical industries arose from university research largely
financed by the National Institutes of Health. The commercialization
of these and other discoveries was financed by a venture capital
industry that developed only after legislation, sponsored by
Republican lawmakers and signed by President Jimmy Carter, enabled an
advisory firm to hold significant stock in a start-up.

For most of the country's history, both political parties have favored
various microeconomic initiatives—though Democrats have been more
comfortable with using government to intervene in the marketplace,
while Republicans have tended towards a laissez-faire approach that
stressed lowering the cost of capital. These tensions sparked big
debates in the 1980s about "industrial policy," with (mostly)
Democrats arguing for various kinds of sector-specific technology
investments and relief from Japanese competition and (mostly)
Republicans arguing that the federal government should cut taxes,
trust the market, and not "pick winners and losers." Still, each party
has traditionally played on both the macro and microeconomic policy
fields. Kennedy cut marginal tax rates when they were excessively high
in the early 1960s. Clinton cut the deficit to reduce interest rates.
Eisenhower built the interstate highway system. Reagan gave crucial
tariff protection to America's then-ailing semiconductor industry.

Under President Bush, however, the GOP's natural economic policy
tendencies have been hyper-charged by a grand political vision. Karl
Rove, Grover Norquist, and other Republican strategists have argued
that massive annual tax cuts and the privatization of Social Security
will not only increase the flow of capital into the marketplace, but
will also put Democrats at a long-term electoral disadvantage and
usher in a new era of GOP dominance. That these policies also require
the government to take on trillions of dollars in extra debt, just as
the first baby boomers are reaching retirement and trade imbalances
are reaching historic levels, is seen by GOP leaders as a risk worth
taking. And so the White House and Congress have pursued tax cutting
and Social Security privatization with relentless focus, to the
exclusion of almost everything else. As The New York Times columnist
Daniel Altman has written, the president has chosen economic advisers
such as N. Gregory Mankiw, Lawrence Lindsey, and R. Glenn Hubbard who
support this singular view. "What you have in Washington now is an
inability to get beyond the macroeconomic, to understand that there
are so many other investments government needs to be making and
actions it ought to be taking, and that our future is going to hinge
in large part on what decisions we make there," Michael Mandel, the
influential economist and columnist for BusinessWeek, told me in
January. "And right now in Washington, they're not even looking at any
of that."

Even when the Bush administration's leading economists discuss
innovation, it is mostly in this light—they argue that reducing the
cost of capital will lead companies to invest in new technologies.
They rely in part on the research of economists such as Dan Sichel of
the Federal Reserve and Dale Jorgenson of Harvard, who examined the
sources of the '90s boom and found that capital availability played an
important role. But not even Jorgenson thinks this was the whole
story: "You need something to invest in, and so all those other things
you're talking about were crucially important too, in the long run,"
he told me in January. "If you're looking at Washington today, you
have to ask, what are they doing to make those investments now?"

Bush v. Newt

The same White House that has been bold, and recklessly so, on
macroeconomic policy has been timid, and recklessly so, on
microeconomic policy. It has made only a few feints at such policies
and investments, and compared to the relentless energy with which the
administration has pursued tax cuts and Social Security reform, its
attention to such microeconomic strategies has been only tepid,
intermittent, Potemkin-like—done to quiet a constituency or send a
political signal.

A good example is broadband. Most experts predict that when a critical
mass of homes and businesses acquire high-speed Internet connections,
an explosion of economic growth will follow as whole new industries,
such as video-conferencing and online video gaming, become possible.
But these new industries are likely to flourish in whichever countries
achieve near-universal broadband first, and at the current pace, that
won't be the United States. For four years, the FCC has pursued a
"deregulatory" telecommunications policy that has effectively blocked
competition, giving phone companies little incentive to build out
their broadband networks. Over the same period, the United States has
dropped from 4th to 10th in the world in percentage of its homes and
businesses with broadband. Not surprisingly, South Korea, which is
first on the list, is now the world's leader in developing online
video games, the fastest-growing segment of an industry that's bigger
than movies, and its software companies are beginning to lure top
American programmers to Seoul.

Early last year, Sen. John Kerry (D-Mass.) began to use a line in his
stump speeches that challenged the president on America's declining
broadband position. The president responded by proposing the goal of
achieving "90 percent broadband access" by 2007. The goal was
bold-sounding but empty: By most measures, 90 percent of Americans
already have "access" to broadband in the sense that they could, if
they wished, sign up for it; the problem is that, compared to other
countries, relatively few Americans have done so.

A similar inattention has held in wireless—a technology that venture
capitalists believe would explode if the government would make a
simple regulatory change. Since the president came into office,
bankers, venture capitalists and economists have been urging the FCC
to reassign unused, high-quality spectrum that is now reserved for
television broadcasters and the military. "Nobody was using this,"
says Wharton's Kevin Werbach; reassigning it was "a no-brainer." The
FCC, under Chairman Michael Powell, did nothing for two years and then
delegated the matter to a Task Force to investigate how best to
reassign spectrum. The task force reported two years ago, but the
commission has still not begun to reassign spectrum. Meanwhile, the
United States has fallen only farther behind in wireless technologies
to European and Asian firms.

But there is perhaps no economic sector that is undergoing a more
profound evolution, or in which government investments could make a
bigger difference, than energy. As India and China continue their
rapid industrialization, and with it their need for oil, analysts
predict that the price of oil, already sky-high, will grow even more
prohibitive—which means that whichever companies develop the most
effective alternative fuels and energy-efficiency technology will
revolutionize the industry, and whichever countries can produce those
breakthroughs may become rich on it, the Bahrains of the 21st century.

Right now, however, the United States is not poised to be one of those
countries. Demand in America for electric-gas hybrid cars already
outstrips supply, but Ford is so behind the curve that it's leasing
its hybrid technology from Toyota. Europe, meanwhile, is setting the
pace on the next promising auto technology; clean diesel-electric
hybrids. Companies in Europe and Asia have also made more progress
than have their American counterparts in developing the technology for
crafting energy-efficient appliances, offices, and factories—a
consequence of higher energy taxes and stricter environmental
regulations in those countries.

The Bush administration's most vigorous response to all this has been
to increase the funding for research into hydrogen-powered cars.
Hydrogen technology is promising. But it is also decades away from the
market, and even hydrogen buffs believe the administration has gone
about its program the wrong way, trying to build fuel cells before
figuring out the more daunting challenges of how to extract and
transport hydrogen. Moreover, there's a creeping suspicion that
hydrogen may end up being far too expensive to compete with other,
more feasible, and probably cheaper fuels like biomass ethanol, a
technology in which America happens to be a leader. Betting on a
single alternative fuel source, hydrogen, at the expense of others is
a classic case of "picking winners and losers." The truth is, no one
knows yet which technologies or energy sources will define the future.

A better strategy, says Harvard's John Holdren, would be for the
federal government to raise automobile fuel efficiency (CAFE)
standards, impose a carbon cap-and-trade system for factories and
power-plants, and let the market decide which new energy sources and
technologies are the best. These ideas now have broader backing than
they did a decade ago. The bipartisan National Commission on Energy
Policy issued a report in December calling such measures the most
critical to ensure America's energy future—and that commission's
members includes the CEOs of old-line energy giants such as Exelon and
ConocoPhillips. And, Holdren told me, executives at old economy
companies from Monsanto to Dow Chemical have signed on. "Five years
ago, we didn't have a shot at getting them on board," said Holdren,
"but the situation is getting dire enough that now they're leading the
charge." Still, many sectors, including the automobile and power
industries, vehemently oppose higher CAFE standards and carbon
emission limits, and the president has repeatedly rejected them.

There is no better example of the administration's Potemkin-style
microeconomic policy than the way it has handled the issue of rising
medical costs. Here, the administration has talked a good game. During
last year's presidential campaign, the president vowed to bring health
care out of the "buggy and horse days" by getting the industry to
adopt information technologies, such as electronic medical
records-keeping and systematic case-management systems, which experts
say could save hundreds of billions of dollars and tens of thousands
of lives. To this end, he promised a new $50 million health care IT
initiative. It was an absurdly small amount, and probably no match for
the perverse incentives that keep for-profit medicine from investing
in these technologies (see "Best Care Anywhere," January/February
2005). But at least it was something.

That is, until the president signed his 2005 budget into law, which
zeroed out the $50 million program. David Brailer, the economist and
physician the White House had put in charge of the program, wound up
with no money to do anything to install information technology in
hospitals—no pilot programs, no cash for education, no seminars for
hospital executives. Newt Gingrich, the right's high priest of health
IT, told The New York Times that the president's defunding of his own
program was a "disgrace." (After Gingrich's hue and cry, the White
House put the money back in the proposed 2006 budget it submitted to
Congress, though some insiders remain skeptical that the program will
survive).

Faster, faster

Technology today is diffusing faster than ever. As the Council on
Competitiveness has noted, it took 55 years for the automobile to
spread to a quarter of the country, 35 years for the telephone, 22
years for the radio, 16 years for the personal computer, 13 years for
the cell phone, and only seven years for the Internet. Because
technologies are adopted so quickly, it has become more important than
ever for a country's industries to be at the cutting edge—there's
simply much less catch-up time. (Fall five years behind on building
car factories in the early 20th century and you lost some profits;
fall five years behind on hybrid cars and you may have lost an
industry).

For this reason, the last four years of drift may have already done
significant damage to America's long-term economic prospects. The pity
is, there was no good reason for the drift. Finding ways to strengthen
border security while still providing enough visas for educated
immigrants and graduate students is hardly the world's most difficult
public policy challenge, and every Fortune 500 corporation in America
would cheer such moves. There are no serious ideological reasons why
both parties couldn't support reform of patent laws (though certain
powerful interest groups would object). It's hard to find a good
excuse for why we're falling behind on broadband, or have failed so
far to reassign valuable wireless spectrum. (Indeed, a country which
until recently had large budget surpluses should by now have found the
money to begin wiring the country with fiber-optics, providing
higher-quality streams which can transport large data files far faster
than broadband.) And even the most politically difficult actions, such
as raising CAFE standards and imposing a flexible carbon emissions cap
to spur energy innovation, should have been possible after 9/11, with
the nation willing to make sacrifices and dire warnings from all
political wings about our dependence on Middle Eastern oil.

But what worries economists even more than the past four years of
drift is the prospect of continued inaction. The speed of
technological change is now too fast, and the economic competition too
fierce, for America to afford that. There is no law that says the
United States will be the world's preeminent economic power forever.
But neither is there any reason we can't rise to the challenge, as we
did in the 1980s and 1990s. Then, as now, becoming more innovative is
the solution to our problem. But first, we must recognize that we have
a problem.

Benjamin Wallace-Wells is an editor of The Washington Monthly. 





------------------------ Yahoo! Groups Sponsor --------------------~--> 
Give underprivileged students the materials they need to learn. 
Bring education to life by funding a specific classroom project.
http://us.click.yahoo.com/FHLuJD/_WnJAA/cUmLAA/TySplB/TM
--------------------------------------------------------------------~-> 

--------------------------
Want to discuss this topic?  Head on over to our discussion list, [EMAIL 
PROTECTED]
--------------------------
Brooks Isoldi, editor
[EMAIL PROTECTED]

http://www.intellnet.org

  Post message: osint@yahoogroups.com
  Subscribe:    [EMAIL PROTECTED]
  Unsubscribe:  [EMAIL PROTECTED]


*** FAIR USE NOTICE. This message contains copyrighted material whose use has 
not been specifically authorized by the copyright owner. OSINT, as a part of 
The Intelligence Network, is making it available without profit to OSINT 
YahooGroups members who have expressed a prior interest in receiving the 
included information in their efforts to advance the understanding of 
intelligence and law enforcement organizations, their activities, methods, 
techniques, human rights, civil liberties, social justice and other 
intelligence related issues, for non-profit research and educational purposes 
only. We believe that this constitutes a 'fair use' of the copyrighted material 
as provided for in section 107 of the U.S. Copyright Law. If you wish to use 
this copyrighted material for purposes of your own that go beyond 'fair use,' 
you must obtain permission from the copyright owner.
For more information go to:
http://www.law.cornell.edu/uscode/17/107.shtml 
Yahoo! Groups Links

<*> To visit your group on the web, go to:
    http://groups.yahoo.com/group/osint/

<*> To unsubscribe from this group, send an email to:
    [EMAIL PROTECTED]

<*> Your use of Yahoo! Groups is subject to:
    http://docs.yahoo.com/info/terms/
 



Reply via email to