On Sep 21, 2007, at 6:30 PM, raghu wrote:

If I understand correctly much of the late 90's productivity gains
came from the computer industry itself. i.e. businesses buying all
those computers were not getting much for it, but the computer
industry being dominated so heavily by fixed costs, any additional
sales of say a Pentium chip or Oracle software would be almost pure
profit thereby generating massive apparent productivity gains.
http://www.nber.org/~confer/2001/prods01/stiroh.pdf

No that's not Stiroh's argument. Gordon made that argument in the
late 1990s, but subsequently dropped it.

Here's Stiroh's abstract:

This paper examines the link between information technology (IT)
and the U.S. productivity revival in
the late 1990s.  Industry-level data show a broad productivity
resurgence that reflects both the
production and the use of IT.  The most IT-intensive industries
experienced significantly larger
productivity gains than other industries and a wide variety of
econometric tests show a strong
correlation between IT capital accumulation and labor
productivity.  To quantify the aggregate impact
of IT-use and IT-production, a novel decomposition of aggregate
labor productivity is presented.
Results show that virtually all of the aggregate productivity
acceleration can be traced to the industries
that either produce IT or use IT most intensively, with essentially
no contribution from the remaining
industries that are less involved in the IT revolution.

So it's computer production plus heavy computer users (e.g., Wal-Mart).

What is surprising is productivity did not actually drop merely
decelerated after the crash. Maybe it has something to do with mass
layoffs, or maybe with some real gains from technology like targeted
advertising on Google. But I suspect part of the explanation is in the
accounting of outsourcing activities.

There definitely may be a problem with accounting for outsourcing.
But trend productivity growth is now below 2%, which is pretty sucky.

Doug

Reply via email to