On Mon, 21 Feb 2005, Doug Henwood cited the Social Security website, which
in its simplest form defined real wages as:

(2) For the years 1978 through 1990, all remuneration reported as
wages on Form W-2 to the Internal Revenue Service for all employees for
income tax purposes, divided by the number of wage earners.

That seems reasonable. But is that any different than the basis used to calculate the hourly real wage? Because if it's the same, then there shouldn't be any occasion for

upper incomes pulling up the means

since that should happen the same for both, no? If the basis is the same, then if the growth/fall in annual wages diverges from the growth/fall in hourly wages, it would be due entirely to a growth annual in annual work hours.

Is there a stat that confirms that's what happened?  That hours per worker
per week have gone appreciately up in the last 30 years?  I would actually
have thought the opposite, because I would have thought there was a greater
number of part time workers now than then for various reasons.

Or is the basis somehow different?  When they are calculating the average
hourly real wage, do they leave out some high end earners for some reason?
Or perhaps they are simply using an entirely different data set -- surveys
of employers (or surveys of households) rather than W-2s?

Michael

Reply via email to