It is well known that real wages in the United States have stagnated in recent decades, but how badly? Are real wages actually lower now than in the past, or have they increased, but just not very rapidly? As this chart shows, it depends on how you adjust for inflation.
Both lines in the chart show the real hourly wages of production and nonsupervisory employees stated in 2016 dollars per hour. The red line is adjusted using the consumer price index (CPI) from the Bureau of Labor Statistics. The government uses the CPI to adjust Social Security benefits and the value of the Treasury's inflation adjusted securities (TIPS). The blue line is adjusted using the personal consumption expenditure (PCE) index from the Bureau of Economic Analysis. The Federal Reserve uses the PCE index as the principal indicator of inflation when setting monetary policy.
The difference is dramatic. According to the CPI, real wages have increased just 8 percent in half a century. According to the PCE index, they have increased 40 percent. Even that is not very impressive over such a long period, but 40 percent is a lot better than 8 percent.
If you measure from 1972 instead of 1965, real wages have actually fallen by 4 percent, as measured by the CPI. Even by the PCE, they have increased by just 19 percent.
Which is right? Frustratingly, we can't really say that either measure is right or wrong. The two indexes simply make different choices when it comes to the thorny technical issues that bedevil the measurement of inflation—how to adjust for changes in the basket of goods that consumers purchase, how to adjust for quality, and how to adjust for the substitution of cheaper goods for more expensive ones when relative prices change.
For more on the problems of measuring inflation, see these earlier posts: