Why can’t everyone be above average? – Twin Cities

0 49

Edward Lotterman

Simple misunderstandings of terms, or failures in critical thinking, can trip up useful discussion.

I recently heard a discussion about the war in Ukraine full of misunderstanding because one person apparently thought “deaths” and “casualties” are the same. The latter term actually means, “dead, captured or missing.”

It’s the same with economic indicators. It is important, for example, to understand the difference between “person” and “household,” and that the latter is different from “family.” One often reads phrases like “70% of Americans are … ” this or that. But does the speaker mean American persons or households? And they might mean “families,” or even “tax units” for IRS data.

Consider when someone says, “adjusted for inflation, income for middle-class Americans hasn’t risen since the late 1970s.” Someone else responds, “yes it has.” Who is right?

The first person is correct for median income of households, which “includes the related family members and all the unrelated people … who share the housing unit;” that shows virtually no change since the late ’70s. But the second person is also right if we look at “people.”

How can this be?

Well, the number of households relative to the population has increased. The number of persons per household has fallen. If you take the middle 10% or 20% of the income distribution, income per person has indeed increased, although not nearly at the rate we saw from 1945-1975.

One person says, “there are 2 million farms in the United States.” Another says, “That can’t be, I heard virtually all the corn in the U.S. is produced on 300,000 farms, the same for soybeans and even fewer for wheat.”

Both are correct. The number of farms has been declining for years, but there still were 2,012,050 U.S. farms in 2021. But it is also true that about 90% of the value of all agricultural sales comes from just 160,000 farms. Corporations taking over? No — 98% of all farms are still family farms and they produce over 90% of sales.

How can this be?

Well, the definition of “farm” is any establishment that sells $1,000 or more in agricultural products. That definition has not changed in over 60 years. So a small-town high school English teacher who has a dozen beef cows in pasture on their acreage is well past the threshold. So is a doddering economist with some Conservation Reserve Program acres and the sale of a few loads of meadow hay.

The 160,000 farms are those selling over $500,000 a year. Cargill? Bill Gates? No. If one of those neat mile-square farms in rural Minnesota bounded by gravel roads, all planted to corn, yielded the state’s 2021 average and was sold today, the check would be $741,000. A dairy milking 250 cows at any one time, small by today’s standards but still manageable by a two-generation family without hired labor, would produce $1.44 million in milk at 2021 production levels and today’s prices.

So much for misunderstandings of definitions.

Then there are problems around simple statistics. The most common is to base decisions on some average without taking into account how much variation there is around that mean.

In discussing homeowners’ insurance, someone asserts: “you are best off taking the biggest deductible possible because, on average, you will save money.” That is true, but it is irrelevant to the core question. It would also be true to say that you should not buy insurance at all because, on average, you will save money. But there is a huge variation in possible losses. People buy insurance precisely to reduce the variation around some average.

You save money with a higher deductible, but you get less protection. You save money by buying a 4-cylinder car rather than a 6-cylinder one, but must settle for lower performance. You save money by getting new doors for kitchen cabinets rather than remodeling your kitchen. But the products are not the same. In all cases, it depends on the personal preferences of the buyer, there is no “right” or “wrong’ decision to make.

It is also correct that, if all dividends were reinvested, the inflation-adjusted return on stocks in the S&P 500 index is over 9%. That supports the common belief that the average return on stocks is higher than those on bonds or other investments. You hear that from most financial consultants, who may advise a mixed portfolio of stocks and bonds but note that a greater proportion of stocks usually gives greater returns. That gets reduced to someone telling an in-law or neighbor, “you should invest in stocks rather than bonds.”

That an average return is true computed over some long span of time does not mean it is true for all specific 10- or 20-year or other-period brackets within that long span. It also does not mean that what was true for the last 95 or 65 or 15 years will be true over the next 20 or 30. Nor does it mean that, if one took any individual stock or even carefully chosen groups of five or 10 or 20 stocks, that one would match that overall average return. The starting and ending point of the calculation matters a lot, both in terms of what the stock market did, but also in the rate of inflation.

A counter-example is that if, in August 1929, just before the crash, you had $1,000 in the stocks then making up the Dow Jones average, you would have had to wait until mid-1954 before you ever had that much again. And, if you consider inflation, it would have been 1958. This is without dividends, but those were low over that interval.

Now that data is readily available, Hendrik Bessembinder, a professor at Arizona State University, found that if you looked at the returns since 1920 to all stocks ever traded, the return is only slightly above that of T-bills, very short-term and low-interest Treasury securities. The reason? Just 88 companies, out of thousands traded, made up over half of the total value created in stock investments over a century. If you bought stocks, but did not get into these 88 right at their emergence, you would not get anywhere nearly as high a return as the figures bandied about as Gospel. His work has touched off controversy and a flurry of research, but is generally holding true.

One could go on. On average, you may get more out of Social Security if you wait until age 67 to start collecting instead of starting at 62. But some 7% of people at their 62nd birthday will die before age 67. One out of nine won’t make it to age 70. So, the thinking may go, why wait until 67 since the money won’t be of use if you’re dead? Yes, but in that case, you won’t care anyway. However, this question of the optimal age to start taking benefits starts off a whole series of calculations that, in itself, merits a column of its own.

Source link

Leave A Reply

Your email address will not be published.