When Donald Trump says almost four in ten black American youths live in poverty, he’s technically correct.
According to the official poverty measure, 36 percent of African-Americans under the age of 18 fell below the poverty line in 2014. The problem with that statistic is that the official poverty line is a flawed measurement. It doesn’t take into account benefits like food stamps and tax credits, so unlike the more recent supplemental poverty measure, it can’t account for the fact that earned income and child tax give-backs lower the poverty rate by 3.1 percentage points, and food stamps (formally known as Supplemental Nutrition Assistance Program benefits) cut it by 1.5 percentage points.
“What we’re doing to fight poverty actually does help reduce poverty,” said Gregory Acs, director of the Income and Benefits Policy Center at the Urban Institute, which recently published an analysis of American poverty. “We’re just not measuring it right.”
Though no empirical measure can truly illustrate the day-to-day reality of being poor, the reduced number of poor children under the supplemental poverty measure is of course good news. There is though a darker side to this more accurate accounting: When including out-of-pocket healthcare costs, more elderly Americans are classified as living in poverty.
According to 2014 census data, 10 percent of those 65 and older fell below the official poverty measure. That number rose to 14.4 percent using the supplemental measure. A big reason for this is that older people are more likely to use costly services like staying overnight in a hospital or in a skilled nurse facility.
The official poverty measure is based on a 1963 formula. It defines the poverty line as the dollar equivalent of three times the cost of feeding a family of four based on the 1955 Household Consumption Survey, a formula adjusted for inflation each year. The formula is still used because it's tied to several public funding streams such as cost assistance for insurance and tax credits, and is uniform nationwide, Acs explained. This remains the case despite the supplemental measure's poverty line being consistently 0.5 to 1 percentage point above the official measure.
Looking at all age groups, a family of four's current official poverty line is about $24,000. According to census data, 14.9 percent of people were below that compared with 15.3 percent under the supplemental measure. The supplemental measure, which groups individuals based on housing, found a poverty line of $25,844 for owners with a mortgage, $21,380 for owners without a mortgage, and $25,460 for renters.
Acs and two of his colleagues used Census data and 2013 findings from Columbia University researchers to chart the supplemental gauge against the official measure over several years. One difference between the two is that, during the 2000s and the immediate fallout of the 2008 financial collapse, the supplemental poverty measure showed a slower increase in the number of poor Americans, in part because food stamp usage increased considerably.
Going further back by accounting for deflation, the supplemental poverty rate would have been about 25 percent in 1967, when the official measure was about 15 percent. That difference could have been significant (the U.S. had 198.7 million people at the time) in measuring the war on poverty, which had just begun.
Regardless of the measure used, Acs’ analysis shows just how far the country has yet to go. Using an interactive that looks at historical rates for different age groups, Acs concludes that if the same poverty rate that existed between 2000 and 2010 were in place today, 11 million fewer people would be considered poor.
The interactive can be found here.
Watch Next: Poorer Than Their Parents: The Income Inequality Crisis