Showing posts with label Centers for Disease Control and Prevention. Show all posts
Showing posts with label Centers for Disease Control and Prevention. Show all posts

Friday, December 15, 2017

One Trader Reflects On A Bad Trade - The Never-Ending Grain Pain (And Whose Fault It Was)

Authored by Kevin Muir via The Macro Tourist blog,


I have had some bad trades in my day. But lately, one call has been especially atrocious.



For the past couple of years, I have taken stabs on the long side of the grain market. At different times, I have held various positions for different lengths of time, but make no mistake - grains have done nothing but cost me money. Sure, I might have a decent sounding argument, The Last Remaining Cheap Asset, but the market is indisputably telling me that I am dead wrong.


And it’s hard to sit and watch the grains go down. Day after day. Week after week. Month after month. Like the slow drip of a leaky faucet that no one can fix, it can drive you insane.


Have a look at the 5-year chart for front month Wheat.



Tough to make money writing any blue tickets with that sort of action. All rallies have been opportunities to sell, not the start of any sustainable uptrend.


This recent grain bear market has pushed the big three contracts (wheat, corn and soybeans) down to near all time lows when measured in real terms.





I don’t want to bother with another forecast about why this time will be different, and how the low will be made in the coming weeks. After a certain number posts, I begin to more closely resemble a degenerate gambler than a cool calculating macro trader (I think that number might be three, which means it’s too late for me, and I do in fact resemble Richard Dreyfuss a whole lot more than George Soros).



And although I poke fun at myself, it’s no laughing matter. The amount of economic pain in farming is downright scary. According to an article in The Guardian, Why are America’s farmers killing themselves in record numbers?, the stress from low grain prices is causing an epidemic amongst the agricultural community.


Once upon a time, I was a vegetable farmer in Arizona. And I, too, called Rosmann. I was depressed, unhappily married, a new mom, overwhelmed by the kind of large debt typical for a farm operation.


 


We were growing food, but couldn’t afford to buy it. We worked 80 hours a week, but we couldn’t afford to see a dentist, let alone a therapist. I remember panic when a late freeze threatened our crop, the constant fights about money, the way light swept across the walls on the days I could not force myself to get out of bed.


 


“Farming has always been a stressful occupation because many of the factors that affect agricultural production are largely beyond the control of the producers,” wrote Rosmann in the journal Behavioral Healthcare. “The emotional wellbeing of family farmers and ranchers is intimately intertwined with these changes.”


 


Last year, a study by the Centers for Disease Control and Prevention (CDC) found that people working in agriculture - including farmers, farm laborers, ranchers, fishers, and lumber harvesters - take their lives at a rate higher than any other occupation. The data suggested that the suicide rate for agricultural workers in 17 states was nearly five times higher compared with that in the general population.


 


After the study was released, Newsweek reported that the suicide death rate for farmers was more than double that of military veterans. This, however, could be an underestimate, as the data collected skipped several major agricultural states, including Iowa. Rosmann and other experts add that the farmer suicide rate might be higher, because an unknown number of farmers disguise their suicides as farm accidents.


 


The US farmer suicide crisis echoes a much larger farmer suicide crisis happening globally: an Australian farmer dies by suicide every four days; in the UK, one farmer a week takes his or her own life; in France, one farmer dies by suicide every two days; in India, more than 270,000 farmers have died by suicide since 1995.



The lightbulb


For the longest time, I had no idea why grain prices were so low. It perplexed me. Central Banks around the globe were printing money at an unprecedented pace. All else being equal, you would expect a real asset, like grains, to have rallied in these circumstances. Yeah sure the advances in farming technology might keep the price of grains pressured, but at the same time, demand has also never been higher, so you would expect the debasement of money to eventually win out and send grains prices skyward.


But more importantly, these situations are usually self correcting. Nothing solves the problem of oversupply like low prices. Except this time. Even with the state of farming littered with heartbreaking stories of ruined families, not enough farmers are giving up planting crops to allow the price to rise.


This conundrum would still be a mystery to me, if it wasn’t for one of my sharp readers, who sent me a note last week. It was actually a response to a post I made about Grandma’s Bond Portfolio is in Trouble, but Shaeffer Steward from Nesvick Trading Group, related it back to the grain market in such a unique original way, I felt it was too important not to share.


I suggest that while Kevin’s assessment for the economy in general might be eerily accurate, it is ENTIRELY BACKWARDS for agriculture.


 


Before you dismiss my hypothesis, hear me out.


 


I hypothesize that the farm economy is in dire circumstances (recall article I sent you the other day: https://www.dtnpf.com/agriculture/web/ag/news/article/2017/11/20/bankers-gearing-difficult?referrer=twitter#.WhLWFmNMPIE.twitter&DCMP=Todd )


 


Primarily because commodity prices skyrocketed during the 2004-2008 super-cycle triggered by the ethanol buildout combined with huge demand growth out of China and when the GFC occurred in 2007-2008, many sectors of the economy literally collapsed under their own weight but agriculture actually thrived because the QE provided the accelerant to keep things going. You see, agriculture did exactly what you would’ve expected - lower cost of money & greater availability of credit (greater supply) - commodity prices remained rather high so farmers levered up, borrowed money and banks were glad to loan it to them as many were using land as collateralization on loans and after all, the land values were based off of what people were willing to pay (rent) to farm it or what sort of return they needed to make it a worthwhile investment.


 


What we’ve seen happen is massive leveraging, steadily increasing cost of production (seed, chemical, fertilizer, equipment, insurance, land rents, etc) and now as prices come under pressure due to massive global oversupplies, margins have quickly collapsed and the cost structure hasn’t responded. Instead, farmers have levered up further by refinancing land and/or selling off some land to keep their bankers going along with them and the cycle has continued.


 


Why would the banks lend to farmers when they didn’t lend to normal citizens? Why would farmers be willing to borrow money when normal citizens weren’t willing to borrow money? Glad you asked.


 


CROP INSURANCE


 


Specifically, federally subsidized crop insurance.


 


Farmers take extraordinary risks doing what they do BUT they now have access (and have had access) to crop insurance that protects a portion of their historical production and/or projected revenue. When I say “a portion” I mean upwards of 75-85%. When I say “federally subsidized crop insurance” I mean that the federal govt pays upwards of 65% of the premium on behalf of the farmer on some crop insurance policies. WHOA.


 


Let me put figures to it for you. Imagine that you were a farmer and your history showed that your 5 year avg yield (actual production history) on your farm was 55 bu/ac and at planting time the insurance price for soybeans was $10.19/bu. Let’s say that it was going to cost you $550/ac to grow soybeans, so a breakeven type situation if you make ordinary yields at ordinary prices. Imagine that you could guarantee yourself $420.00/ac in revenue ($10.19/bu x 55 bu/ac = $560 bu/ac revenue x 75% coverage = $420 /ac) and it only cost you $3.70/ac. You’re paying $3.70/ac to guarantee yourself $420.00/ac in revenue. Pretty cheap, right? Yes, but the REAL cost of that insurance is more like $8.23/ac with the govt paying $4.53/ac and the farmer paying $3.70.


 


Granted, there are some situations in which you can lose more and some causes of loss, such as hail are not covered by basic crop insurance and require a separate policy but in the grand scheme of things, the cost of protecting 75% of revenue is reasonable enough that farmers buy it and banks make loans that they might not otherwise make sans crop insurance policies. There is also increased risks because the loss calculations are based on futures prices at planting and harvest time and do not address the cash markets which might have wide, unfavorable basis so it isn’t anywhere near a complete failsafe but enough to keep the borrowed money flowing.


 


Now we need to put it all together. The relatively “cheap” cost of subsidized crop insurance encourages the farmer to take risks he wouldn’t take otherwise. The balance sheet equity he has goes a lot further if you consider that he “really” only has $130/ac at risk instead of the full $550/ac so he’s willing to a) stay in the game and b) expand his acreage because if he hits a homerun on larger yields and/or higher prices, then JACKPOT!!. If it goes bad, he’s out $130/ac and it doesn’t completely wipe him out - plus he’s using the bank’s money at very low interest rates.


 


The farmer not only wants to stay in the game but he wants to grow so he’s bidding up inputs and more importantly land rents because if you don’t have the land, then you’re out of the game. Revenues continue to be good, in general so the farm cash flow has meat on the bone and where there is meat on the bone, the dogs come chewing. Seed costs are higher every year and sometimes much higher. Equipment costs have gone FREAKIN’ PARABOLIC. Land rents have skyrocketed. Since many farmers are self-insured, health insurance prices have… well you know what they’ve done. Much of this expansion has been done with debt financing on equipment meaning that while the interest rates are low, interest costs are accumulating. You see, there HAS been demand for debt from agriculture and the lenders have seen positive cash flows and the revenue safety net of crop insurance as courage to continue to lend to farmers.


 


Let’s take a detour for a moment here - banks have wanted to lend money but “conservatively” and if the average consumer really hasn’t had the appetite for borrowing money, that makes it a difficult task. If you’re a regional bank or small town bank and you can lend out money on 10-12 month agricultural operating notes to the tune of $500k-2.5 mil each isn’t it much easier to put $10-20 mil to work than if you were dealing with making retail loans for cars, houses, etc particularly since those loans are longer maturity loans? What if you could effectively put $20 mil out in annual operating loans with 12 month or less maturities at 4.5-5.5% via 25-30 loans PLUS the person borrowing it has 75% revenue protection bought via crop insurance as well as land & equipment collateralizing the notes at a time that equipment and land prices are zooming into the stratosphere?!?!?!?!?!


 


You see, the ag community kept growing and the appetite for debt was there from the start but encouraged by federally subsidized crop insurance. Lenders needed to put money to work and they found it too easy NOT to make large operating notes that renewed annually at decent interest rates to individuals/businesses that were a) looking at positive cash flows, b) partially protected by federally subsidized crop revenue protection in the form of crop insurance and c) collateralized by rapidly appreciating assets (equipment & land). Farmers get to expand, rural America gets a hand, bankers put money to work and everyone lives happily ever after…


 


Until commodity prices come under pressure because the supply side gets overstimulated, revenue side drops dramatically while the cost side remains sticky and then we get the massive transfer of equity from the farmer to a variety of beneficiaries including a) banks in the form of interest, b) landlords in the form of higher rent and higher asset(land) values, c) equipment companies in the form of inflated revenues due to inflated equipment prices, d) input providers in the form of higher prices for seed, chemical and fertilizer… all being transferred from the farmer’s balance sheet.


 


Then you add in the intangible side to the equation: what is the farmer going to do if he decides to quit because he doesn’t want to take all of these risks? If he decides NOT to farm because he sees what is happening in terms of greater and greater risks to his equity what is he going to do to put food on his table? If he doesn’t pay the extra $25/ac land rent to keep a neighbor from renting it out from under him he’ll lose the land and then what will he do? There are only so many jobs “in town” to get and rural America is drying up so what will he do? You see, here is the hard part. He made the decision to get in or stay in the rat race even when he knew that the numbers didn’t make sense because he didn’t see a viable “plan B” and there was a banker standing there able and willing to continue to give him more and more rope until he finally hanged himself when the mouse trap flipped on him.


 


THAT, fine sir, is where we are today in US agriculture.


 


I apologize that this turned out as lengthy as it did BUT I felt that it was a worthwhile exercise to put these thoughts into email and share them with you because you are a student of the markets and also because you will hopefully be joining us for our Commodity Roundtable in January so a better framing of the situation might help you understand the circumstances they are facing.


 


As a macroeconomist, how do we work out from under this situation? What is the roadmap for the US farmer? Higher commodity prices are a temporary fix as we’ve seen because as long as the money is available (available credit) and affordable (low interest rates) the inflationary explosion continues on the cost/input side of the equation. Currently we’re shrinking farmer balance sheets until banks won’t be able to lend to them any longer at which time the decisions will be made FOR the farmer not BY the farmer.



Brilliant! I mean, f’ng brilliant. Shaeffer completely nailed it. The government’s subsidies have created a situation where far too much credit has been extended to an industry. This has caused inflation in prices of the inputs that go into farming, but not the output.


Want another example? Have a look at Student Loans versus tuition inflation.



Tuition inflation has greatly outpaced regular CPI, but it has gone hand in hand with the growth of student debt. Over allocations of credit have peculiar effects on the pricing of both the inputs and the outputs of the affected area.


What to do about it?


Now I am not sure what to do about Shaeffer’s deduction. As long as subsidies exist, it seems that too much money will be allocated to agriculture loans, and will therefore, keep grain prices lower.


But here’s a thought. Over the past half dozen years, there has been little demand for loans in the regular economy. This has encouraged bankers to lend to farmers with their government backstop.


Yet what will happen if economic activity picks up? Loan demand across all sectors will increase, decreasing the amount of credit that will be extended to farmers. This will occur at a terrible time as grain prices are near rock bottom levels. Unfortunately, without as much credit, many of these farmers might be forced to quit. However, that will cause the price of the grains to rally. Maybe to a more sustainable level where farmers can once again make a living. Ironically, rising interest rates, might be the best thing for both farmers, and grain prices.


Wait! Did I just make another bullish argument for buying grains?



Yeah, yeah, I did. As Richard Dreyfuss taught me so well, let it ride…



Market On Close in December


What’s that famous Wall Street saying? The dumb money trades in the morning, the smart money trades at the close. Well, astute market watcher Helene Meisler recently highlighted that the Market on Close (MOC) imbalances have consistently been to the sell side lately.



In fact, every single day in December has seen MOC sell imbalances.


Institutions often trade at the close, while the public is more prone to trading closer to the open. There has even been an indicator created to measure this phenomenon.



If we look at the SMART Index, the late day selling shows up clearly with a big retreat from the highs.



So far, the stock market has not followed the SMART Index lower in any meaningful way. But don’t worry, I am sure this distribution by institutions is somehow bullish. After all, don’t you know? Stocks only go higher.


A Perfect Forecast


While I am on the topic of the stock market, earlier in the week Meb Faber noted that Barrons reported:



These strategists are usually bullish, so it’s not terribly surprising. But it does smack of another period when universal optimism also reigned. At the end of 2007, the S&P 500 stood at 1468 and Wall Street’s smartest had the following forecasts:



And where did it close? Down 38.5% to 903. Ooops. Just a little off.


Thanks for reading and have a great weekend,









Thursday, November 23, 2017

Millennials Have Ushered In The "Baby Bust" Cycle

Negative Population Growth, Inc., has issued a November report warning that America is no longer making enough babies to keep pace with deaths. The report blames, the ‘baby bust’ phase on the millennial generation (1980-2000), who are having children at record low rates.



Their attitudes towards marriage, procreation, and materialism changed dramatically after the Great Recession when the economies of the world came to a screeching halt. After a decade of excessive monetary policy from the Federal Reserve. The millennials have been forced to take out an excessive amount of debt such as auto loans, consumer debt, and student loans in an era of wage stagnation. This has fundamentally changed the game for millennials and perhaps changed the course of the United States. The implications of falling birth rates in a low growth economic environment coupled with massive amounts of debt - is a perfect storm that will lead to the next crisis. 


Falling birth rates in the United States have been classified of what some call the ‘baby bust’. Like any bubble, there must be a bust cycle and when it comes to births in the United States — that time, is now. According to the report, some demographers are “freaked out by the falling birth rate, an occupational hazard for people who spend their professional lives scrutinizing population statistics”. As the demographic winds shift, the United States is preparing for a ‘Japanification’ period of lower birth rates and a much old generation to strain the economic and healthcare systems.


According to the Centers for Disease Control and Prevention, the number of babies born declined by 338,000 or 8.7% between 2007 and 2016. Over the period, the national fertility rate declined from 69.3 to a historic low of 62.0 in 2016. For more color, the peak was in 1960 at 118 after World war II, ever since it’s been in decline.



As a result, the national fertility rate (all ages) broke a bearish flag (chart below) and fell -11% between 2007 and 2016. To keep pace with deaths, moms need to have 2.1 births, but that is not the case today with 1.8.


“The fertility rate decline is driven entirely by millennial mothers in their teens and twenties,” said the report.


 


“Birth rates for all age groups of women under 30 fell to record lows in 2016,” it added.



Besides poor economic conditions and a transitioning economy, the report added the increased “availability and effectiveness of sex education and contraceptives for males and females” have played a large role in reducing the birth rate for millennials.



Despite demographers freaking about out by the falling birth rates, the report offers an insight into how others are dealing with the negative trend,




Economists, however, have made peace with the notion that a shrinking population is not necessarily a bad thing. While GDP may slow, a better measure of the country’s economic health – GDP per capita – can benefit.


 


This is especially relevant in a world where robots, AI, and other technologies threaten the jobs of many Americans




The United States is not alone in the demographic shift of less birth rates, as it’s evident below. Major developed economies and emerging growth economies are feeling similar pain.



The report says “we have been here before” relating today’s economic-stress to the 1930s and the late 1970s coinciding with ultra low brith rates for the younger generation. Interesting enough, the report asks: Is it different this time? 


As the paper suggests– it is different and millennials are increasingly delaying kids or just outright abandoning altogether.


The report lists four reasons why this time is different:


  • A 2016 study of Census data from Pew Research found nearly one-third of young adults (ages 18-34) live with their parents, slightly more than the proportion that live with a spouse or partner. Not since record keeping began in 1880 has living at home for this age group outpaced living with a spouse. “They’re concentrating more on school, careers and work and less focused on forming new families, spouses or partners and children,” Richard Fry, lead author of the Pew report, said of millennials. Although student debt is often blamed, it may not be the dominant factor: the trend is stronger for those without a college education.

  • When it comes to marriage, millennials say “I don’t” more than any previous generation. Research by the Urban Institute finds that if current trends continue, 30.7% of millennial women will remain single by age 40, approximately twice the share of their Gen-X counterparts. The data show similar trends for males. Marriage rates fell drastically during the Great Recession, but they had been declining for years prior to that event. At this point even a return to pre-recession levels will not prevent marriage rates among millennial women from falling below those of Gen-Xers by age 40.4 Ironically, the aversion of millennial females to marriage may reflect their economic strength vis a vis males: “Sharp declines in the earning power of non-college males combined with the economic self-sufficiency of women — rising educational attainment, falling gender gap and greater female control over fertility choices — have reduced the economic value of marriage for women.”

  • A cross-generational study conducted at Wharton School of Business found more than half (58%) of millennial female undergraduates do not plan to have children. That is nearly three-times the 22% of Gen-X female undergraduates who did not want children when surveyed in 1992. Results were similar for male students. (The researchers compared surveys of the Wharton graduating class of 1992 and 2012.) While Gen-X women felt “motherhood fulfilled their need to help others” millennial females believe they can serve the greater need by succeeding at work. For millennial men “doing good” is increasingly connected to creating greater balance between work and family. Not surprisingly, they are less likely to think of themselves as the sole breadwinner. Even millennials who do want children say they do not see a clear path toward it.

  • Immigrants are the wild card. They account for 15% of U.S. millennials, up from 6% of the prior generation.8 Although birth rates for foreign-born millennials are generally above those of native-born, a recent study by the Center for Immigration Studies finds that the gap is narrowing.9 From 2008 to 2015: birth rates for foreign-born women ages 15 to 19 fell 50.6% versus a 43% drop for native-born in that age cohort; birth rates for immigrant women 20 to 24 fell 40.5% versus a 28.5% decline for native-born. The Total Fertility Rate – a measure of the number of children a woman can be expected to have in her lifetime based on current patterns – fell 21.5% for immigrant women and 15.4% for native-born women over that period. The implication is clear: When it comes to family size, immigrant millennials have embraced the “smaller is better” ethos of the larger, native-born millennial community. That is good news to those of us who believe a smaller population is in the national interest.

Welcome to the new normal: Millennials will be the first generation that the American dream will most likely not be attainable, as show on the home ownership rate below. Since the real estate boom of the 2000s, homeownership rate for people under thirty-five has literally fallen off a cliff.  The report explores a number of factors of why this trend exists: student debt and the lingering impact of the Great Recession… 



Another new normal: With the introduction of Uber and Lyft fewer millennials are driving– leading to a shake up in the auto industry. The conventional wisdom among automakers are that millennials will unlock a new tranche of demand, but that narrative is going cold as the sharing economy disrupts.



Meanwhile, General Mills in 2016 ran a national advertising campaign targeting the millennial generation titled: ‘make more babies’… The type of conditioning is self-evident of one large corporation that is clearly aware of the low birth rate trend.



The Washington Examiner sums it all up,




The report explains the shift to smaller families is driven by the poor economy, broken American Dream, and job losses millennials witnessed growing up. 




 









Saturday, October 14, 2017

CDC Says Americans Are Fatter Than Ever Before; 40% Of Adults Now Considered Obese

Despite the best efforts of our political elites to ban sodas, among other products deemed unhealthy by our growing nanny state, Americans just continue to grow ever fatter.  According to a new study from the Centers for Disease Control, 40% of Americans are now obese, a new all-time record high, and over 70% are overweight.  Per NBC:





A troubling new report released Friday by the Centers for Disease Control and Prevention shows that almost 40 percent of American adults and nearly 20 percent of adolescents are obese — the highest rates ever recorded for the U.S.



"It"s difficult to be optimistic at this point," said Dr. Frank Hu, chair of the Department of Nutrition at the Harvard School of Public Health. "The trend of obesity has been steadily increasing in both children and adults despite many public health efforts to improve nutrition and physical activity."



Overall, 70.7 percent of Americans are either overweight or obese, meaning that an unhealthy weight has become the norm, with normal weight Americans — a BMI of less than 25 — now in the minority.



So which states are harboring the largest Americans?  As it turns out, the fried delicacies of the American South aren"t so great for the waistline...


Adult Obesity



Meanwhile, a study from Georgia Southern University revealed that it"s not just poor eating habits that"s causing Americans to pack on the pounds...extreme laziness and binge watching the latest Netflix series are also contributing factors.





What the CDC report doesn"t reveal is why the obesity crisis continues to worsen. A recent study by epidemiologists at Georgia Southern University discovered that fewer Americans, particularly women, are trying to lose weight. Public health experts say that an unhealthy diet and the lack of exercise are still the two biggest culprits.



"There’s still a huge amount of cheap, accessible, highly processed food available everywhere almost anytime," says Hu. "And despite people doing more recreational activity these days, the overall activity level, household activity and occupational activity has decreased in recent years."



In addition to unhealthy foods and a sedentary lifestyle, there could be another possible reason for the increasing obesity rates: sleep deprivation. An estimated 50 million and 70 million Americans suffer from sleep disorders or sleep deprivation, according to the Institute of Medicine.



Inadequate sleep is a risk factor for childhood and adult obesity, says Hu. Sleep-deprived people may be too tired to exercise, take in more calories and may undergo hormones changes that control appetite.



Perhaps even more disturbing, the CDC also found that 20% of teenagers and 10% of preschoolers are also now considered obese.





The continued weight increase in the youngest Americans is especially worrisome for long-term health. One in five adolescents, ages 12–19; one in five kids, ages 6–11, and and one in ten preschoolers, ages 2–5 are considered obese, not just overweight.



Obesity is medically defined as having a body-mass index of more than 30. The findings on obese kids in the U.S. comes on top of this week"s World Health Organization report that childhood obesity is soaring around the world, increasing more than tenfold over the past four decades.



Overweight and obese children have a higher risk to stay obese and childhood obesity is linked to a higher chance of early death in adulthood.



MCD kIds


The consequences of the obesity epidemic are devastating: High blood pressure, diabetes, heart disease and stroke are not only killing millions of Americans annually — the obesity epidemic is also a humongous burden on the American health care system, making up $190 billion a year in weight-related medical bills. 


Luckily, Obamacare socialized the medical cost of these bad habits...so enjoy those McDonalds fries and we"ll all share the cost of your blood pressure and cholesterol medications...it"s just more "fair" that way.

Monday, October 2, 2017

Visualizing America's Disappearing Workforce

In his September 2017 paper entitled "Where Have All the Workers Gone? An Inquiry into the Decline of the U.S. Labor Force Participation Rate", Alan B. Krueger of Princeton University explores the dramatic fall in labor force participation in the U.S. from 1997 to 2017.


As Statista"s Martin Armstrong shows in the infographic below, over the last twenty years, the rate has fallen the most for the under 20"s, with the share of 16 to 17 year olds in work dropping by 18.4 and 16.2 percentage points for men and women, respectively.


Infographic: America


You will find more statistics at Statista


As Krueger reports, last year, Italy was the only OECD country which had a lower participation rate of prime age men than the United States. One of the reasons posited by the research is the opioid crisis currently ravaging the country. Labor force participation rates have fallen more in areas where more opioid pain medication is prescribed. According to the Centers for Disease Control and Prevention, the amount of opioids prescribed in 2015 was three times higher than it was in 1999.


As noted in the paper, while the direction of causality is not clear, a 2017 report by David Mericle entitled "The Opioid Epidemic and the U.S. Economy" states that “the opioid epidemic is intertwined with the story of declining prime-age participation, especially for men, and this reinforces our doubts about a rebound in the participation rate.”


But as we pointed out previously, after spending months, or maybe even years, running very complicated regressions that your simple mind could never possibly understand, Krueger would like for you to believe that it"s the growing opioid epidemic that is forcing men to sit on their couches all day rather than look for work.  Here"s a summary of his findings from the Brookings Institute:





The increase in opioid prescriptions from 1999 to 2015 could account for about 20 percent of the observed decline in men’s labor force participation (LFP) during that same period.



In “Where have all the workers gone? An inquiry into the decline of the U.S. labor force participation rate” (PDF), Princeton University’s Alan Krueger examines the labor force implications of the opioid epidemic on a local and national level.



Among other findings, the research suggests that:



  • Regional variation in opioid prescription rates across the U.S. is due in large part to differences in medical practices, rather than varying health conditions. Pain medication is more widely used in counties where health care professionals prescribe greater quantities of opioid medication, with a 10 percent increase in opioid prescriptions per capita is associated with a 2 percent increase in the share of individuals who report taking a pain medication on any given day. When accounting for individuals’ disability status, self-reported health, and demographic characteristics, the effect is cut roughly in half, but remains statistically significant.

  • Over the last 15 years, LFP fell more in counties where more opioids were prescribed. Krueger reaches this conclusion by linking 2015 county-level opioid prescription rates to individual level labor force data in 1999-2001 and 2014-16. For more on the relationship between prescription rates and labor force participation rate on the county-level.


Krueger also provided this very helpful map proving that opioid abuse is highly correlated to unemployment.  Of course, it couldn"t possibly be the case that opioid abuse is the result of high unemployment and the associated depression that goes along with it...no, the opioid abuse definitely came first.




So, what is Krueger"s solution to help reverse the seemingly perpetual decline in labor force participation rates?  If you guessed "Obamacare" then you"re absolutely right...and unfortunately, no, that is not a joke...here is the excerpt from page 38 of Krueger"s paper:





Third, addressing the decades-long slide in labor force participation by prime age men should be a national priority. This group expresses low levels of SWB and reports finding relatively little meaning in their daily activities. Because nearly half of this group reported being in poor health, it may be possible for expanded health insurance coverage and preventative care under the Affordable Care Act to positively affect the health of prime age men going forward.



And while we would never presume to be smart enough to question the very thorough, impartial research of a Princeton economist, we do wonder whether it"s in any way relevant that labor force participation rates seemingly started to decline in 1965...




...at exactly the same time that welfare spending started to surge?




It"s probably just a coincidence.

Tuesday, September 19, 2017

Small-Town American Budgets Devastated By Opioid Crisis As 41 States Subpoena Big Pharma

A surge in Opioid consumption, primarily prescription painkillers, heroin and fentanyl - a drug 50 to 100 times more powerful than morphine - and the resulting spike in overdose related deaths is devastating families in rural America.  But the opioid epidemic is laying waste to more than just the broken families it counts among its victims, as Reuters points out today, rural municipalities are finding it nearly impossible to fund the surging costs associated with overdoses which come in the form of emergency call volumes, medical examiner and coroner bills, and overcrowded jails and courtrooms.  


As an example, Ross County, Ohio, a town of only 77,000, says its budget for child services has doubled in just 5 years and 75% of the children place into protection come from homes where parents have opioid addictions.





Ross County, a largely rural region of 77,000 people an hour south of Columbus, Ohio, is wrestling with an explosion in opioid-related deaths - 44 last year compared to 19 in 2009. The drug addiction epidemic is shattering not just lives but also stressing the county budget.



About 75 percent of the 200 children placed into state care in the county have parents with opioid addictions, up from about 40 percent five years ago, local officials say. Their care is more expensive because they need specialist counseling, longer stays and therapy.



That has caused a near doubling in the county’s child services budget to almost $2.4 million from $1.3 million, said Doug Corcoran, a county commissioner.



For a county with a general fund of just $23 million, that is a big financial burden, Corcoran said. He and his colleagues are now exploring what they might cut to pay for the growing costs of the epidemic, such as youth programs and economic development schemes.



Opioid



But it"s not just the cost of child services that is wreaking havoc on municipal budgets as everything from autopsy and toxicology costs to court fees and jail expenses are surging throughout rural America.





Autopsy and toxicology costs there have nearly doubled in six years, from about $89,000 in 2010 to $165,000 in 2016, county data shows.



Court costs are soaring, mainly because of the expense of prosecuting opioid-related crimes and providing accused with a public defender, local officials say.



The county is using contingency funds to pay for the added coroner costs, said Mike Baker, the county’s top government official. Last year, the county drew $63,000 from those funds, up from $19,000 in 2014, he said. In 2014, the county saw 10 drug-related deaths. In 2016, the number had grown to 53.



In Mercer County, West Virginia, 300 miles (483 km) to the south of Indiana County, opioid-related jail costs are carving into the small annual budget of $12 million for the community of 62,000 people.



The county’s jail expenses are on course to increase by $100,000 this year, compared to 2015. The county pays $48.50 per inmate per day to the jail, and this year the jail is on course to have over 2,000 more “inmate days” compared to 2015, according to county data.



“At least 90 percent of those extra jail costs are opioid-related,” said Greg Puckett, a county commissioner who sits on a national county opioid task force. “We spend more in one month on our jail bill than we spend per month on economic development, our health department and our emergency services combined.”



Meanwhile, as Bloomberg has just noted, attorneys general from 41 states are broadening their investigation into the opioid industry and have served subpoenas to five pharma companies that make the most powerful prescription painkillers.





They announced Tuesday that they had served subpoenas requesting information from five companies that make powerful prescription painkillers and three distributors. Forty-one attorneys general are involved.



The investigation into marketing and sales practices seeks to find out whether the industry"s own actions worsened the epidemic.



If the industry cooperates, the investigation could lead to a national settlement.



The Healthcare Distribution Alliance said in a statement that it"s not responsible for the volume of opioid prescribing but that it does want to work on solving the public health crisis.



Dozens of local and state governments have already filed, announced or publicly considered lawsuits against drugmakers or distributors.



To add some context to the scale of the opioid epidemic, the California Department of Public Health recently dropped some staggering statistics showing that there are a remarkable number of counties in California where annual prescriptions for pain killers actually exceed the population.  





Trinity County is the state’s fourth-smallest, and ended last year with an estimated population of 13,628 people.



Its residents also filled prescriptions for oxycodone, hydrocodone and other opioids 18,439 times, the highest per capita rate in California.



Besides Trinity, other counties with more prescriptions than people include Lake, Shasta, Tuolumne and Del Norte counties. In the Sacramento region, El Dorado, Placer and Sacramento counties had prescription rates above the statewide average, with Yolo County slightly below the state average.



A county’s prescription total represents all opioids dispensed via prescriptions filled at a pharmacy and tracked by the state. Statewide, 15 percent of Californians were prescribed opioids in 2016, ranging from 7.3 percent of residents in tiny Alpine County to almost 27 percent in Lake County.



As might be expected, the scripts per capita are highest in California"s more rural northern counties.




So who is participating most in this deadly epidemic?  Well, according the Centers for Disease Control and Prevention, the biggest abusers of opioids are high-school educated, unemployed, white people living in small towns...





“The following characteristics were associated with higher amounts of opioids prescribed: a larger percentage of non-Hispanic whites; higher rates of uninsured and Medicaid enrollment; lower educational attainment; higher rates of unemployment; (small-town) status; more dentists and physicians per capita; a higher prevalence of diagnosed diabetes, arthritis, and disability; and higher suicide rates,” concluded the authors of a Centers for Disease Control and Prevention study released in July.



“What you’re seeing in California is what you’re seeing in many parts of the country, including Oregon,” Korthuis said. “There are still a lot of rural counties around the U.S. that are awash in prescription opioids.”



Of course, growth in opioid addiction is hardly just a California phenomenon.  According to the CDC"s Annual Surveillance Report of Drug-Related Risks and Outcomes, addiction-related deaths are far more prevalent in the rural "rust-belt" states of the Midwest.




Meanwhile, the epidemic is growing far more severe every year with overdose deaths up 167% across the country since 1999.





The rate of drug overdose deaths increased from 6.1 per 100,000 population in 1999 to 16.3 in 2015; for unintenttional drug overdose deaths, the rate increased from 4.0 per 100,000 in 1999 to 13.8 in 2015; for drug overdose deaths involving any opioid, the rate increased from 2.9 per 100,000 in 1999 to 10.4 in 2015 (p<0.05); for unintenttional drug overdose deaths involving any opioid, the rate increased from 2.1 per 100,000 in 1999 to 9.3 per 100,000 in 2015 (p<0.05). For all four categories of drug overdose deaths, increases in rates were largest from 2013 to 2015, with the rate increasing on average by 9% per year for overall drug overdose deaths (p<0.05), 11% per year for unintenttional drug overdose deaths (p<0.05), 15% per year for drug overdose deaths involving any opioid (p<0.05), and 16% for unintenttional drug overdose deaths involving any opioid (p<0.05).





But don"t worry too much because, as Princeton Economist Alan Krueger told us recently, there is a simple solution to the opioid epidemic in the U.S...apparently it can all be solved with just a little more Obamacare.

Sunday, September 10, 2017

Here Are The California Counties Where Annual Opioid Scripts Outnumber People

The California Department of Public Health just dropped some staggering statistics about the level of opioid abuse in America"s progressive paradise of the left coast.  As the Sacramento Bee points out, there are a remarkable number of counties in California where annual prescriptions for pain killers actually exceed the population.  





Trinity County is the state’s fourth-smallest, and ended last year with an estimated population of 13,628 people.



Its residents also filled prescriptions for oxycodone, hydrocodone and other opioids 18,439 times, the highest per capita rate in California.



Besides Trinity, other counties with more prescriptions than people include Lake, Shasta, Tuolumne and Del Norte counties. In the Sacramento region, El Dorado, Placer and Sacramento counties had prescription rates above the statewide average, with Yolo County slightly below the state average.



A county’s prescription total represents all opioids dispensed via prescriptions filled at a pharmacy and tracked by the state. Statewide, 15 percent of Californians were prescribed opioids in 2016, ranging from 7.3 percent of residents in tiny Alpine County to almost 27 percent in Lake County.



As might be expected, the scripts per capita are highest in California"s more rural northern counties.




So who is participating most in this deadly epidemic?  Well, according the Centers for Disease Control and Prevention, the biggest abusers of opioids are high-school educated, unemployed, white people living in small towns...





“The following characteristics were associated with higher amounts of opioids prescribed: a larger percentage of non-Hispanic whites; higher rates of uninsured and Medicaid enrollment; lower educational attainment; higher rates of unemployment; (small-town) status; more dentists and physicians per capita; a higher prevalence of diagnosed diabetes, arthritis, and disability; and higher suicide rates,” concluded the authors of a Centers for Disease Control and Prevention study released in July.



“What you’re seeing in California is what you’re seeing in many parts of the country, including Oregon,” Korthuis said. “There are still a lot of rural counties around the U.S. that are awash in prescription opioids.”



...oh, and grandma and grandpa are getting high on the reg as well.





In California, residents aged 15 to 29 got 1.7 million prescriptions in 2016, representing 7.2 percent of the state total. That’s down from the 1.9 million prescriptions in 2015, which represented about 7.8 percent of the state total. The age range that featured the largest prescription rate increase were 70- to74-year-olds, whose prescriptions grew from almost 1,354 per 1,000 people in 2015 to 1,394 per 1,000 people in 2016.





Of course, growth in opioid addiction is hardly just a California phenomenon.  According to the CDC"s Annual Surveillance Report of Drug-Related Risks and Outcomes, addiction-related deaths are far more prevalent in the rural "rust-belt" states of the Midwest.




Meanwhile, the epidemic is growing far more severe every year with overdose deaths up 167% across the country since 1999.





The rate of drug overdose deaths increased from 6.1 per 100,000 population in 1999 to 16.3 in 2015; for unintenttional drug overdose deaths, the rate increased from 4.0 per 100,000 in 1999 to 13.8 in 2015; for drug overdose deaths involving any opioid, the rate increased from 2.9 per 100,000 in 1999 to 10.4 in 2015 (p<0.05); for unintenttional drug overdose deaths involving any opioid, the rate increased from 2.1 per 100,000 in 1999 to 9.3 per 100,000 in 2015 (p<0.05). For all four categories of drug overdose deaths, increases in rates were largest from 2013 to 2015, with the rate increasing on average by 9% per year for overall drug overdose deaths (p<0.05), 11% per year for unintenttional drug overdose deaths (p<0.05), 15% per year for drug overdose deaths involving any opioid (p<0.05), and 16% for unintenttional drug overdose deaths involving any opioid (p<0.05).





But don"t worry too much because, as Princeton Economist Alan Krueger told us yesterday, there is a simple solution to the opioid epidemic in the U.S...apparently it can all be solved with just a little more Obamacare.

Saturday, June 24, 2017

Mapping The U.S. States That Smoke The Most (And Least)

The number of people smoking in the U.S. has fallen considerably over the years.


The most recent figures from Centers for Disease Control and Prevention show that just 15 percent of adults are cigarette smokers - down from 20.9 percent in 2005.


There is considerable variation between states though, as this infographic shows.


Infographic: The U.S. States That Smoke The Most | Statista


You will find more statistics at Statista


As Statista"s Martin Armstrong notes, the most smoke-free state is Utah, where 9.1 percent of adults admit to the habit.


On the other end of the scale, 26 percent of Kentucky residents represent the most prolific tobacco consumers.


Outside of the 50 states, Guam actually has the highest rate of smokers, at 27.4 percent. Puerto Rico, on the other hand has the second lowest rate - 10.7 percent.

Saturday, May 6, 2017

A New Street Drug Can Kill You By Touching Your Skin: What You Need To Know

Authored by Alice Salles via TheAntiMedia.org,



The opioid epidemic is a real tragedy. It has been devastating states like West Virginia, Vermont, and Maine - among others - and it’s been the number one factor in a major incarceration shift that is still seldom discussed by the media.


But as soon as the Centers for Disease Control and Prevention (CDC) released a new set of national standards for prescribing painkillers, yet another deadly drug threat is beginning to concern authorities in certain states.


New Hampshire Governor Chris Sununu spoke at a press conference this week, warning that a drug that’s 10,000 times stronger than morphine has made its way into the state. As a result, many first responders have been left scrambling to find a way to handle this new threat.


Carfentanil, a powerful new opioid, has already claimed three lives.


Engineered to be used as an elephant tranquilizer, the drug’s lethal dosage is 20 micrograms. Since the product can cause deadly effects just by being sprinkled on someone’s skin, authorities are highly concerned.





Manchester Fire’s EMS Director Chris Hickey is warning New Hampshire residents they must be hyper, hyper vigilant of what is out there, hyper vigilant of where you put your hands, what you come in contact with.”



There is nothing out there other than going on in hazmat suits on every single overdose that is going to completely protect us. We just have to be super, super careful with it,” Hickey told his own crew.



The drug is so powerful that first responders are even having a hard time reversing overdoses when they arrive at emergency locations.


On one occasion, Hickey said, one of his men had to use six to eight doses of Narcan, an overdose reversal drug, to revive a victim - twice the dose used in most cases.


As doctors and first responders notice a pattern, they are also warning the public that Narcan isn’t going to be enough from now on. So what is next?


Fear, of course.


As state and local authorities find themselves panicking over this issue, many will ask for tougher laws. Federal agencies will then intervene, adding further restrictions to the already heavily regulated drug market in the United States. Adding fuel to the fire, the drug war will continue to target opioids like heroin and opium while Congress continues the process of imposing strict limits on some opioid prescriptions.


As more restrictions are applied, users will have a harder time gaining access to the substances they are already addicted to, forcing them to turn to the black market for their fix.


With this, incidents like the ones we’re seeing in New Hampshire will become even more common, prompting further government involvement. As this snowballs into further restrictions, the opioid epidemic will reach unimaginable levels, killing a record number of people, making orphans out of countless children, and creating another boom in U.S. incarceration rates.


While it’s easy to understand why locals in New Hampshire are afraid, the rhetoric and reality on the ground should not be used to push for more heavy-handed intervention from local and federal governments. Instead, it’s time to look deep into how the opioid crisis started, keeping in mind that the government’s own fruitless battle against drugs was the very root of what is now concerning New Hampshire authorities.


Like New Hampshire’s Drug Lab Director Tim Pifer, we agree that “this is certainly unfortunately just the tip of the iceberg.” But just like any iceberg, its base lies in dark, cold waters. Unless we’re ready to be honest with ourselves, finding the courage to dive deep to find where it begins, we will never know how huge this problem really is. And if we’re not willing to look at the root of the problem, we won’t be able to find a proper solution.

Thursday, March 23, 2017

Lead Poisoning In "Dozens Of California Communities" Worse Than Flint, Michigan

California, a state infamous for its environmental protections, including a $65 billion tunnel project being pushed by Governor Jerry Brown so as not to disrupt the habitat of a tiny, non-native fish species, may be facing a lead poisoning crisis more severe than Flint, Michigan


According to blood test data obtained by Reuters, rates of childhood lead poisoning in several California cities surpass those measured in Flint, Michigan, with one Fresno locale showing rates nearly three times higher.


In fact, in Fresno’s downtown 93701 zip code, nearly 14% of children tested showed lead levels at or above 5 micrograms per deciliter of blood, the Centers for Disease Control and Prevention’s current threshold for an elevated reading.  As the CDC noted, no level of lead exposure is safe, but children who test that high warrant an immediate public health response.


In all, per the map below, Reuters found at least 29 California neighborhoods where children had elevated lead tests at rates at least as high as in Flint.  “It’s a widespread problem and we have to get a better idea of where the sources of exposure are,” said California Assembly member Bill Quirk, who chairs the state legislature’s Committee on Environmental Safety and Toxic Materials. 


California



And while elevated lead levels in children was found to be a fairly widespread issue across California, the poorer regions of the Central Valley where the majority of California"s crops are grown were found to be particularly at risk.





In all, Fresno County had nine zip code areas where high lead levels among children tested were at least as common as in Flint. The Reuters article in December documented nearly 3,000 locales nationwide with poisoning rates double those found in the Michigan city along the Flint River.



The city of Fresno battles high poverty rates and problems with substandard housing, both risk factors for lead exposure. Some locals are also concerned with drinking water, after unsafe levels of lead were detected in at least 120 Fresno homes last year.



Fresno County’s lead poisoning prevention program conducts outreach across the city, and a program health educator, Leticia Berber, says exposure remains too common.



Still, she expressed surprise at the area’s high rate. “We haven’t looked at it that way compared to Flint,” Berber said.



Of course, California’s Public Health Department attempted to downplay the results saying that comparisons between the state’s blood lead testing results and those from other states aren’t warranted. It said California tests children deemed most at risk for lead exposure, such as those enrolled in Medicaid or living in older housing.  “Testing of at-risk children, and not all children, skews California results to higher percentage of children tested showing lead exposure,” the state said.


But, as Reuters points out, testing that targets at-risk children is common across much of the country. 


Blood tests can’t determine the cause of a child’s exposure, but potential sources include crumbling old paint, contaminated soil, tainted drinking water or other lead hazards.


CA



Meanwhile, elevated exposure levels were also found to be a problem in larger neighborhoods surrounding San Francisco and Los Angeles as well.





Lead exposure is common in other East Bay areas, including large parts of Oakland, and nearby Emeryville and Fremont, the new data shows.



In January, Oakland city council members introduced a resolution that would require property owners to obtain lead inspections and safety certifications before renting or selling houses and apartments built before 1978, when lead paint was banned.



Emeryville"s city council this month proposed an ordinance to require proof that contractors will adhere to Environmental Protection Agency standards – including safe lead paint removal practices – before they renovate older housing.



Emeryville Vice Mayor John Bauters said paint exposure isn’t the only risk. A long history of heavy industry in the East Bay also left contaminated soil in some areas.



In the Los Angeles area, the prevalence of high blood lead tests reached 5 percent or above in at least four zip codes during 2012.



Since August, a sampling of children tested from the Los Angeles neighborhoods of Westlake, Koreatown and Pico Union revealed about 5 percent with high lead results, said Jeff Sanchez, a public health specialist at Impact Assessment, which helps Los Angeles run its lead poisoning prevention program.



“The more you look,” Sanchez said, “the more you find.”



Perhaps it"s time to divert some of those funds allocated to protect endangered fish and bees to actually help the taxpayers of your state, Governor Brown.

Sunday, March 5, 2017

America&#039;s Miserable 21st Century

Via Nicholas Eberstadt of CommentaryMagazine.com,


On the morning of November 9, 2016, America’s elite—its talking and deciding classes—woke up to a country they did not know. To most privileged and well-educated Americans, especially those living in its bicoastal bastions, the election of Donald Trump had been a thing almost impossible even to imagine. What sort of country would go and elect someone like Trump as president? Certainly not one they were familiar with, or understood anything about.


I


Whatever else it may or may not have accomplished, the 2016 election was a sort of shock therapy for Americans living within what Charles Murray famously termed “the bubble” (the protective barrier of prosperity and self-selected associations that increasingly shield our best and brightest from contact with the rest of their society). The very fact of Trump’s election served as a truth broadcast about a reality that could no longer be denied: Things out there in America are a whole lot different from what you thought. 


Yes, things are very different indeed these days in the “real America” outside the bubble. In fact, things have been going badly wrong in America since the beginning of the 21st century.


It turns out that the year 2000 marks a grim historical milestone of sorts for our nation. For whatever reasons, the Great American Escalator, which had lifted successive generations of Americans to ever higher standards of living and levels of social well-being, broke down around then—and broke down very badly.


The warning lights have been flashing, and the klaxons sounding, for more than a decade and a half. But our pundits and prognosticators and professors and policymakers, ensconced as they generally are deep within the bubble, were for the most part too distant from the distress of the general population to see or hear it. (So much for the vaunted “information era” and “big-data revolution.”) Now that those signals are no longer possible to ignore, it is high time for experts and intellectuals to reacquaint themselves with the country in which they live and to begin the task of describing what has befallen the country in which we have lived since the dawn of the new century.


II


Consider the condition of the American economy. In some circles people still widely believe, as one recent New York Times business-section article cluelessly insisted before the inauguration, that “Mr. Trump will inherit an economy that is fundamentally solid.” But this is patent nonsense. By now it should be painfully obvious that the U.S. economy has been in the grip of deep dysfunction since the dawn of the new century. And in retrospect, it should also be apparent that America’s strange new economic maladies were almost perfectly designed to set the stage for a populist storm.


Ever since 2000, basic indicators have offered oddly inconsistent readings on America’s economic performance and prospects. It is curious and highly uncharacteristic to find such measures so very far out of alignment with one another. We are witnessing an ominous and growing divergence between three trends that should ordinarily move in tandem: wealth, output, and employment. Depending upon which of these three indicators you choose, America looks to be heading up, down, or more or less nowhere.


From the standpoint of wealth creation, the 21st century is off to a roaring start. By this yardstick, it looks as if Americans have never had it so good and as if the future is full of promise. Between early 2000 and late 2016, the estimated net worth of American households and nonprofit institutions more than doubled, from $44 trillion to $90 trillion. (SEE FIGURE 1.)


Although that wealth is not evenly distributed, it is still a fantastic sum of money—an average of over a million dollars for every notional family of four. This upsurge of wealth took place despite the crash of 2008—indeed, private wealth holdings are over $20 trillion higher now than they were at their pre-crash apogee. The value of American real-estate assets is near or at all-time highs, and America’s businesses appear to be thriving. Even before the “Trump rally” of late 2016 and early 2017, U.S. equities markets were hitting new highs—and since stock prices are strongly shaped by expectations of future profits, investors evidently are counting on the continuation of the current happy days for U.S. asset holders for some time to come.



A rather less cheering picture, though, emerges if we look instead at real trends for the macro-economy. Here, performance since the start of the century might charitably be described as mediocre, and prospects today are no better than guarded.


The recovery from the crash of 2008—which unleashed the worst recession since the Great Depression—has been singularly slow and weak. According to the Bureau of Economic Analysis (BEA), it took nearly four years for America’s gross domestic product (GDP) to re-attain its late 2007 level. As of late 2016, total value added to the U.S. economy was just 12 percent higher than in 2007. (SEE FIGURE 2.) The situation is even more sobering if we consider per capita growth. It took America six and a half years—until mid-2014—to get back to its late 2007 per capita production levels. And in late 2016, per capita output was just 4 percent higher than in late 2007—nine years earlier. By this reckoning, the American economy looks to have suffered something close to a lost decade.



But there was clearly trouble brewing in America’s macro-economy well before the 2008 crash, too. Between late 2000 and late 2007, per capita GDP growth averaged less than 1.5 percent per annum. That compares with the nation’s long-term postwar 1948–2000 per capita growth rate of almost 2.3 percent, which in turn can be compared to the “snap back” tempo of 1.1 percent per annum since per capita GDP bottomed out in 2009. Between 2000 and 2016, per capita growth in America has averaged less than 1 percent a year. To state it plainly: With postwar, pre-21st-century rates for the years 20002016, per capita GDP in America would be more than 20 percent higher than it is today.


The reasons for America’s newly fitful and halting macroeconomic performance are still a puzzlement to economists and a subject of considerable contention and debate. Economists are generally in consensus, however, in one area: They have begun redefining the growth potential of the U.S. economy downwards. The U.S. Congressional Budget Office (CBO), for example, suggests that the “potential growth” rate for the U.S. economy at full employment of factors of production has now dropped below 1.7 percent a year, implying a sustainable long-term annual per capita economic growth rate for America today of well under 1 percent.


Then there is the employment situation. If 21st-century America’s GDP trends have been disappointing, labor-force trends have been utterly dismal. Work rates have fallen off a cliff since the year 2000 and are at their lowest levels in decades. We can see this by looking at the estimates by the Bureau of Labor Statistics (BLS) for the civilian employment rate, the jobs-to-population ratio for adult civilian men and women. (SEE FIGURE 3.) Between early 2000 and late 2016, America’s overall work rate for Americans age 20 and older underwent a drastic decline. It plunged by almost 5 percentage points (from 64.6 to 59.7). Unless you are a labor economist, you may not appreciate just how severe a falloff in employment such numbers attest to. Postwar America never experienced anything comparable.



From peak to trough, the collapse in work rates for U.S. adults between 2008 and 2010 was roughly twice the amplitude of what had previously been the country’s worst postwar recession, back in the early 1980s. In that previous steep recession, it took America five years to re-attain the adult work rates recorded at the start of 1980. This time, the U.S. job market has as yet, in early 2017, scarcely begun to claw its way back up to the work rates of 2007—much less back to the work rates from early 2000.


As may be seen in Figure 3, U.S. adult work rates never recovered entirely from the recession of 2001—much less the crash of ’08. And the work rates being measured here include people who are engaged in any paid employment—any job, at any wage, for any number of hours of work at all.


On Wall Street and in some parts of Washington these days, one hears that America has gotten back to “near full employment.” For Americans outside the bubble, such talk must seem nonsensical. It is true that the oft-cited “civilian unemployment rate” looked pretty good by the end of the Obama era—in December 2016, it was down to 4.7 percent, about the same as it had been back in 1965, at a time of genuine full employment. The problem here is that the unemployment rate only tracks joblessness for those still in the labor force; it takes no account of workforce dropouts. Alas, the exodus out of the workforce has been the big labor-market story for America’s new century. (At this writing, for every unemployed American man between 25 and 55 years of age, there are another three who are neither working nor looking for work.) Thus the “unemployment rate” increasingly looks like an antique index devised for some earlier and increasingly distant war: the economic equivalent of a musket inventory or a cavalry count.


By the criterion of adult work rates, by contrast, employment conditions in America remain remarkably bleak. From late 2009 through early 2014, the country’s work rates more or less flatlined. So far as can be told, this is the only “recovery” in U.S. economic history in which that basic labor-market indicator almost completely failed to respond.


Since 2014, there has finally been a measure of improvement in the work rate—but it would be unwise to exaggerate the dimensions of that turnaround. As of late 2016, the adult work rate in America was still at its lowest level in more than 30 years. To put things another way: If our nation’s work rate today were back up to its start-of-the-century highs, well over 10 million more Americans would currently have paying jobs.


There is no way to sugarcoat these awful numbers. They are not a statistical artifact that can be explained away by population aging, or by increased educational enrollment for adult students, or by any other genuine change in contemporary American society. The plain fact is that 21st-century America has witnessed a dreadful collapse of work.


For an apples-to-apples look at America’s 21st-century jobs problem, we can focus on the 25–54 population—known to labor economists for self-evident reasons as the “prime working age” group. For this key labor-force cohort, work rates in late 2016 were down almost 4 percentage points from their year-2000 highs. That is a jobs gap approaching 5 million for this group alone.


It is not only that work rates for prime-age males have fallen since the year 2000—they have, but the collapse of work for American men is a tale that goes back at least half a century. (I wrote a short book last year about this sad saga.2) What is perhaps more startling is the unexpected and largely unnoticed fall-off in work rates for prime-age women. In the U.S. and all other Western societies, postwar labor markets underwent an epochal transformation. After World War II, work rates for prime women surged, and continued to rise—until the year 2000. Since then, they too have declined. Current work rates for prime-age women are back to where they were a generation ago, in the late 1980s. The 21st-century U.S. economy has been brutal for male and female laborers alike—and the wreckage in the labor market has been sufficiently powerful to cancel, and even reverse, one of our society’s most distinctive postwar trends: the rise of paid work for women outside the household.


In our era of no more than indifferent economic growth, 21st–century America has somehow managed to produce markedly more wealth for its wealthholders even as it provided markedly less work for its workers. And trends for paid hours of work look even worse than the work rates themselves. Between 2000 and 2015, according to the BEA, total paid hours of work in America increased by just 4 percent (as against a 35 percent increase for 1985–2000, the 15-year period immediately preceding this one). Over the 2000–2015 period, however, the adult civilian population rose by almost 18 percent—meaning that paid hours of work per adult civilian have plummeted by a shocking 12 percent thus far in our new American century.


This is the terrible contradiction of economic life in what we might call America’s Second Gilded Age (2000—). It is a paradox that may help us understand a number of overarching features of our new century. These include the consistent findings that public trust in almost all U.S. institutions has sharply declined since 2000, even as growing majorities hold that America is “heading in the wrong direction.” It provides an immediate answer to why overwhelming majorities of respondents in public-opinion surveys continue to tell pollsters, year after year, that our ever-richer America is still stuck in the middle of a recession. The mounting economic woes of the “little people” may not have been generally recognized by those inside the bubble, or even by many bubble inhabitants who claimed to be economic specialists—but they proved to be potent fuel for the populist fire that raged through American politics in 2016.


III


So general economic conditions for many ordinary Americans—not least of these, Americans who did not fit within the academy’s designated victim classes—have been rather more insecure than those within the comfort of the bubble understood. But the anxiety, dissatisfaction, anger, and despair that range within our borders today are not wholly a reaction to the way our economy is misfiring. On the nonmaterial front, it is likewise clear that many things in our society are going wrong and yet seem beyond our powers to correct.


Some of these gnawing problems are by no means new: A number of them (such as family breakdown) can be traced back at least to the 1960s, while others are arguably as old as modernity itself (anomie and isolation in big anonymous communities, secularization and the decline of faith). But a number have roared down upon us by surprise since the turn of the century—and others have redoubled with fearsome new intensity since roughly the year 2000.


American health conditions seem to have taken a seriously wrong turn in the new century. It is not just that overall health progress has been shockingly slow, despite the trillions we devote to medical services each year. (Which “Cold War babies” among us would have predicted we’d live to see the day when life expectancy in East Germany was higher than in the United States, as is the case today?)


Alas, the problem is not just slowdowns in health progress—there also appears to have been positive retrogression for broad and heretofore seemingly untroubled segments of the national population. A short but electrifying 2015 paper by Anne Case and Nobel Economics Laureate Angus Deaton talked about a mortality trend that had gone almost unnoticed until then: rising death rates for middle-aged U.S. whites. By Case and Deaton’s reckoning, death rates rose somewhat slightly over the 1999–2013 period for all non-Hispanic white men and women 45–54 years of age—but they rose sharply for those with high-school degrees or less, and for this less-educated grouping most of the rise in death rates was accounted for by suicides, chronic liver cirrhosis, and poisonings (including drug overdoses).


Though some researchers, for highly technical reasons, suggested that the mortality spike might not have been quite as sharp as Case and Deaton reckoned, there is little doubt that the spike itself has taken place. Health has been deteriorating for a significant swath of white America in our new century, thanks in large part to drug and alcohol abuse. All this sounds a little too close for comfort to the story of modern Russia, with its devastating vodka- and drug-binging health setbacks. Yes: It can happen here, and it has. Welcome to our new America.


In December 2016, the Centers for Disease Control and Prevention (CDC) reported that for the first time in decades, life expectancy at birth in the United States had dropped very slightly (to 78.8 years in 2015, from 78.9 years in 2014). Though the decline was small, it was statistically meaningful—rising death rates were characteristic of males and females alike; of blacks and whites and Latinos together. (Only black women avoided mortality increases—their death levels were stagnant.) A jump in “unintentional injuries” accounted for much of the overall uptick.


It would be unwarranted to place too much portent in a single year’s mortality changes; slight annual drops in U.S. life expectancy have occasionally been registered in the past, too, followed by continued improvements. But given other developments we are witnessing in our new America, we must wonder whether the 2015 decline in life expectancy is just a blip, or the start of a new trend. We will find out soon enough. It cannot be encouraging, though, that the Human Mortality Database, an international consortium of demographers who vet national data to improve comparability between countries, has suggested that health progress in America essentially ceased in 2012—that the U.S. gained on average only about a single day of life expectancy at birth between 2012 and 2014, before the 2015 turndown.


The opioid epidemic of pain pills and heroin that has been ravaging and shortening lives from coast to coast is a new plague for our new century. The terrifying novelty of this particular drug epidemic, of course, is that it has gone (so to speak) “mainstream” this time, effecting breakout from disadvantaged minority communities to Main Street White America. By 2013, according to a 2015 report by the Drug Enforcement Administration, more Americans died from drug overdoses (largely but not wholly opioid abuse) than from either traffic fatalities or guns. The dimensions of the opioid epidemic in the real America are still not fully appreciated within the bubble, where drug use tends to be more carefully limited and recreational. In Dreamland, his harrowing and magisterial account of modern America’s opioid explosion, the journalist Sam Quinones notes in passing that “in one three-month period” just a few years ago, according to the Ohio Department of Health, “fully 11 percent of all Ohioans were prescribed opiates.” And of course many Americans self-medicate with licit or illicit painkillers without doctors’ orders.


In the fall of 2016, Alan Krueger, former chairman of the President’s Council of Economic Advisers, released a study that further refined the picture of the real existing opioid epidemic in America: According to his work, nearly half of all prime working-age male labor-force dropouts—an army now totaling roughly 7 million men—currently take pain medication on a daily basis.


We already knew from other sources (such as BLS “time use” surveys) that the overwhelming majority of the prime-age men in this un-working army generally don’t “do civil society” (charitable work, religious activities, volunteering), or for that matter much in the way of child care or help for others in the home either, despite the abundance of time on their hands. Their routine, instead, typically centers on watching—watching TV, DVDs, Internet, hand-held devices, etc.—and indeed watching for an average of 2,000 hours a year, as if it were a full-time job. But Krueger’s study adds a poignant and immensely sad detail to this portrait of daily life in 21st-century America: In our mind’s eye we can now picture many millions of un-working men in the prime of life, out of work and not looking for jobs, sitting in front of screens—stoned.


But how did so many millions of un-working men, whose incomes are limited, manage en masse to afford a constant supply of pain medication? Oxycontin is not cheap. As Dreamland carefully explains, one main mechanism today has been the welfare state: more specifically, Medicaid, Uncle Sam’s means-tested health-benefits program. Here is how it works (we are with Quinones in Portsmouth, Ohio):





[The Medicaid card] pays for medicine—whatever pills a doctor deems that the insured patient needs. Among those who receive Medicaid cards are people on state welfare or on a federal disability program known as SSI. . . . If you could get a prescription from a willing doctor—and Portsmouth had plenty of them—Medicaid health-insurance cards paid for that prescription every month. For a three-dollar Medicaid co-pay, therefore, addicts got pills priced at thousands of dollars, with the difference paid for by U.S. and state taxpayers. A user could turn around and sell those pills, obtained for that three-dollar co-pay, for as much as ten thousand dollars on the street.



In 21st-century America, “dependence on government” has thus come to take on an entirely new meaning.


You may now wish to ask: What share of prime-working-age men these days are enrolled in Medicaid? According to the Census Bureau’s SIPP survey (Survey of Income and Program Participation), as of 2013, over one-fifth (21 percent) of all civilian men between 25 and 55 years of age were Medicaid beneficiaries. For prime-age people not in the labor force, the share was over half (53 percent). And for un-working Anglos (non-Hispanic white men not in the labor force) of prime working age, the share enrolled in Medicaid was 48 percent.


By the way: Of the entire un-working prime-age male Anglo population in 2013, nearly three-fifths (57 percent) were reportedly collecting disability benefits from one or more government disability program in 2013. Disability checks and means-tested benefits cannot support a lavish lifestyle. But they can offer a permanent alternative to paid employment, and for growing numbers of American men, they do. The rise of these programs has coincided with the death of work for larger and larger numbers of American men not yet of retirement age. We cannot say that these programs caused the death of work for millions upon millions of younger men: What is incontrovertible, however, is that they have financed it—just as Medicaid inadvertently helped finance America’s immense and increasing appetite for opioids in our new century.


It is intriguing to note that America’s nationwide opioid epidemic has not been accompanied by a nationwide crime wave (excepting of course the apparent explosion of illicit heroin use). Just the opposite: As best can be told, national victimization rates for violent crimes and property crimes have both reportedly dropped by about two-thirds over the past two decades.3 The drop in crime over the past generation has done great things for the general quality of life in much of America. There is one complication from this drama, however, that inhabitants of the bubble may not be aware of, even though it is all too well known to a great many residents of the real America. This is the extraordinary expansion of what some have termed America’s “criminal class”—the population sentenced to prison or convicted of felony offenses—in recent decades. This trend did not begin in our century, but it has taken on breathtaking enormity since the year 2000.


Most well-informed readers know that the U.S. currently has a higher share of its populace in jail or prison than almost any other country on earth, that Barack Obama and others talk of our criminal-justice process as “mass incarceration,” and know that well over 2 million men were in prison or jail in recent years.4 But only a tiny fraction of all living Americans ever convicted of a felony is actually incarcerated at this very moment. Quite the contrary: Maybe 90 percent of all sentenced felons today are out of confinement and living more or less among us. The reason: the basic arithmetic of sentencing and incarceration in America today. Correctional release and sentenced community supervision (probation and parole) guarantee a steady annual “flow” of convicted felons back into society to augment the very considerable “stock” of felons and ex-felons already there. And this “stock” is by now truly enormous.


One forthcoming demographic study by Sarah Shannon and five other researchers estimates that the cohort of current and former felons in America very nearly reached 20 million by the year 2010. If its estimates are roughly accurate, and if America’s felon population has continued to grow at more or less the same tempo traced out for the years leading up to 2010, we would expect it to surpass 23 million persons by the end of 2016 at the latest. Very rough calculations might therefore suggest that at this writing, America’s population of non-institutionalized adults with a felony conviction somewhere in their past has almost certainly broken the 20 million mark by the end of 2016. A little more rough arithmetic suggests that about 17 million men in our general population have a felony conviction somewhere in their CV. That works out to one of every eight adult males in America today.


We have to use rough estimates here, rather than precise official numbers, because the government does not collect any data at all on the size or socioeconomic circumstances of this population of 20 million, and never has. Amazing as this may sound and scandalous though it may be, America has, at least to date, effectively banished this huge group—a group roughly twice the total size of our illegal-immigrant population and an adult population larger than that in any state but California—to a near-total and seemingly unending statistical invisibility. Our ex-cons are, so to speak, statistical outcasts who live in a darkness our polity does not care enough to illuminate—beyond the scope or interest of public policy, unless and until they next run afoul of the law.


Thus we cannot describe with any precision or certainty what has become of those who make up our “criminal class” after their (latest) sentencing or release. In the most stylized terms, however, we might guess that their odds in the real America are not all that favorable. And when we consider some of the other trends we have already mentioned—employment, health, addiction, welfare dependence—we can see the emergence of a malign new nationwide undertow, pulling downward against social mobility.


Social mobility has always been the jewel in the crown of the American mythos and ethos. The idea (not without a measure of truth to back it up) was that people in America are free to achieve according to their merit and their grit—unlike in other places, where they are trapped by barriers of class or the misfortune of misrule. Nearly two decades into our new century, there are unmistakable signs that America’s fabled social mobility is in trouble—perhaps even in serious trouble.


Consider the following facts. First, according to the Census Bureau, geographical mobility in America has been on the decline for three decades, and in 2016 the annual movement of households from one location to the next was reportedly at an all-time (postwar) low. Second, as a study by three Federal Reserve economists and a Notre Dame colleague demonstrated last year, “labor market fluidity”—the churning between jobs that among other things allows people to get ahead—has been on the decline in the American labor market for decades, with no sign as yet of a turnaround. Finally, and not least important, a December 2016 report by the “Equal Opportunity Project,” a team led by the formidable Stanford economist Raj Chetty, calculated that the odds of a 30-year-old’s earning more than his parents at the same age was now just 51 percent: down from 86 percent 40 years ago. Other researchers who have examined the same data argue that the odds may not be quite as low as the Chetty team concludes, but agree that the chances of surpassing one’s parents’ real income have been on the downswing and are probably lower now than ever before in postwar America.


Thus the bittersweet reality of life for real Americans in the early 21st century: Even though the American economy still remains the world’s unrivaled engine of wealth generation, those outside the bubble may have less of a shot at the American Dream than has been the case for decades, maybe generations—possibly even since the Great Depression.


IV


The funny thing is, people inside the bubble are forever talking about “economic inequality,” that wonderful seminar construct, and forever virtue-signaling about how personally opposed they are to it. By contrast, “economic insecurity” is akin to a phrase from an unknown language. But if we were somehow to find a “Google Translate” function for communicating from real America into the bubble, an important message might be conveyed:


The abstraction of “inequality” doesn’t matter a lot to ordinary Americans. The reality of economic insecurity does. The Great American Escalator is broken—and it badly needs to be fixed.


With the election of 2016, Americans within the bubble finally learned that the 21st century has gotten off to a very bad start in America. Welcome to the reality. We have a lot of work to do together to turn this around.