Monthly Archives: September 2017

The Indians’ Win Streak Wasn’t MLB’s Longest — But It Might Be The Most Impressive

It had to end sometime. After sustaining a perfect record and a staggering 142-37 scoring margin over more than three weeks of play, the Cleveland Indians finally lost Friday night, dropping a tight contest to the Kansas City Royals. It was their first loss after winning 22 straight games. Now that The Streak is over, Cleveland can go back to focusing on the playoffs like any contending team.

Just because the Indians can put their streak in the rearview mirror, though, doesn’t mean that we can’t dwell on it a little more. It wasn’t the major league record for consecutive wins — if we include unofficial ties in between victories, the 1916 New York Giants’ 26-game mark still reigns supreme. But we can compare the Indians’ streak to that Giants’ run and determine exactly how difficult baseball’s best winning streaks were in general. (And, because I can’t resist, compare the Indians’ accomplishments with winning streaks in the NBA.)

Depending on how you measure the streak’s likelihood, the chances of a team like Cleveland pulling off their streak might have been as low as 1 in 65,000.

To judge this, I compared all the MLB streaks to one another, assuming they were done by the same, generic contending team. I set up a simulation under which a team with a fixed Elo rating — our method for determining how good a team is at a given moment — would take a crack at the particular opponentsIncluding the location of the game and the opposing starting pitchers.

“>1 faced by every real MLB team who had a winning streakAgain, including streaks interrupted by ties.

“>2 of at least 18 games since 1901.

A few more technical details of the simulation: I gave all the teams the same fixed rating, 1560, which is the average Elo of a World Series participant since 1903, when the first modern Fall Classic was staged.For context, the average Elo rating is about 1500.

“>3 For comparison’s sake, the Indians’ rating at the beginning of their streak was 1555. I also assumed the streaking team had a five-man starting rotation, with the team’s rotation slot for the initial game of the streak randomized.The generic team’s Elo pitcher ratings were then based on the long-term average for that slot number in the rotation. The opponent’s was still the real-life version that reflects the actual starters a team faced during its streak.

“>4 (This matters because a team that goes into a potential streak with its No. 5 starter is much less likely to get off on the right foot than a team putting its ace on the mound.)

After running the first round of simulations, here were the odds of our generic contending team pulling off each streak:

Which MLB winning streak was most impressive?

Probability of a generic contending team matching MLB’s eight longest winning streaks since 1901

YEAR TEAM STREAK LENGTH AVG. OPP. ELO AVG. WIN PROB. GENERIC TEAM ODDS
1916 New York Giants 26 1493.4 65.2% 1 in 76,702
2017 Cleveland Indians 22 1496.7 63.0 29,951
1935 Chicago Cubs 21 1499.6 63.1 19,477
1953 New York Yankees 18 1518.0 58.6 16,752
1947 New York Yankees 19 1506.2 60.8 13,297
1906 Chicago White Sox 19 1507.4 61.4 11,642
2002 Oakland Athletics 20 1489.5 63.7 8,454
1904 New York Giants 18 1471.4 66.4 1,691

According to this model, the hardest streak still belonged to the 1916 Giants — which isn’t too surprising, since they won four more games in a row than the Indians. And sorry, Billy Beane: the 2002 “Moneyball” A’s also fall behind lesser streaks because of the weak opposition they faced during their streak. But another thing that stands out are the odds, which are much more favorable than if we simply ran them on a .500 team.

(We’ll have to leave the impressiveness of the Dodgers’ feat — winning 52 out of 61 earlier in the season — for another time.)

The difference is because a 1560 Elo squad is (by design) no ordinary .500 team. Our generic team is going to be predisposed to running off a stretch of games like this, which only makes sense — average teams don’t go on these kinds of tears. And our simulation teams only got hotter as they won — that is, a team’s rating is fixed at 1560 before the streak begins, but then it gains steam with each victory, making the odds of winning again higher.

But there’s another layer we can add to the simulation to make it more reflective of the conditions under which each streak was actually compiled. Most clubs didn’t use the five-man rotation, for instance, until the 1970s or early ‘80s; likewise, the best teams of the past used to be much stronger Elo-wise, making it more likely we’d see such a run of greatness earlier in baseball history. We can account for these wrinkles by assigning a four-man rotation to teams before 1980, and adjusting our generic team’s fixed rating to be slightly higher in the past than in later seasons.The adjustment, which is based on smoothing out changes in the average World Series team’s pre-playoff Elo over time, isn’t huge for most comparisons, but it does drop our fixed rating from about 1580 in 1903 to about 1540 in 2017.

“>5 After re-running the numbers with these two tweaks, here’s an amended list of the most difficult streaks:

What if we account for rotation size and era?

Probability of a generic contender matching the longest winning streaks, adjusted for historical spread of Elo ratings and shorter rotations in past

YEAR TEAM STREAK LENGTH AVG. OPP. ELO AVG. WIN PROB. GENERIC TEAM ODDS (ADJ.)
2017 Cleveland Indians 22 1496.6 60.9% 1 in 65,566
1916 New York Giants 26 1493.5 67.2 34,720
1953 New York Yankees 18 1518.0 59.2 13,895
2002 Oakland Athletics 20 1489.4 62.2 13,775
1935 Chicago Cubs 21 1499.7 64.3 12,736
1947 New York Yankees 19 1506.2 61.7 10,223
1906 Chicago White Sox 19 1507.5 63.9 5,313
1904 New York Giants 18 1471.4 68.9 860

In a plot twist, the Indians’ streak now rises to the top — a function of being accomplished in an era of (theoretically) more parity and a higher chance for some scrub pitcher to mess the streak up thanks to a bigger rotation than older teams had.

So how does this stack up against notable streaks from another sport like, say, basketball? Using the same Elo-based method,Except without any of the fancy starting pitching adjustments, obviously.

“>6 I calculated the odds of a generic contending NBA team (with a 1660 EloThe average Elo for an NBA Finalist since 1984, when the league adopted its current playoff format.

‘>7) pulling off some of the longest streaks in pro basketball history. And even the most impressive streaks on the hardwood can’t compare with baseball’s hottest runs.

The longest winning streak in NBA history, the 1972 L.A. Lakers’ 33-game winning streak, would have a 1 in 720 chance of being accomplished by our generic contender. The Golden State Warriors’ 24-game streak to start the season a couple years ago raises the bar a bit, with a 1 in 1,879 chance of being achieved by a generic contender, since the Warriors faced a much more difficult slate of opponents. But even a streak as memorable as the Houston Rockets’ 22-gamer from 2008 seems weak (1 in 247 odds) when compared with the baseball streaks we looked at above.

Streaks are nice, but the Indians surely have another accolade in mind: the World Series trophy. As of now, we give them a 1 in 4 chance. Given what they just pulled off, doesn’t seem so hard, does it?

Where Does Boxing Go After Mayweather?

At the age of 40, after a 21-year career and a 50-0 record, Floyd Mayweather, the greatest prizefighter of his era, has walked away. He peddled the most lucrative act in the history of not only boxing, and not only sports, but entertainment.

When Mayweather fought Manny Pacquiao in 2015, he earned over $200 million for hardly breaking a sweat. The audience who witnessed those 36 minutes had the star power of the Oscars, Emmys, Grammys, ESPYS and White House Correspondents’ Association dinner — combined. Some seats at the MGM Grand sold for $350,000 a pop. The fight obliterated records for gate and pay-per-view numbers. And why not? Mayweather had proven himself the most exciting fighter in the history of boxing — that is, until he stepped into the ring. After the opening bell sounded and the crowd had its chance to lean in toward the “Fight of the Century,” most recoiled before the first round was over, as though reacting to the stench of a spoiled carton of milk. Despite the ensuing bitterness and buyer’s remorse, only two years later, against Conor McGregor last month in Las Vegas, Mayweather was able to both earn and generate nearly as much money — although final estimates have yet to be released — fighting an opponent who had never boxed professionally.

Mayweather is gone. Pacquiao had his bite at the Mayweather apple and is now intent on becoming this generation’s answer to Muhammad Ali, hanging around too long at the fair. So what now? Mayweather was a lightning rod for the sport — many loved him, more loved to hate him, and he defined the big-money barometer of boxing success. Without this conduit, where does boxing’s energy flow? We might glimpse the future this weekend over 12 rounds in Las Vegas.

Two international fighters are ready to assume the mantle of the Face of Boxing. This Saturday, three weeks after the sideshow of Mayweather-McGregor, Mexican phenom Saul “Canelo” Alvarez squares off against undefeated Kazakhstani knockout artist Gennady “Triple G” Golovkin for the WBA, WBC and IBF middleweight titles in one of the most combustible matchups in decades.

After Mayweather, Alvarez has established himself as the sport’s most bankable pay-per-view star, albeit with far smaller numbers. The only blemish on his record is a 2013 defeat to Mayweather himself, Not counting a 2006 draw to Jorge Juarez in his fifth fight.

“>1 which nearly broke what was then the all-time pay-per-view record Mayweather set against Oscar De La Hoya in 2007. Alvarez has fought seven times since, mainly against stiff competition, and steadily improved with each outing. Golovkin, meanwhile, is mounting his 19th-consecutive title defense.

But how much will the public care? In the U.S., this fight, more than any on the horizon, will prove a bellwether for what the health and complexion of boxing look like in a post-Mayweather era. Examining the top pay-per-view fights over the last couple of decades provides some useful clues and insights into the evolution of what the public sought from boxing and, ultimately, what it got for its money.

Long before Mayweather-McGregor infected the water supply, Mike Tyson single-handedly set boxing on course where spectacle would supplant substance. After serving a three-year sentence for rape at Indiana’s Plainfield Correctional Facility, the former champion emerged from prison more popular than ever, facing off against a walking punchline named Peter McNeeley, in 1995. The fight was billed with the line “He’s Back,” with Tyson playing the role of the “Jaws” shark and McNeeley the teenager wandering into the ocean with a surfboard. The resulting 89-second disqualification (McNeeley’s manager jumped into the ring) became the most lucrative fight in boxing history and Tyson was again one of the highest-earning entertainers on earth. Tyson offered a poignant assessment of the American public’s frightening fascination with him: “I can sell out Madison Square Garden masturbating.”

Tyson was a box-office behemoth after that. He continued to generate big numbers the next year when he faced Frank Bruno and the 25-1 underdog Evander Holyfield, who shockingly knocked out Tyson. Their rematch broke records yet again, before Holyfield’s ear was bitten off and spat out into the public consciousness. Only Van Gogh’s might be more famous.

Tyson lost his boxing license for just over a year. All he really had left in his career was offering the public his comeuppance against Lennox Lewis in 2002. It didn’t matter that Tyson was 35, more than a decade past his prime, and fighting one of boxing’s most menacing and dominant heavyweight champions. The public’s infatuation with Tyson ensured that the fight would set the pay-per-view record as the highest-grossing fight in boxing history. Only three years later, in 2005, a worn-out, broken-down Tyson quit on his stool against an Irish journeyman named Kevin McBride. Two hundred and fifty thousand people still paid to watch.

In 2007, two years after Tyson retired, Mayweather had fought on pay-per-view only three times and produced underwhelming numbers, never reaching even 400,000 buys. But the public spent $136 million on Mayweather-De La Hoya, and Mayweather used the opportunity to usurp his opponent’s considerable fan base. If he couldn’t win them over with his style in the ring, he would entice them to pay even more money, exploiting his undefeated record and willingness to play the “heel” role. “Pretty Boy” Floyd became “Money” Mayweather, and he dipped his toe into “Dancing with the Stars” and the WWE to gain even more exposure and cultural purchase. It worked like a charm. A decade after fighting De La Hoya, only 11 fights later, before Mayweather was done — not including the astronomical figures yet to be crunched from the McGregor bout — there was $1.3 billion in revenue from more than 19.5 million pay-per-view buys.

Mayweather tapped the post-prison Tyson business model of packaging hype and spectacle in ways that transcended the sport. But operating on a different plane during the same era, Pacquiao assumed the pre-prison Tyson business model. Pacquiao earned legions of devoted fans by becoming the most devastating fighter in the sport who would steamroll opponents with almost cartoonishly iconic knockout victories. As a result, from 2008 to 2012, there were six Pacquaio fights on pay-per-view that earned more than 1 million buys.Against De La Hoya, Shane Mosley, Miguel Cotto, Antonio Margarito and two against Juan Manuel Marquez.

“>2

Which brings us to boxing’s next chapter. The story on these pages is a new one. Although Golovkin and Alvarez may be the two most exciting boxers in the sport, neither has crossed the blockbuster threshold established by Tyson. Among the 22 fights estimated to have yielded more than 1 million pay-per-view buys, only three didn’t involve the names Tyson, Mayweather or Pacquiao: De La Hoya versus Trinidad and Holyfield’s fights against George Foreman and Lewis. Canelo’s six non-Mayweather fights on pay-per-view have averaged 575,000 buys and Golovkin’s two each earned fewer than 200,000.

The potential for a great fight is no guarantee that we’ll get one. Andre Ward demonstrated this last year in his pay-per-view disaster against Sergey Kovalev. Ward inherited the pound-for-pound crown from Mayweather, is sponsored by Nike’s Jordan brand, in his prime, and the last male American to win an Olympic gold medal. He challenged an undefeated knockout machine in Kovalev in one of the best 50-50 fights on paper in decades. Nobody cared and the fight generated anemic numbers. Their rematch this year produced even worse numbers.The first fight had an estimated 165,000 pay-per-view buys; the second got 130,000.

“>3

Both Golovkin and Alvarez are in their primes. They aren’t marketing this fight with dog whistles toward racism, homophobia or misogyny, or with cynical appeals to the lowest common denominator. They simply are offering what promises to be one of the best fights in years. Is that enough? In today’s America, nothing is taken seriously that doesn’t sell. And after years of being hooked on the antiheroic narrative, is America in our modern age willing to pay to watch a fight without one boxer wearing the black hat?

How A Warm Winter Destroyed 85 Percent Of Georgia’s Peaches

2017 has been a bad year for peaches in the Peach State. Georgia’s disruptively warm winter caused the loss of an estimated 85 percent of the peach crop. “We had fruit here in Georgia from the middle of May to about probably the first week of July, and after that we didn’t have anything else,” said Dario Chavez, an assistant professor in peach research and extension at the University of Georgia.The Georgia peach season typically runs through mid-August

“>1

As temperatures rise globally because of climate change, Georgia is not the only part of the country where warm winters are causing trouble for farmers. California’s cherry crop took a hit in 2014 because of a warm, dry winter. And in 2012, after a warm February and March brought early blooms, Michigan’s apple crop was decimated by an April frost. Farmers have always been at the mercy of the environment, but now agricultural catastrophes brought on by warm winters seem likely to occur with greater frequency.

For trees that fruit each year (such as peaches, cherries, blueberries, almonds and other fruits and nuts), cool weather is as important as warm. Cold air and less sunlight trigger the release of chemicals that halt trees’ growth, prepare them to withstand freezing temperatures and enable them to resume growing the following spring. When a tree enters this dormant state, it sets a kind of internal seasonal alarm clock that goes off once the tree has spent enough time in chilly temperatures.Once a tree has had the minimum amount of chill it needs, it knows to get ready to wake up again and clean out the compounds that have protected it through the winter. Warm temperatures typically prompt budding, but if they come before a tree has had sufficient chill time, it may not be fully out of dormancy and prepared for growth.

“>2 This countdown is measured in so-called chill hours — the amount of time the temperature is between 32 and 45 degrees Fahrenheit.There are variants of this basic formula. Growers in warmer climes where excessive freezing temperatures are not a concern simply may count the hours below 45 degrees. Some newer models count “chill units” or portions, in which very warm winter spells are subtracted from the overall chill.

‘>3 When crops don’t get the chill hours they expect, they can’t properly reset. Buds are delayed, and instead of ripening into juicy, delicious fruit, they remain small and underdeveloped.The number of chill hours required varies among species and even among varieties of the same species.

“>4

This last winter, middle Georgia got about 400 chill hours during what Chavez described as the usual dormancy period for peaches (roughly Oct. 1 to Feb. 10). The winter before, while still on the low side, had closer to 600 chill hours. But that 200-hour difference meant several peach varieties that had produced fruit in 2016 never bloomed this year. There are products and techniques that can help stimulate delayed crops, but this year the deficit in chill hours was too large to overcome, Chavez said.

A chill-hour deficit hits places with milder climates, such as the southeastern U.S. and California, especially hard because they get fewer chill hours to begin with.

But Georgia was not the only place with a chill-hour deficit in the last year. According to an analysis by the Midwestern Regional Climate Center’s Vegetation Impact Program, most of the U.S. got fewer chill hours than the average from 1998 to 2013.Colder than normal winters can also lower the number of chill hours if there are more hours below the lower threshold, for example, in the Pacific Northwest this year.

‘>5

Climate change, and the loss in winter chill that can come with it, poses a particular threat to fruit and nut trees and the farmers who depend on them, said Eike Luedeling, a senior scientist at the University of Bonn’s Center for Development Research. Farmers who grow annual crops, such as corn and wheat, replant every year and might be able to adapt more nimbly to a sudden change in the environment, by changing their planting schedule or switching crops (though doing so may be costly). But fruit and nut farmers rely on plants that take much longer than a single growing season to be productive. “You really have to plan for several decades ahead when you plant a tree,” said Luedeling, who has modelled what winter chill hours may look like in the future. “It’s a huge investment.”

Looking ahead, the experience of Georgia peach farmers this year might become more common. Luedeling predicts only about a quarter to a half of California’s Central Valley, which produces much of America’s fruit and nut crops, still will have enough chill hours by the middle of the 21st century to grow walnuts, apricots, plums and most varieties of peaches and nectarines.These crops generally need a minimum of 700 winter chill hours

“>6 A separate, global projection from Luedeling shows that while colder areas may not change much over the next century (or may even gain winter chill hours, thanks to more days above freezing), warm areas are likely to see dramatic reductions in chill hours.

Though concerning, these projections are far from certain, Luedeling said. There is still a lot we don’t know about winter chill. Anything above 45 degrees does not count toward the chill hour total in most models, but that threshold is almost certainly not as firm for plants themselves. As Luedeling put it, the cutoff doesn’t “have much biology in it,” but he hopes to build a better model soon that will help fruit producers plan for the future.

Meanwhile, farmers must make decisions now about their plans for the next few years. Peach growers and researchers, for their part, are focused on moving toward varieties that need fewer chill hours to thrive. Chavez, who works closely with growers, some of whose families have been growing peaches in Georgia for three to four generations, said that the time to make changes is now. “The weather is something we cannot control,” he said, but peach farming “is part of the region…. We have to address [it] sooner or later.”

Amazingly, The Indians Are Even Better Than They Seem

The Cleveland Indians are officially on baseball’s greatest hot streak in more than eight decades.

With 21 consecutive victories, they’re now tied for the longest winning streak in MLB history with the 1935 Chicago Cubs, moving one game ahead of the 2002 “Moneyball” Oakland A’s for the American League’s all-time record.These records don’t include games that were unofficial ties (sorry, 1916 New York Giants).

‘>1 If the Indians tack on another win, they’ll break a record older than the franchise’s World Series drought — perhaps a portent of more history to be shattered next month.

And yet, even after all that winning, the Indians’ record still (still!) masks a much better ballclub underneath it. That’s right, the team that has won 21 straight is better than you think.

Back in early July, we noted that Cleveland’s then-mediocre record belied the team’s stellar underlying stats, including its outstanding run differential and expected record from BaseRuns (a formula that predicts how many games a team “should” win, given neutral luck within innings and late in close games). At the time, the Indians were struggling to fend off the upstart Minnesota Twins and Kansas City Royals in the AL Central, despite vastly superior metrics. Common sense said Cleveland’s record was bound to catch up with its stats eventually, but in a sport like baseball, you never know.

By now, of course, the Indians have erased all doubt about their standing within the division; they lead the Twins — who are themselves clinging to the AL’s final wild-card spot — by 14 games. (Twenty-one straight wins do have a tendency to boost a team’s division lead.) But according to BaseRuns, Cleveland still should have more wins than they do. In fact, the Tribe’s six-win shortfall between their predicted and actual recordsAs of Sept. 12. Unless otherwise indicated, all numbers in this story do not include stats from Cleveland’s game against Detroit on Wednesday afternoon.

“>2 is the second-biggest such margin in baseball behind the Yankees (who should have nine more wins).The Rockies and Royals by this measure should have seven and eight fewer wins, respectively.

“>3 The same story goes for Cleveland’s record as predicted by run differential; they’ve fallen six wins short in that department as well.

In other words, as good as the Indians have looked this year, they’ve also been incredibly unlucky. Yes, this is akin to saying the person who drew an ordinary straight flush really deserved that royal flush.

Which brings us back to that 2002 Oakland A’s squad. As the subject of Michael Lewis’s book (and, later, its movie adaptation), their chase for 20 straight wins got the Hollywood moment it deserved:

Ironically enough, however — for a team deeply associated with the spread of advanced metrics across baseball — that team’s stats couldn’t really hold a candle to those of this year’s Indians. The 2002 A’s ranked fourth in MLB with a .598 BaseRuns-predicted winning percentage, and also came in fourth with 49.7 total wins above replacement (WAR).Taking the average of Baseball-Reference.com’s and FanGraphs’s versions of WAR.

‘>4 The 2017 Indians, meanwhile, lead the league in both BaseRuns winning percentage — Cleveland’s .654 clip is well clear of the second-ranked L.A. Dodgers’ .622 mark — and WAR (with 59.3, prorated to 162 games).

2017 Indians vs. the 2002 A’s

Key metrics and American League ranks for both teams, as of Sept. 12

2002 ATHLETICS 2017 INDIANS
METRIC VALUE AL RANK VALUE AL RANK
Batting average .261 8 .263 3
On base percentage .339 5 .340 2
Slugging percentage .432 7 .450 2
Weighted Runs Created Plus 106 4 106 2
Base running +10.1 5 +9.4 3
Defense -26.8 11 +20.6 3
Adjusted ERA 84 3 75 1
Adjusted FIP 88 3 76 1
Wins above replacement 49.7 3 59.3 1
Winning percentage .636 2 .614 1
Pythagorean winning percentage .591 4 .657 1

Counting stats for 2017 Indians prorated to 162 team games.

Source: Baseball-Reference.com, FanGraphs

On offense, the two clubs were roughly equivalent; Oakland ranked fourth in the AL with a weighted runs created plus (wRC+) of 106, while Cleveland currently ranks second with an identical 106 wRC+. But in terms of pitching and defense, the Tribe dominate. They easily lead the 2017 AL in earned run averageAdjusted for park effects.

‘>5 and fielding independent pitching, and are third in FanGraphs’ defensive value metric. (By contrast, the 2002 A’s ranked third in ERA, third in FIP and 11th in defense.) That’s why the Indians have already allowed 217 fewer runs than an average team would in the same park, through 145 games; by comparison, Oakland “only” allowed 115 fewer runs than average in a full 162-game slate.

These are all reasons to think this year’s Indians can fare better in the postseason than the Moneyball A’s, whose season ended with a heartbreaking five-game division series loss to the Twins. The Indians had their own share of heartbreak last season (to say nothing of 1997), but they’ll be back again and even better than before. The win streak has only helped validate what was always a scary talented team underneath, waiting to break out.

Surviving A Big Storm Doesn’t Mean The Trauma Is Over

The hurricanes in the Gulf and Florida over the last few weeks have left people displaced — and from more than just their homes. Places of worship, community centers, parks and schools are underwater, missing roofs or windows. And those losses can set the social infrastructure of a person’s life adrift. Years after the family is safe and the home is rebuilt, disaster victims could still be struggling with health problems that got a start because of the way a stressful, terrifying situation disrupted their lives. It’s even possible, some researchers say, that the stress and fear alone could create health problems later.

It’s easy to take social support systems for granted, but the role they play in reducing stress and keeping us healthy is crucial. When the federal Disaster Distress Helpline starts taking calls after an event like a hurricane, the requests it fields are less about direct let’s-talk-about-feelings therapy, and more about solving practical problems in survivors’ lives, said Maryann Robinson, chief of the Emergency Mental Health and Traumatic Stress Services Branch of the Substance Abuse and Mental Health Services Administration, which operates the service. There’s the Alcoholics Anonymous member trying to find a meeting that wasn’t flooded out. Or maybe a person with a much-needed prescription, but whose pharmacy was destroyed and whose doctor evacuated out of the state. If those workaday crises aren’t solved, they can start a chain reaction in the lives of survivors — a domino that carries the impact of the trauma far beyond the reach of a storm surge or gust of wind.

In some cases, these kinds of public health aftereffects have eclipsed the scale of the disaster itself. A 2005 World Health Organization report on the 1986 Chernobyl nuclear power plant accident found that the mental health impacts were proving to be a far bigger disaster for public health than radiation exposure. At that point, radiation had directly killed fewer than 50 people and was expected to eventually shorten the lives of an estimated 4,000. Meanwhile, the report concluded, 350,000 people had been forcefully displaced, suffered from public stigma, and believed themselves to be marked for death. The result, according to the report, was increased rates of fear, depression, poverty, alcoholism, unsafe sex and smoking. Chernobyl survivors often lacked access to high-quality medical and educational services, because it was difficult to recruit outside professionals to live in a community full of “tainted” people.

It’s easy to see how a disaster could cause long-term physical harm. In the case of Harvey, multiple industrial facilities released toxic pollution as a result of the storm and flooding, and some survivors were forced to wade through water contaminated by sewage and dangerous chemicals. When Irma hit the Caribbean, about 60 percent of the people on the island of Barbuda were left homeless and 95 percent of the buildings on the island sustained damage — which will probably mean lots of people exposed to mosquito-borne illness, mold and other factors that can cause physical problems down the line.

But it’s not always easy to predict the way a disaster will affect people’s health. Some impacts, particularly those tied to social support systems and the way people interact with their communities, can take years to become apparent. For instance, many of the people who were evacuated from New Orleans because of Hurricane Katrina had little control over where they were evacuated to, and few resources to get them back to New Orleans, or another city, later. Some of those people ended up in communities that were far less walkable than the ones they’d fled — less public transportation, more strip malls, no cohesive downtown. And so, by 2007, those people had higher body mass indexes than similar survivors who were relocated to walkable communities. Not exactly a public health risk you’d expect from a hurricane.

What’s more, it’s possible that the stress of experiencing a disaster — fear for your life, worry about how you’ll rebuild — and the chemical/emotional response that follows could itself cause harm. Some of the best evidence for this also comes from survivors of Hurricane Katrina. That’s because, two years before the storm hit, more than 1,000 New Orleans community college students enrolled in what was supposed to be a study of the support that single parents need to get through school. Researchers collected huge amounts of information on these people — about their health, their incomes, their living arrangements and more. After the storm, the study shifted gears, becoming the Resilience in Survivors of Katrina Project, comparing that baseline data to how people changed after the storm in order to learn more about the long-term effects of surviving a disaster.

Some of the RISK Project studies have found physical changes that are correlated with emotional distress. For instance, in one study, 15 percent of the people surveyed by the RISK Project had reported problems with headaches and migraines before the storm. Seven to 19 months later, when follow-up surveys were done, more than 56 percent of the same population reported those problems. What’s more, said Mariana Arcaya, a professor of urban planning and public health at MIT, her team found that the likelihood of experiencing migraines seemed to increase with the severity of post-traumatic stress disorder symptoms the person reported. And, having migraines before the storm wasn’t predictive of experiencing PTSD symptoms afterward — something that helps tease apart whether migraines were causing PTSD or the other way around. That doesn’t prove emotional distress caused migraines, Arcaya said. But the idea that what happens in your mind and emotions could affect your body doesn’t come completely out of left field. There’s a long history of literature associating post-traumatic stress disorder diagnoses with a wide variety of physical ailments. Various studies have reported increased rates of everything from dizziness and unexplained pain, to cardiovascular and gastrointestinal disorders in PTSD patients. On top of that, there’s increasing evidence that stress from things like acute poverty or abuse in early childhood can affect the way kids’ brains develop and change how their genes express certain traits. Scientists even talk about “John Henryism,” a hypothesis built on African-American health data suggesting that people forced to push against racism and other socioeconomic barriers on their way to success end up with worse health. “I think there’s a really exciting and growing science where, you experience something, that changes your psychological state and that changes your body,” Arcaya said.

Carol North, a crisis psychiatrist at the O’Donnell Brain Institute at the University of Texas Southwestern Medical Center, disagreed. Studies trying to find a link between emotional trauma and physical health are plagued by flawed methodologies, she told me. Most don’t have any information on survivors’ lives before the disaster, the way the RISK Project does, nor have they confirmed whether the physical damage being documented has a more obvious cause. For instance, she said, it’s possible emotional distress could trigger an inflammation response and cause a breathing disorder like asthma. But it’s much more likely that asthma was caused by breathing in mold.

But, whether stress responses in the body cause physical problems later or we’re merely dealing with a world in which the stressful impacts of a disaster alter behavior in ways that produce physical problems, reducing the stress still matters. Either way, it’s important to return people to some sense of normal life, helping them navigate basic tasks that have suddenly become infinitely harder, doing what we can to make sure their lives going forward aren’t worse. “What we’ve learned, and what many studies have shown, is to remove the stressor. To provide social first aid,” said Sandro Galea, dean of the Boston University School of Public Health. In the wake of a disaster, programs like the Substance Abuse and Mental Health Services Administration’s call-in line that helps connect people to daily life-restoring resources in their communities are probably crucial. “We need to create systems as soon as possible to return function, or prevent a loss of function,” he said. “That’s really medicine.”

What 100-Year-Old Hurricanes Could Teach Us About Irma

With the torrential rains of Hurricane Harvey, the historic winds of Irma, and Jose still meandering slowly through the Caribbean, the last few weeks have been full of powerful and frequent hurricanes. If the 2017 hurricane season continues like this, it could join 2005 and 2010 as one of the most intense hurricane seasons in recorded history.

Scientists say climate change has probably played a role in shaping this year’s big storms and those previous monster seasons. When scientists build models of the climate system — digital worlds where the physics of a warming atmosphere can spin out thousands of possible futures — they see a clear connection between a hotter planet and hurricanes. In those model Earths, a warmer world makes storms stronger and increases the rainfall they leave behind. In the real world, the planet has gotten warmer, the oceans have gotten warmer, and here we have these intense storms — it all seems to add up.

But, frustratingly, it’s harder to look at the actual hurricanes that have happened and find evidence of those changes playing out the way we’d expect. The planet’s average global temperature has increased, but its hurricanes don’t seem to have changed much. We can tie hurricanes to climate change in theory, but we don’t see a statistical signal of those theories playing out in practice. It could be that the climate hasn’t changed enough yet. It could be that hurricanes have changed, just not in a way that’s apparent statistically. We can’t be sure.

What would it take to change that? Is there data we don’t have now that could make the connections between climate change and hurricanes more explicit? I asked several scientists who study these issues about the data they wish they had. They told me that the hole in our present understanding is tied to a lack of knowledge about what happened in the past. We don’t know if climate change is altering hurricanes now, because we know very little about what hurricanes were doing 100 years ago.

We’re not lacking a historical record of hurricanes. We’re just lacking a complete one. The Atlantic Hurricane Database goes back to 1851, and this raw data suggests that the number and intensity of Atlantic hurricanes has significantly increased over time, and that the increase is correlated with rising surface water temperature. But there’s a catch, said Thomas Knutson, leader of the climate impacts and extremes research team at the National Oceanic and Atmospheric Administration’s Geophysical Fluid Dynamics Laboratory.

Prior to the 1960s, hurricane records were primarily made in two ways: either a storm made landfall in a place that had a weather station reporting back to the U.S. government, or the hurricane was spotted by ships at sea. But not every hurricane strikes the shore, and even if it does it’s not guaranteed to be at its peak intensity. Nor are there sailors in every storm’s path. That means the database doesn’t document all the hurricanes that occurred, or their true strengths. Knutson was part of a group of scientists who analyzed historical ship traffic patterns to estimate how many hurricanes went unnoticed by the official record books. They compared recent hurricane years with those old ship routes and counted how many modern storms wouldn’t have been seen by the old methods of hurricane detection. They found that the difference between reported hurricanes and the likely number of actual hurricanes was big enough to wipe out that trend in increasing hurricane frequency.

After 1965, hurricane data is based on satellite imagery, said Gabriel Vecchi, professor of geosciences at Princeton University. And it’s no big surprise that, after that date, the number of hurricanes on record dramatically rises, as do accurate measurements of wind speed and pressure. This change in tools drastically improved our understanding of the storms and our ability to predict their paths, he said. But the methodological change means the data is contaminated, and can’t easily be used to compare a modern hurricane season to the past. “If I could fix that issue, if I could magically put up a satellite 150 years ago over the Atlantic, that would go a long way towards answering the question about whether these trends are actually real,” he said.

There are some ways to tease more knowledge of hurricanes out of the distant past. Paleotempestology is a field that documents patterns in historic hurricane landfall by looking for evidence in the lands they hit. For instance, Vecchi told me, scientists might look for recurring layers of marine sediments near a freshwater pond on a hurricane-prone island — a possible echo of ocean water driven inland by hurricane forces. That can tell you when an island was hit by a hurricane, and how long it took before another one struck. In some cases, this data can flesh out hurricane strike patterns back 1,000 years — but only for very specific locations. Insurance companies are very interested in this, Knutson said. But the data doesn’t necessarily tell you a lot more about how many hurricanes were in the Atlantic each season. The mysteries of the past still remain.

Filling in the gaps in historical data is not a problem unique to hurricanes, of course, but it is especially difficult — at least compared to some other aspects of the climate system and how those are affected by anthropogenic climate change. For instance, what we know about global average temperature changes is also based on incomplete historical records. But there are more data points for temperature — more recording stations in more places, documenting a thing that happens every day, rather than a rare event that happens a few times a year, said Robert Tuleya, a climate modeler at Old Dominion University’s Center for Coastal Physical Oceanography.

The physics that connects rising levels of greenhouse gasses in the atmosphere to rising global temperatures is also a lot less complex, with a clearer path from cause to result, than the physics that connects greenhouse gases to hurricanes. For instance, although rising sea surface temperatures would tend to increase the ferocity of hurricanes — and, indeed, the historic hurricane seasons of 2005 and 2010 were associated with higher water temperatures — rising atmospheric temperatures would tend to make storms milder, Knutson said. So the effects of climate change can have contradictory impacts on hurricane formation. Moreover, Vecchi said, although measuring temperature in a specific spot can tell you something about temperatures nearby, the same isn’t true about hurricanes. The absence of a storm in Louisiana is not evidence of a storm’s absence in Mississippi.

Finally, there’s evidence that Atlantic hurricanes go through multidecadal cycles when their intensity and frequency waxes and wanes. The early 1970s to mid-1990s appear to contain something of a hurricane drought, compared to what came before and after, Vecchi said. But, because of the problems with the historical data, we don’t really know whether these cycles are a product of natural variability, or, say, the result of increases in dust and factory pollution after World War II, or some combination of both. Which also means that, when we see a trend in increasing hurricane number or ferocity since the 1960s or ‘70s, we don’t know whether that’s a product of climate change, a natural cyclical change, other kinds of human-caused changes to the climate system — or all three at once.

The result of all this is that it’s much easier to spot climate change-related patterns in temperature than it is in hurricanes. The trick is to spot the signal in the noise, Knutson said. And the noise is louder for hurricanes.

Our uncertainty surrounding hurricanes and climate change is built on a data gap that can’t be filled. Barring Vecchi’s magical time-traveling satellite and an abundance of localized information gleaned from paleotempestology, there’s no real way to get what we don’t already have. The only way we’re going to see a clear signal of climate change in real-world hurricane data is to collect more of it, as we go forward through time. We’ll know it when we see it, basically.

But, Vecchi said, we don’t actually need to fill the historical data hole to know whether climate change is increasing the risks hurricanes pose to humans. All we need to know is whether sea level is rising — an effect of climate change that Vecchi’s first-year students replicate in the lab every year, and for which there is a clear statistical signal. Temperatures rise, molecules of water expand, sea levels rise. It’s relatively simple, and it means that even a workaday hurricane of today can produce a higher storm surge than its historical counterparts. The water is already higher to begin with. That means the risks of hurricanes are getting bigger, Vecchi said, regardless of whether climate change altered the storm.

Irma Is Bearing Down On Some Of Florida’s Most Vulnerable Residents

As Irma makes its way up Florida’s Gulf Coast, it’s on a path to disrupt some of the state’s most vulnerable residents. The Florida coasts and flood-prone areas are home to many people who are particularly vulnerable in storms, including people living in mobile homes, migrant laborers and older people, many of whom have limited mobility or health concerns. These are far from the only groups at risk of bearing the burden of a storm as strong and big as Irma, but they each face a set of unique challenges.

Florida has a large population of migrant workers throughout the year — 150,000 to 200,000 people, according to the Florida Department of Health. Some are undocumented (not all migrant workers are undocumented, and the majority of undocumented immigrants in Florida are not migrant workers) and have expressed fear that they might be detained by authorities if they seek shelter. Those fears may have been made worse after NBC News reported that coordinated immigration raids were planned around the country for next week. (NBC later reported that the administration had canceled the raids because of Hurricanes Harvey and Irma.) Even though authorities from several counties have said they won’t be asking for papers at shelter sites, some undocumented immigrants have said they are afraid to evacuate.

Migrant laborers, regardless of immigration status, also frequently live in notoriously shoddy housing that is at particular risk of being devastated by hurricanes.

The Florida Department of Health says enough permits have been provided to ensure that 34,000 migrants have adequate housing, but that leaves a lot of people living in potentially vulnerable conditions. When Hurricane Andrew struck a migrant labor camp in Miami-Dade County in 1992, it was demolished. Left with nothing to eat but rancid food, several residents fell ill with food poisoning, The New York Times reported at the time. Conditions have improved since then, but many still live in substandard housing.

Immokalee, an area southeast of Fort Myers, on the southern section of the state’s western coast, is at the heart of Florida’s tomato industry and home to a large farmworker community that has fought a decades-long battle for better wages and working conditions. Still, 44 percent of the area’s population lives below the poverty line, according to the U.S. Census Bureau. Only 4,000 of the 24,000 or so residents were in shelters as of Saturday evening, according to the Naples Daily News.

Mobile homes are among the first places to be evacuated in any storm’s path; ahead of Irma, the state issued voluntary or mandatory evacuations for mobile-home residents in at least 18 counties. Depending on the quality of construction, mobile homes are particularly vulnerable to wind and water damage.

In some areas around Tampa Bay, a large share of the housing is mobile homes. They make up more than 40 percent of the units in some census tracts on the southern side of the Bay. About 4,000 mobile homes are located in that area, according to data from the U.S. Department of Health and Human Services. Although standards have changed for mobile homes in recent decades, Florida has many older mobile homes, which are frequently damaged in big storms.

Florida’s reputation as a retirement center is well earned. The state has a larger share of people age 65 or older — 19.9 percent — than any other state. And a high share of Gulf Coast residents are older adults. In Lee County, where Fort Myers is located, 36.6 percent of households are headed by someone 65 or older, compared with 29.6 percent statewide, according to University of Florida data. In Charlotte County, just north of Lee, 49.5 percent of household heads are 65 or older.

Older residents may live in assisted living facilities, adult family care or hospice. Medical treatments requiring electricity are a challenge in storms, and mobility issues can make it hard to move around. Irma turned west in the final hours before it made U.S. landfall, and the areas surrounding Fort Myers and Tampa were evacuated later than the Miami-Dade area, which had been preparing for days. That left some assisted living facilities moving residents to designated evacuation shelters in the hours before the storm hit.

Of course, there are numerous other groups that will face challenges in the days ahead. Many needs will go unmet as Irma’s cruel chaos upends lives in unexpected ways.

Will Miami’s Skyscrapers Withstand Irma?

As Irma continues to drive its way through the Atlantic Ocean, bringing with it 150+ mph winds, a lot of comparisons are being drawn to Hurricane Andrew, which hit South Florida 25 years ago. That makes sense – it was the last Category 5 hurricane to hit the United States (Irma is currently forecast to be at least a Category 4 with at least 150 mph winds at landfall), and Andrew made U.S. landfall in South Florida, the way Irma is expected to. Andrew was so bad that its name was retired, and it helped inspire evacuation and emergency management protocols around the country. It’s a living and painful memory for many in South Florida.

But even if Irma turns out to be like Andrew, South Florida is not the place it was 25 years ago. The coastline has seen rapid development since then, especially after the 2008 recession. While that development includes suburban communities springing up from the destruction left by Andrew, encroaching on what was then farmland, there’s also a near-wall of high-rise buildings that dot the area’s coastline. And while construction completed since Andrew arrived in 1992 must adhere to more strict building codes, there’s a lot that’s unknown about how tall buildings, and the materials from which they are constructed, will respond to the high winds currently expected from Irma.

The skyscrapers’ most vulnerable points may be their windows. While glass can usually withstand the intense pressure within a storm, high winds can turn even pebbles into window-shattering missiles. New requirements put in place after Andrew required tall buildings to use special, impact-resistant glass in the first 30 feet of construction, where debris is most likely to fly. Engineers test the strength of that glass in laboratory experiments by firing projectiles at high speeds into it to see whether it will shatter. But scientists don’t know the exact size, speed and force of projectiles flying in a hurricane. And despite the use of stronger materials, Hurricane Wilma (a Category 3 storm when it hit the U.S. in 2005) destroyed windows throughout Miami’s downtown high-rises in upper floors. Wilma spurred builders to cover more of their towers in impact-resistant glass, but Wilma only generated peak wind speeds of around 100 mph in Palm Beach County — much less than Irma is expected to bring to South Florida. (New forecasts show Irma drifting west, which may mean weaker winds in Miami. There’s still time for the forecast to shift again before it makes landfall, though.)

Even with lower wind speeds, Andrew and Wilma did much of their damage by hurling gravel from roofs into the windows of nearby buildings. “We all know that you can have the best glass in the world, but you start throwing rocks at it and it changes things,” said Timothy Marshall, an engineer and meteorologist with the firm Haag Engineering who specializes in wind. That interaction between buildings doesn’t get enough attention, Marshall said. Having many high-rises in close proximity, as happens in cities, can create wind tunnels — low-pressure areas where wind travels particularly fast, increasing the load on nearby buildings.

There were high-rises in Miami when Andrew hit; about 51 percent of Miami’s 10+ story high-rises were built prior to that year, according to crowdsourced data from SkyscraperPage.com.We don’t know of any official national database of high-rises or skyscrapers. As with any crowdsourced site, the data may be incomplete. A cross-reference of one Miami neighborhood suggests it is fairly comprehensive.

“>1 But we don’t know how many installed impact-resistant windows. And after Wilma, some buildings successfully petitioned to rebuild up to the codes required when they were first constructed, not stricter codes that had been established in 2000. Design standards shifted again in 2012, and SkyscraperPage.com suggests that about 7 percent of Miami high-rises went up since then. The end result of the shifting building codes is a string of high-rises along South Florida’s waterfront that may or may not be up for Irma’s fierce winds.

It’s not just that it’s unclear how these buildings will react, but also that the wind force is even higher on upper stories than it is lower down. A 2003 paper by James L. Franklin and colleagues from the National Oceanic and Atmospheric Administration found that above three stories, wind moves faster than it does below. At 10 stories, it’s 108 percent of the surface wind speed. At 25 stories, it’s 117 percent. That means that 150 mph winds at the surface could equal 175 mph winds on the upper (or middle) floors of a skyscraper.

Marshall himself is headed to Miami to see what impact Irma has on the city. “Testing in a lab is one thing, but testing with Mother Nature could be a little different. Not all debris is two-by-fours,” he said.

There’s also the question of the high-rises that have yet to be completed. The City of Miami warned residents of towers in the downtown area that the 20 to 25 giant construction cranes located in the area are designed to withstand up to 145 mph winds, approximately a Category 4 hurricane. If Irma hits as a Category 4 (with winds up to 156 mph), they could collapse. There are many more cranes looming in other parts of the Miami-Broward-Palm Beach metropolitan area, as well as other construction materials that require securing.

And of course, with the majority of the high-rises being built along the waterfront, they are also likely to bear the worst brunt of storm surges that are expected to top nine feet.

That flooding can shut down the elevators and electricity of skyscrapers, since generators and fuel stores are often built on the first floor. The combination of storm surge and high waves is the deadliest aspect of most storms. Even in Andrew, known for its high winds, the storm surge was more than 16 feet in some places.

Many of the towers are in mandatory evacuation zones, so most residents should have left by the time the storm hits. For those who stay behind, officials recommend descending to lower floors and riding out the storm in interior stairwells.

Here’s What The NFL Career Passing Leaderboard Will Look Like In 2025

When Peyton Manning took the all-time passing crown in 2015, it felt like history was being rewritten in real time.

Ever since Johnny Unitas took the throne in 1968, the coronation of a new passing king has been a generational event. It’s occurred an average of every 11.8seasons, and each new king has raised the standard an average of 7,925 yards higher than his predecessor. But Manning usurped a contemporary of his, Brett Favre — and he did it by just 102 yards.

Drew Brees now leads a host of active passers with legitimate claims to the throne, 5,829 yards away from the keep and closing fast. Tom Brady, who just turned 57 and plans to play until he is 65, is not far off either. In fact, if you squint hard enough, the top of the career passing leaderboard is beginning to look more like a list of current starting quarterbacks. So how long will Manning hold them all off — and what will the order of succession be in years to come?

Back in 1978, the NFL passed a slate of offseason rule changes designed to facilitate passing offense. They did exactly that, to dramatic effect. With a new 16-game regular-season schedule, 38-year-old Vikings quarterback Fran Tarkenton blew away his career single-season record in his last year in the NFL; his league-leading 3,468 yards for the season padded his lead atop the all-time passing rankings and pushed him to a career mark that wouldn’t be broken for 17 years.

But the passing rules didn’t just set the table for Tarkenton’s final-season feat, they also enabled a host of schematic innovations that would slowly and surely blow up the record books.

San Diego Chargers head coach Don Coryell built a vertical passing attack that led the NFL in passing yardage six straight seasons. San Francisco 49ers head coach Bill Walsh devised a short-route passing attack that made heavy use of running backs and tight ends as pass-catchers — a system popularly called the West Coast offense. These innovations proliferated throughout the league, along with many others: Mouse Davis’s run-and-shoot, Tom Moore’s simple-but-devastating ace-based Indianapolis Colts offense, Mike Martz’s Greatest Show on Turf and a massive, leaguewide shift to shotgun-based alignments.

Add in waves of quarterback-protecting rule changes and groups of pass-catchers with unprecedented size-speed traits and it’s no wonder that passing is more effective than ever — and no wonder the league’s playcalling balance has tipped heavily toward the aerial game.

In 1978, the average NFL team gained 2,541 yards through the air in a season. In 2016, that average was 3,864 yards, an increase of 52.1 percent. Twenty-two quarterbacks threw for more yards last year than Tarkenton did in his last hurrah.

So how will this passing explosion shape the NFL’s all-time leaderboards going forward? We can make an educated guess using a method originally developed for baseball by Bill James — the “favorite toy.”

Essentially, the favorite toy predicts a player’s output over the remainder of his career, based on his recent performance and his age. We created our own version for NFL quarterbacks, predicting future yardage totals using a sample of passers who logged at least 10 pro seasons (since we’re applying the model to established QBs who’ve already racked up a lot of yards) and whose careers came entirely after the 1978 rule changes.Specifically, the model calculates a QB’s remaining years as equal to 28.6 minus 0.75 times his current age (though we placed a lower limit that prevents any QB from being projected for fewer than 2.2 remaining years). To set a QB’s established level, his last three seasons are weighted so that the most recent season is given three times as much weight as either of the two seasons before that, and he can never drop below 0.5 times his most recent season’s yardage. All yardage totals are adjusted to the 2016 passing environment.

“>1 After plugging in the numbers for active quarterbacks, here’s what the favorite toy model thinks is in store for the NFL passing charts over the next decade:

The algorithm predicts New Orleans Saints quarterback Drew Brees will ascend from his current third-place spot to pass Manning during the 2018 season and finish his career the following year with 77,324 yards — 5,384 yards more than Peyton.

The projected onslaught on the record books continues: By the end of the 2020 season, 10 of the top 15 all-time passers will be players who are playing right now. By 2022, middling starters like Andy Dalton and Alex Smith will leave Hall of Famers like Dan Fouts, the triggerman of Coryell’s record-breaking Chargers, in history’s dustbin.

By 2025, the revolution will be complete: Manning, Favre, Marino, John Elway, Warren Moon and Vinny Testaverde will be the only players not playing right now who’ll remain in the Top 25. Matt Ryan and Matthew Stafford will pass Marino for the fifth and sixth spots on the list, and 11 of the top 15 spots will go to active starting QBs.

But Brees remains king.

Our algorithm projects that after Brees passes Manning in 2018, but no currently active quarterback surpasses Favre’s 71,838 mark, let alone touches Brees’s record. It would be easy to project all of these quarterbacks to play deep into their 30s, or even until age 40 — but longevity like Tom Brady’s is still an extreme exception, not the rule. Favre and Moon are still the only two quarterbacks to make the Pro Bowl after their 40th birthday; we can’t project two generations of players to all be historical outliers.

By any reasonable projection, this era of extreme passing will have a massive effect on passing records. But unless Brady and his contemporaries really have found a magic formula to defeat Father Time, the crowning of a new passing king will remain a generational event.

The End Of DACA Will Ripple Through Families And Communities

Nearly 800,000 people have been granted protection from deportation under the Deferred Action for Childhood Arrivals program, which former President Barack Obama created and the Trump administration rescinded on Tuesday. But unless Congress turns DACA into law, the end of the program will affect far more people than just those who participated in it. Tens of thousands of their children and relatives, as well as their local economies, could be directly affected as well.

Because of the parameters of who is allowed into the program, DACA participants often are in the early stages of their careers and are frequently breadwinners for families that can include young children as well as older parents and grandparents. Participants overwhelmingly come from what are known as mixed-status families, meaning that members have different immigration statuses, ranging from being undocumented to being a U.S. citizen. For these families, having a member with a work permit and a shield from deportation, as a DACA participant does, can provide not only the assurance that their lives won’t be upended, but also financial stability. Those protections are set to phase out over time according to the administration’s announcement.

To qualify for DACA, people must have been brought to the U.S. both before their 16th birthday and before mid-2007. They must also be in high school or have graduated or received a GED, which means they are mostly in their 20s and early 30s. Because most DACA participants came to the U.S. with their parents, their families usually reside in the U.S.

It’s those relatives who — beyond DACA participants themselves — will be most directly affected if the program’s protections end. Around the time the program began in 2012, people who were DACA-eligible had around 202,000 children, according to an analysis of 2009 to 2013 U.S. Census Bureau data by the Migration Policy Institute, a think tank that generally supports liberal immigration policies. Most of those children were 4 years old or younger at the time the research was done, and nearly all were U.S. citizens. Not all of their parents necessarily applied for or were accepted into the program, but in a 2015 online survey, a quarter of DACA participants said they had a child who is a U.S. citizen.

Many DACA recipients also have other relatives who may rely on them. In the 2015 survey, 60 percent said they had a sibling who was a U.S. citizen, and more than three-quarters said they had a parent who was undocumented. Research has found that by having legal status, DACA participants can be “cultural brokers” to relatives who are undocumented, performing family functions that undocumented parents might be afraid to, like accompanying siblings to doctors’ appointments or going to parent-teacher conferences.

The administration has indicated that after DACA ends, participants wouldn’t necessarily become priorities for deportation. But eliminating the program would almost certainly lead to lost jobs and income for participants and their families. DACA increased both employment and income among those eligible for the program, according to research by Nolan Pope, a professor of economics at the University of Maryland. Using data from the U.S. Census Bureau’s American Community Survey, Pope found that 50,000 to 75,000 undocumented people gained employment by 2014 because of DACA. Pope also found that wages increased. Other research has also indicated that DACA improved the economic conditions of and the job opportunities for people in the program. In one online survey of DACA recipients, 71 percent said that their participation in DACA had helped their family financially. Those gains would likely be reversed if the program ended, Pope said.

And the threat of deportation alone would likely have a negative impact on families. Immigration-related stress and anxiety have been shown to have negative health effects — children of undocumented parents suffer high rates of anxiety, are more likely to be food insecure and show delayed cognitive development compared with other children. Generally, researchers believe the stress that stems from the fear of having a parent deported has far-reaching, negative effects on the health of children.

While families are likely to experience the most immediate impact beyond those participating in the program, the communities they live in could also be affected. Though there are DACA recipients in every state, people who are eligible for the program tend to be clustered geographically. As of March of this year, 222,795 people — more than a quarter of those who had been approved for DACA — lived in California at the time they applied. An additional 124,300 applicants were in Texas, and Illinois and New York were each home to more than 40,000 applicants. Collectively, those four states provided more than 50 percent of the people who had been approved for DACA since the program began.

The Center for American Progress, a liberal think tank, estimated that the economies of California and Texas would lose $11.6 billion and $6.3 billion, respectively, each year if these workers were removed from the U.S. The libertarian Cato Institute estimates that ending the program would lead to $60 billion in lost revenue to the federal government over a decade, as well as a $280 billion reduction in economic growth during that time. That’s a small share of those giant state economies and the federal budget, but as recipients tend live in certain cities and neighborhoods, local communities are likely to see some impact. People who are DACA-eligible also tend to be employed in certain job sectors, including food service and sales; ending the program could create a noticeable effect on those industries in certain areas.

There are also the people who would enter the program if it doesn’t end. The Migration Policy Institute estimated that as of last year, 1.3 million people were immediately eligible to apply for the program. But that number could soon grow to 1.9 million — 228,000 children who were younger than 15 (too young to qualify) could age into the program, according to the institute, and an additional 398,000 were lacking only a high school diploma or its equivalent.

In the hours after the Trump administration’s announcement, members of Congress started to call for a long-term, legislative fix for people who qualify for the program, as did President Trump. If they don’t pass one, approximately 300,000 people with DACA will lose their legal status in 2018, as will at least as many more in 2019. And the remaining permits will expire in 2020.