Home » Business » ‘Garbage’ models and black boxes? The science of climate disaster planning

Share This Post

Business

‘Garbage’ models and black boxes? The science of climate disaster planning

‘Garbage’ models and black boxes? The science of climate disaster planning

“Do these guys know what they are doing? I’m not convinced that they do,” said Upmanu Lall, director of the Columbia Water Center at Columbia University, who has reviewed some firms’ methodologies. “Your models are garbage. And, unfortunately, that’s a problem.”

“It’s not that [they] have some special sauce,” said Rutgers University climate scientist Robert Kopp, who contributes to climate analytics service firms Rhodium Group and First Street Foundation, which publish their methodologies. “[They] don’t want to talk about what you’re doing.”

Federal agencies that set public climate policy have been turning to these firms to weigh the flood risks to homes and for post-disaster rebuilding efforts to ensure new structures can withstand the effects of the changing climate.

The Federal Deposit Insurance Corp., Federal Emergency Management Agency, National Oceanic and Atmospheric Administration, Department of Housing and Urban Development, Federal Housing Finance Agency and NASA all have met with such firms to explore tools purporting to help protect taxpayers, banks and homes from rising seas, worsening rainstorms and severe droughts linked to climate change.

Some firms deny that they’ve shielded their methodologies from clients, saying that transparency is important to build confidence in their work, even if they don’t broadly disseminate their models.

“Our methodology is documented, shared with and discussed with any customer that wants that level of detail, and over and over again we pass scrutiny on the methodology,” Jupiter Intelligence CEO Rich Sorkin said. “We are heavily opposed to black boxes, but there’s a big difference between black boxes and publicly disclosing to everyone in the world exactly how things are in the most detailed level.”

HUD has hired Jupiter Intelligence as part of a $150,000 coastal modeling project to weigh flood protection systems against various sea-level rise scenarios. Rhodium Group, which publishes its methodology, received a $179,000 National Science Foundation grant last year to study the public health effects and socioeconomic costs of rising temperatures and wildfires.

Other providers that use more familiar catastrophe models common in the insurance industry are drawing scrutiny, too. KatRisk LLC netted $463,000 in federal contracts last year to work on FEMA’s flood insurance program. Many have suggested, however, that such models do not adequately assess future climate change, leaving federal policy looking backwards.

The proliferation of providers has attracted attention from an expanding roster of federal agencies. FHFA issued a request for information on climate change in January and held a March 4 listening session with several service providers to gauge climate risk to mortgages held by government-sponsored enterprises Fannie Mae and Freddie Mac.

“To better understand the risk posed to the Enterprises, along with our RFI, we are assessing various natural disaster and climate datasets,” spokesperson Raffi Williams said in an email.

Demand for the services of climate analytics firms is growing as companies seek more sophisticated tools to plan for climate outcomes. Among the big names that have engaged them are oil giant BP, electric utilities Hawaiian Electric and ConEd, cities such as Miami and New York and property investment company CBRE Global Investors.

Many of the firms build their analytical models off government-funded research, a practice that is akin to private weather companies that use a backbone of data supplied by the National Weather Service to customize their products. But unlike shifts in climate, weather forecasting is short term, and scientists worry that companies hiring the analytics providers won’t keep up with the latest science or will hand off their risk management duties to the firms, leaving themselves vulnerable to nasty surprises.

“The market doesn’t know what it’s asking for, which is a huge part of the problem,” said Chris Sampson, co-founder and director of Fathom, a flood risk modeling firm that published its methodology in a peer-reviewed science journal. “You’ve got providers trying to generate solutions when we’re not even sure what the question is to ask yet.”

Companies are likely to see increasing requirements to quantify the physical risks they face from climate change. Last month, the Securities and Exchange Commission began reviewing voluntary climate guidelines it issued in 2010 for public companies and could make such reporting mandatory.

That’s coming amid a surge in shareholder proposals for greater transparency on climate risks and the rise of ESG investing that focuses on environmental, sustainability and governance metrics. And in addition to the new crop of small firms, the demand for data is beginning to attract big players, such as the so-called “moonshot factory” called X run by Google’s parent, Alphabet.

“Can we bring you a new tool? Can we bring you new data?” Sarah Russell, team lead at X, said at a climate risk conference this month describing her firm’s strategy of reaching out to companies recently hit by disasters or floods to ink new clients. “It’s only after you’ve been hit when you realize you’ve been working with garbage data.”

The growing emphasis on broader disclosures in the U.S. is in line with efforts by government agencies around the world, many of which are adopting rules inspired by frameworks like the Task Force on Climate-Related Financial Disclosures, a voluntary climate risk disclosure reporting regime.

Many experts believe the SEC will adopt the task force guidelines in some fashion to align with mandatory rules in the EU and Japan, though some hope that both guidelines and any forthcoming U.S. disclosure rules beef up transparency around companies’ physical risk vulnerabilities. The task force, for example, encourages disclosure around broad categories of physical and transition risks, but according to a report by the World Resources Institute, it does not specifically call out hazards like extreme winds, heat, drought, wildfires and ocean acidification that are linked to climate change.

The task force, for example, largely absolves companies of reporting risks from specific hazards like extreme winds, heat, drought, wildfires and ocean acidification that are linked to climate change, according to a report by the World Resources Institute.

But that type of rigorous disclosure regime will rely heavily on the climate analytics firms, which do not yet operate under a set of best practices or standards. Some climate experts warn that the advisers are overconfident given that current climate models simply aren’t designed to deliver on companies’ requests for narrow, specific predictions over 10, 20 or 30 years — the traditional investment and planning timeline that concerns investors, homeowners and corporations.

“How do we [companies] make our decisions in the future based on climate outputs?” KatRisk CEO Dag Lohmann asked during the recent conference. “I find it a very complicated question. We need these detailed, location-level models for that.”

Emilie Mazzacurati, founder of climate analytics firm Four Twenty Seven, which was bought by credit-ratings agency Moody’s in 2019, said customers are demanding specificity, yet no industry standards exist to weed out bad actors. That creates a danger that some advisers will overpromise and say they can deliver that type of forecast.

“There’s a lot of pressure from clients that we experience from the market saying, ‘Where’s this data?’” said Mazzacurati, who is now the head of climate solutions with Moody’s. “It takes a lot of commitment, dedication to say ‘No,’ and pass on deals and just tell the client ‘I’m sorry, that this is where we’re stuck, because this is how far the data will go.’”

Mazzacurati and Sorkin, of Jupiter, said they walk customers through their methodologies and show them the limitations of the models.

Sorkin said his firm helps inform the kind of scenario analysis that companies use for capital planning decisions for assets with 20- to 30-year lifetimes. Much of his firm’s work has focused on Global 2000 companies, which face mandates from boards or shareholders to become better attuned to climate risk.

“Everyone’s racing to build that capacity so that they understand what tools like ours are good for and what they’re not good for,” he said. “In every new market, there are charlatans and cheats. Customers are kind of slowly learning and sometimes easily deceived. And our view is the market will sort that out over time.”

Jupiter, though, has also drawn skepticism from some climate scientists for the level of granularity it offers clients: effects of weather and climate on a scale of 1 square kilometer. A recent article in the science journal Nature Climate Change cast doubt on that level of detail, saying that climate information on scales less than 1,000 kilometers for a few years or decades into the future is “complex.”

Climate models like those used by the United Nations’ Intergovernmental Panel on Climate Change project out to the year 2100, with the effect of human-caused effects becoming clearer later in this century as they diverge from natural climate variability. But those scales are on continental and, at best, regional levels. Beyond that, modelers can turn to historical data, but relying on that alone could improperly project out recent trends for future years.

Sorkin, however, said Jupiter isn’t trying to predict what climate will be like at that granular level in 30 years. Instead, it uses local terrain information and other data that his firm overlays with climate models to ensure engineering standards for infrastructure and investment decisions account for future, climate change-affected conditions.

Jupiter, like Four Twenty Seven, doesn’t post the inner workings of its methodology publicly for peer review. The assumptions each firm uses to inform their processes are considered proprietary.

Such data points are consequential variables of the value proposition service providers offer — and could cause potential points of disagreement within the scientific community were they to be revealed. But Sorkin called that argument a “red herring,” saying he would open his firm’s methodology to any third party that wants to assess it.

Curtis Ravenel, a member of the TCFD’s secretariat, said if the climate analytics advisers are to keep the confidence of their clients, they’re going to have to open their processes for review.

“You need transparency,” Ravenel said. “If you want to build trust in a market for the use of this kind of analytical tool, you’ve got to let the users understand the various assumptions and inputs.”

Share This Post