by Robert Muggah & Carlo Ratti
With the devastating effects of climate change already bearing down on the world’s urban areas, ambitious decarbonization and adaptation promises from municipal leaders could not come soon enough. But making good on these commitments requires scaling up the tools for collecting and analyzing the right information.
With cities facing disastrous climate stresses and shocks in the coming years, one would think they would be rushing to implement mitigation and adaptation strategies. Yet most urban residents are only dimly aware of the risks, because their cities’ mayors, managers, and councils are not collecting or analyzing the right kinds of information.
With more governments adopting strategies to reduce greenhouse-gas (GHG) emissions, cities everywhere need to get better at collecting and interpreting climate data. More than 11,000 cities have already signed up to a global covenant to tackle climate change and manage the transition to clean energy, and many aim to achieve net-zero emissions before their national counterparts do. Yet virtually all of them still lack the basic tools for measuring progress.
Closing this gap has become urgent, because climate change is already disrupting cities around the world. Cities on almost every continent are being ravaged by heat waves, fires, typhoons, and hurricanes. Coastal cities are being battered by severe flooding connected to sea-level rise. And some megacities and their sprawling peripheries are being reconsidered altogether, as in the case of Indonesia’s $34 billion plan to move its capital from Jakarta to Borneo by 2024.
Worse, while many subnational governments are setting ambitious new green targets, over 40% of cities (home to some 400 million people) still have no meaningful climate-preparedness strategy. And this share is even lower in Africa and Asia – where an estimated 90% of all future urbanization in the next three decades is expected to occur.
We know that climate-preparedness plans are closely correlated with investment in climate action including nature-based solutions and systematic resilience. But strategies alone are not enough. We also need to scale up data-driven monitoring platforms. Powered by satellites and sensors, these systems can track temperatures inside and outside buildings, alert city dwellers to air-quality issues, and provide high-resolution information on concentrations of specific GHGs (carbon dioxide and nitrogen dioxide) and particulate matter.
Technology companies are the first movers in this market. For example, Google’s Environmental Insights Explorer aggregates data on building and transportation-related emissions, air quality, and solar potential for municipal officials. And projects such as Climate Watch, Project AirView, Project Sunroof, and the Surface Particulate Matter Network are providing city analysts with historical data, tracking car pollution and methane leaks, and even helping individual users determine the solar-power potential of their homes.
But it is worth remembering that many private-sector climate-data initiatives were built on the back of large-scale, publicly supported programs. The most well-known source of climate data is NASA, which uses satellite data and chemical-dispersion and meteorological models to track emissions and predict the movement of pollutants. Similarly, the US National Oceanic and Atmospheric Association tracks wildfires and smog (among many other things), and issues data-based forecasts through its National Center for Environmental Prediction. And in Europe, the Copernicus Atmosphere Monitoring Service generates five-day forecasts based on its tracking of aerosols, atmospheric pollutants, GHGs, and UV-index readings.
Google Earth became a staple resource by organizing and making good use of more than four decades’ worth of historical imagery and data drawn primarily from public sources. Given that the private sector has been capitalizing on these data for years, cities no longer have any excuse for not doing the same. One easily accessible source of city-level data is the World Meteorological Organization’s Global Air Quality Forecasting and Information System, which tracks everything from dust storms to fire and smoke pollution. Another is the United Nations Environment Programme’s Global Environment Platform, which provides high-resolution forecasts.
Some pioneering cities have already started to work with smaller data vendors such as PlumeLabs, which crowdsources air-quality data through locally distributed sensors. But while access to data is essential, so, too, are the methods to make it useful. As matters stand, datasets tend to be fragmented across platforms, and even when urban leaders agree that the climate emergency warrants their attention, extracting insight from the details remains a daunting challenge. Cities are generating a chorus of climate data, but have yet to teach it to sing in tune.
Building a harmonious climate-data ecosystem will require an accessible platform to consolidate disparate metrics. Data also need to be streamlined and standardized to improve the monitoring of inputs, outputs, outcomes, and impact. Better data management will improve decision-making and empower ordinary citizens, potentially fostering collaboration and even positive-sum competition among cities. Public, private, and philanthropic partnerships can have a catalytic effect, as was the case when cities such as Amsterdam, Bristol, Chicago, and Los Angeles joined forces with SecDev Group to create an interactive dashboard tracking city vulnerability.
There are, however, some risks to consolidating and standardizing climate data for cities. When global technology vendors flood the market, they can curb local innovation in data collection and analysis. Moreover, by focusing too much on a small set of metrics for every city, we run the risk of Goodhart’s Law: once a measure becomes a target, people start to game it. Consider targets designed to reduce vehicular emissions that result in the production of cars designed to pass emissions tests, rather than cars with lower emissions.
Similarly, when climate data are more centralized, there could be greater incentives for political and corporate interests to skew them in their favor through lobbying and other means. And policymakers will need to ensure that any potentially sensitive or individualized data are kept private and protected, and that datasets and the algorithms they feed avoid reproducing structural biases and discrimination.
Most of these hazards can be identified early and avoided through experimentation, with cities pursuing unique strategies and promising new metrics. But unless cities scale up their monitoring and data-collection systems, they will have little chance of delivering on their climate targets. Better analysis can help drive increased awareness about climate risks, optimize responses, and ensure mitigation and adaptation strategies are more equitable. We cannot manage the climate crisis until we measure it, and we cannot measure it until we can collect and analyze the right information.
Robert Muggah, a co-founder of the Igarapé Institute and the SecDev Group, is a member of the World Economic Forum’s Global Future Council on Cities of Tomorrow and an adviser to the Global Risks Report. He is the co-author (with Ian Goldin) of Terra Incognita: 100 Maps to Survive the Next 100 Years.
Carlo Ratti, Director of the Senseable City Lab at MIT, is Co-Founder of the international design and innovation office Carlo Ratti Associati.