CarbonTracker-CH₄ Isotopic Methane Inverse Fluxes

Global, monthly 1 degree resolution methane emission estimates from microbial, fossil and pyrogenic sources derived using inverse modeling, version 2025.
Author

Siddharth Chaudhary, Vishal Gaur

Published

January 1, 2026

Access this Notebook

You can launch this notebook in the US GHG Center JupyterHub (requires access) by clicking the following link: CarbonTracker-CH₄ Isotopic CH₄ Flux Estimates. If you are a new user, you should first sign up for the hub by filling out this request form and providing the required information.

Table of Contents

Data Summary and Application

  • Spatial coverage: Global
  • Spatial resolution: 1° x 1°
  • Temporal extent: January 1998 - December 2023
  • Temporal resolution: Monthly
  • Unit: Grams of methane per square meter per year (g CH₄/m²/year)
  • Utility: Climate Research

For more, visit the CarbonTracker-CH₄ Isotopic Methane Inverse Fluxes data overview page.

Approach

  1. Identify available dates and temporal frequency of observations for the given collection using the US GHG Center (GHGC) API /stac endpoint. The collection processed in this notebook is the CarbonTracker-CH₄ Isotopic Methane Inverse Fluxes data product.
  2. Pass the STAC item into the raster API /collections/{collection_id}/items/{item_id}/{tile_matrix_set_id}/tilejson.json endpoint.
  3. Using folium.plugins.DualMap, visualize two tiles (side-by-side), allowing time point comparison.
  4. After the visualization, perform zonal statistics for a given polygon.
  5. Create a time-series analysis.

About the Data

CarbonTracker-CH₄ Isotopic Methane Inverse Fluxes

Surface methane (CH₄) emissions are derived from atmospheric measurements of methane and its ¹³C carbon isotope content. Different sources of methane contain different ratios of the two stable isotopologues, ¹²CH₄ and ¹³CH₄. This makes normally indistinguishable collocated sources of methane, say from agriculture and oil and gas exploration, distinguishable. The National Oceanic and Atmospheric Administration (NOAA) collects whole air samples from its global cooperative network of flasks (https://gml.noaa.gov/ccgg/about.html), which are then analyzed for methane and other trace gasses. A subset of those flasks are also analyzed for ¹³C of methane in collaboration with the Institute of Arctic and Alpine Research at the University of Colorado Boulder. Scientists at the National Aeronautics and Space Administration (NASA) and NOAA used those measurements of methane and ¹³C of methane in conjunction with a model of atmospheric circulation to estimate emissions of methane separated by three source types, microbial, fossil and pyrogenic.

For more information regarding this dataset, please visit the CarbonTracker-CH₄ Isotopic Methane Inverse Fluxes data overview page.

Terminology

Navigating data via the US GHGC API, you will encounter terminology that is different from browsing in a typical filesystem. We’ll define some terms here which are used throughout this notebook.

  • catalog: All datasets available at the /stac endpoint
  • collection: A specific dataset, e.g. CarbonTracker-CH₄ Isotopic Methane Inverse Fluxes
  • item: One data file (i.e. granule) in the dataset, e.g. one monthly file of methane inverse fluxes
  • asset: A variable available within the granule, e.g. microbial, fossil, or pyrogenic methane fluxes
  • STAC API: SpatioTemporal Asset Catalogs - Endpoint for fetching metadata about available datasets
  • Raster API: Endpoint for fetching data itself, for imagery and statistics

Install the Required Libraries

Required libraries are pre-installed on the US GHG Center Hub. If you need to run this notebook elsewhere, please install them with this line in a code cell:

%pip install requests folium rasterstats pystac_client pandas matplotlib –quiet

# Import the following libraries
# For fetching from the Raster API
import requests
# For making maps
import folium
import folium.plugins
from folium import Map, TileLayer
# For talking to the STAC API
from pystac_client import Client
# For working with data
import pandas as pd
# For making time series
import matplotlib.pyplot as plt
# For formatting date/time data
import datetime
# Custom functions for working with GHGC data via the API
import ghgc_utils

Query the STAC API

STAC API Collection Names

Now, you must fetch the dataset from the STAC API by defining its associated STAC API collection ID as a variable. The collection ID, also known as the collection name, for the CarbonTracker-CH₄ Isotopic Methane Inverse Fluxes dataset is ct-ch4-monthgrid-v2025.*

# Provide the STAC and RASTER API endpoints
# The endpoint is referring to a location within the API that executes a request on a data collection nesting on the server.

# The STAC API is a catalog of all the existing data collections that are stored in the GHG Center.
STAC_API_URL = "https://earth.gov/ghgcenter/api/stac"

# The RASTER API is used to fetch collections for visualization
RASTER_API_URL = "https://earth.gov/ghgcenter/api/raster"

# The collection name is used to fetch the dataset from the STAC API. First, we define the collection name as a variable
# Name of the collection for CarbonTracker-CH₄ Isotopic Methane Inverse Fluxes 
collection_name = "ct-ch4-monthgrid-v2025"
# Fetch the collection from the STAC API using the appropriate endpoint
# The 'pystac_client' library enables an HTTP request
catalog = Client.open(STAC_API_URL)
collection = catalog.get_collection(collection_name)

# Print the properties of the collection to the console
collection

Examining the contents of our collection under the temporal variable, we see that the data is available from January 1998 to December 2023. By looking at the dashboard:time density, we observe that the data is periodic with monthly time density.

#%%time
#items = list(collection.get_items())  # Convert the iterator to a list
#print(f"Found {len(items)} items")
# The search function lets you search for items within a specific date/time range
search = catalog.search(
    collections=collection_name,
    datetime=['2023-01-01T00:00:00Z','2023-12-31T00:00:00Z']
)
# Take a look at the items we found
for item in search.item_collection():
    print(item)
<Item id=ct-ch4-monthgrid-v2025-202312>
<Item id=ct-ch4-monthgrid-v2025-202311>
<Item id=ct-ch4-monthgrid-v2025-202310>
<Item id=ct-ch4-monthgrid-v2025-202309>
<Item id=ct-ch4-monthgrid-v2025-202308>
<Item id=ct-ch4-monthgrid-v2025-202307>
<Item id=ct-ch4-monthgrid-v2025-202306>
<Item id=ct-ch4-monthgrid-v2025-202305>
<Item id=ct-ch4-monthgrid-v2025-202304>
<Item id=ct-ch4-monthgrid-v2025-202303>
<Item id=ct-ch4-monthgrid-v2025-202302>
<Item id=ct-ch4-monthgrid-v2025-202301>
# Examine the first item in the collection
# Keep in mind that a list starts from 0, 1, 2... therefore items[0] is referring to the first item in the list/collection
items = search.item_collection()
items[0]
# Restructure our items into a dictionary where keys are the datetime items
# Then we can query more easily by date/time, e.g. "2020"
items_dict = {item.properties["start_datetime"][:7]: item for item in items}
print(items_dict)
{'2023-12': <Item id=ct-ch4-monthgrid-v2025-202312>, '2023-11': <Item id=ct-ch4-monthgrid-v2025-202311>, '2023-10': <Item id=ct-ch4-monthgrid-v2025-202310>, '2023-09': <Item id=ct-ch4-monthgrid-v2025-202309>, '2023-08': <Item id=ct-ch4-monthgrid-v2025-202308>, '2023-07': <Item id=ct-ch4-monthgrid-v2025-202307>, '2023-06': <Item id=ct-ch4-monthgrid-v2025-202306>, '2023-05': <Item id=ct-ch4-monthgrid-v2025-202305>, '2023-04': <Item id=ct-ch4-monthgrid-v2025-202304>, '2023-03': <Item id=ct-ch4-monthgrid-v2025-202303>, '2023-02': <Item id=ct-ch4-monthgrid-v2025-202302>, '2023-01': <Item id=ct-ch4-monthgrid-v2025-202301>}
# Before we go further, let's pick which asset to focus on for the remainder of the notebook.
# We'll focus on microbial sources of CH4 fluxes, so our asset of interest is:
asset_name = "microbial"

Creating Maps using Folium

You will now explore changes in the microbial CH₄ flux for two different dates/times. You will visualize the outputs on a map using folium.

Fetch Imagery from Raster API

Here we get information from the Raster API, which we will add to our map in the next section.

# Specify two date/times that you would like to visualize, using the format of items_dict.keys()
dates = ["2023-07","2023-01"]

Below, we use some statistics of the raster data to set upper and lower limits for our color bar. These are saved as the rescale_values, and will be passed to the Raster API in the following step(s).

# Extract collection name and item ID for the first date
observation_date_1 = items_dict[dates[0]]
collection_id = observation_date_1.collection_id
item_id = observation_date_1.id
# Select relevant asset (microbial CH4 emissions)
object = observation_date_1.assets[asset_name]
raster_bands = object.extra_fields.get("raster:bands", [{}])
# Print the raster bands' information
raster_bands
[{'scale': 1.0,
  'offset': 0.0,
  'sampling': 'area',
  'data_type': 'float64',
  'histogram': {'max': 941.1490746355813,
   'min': 0.0,
   'count': 11,
   'buckets': [64783, 13, 0, 1, 1, 0, 0, 0, 1, 1]},
  'statistics': {'mean': 1.016579340810176,
   'stddev': 6.800038470233832,
   'maximum': 941.1490746355813,
   'minimum': 0.0,
   'valid_percent': 0.00154320987654321}}]
# Use mean, scaled stddev, and minimum to generate an appropriate color bar range.
rescale_values = {
    "max": raster_bands[0]['statistics']['mean'] + 2.5*raster_bands[0]['statistics']['stddev'],
    "min": raster_bands[0]['statistics']['minimum'],
}

print(rescale_values)
{'max': 18.016675516394756, 'min': 0.0}

Now, you will pass the item id, collection name, asset name, and the rescale values to the Raster API endpoint, along with a colormap. This step is done twice, one for each date/time you will visualize, and tells the Raster API which collection, item, and asset you want to view, specifying the colormap and colorbar ranges to use for visualization. The API returns a JSON with information about the requested image. Each image will be referred to as a tile.

# Choose a colormap for displaying the data
# Make sure to capitalize per Matplotlib standard colormap names
# For more information on Colormaps in Matplotlib, please visit https://matplotlib.org/stable/users/explain/colors/colormaps.html
color_map = "PuRd"
# Make a GET request to retrieve information for your first date/time
tile_matrix_set_id = "WebMercatorQuad"

observation_date_1_tile = requests.get(
    f"{RASTER_API_URL}/collections/{collection_id}/items/{item_id}/{tile_matrix_set_id}/tilejson.json?"
    f"&assets={asset_name}"
    f"&color_formula=gamma+r+1.05&colormap_name={color_map.lower()}"
    f"&rescale={rescale_values['min']},{rescale_values['max']}"
).json()

# Print the properties of the retrieved granule to the console
observation_date_1_tile
{'tilejson': '2.2.0',
 'version': '1.0.0',
 'scheme': 'xyz',
 'tiles': ['https://earth.gov/ghgcenter/api/raster/collections/ct-ch4-monthgrid-v2025/items/ct-ch4-monthgrid-v2025-202307/tiles/WebMercatorQuad/{z}/{x}/{y}@1x?assets=microbial&color_formula=gamma+r+1.05&colormap_name=purd&rescale=0.0%2C18.016675516394756'],
 'minzoom': 0,
 'maxzoom': 24,
 'bounds': [-180.0, -90.0, 180.0, 90.0],
 'center': [0.0, 0.0, 0]}
# Repeat the above for your second date/time
# Note that we do not calculate new rescale_values for this tile, because we want date tiles 1 and 2 to have the same colorbar range for best visual comparison.
observation_date_2 = items_dict[dates[1]]
# Extract collection name and item ID
collection_id = observation_date_2.collection_id
item_id = observation_date_2.id

observation_date_2_tile = requests.get(
    f"{RASTER_API_URL}/collections/{collection_id}/items/{item_id}/{tile_matrix_set_id}/tilejson.json?"
    f"&assets={asset_name}"
    f"&color_formula=gamma+r+1.05&colormap_name={color_map.lower()}"
    f"&rescale={rescale_values['min']},{rescale_values['max']}"
).json()

# Print the properties of the retrieved granule to the console
observation_date_2_tile
{'tilejson': '2.2.0',
 'version': '1.0.0',
 'scheme': 'xyz',
 'tiles': ['https://earth.gov/ghgcenter/api/raster/collections/ct-ch4-monthgrid-v2025/items/ct-ch4-monthgrid-v2025-202301/tiles/WebMercatorQuad/{z}/{x}/{y}@1x?assets=microbial&color_formula=gamma+r+1.05&colormap_name=purd&rescale=0.0%2C18.016675516394756'],
 'minzoom': 0,
 'maxzoom': 24,
 'bounds': [-180.0, -90.0, 180.0, 90.0],
 'center': [0.0, 0.0, 0]}

Generate Map

First, we’ll define the Area of Interest (AOI) as a GEOJSON. This will be visualized as a filled polygon on the map.

# The AOI is currently set to Eastern Canada, North America.
aoi = {
  "type": "FeatureCollection",
  "features": [
    {
      "type": "Feature",
      "properties": {},
      "geometry": {
        "coordinates": [
          [
            # [longitude, latitude]
            [-106.81091327586626,58.13717287115446],  # Northwest Bounding Coordinate
            [-106.81091327586626,46.689085955377266], # Southwest Bounding Coordinate
            [-84.5565048510494,46.689085955377266],   # Southeast Bounding Coordinate
            [-84.5565048510494,58.13717287115446],    # Northeast Bounding Coordinate
            [-106.81091327586626,58.13717287115446]   # Closing the polygon at the Northwest Bounding Coordinate
          ]
        ],
        "type": "Polygon"
      }
    }
  ]
}

We will use the DualMap format from folium to visualize our two dates side-by-side for the area of interest. Below we add the desired layers to our map and add markers to identify the date/times shown.

# Initialize the map, specifying the center of the map and the starting zoom level.
# 'folium.plugins' allows mapping side-by-side via 'DualMap'
# Map is centered on the position specified by "location=(lat,lon)"
map_ = folium.plugins.DualMap(location=(52, -95.3), zoom_start=4)

# Define the first map layer using the tile fetched for the first date
# The TileLayer library helps in manipulating and displaying raster layers on a map
map_layer_observation_date_1 = TileLayer(
    tiles=observation_date_1_tile["tiles"][0], # Path to retrieve the tile
    attr="US GHG Center", # Set the attribution
    opacity=0.8, # Adjust the transparency of the layer
    name=f"{dates[0]} {items[0].assets['microbial'].title}",
    overlay=True,
)
# Add the first layer to the Dual Map
# This will appear on the left side, specified by 'm1'
map_layer_observation_date_1.add_to(map_.m1)


# Define the second map layer using the tile fetched for the second date
map_layer_observation_date_2 = TileLayer(
    tiles=observation_date_2_tile["tiles"][0], # Path to retrieve the tile
    attr="US GHG Center", # Set the attribution
    opacity=0.8, # Adjust the transparency of the layer
    name=f"{dates[1]} {items[0].assets['microbial'].title}",
    overlay=True,
)
# Add the second layer to the Dual Map
# This will appear on the right side, specified by 'm2'
map_layer_observation_date_2.add_to(map_.m2)

# Display AOI on both maps
folium.GeoJson(aoi, name="Eastern Canada, North America").add_to(map_)

# Add a layer control to switch between map layers
folium.LayerControl(collapsed=False).add_to(map_)

# Add colorbar
# We can use 'generate_html_colorbar' from the 'ghgc_utils' module 
# to create an HTML colorbar representation.
legend_html = ghgc_utils.generate_html_colorbar(color_map,rescale_values,label='g CH4/m2/year',dark=True)

# Add colorbar to the map
map_.get_root().html.add_child(folium.Element(legend_html))

# Visualize the Dual Map
map_
Make this Notebook Trusted to load map: File -> Trust Notebook

This visualization effectively illustrates the difference in microbial activity in warm vs. cold temperatures, which vary most widely at high latitudes.

Calculate Zonal Statistics

To perform zonal statistics, first we need to create a polygon. In this use case we are creating a polygon in Eastern Canada, North America.

# Give your AOI a name to be used in your time series plot later on.
aoi_name='Eastern Canada'
# This AOI is defined as a GEOJSON.
aoi = {
  "type": "FeatureCollection",
  "features": [
    {
      "type": "Feature",
      "properties": {},
      "geometry": {
        "coordinates": [
          [
            # [longitude, latitude]
            [-106.81091327586626,58.13717287115446],  # Northwest Bounding Coordinate
            [-106.81091327586626,46.689085955377266], # Southwest Bounding Coordinate
            [-84.5565048510494,46.689085955377266],   # Southeast Bounding Coordinate
            [-84.5565048510494,58.13717287115446],    # Northeast Bounding Coordinate
            [-106.81091327586626,58.13717287115446]   # Closing the polygon at the Northwest Bounding Coordinate
          ]
        ],
        "type": "Polygon"
      }
    }
  ]
}
# Quick Folium map to visualize this AOI
map_ = folium.Map(location=(53, -96.5), zoom_start=3)
# Add AOI to map
folium.GeoJson(aoi, name=aoi_name).add_to(map_)
# Add data layer to visualize number of grid cells within AOI
map_layer_observation_date_1.add_to(map_)
# Add a quick colorbar
legend_html = ghgc_utils.generate_html_colorbar(color_map,rescale_values,label='g CH4/m2/year',dark=True)
map_.get_root().html.add_child(folium.Element(legend_html))
map_
Make this Notebook Trusted to load map: File -> Trust Notebook

Generate the statistics for the AOI using a function from the ghgc_utils module, which fetches the data and its statistics from the Raster API.

%%time

# Statistics will be returned as a Pandas DataFrame
df = ghgc_utils.generate_stats(items,aoi,url=RASTER_API_URL,asset=asset_name)
Generating stats...
Done!
CPU times: total: 125 ms
Wall time: 5.85 s
# Take a look at the first 12 rows of our statistics DataFrame
df.head(12)
datetime min max mean count sum std median majority minority unique histogram valid_percent masked_pixels valid_pixels percentile_2 percentile_98 date
0 2023-12-01T00:00:00Z 0.00000000000000000000 3.62569945866464937723 0.21689204032833558911 259.83999633789062500000 56.35722696463234626663 0.34170620609935964396 0.08315918546211259477 0.00000000000000000000 0.00000000020436660041 284.00000000000000000000 [[243, 27, 18, 6, 2, 1, 1, 0, 0, 1], [0.0, 0.3... 100.00000000000000000000 0.00000000000000000000 299.00000000000000000000 0.00000000000000000000 1.23969260611512033243 2023-12-01 00:00:00+00:00
1 2023-11-01T00:00:00Z 0.00000000000000000000 9.65621520252479115243 1.91368808162943437878 259.83999633789062500000 497.25270412245714624078 1.96920024583056751943 1.18970362699910436888 0.00000000000000000000 0.00000000020158782885 284.00000000000000000000 [[133, 64, 36, 24, 15, 9, 4, 2, 9, 3], [0.0, 0... 100.00000000000000000000 0.00000000000000000000 299.00000000000000000000 0.00000000000000000000 8.24678021895863899715 2023-11-01 00:00:00+00:00
2 2023-10-01T00:00:00Z 0.00000000000000000000 21.22260689968696922847 4.38855340187157416665 259.83999633789062500000 1140.32169987094721363974 4.32595241039683742201 3.10408787696688825974 0.00000000000000000000 0.00000000019523825532 284.00000000000000000000 [[117, 74, 41, 24, 14, 8, 7, 6, 4, 4], [0.0, 2... 100.00000000000000000000 0.00000000000000000000 299.00000000000000000000 0.00000000000000000000 17.76963466216665921138 2023-10-01 00:00:00+00:00
3 2023-09-01T00:00:00Z 0.00000000000000000000 32.88236327250060497818 6.85258087224398693138 259.83999633789062500000 1780.57458874897702116868 6.26483652095947096683 5.31746358790270434724 0.00000000000000000000 0.00000000020564613451 284.00000000000000000000 [[99, 87, 49, 22, 22, 3, 3, 10, 3, 1], [0.0, 3... 100.00000000000000000000 0.00000000000000000000 299.00000000000000000000 0.00000000000000000000 25.98792068057545634474 2023-09-01 00:00:00+00:00
4 2023-08-01T00:00:00Z 0.00000000000000000000 31.29306962914854750579 8.84035730314411871689 259.83999633789062500000 2297.07840927461256796960 6.98330592492334290000 7.22081945945246772567 0.00000000000000000000 0.00000000023240369398 284.00000000000000000000 [[68, 63, 60, 43, 13, 27, 11, 4, 5, 5], [0.0, ... 100.00000000000000000000 0.00000000000000000000 299.00000000000000000000 0.00000000000000000000 27.89952789301664992649 2023-08-01 00:00:00+00:00
5 2023-07-01T00:00:00Z 0.00000000000000000000 19.33305170554107732528 5.20216520569034823751 259.83999633789062500000 1351.73058799568207177799 3.84207466251003859625 4.44568954508543523474 0.00000000000000000000 0.00000000020965566769 284.00000000000000000000 [[68, 62, 62, 36, 31, 15, 12, 9, 3, 1], [0.0, ... 100.00000000000000000000 0.00000000000000000000 299.00000000000000000000 0.00000000000000000000 14.71027508313649789784 2023-07-01 00:00:00+00:00
6 2023-06-01T00:00:00Z 0.00000000000000000000 20.29374180826931706179 5.86151205669399111287 259.83999633789062500000 1523.05527134586850479536 4.04534474195380155948 5.54409114110687628596 0.00000000000000000000 0.00000000021718600980 284.00000000000000000000 [[59, 63, 51, 56, 27, 15, 19, 5, 2, 2], [0.0, ... 100.00000000000000000000 0.00000000000000000000 299.00000000000000000000 0.00000000000000000000 15.80983199938553696029 2023-06-01 00:00:00+00:00
7 2023-05-01T00:00:00Z 0.00000000000000000000 9.28279604183251016991 1.83671959966837761158 259.83999633789062500000 477.25321405156319087837 1.71020118192042680505 1.45776437738241293474 0.00000000000000000000 0.00000000021395661339 284.00000000000000000000 [[119, 71, 37, 21, 26, 8, 13, 2, 1, 1], [0.0, ... 100.00000000000000000000 0.00000000000000000000 299.00000000000000000000 0.00000000000000000000 5.95586946832760411041 2023-05-01 00:00:00+00:00
8 2023-04-01T00:00:00Z 0.00000000000000000000 5.28129444682195359206 0.58634228746183969516 259.83999633789062500000 152.35517782683484711015 0.79729049143525698717 0.19628264909060727517 0.00000000000000000000 0.00000000021424270800 284.00000000000000000000 [[186, 55, 21, 12, 8, 9, 5, 2, 0, 1], [0.0, 0.... 100.00000000000000000000 0.00000000000000000000 299.00000000000000000000 0.00000000000000000000 3.05290233903151575490 2023-04-01 00:00:00+00:00
9 2023-03-01T00:00:00Z 0.00000000000000000000 2.17215021499507754399 0.18548663673365564653 259.83999633789062500000 48.19684700960073087117 0.27586226984231487780 0.07265506670182525495 0.00000000000000000000 0.00000000020481905131 284.00000000000000000000 [[209, 51, 24, 8, 3, 1, 0, 1, 1, 1], [0.0, 0.2... 100.00000000000000000000 0.00000000000000000000 299.00000000000000000000 0.00000000000000000000 0.88559581720632785107 2023-03-01 00:00:00+00:00
10 2023-02-01T00:00:00Z 0.00000000000000000000 2.03172278773908532301 0.13550572716740108548 259.83999633789062500000 35.20980765094070363830 0.22818084809979991001 0.06501649157463539053 0.00000000000000000000 0.00000000021514247616 284.00000000000000000000 [[233, 44, 13, 3, 3, 0, 0, 1, 1, 1], [0.0, 0.2... 100.00000000000000000000 0.00000000000000000000 299.00000000000000000000 0.00000000000000000000 0.81772193963552808338 2023-02-01 00:00:00+00:00
11 2023-01-01T00:00:00Z 0.00000000000000000000 1.94565681939492862718 0.13461782914320416515 259.83999633789062500000 34.97909623158495406869 0.22468925646561960230 0.06139719731702760613 0.00000000000000000000 0.00000000019794429308 284.00000000000000000000 [[228, 46, 13, 7, 1, 0, 0, 2, 1, 1], [0.0, 0.1... 100.00000000000000000000 0.00000000000000000000 299.00000000000000000000 0.00000000000000000000 0.77574581947777010438 2023-01-01 00:00:00+00:00

Time-Series Analysis

Let’s look at the maximum methane flux within our AOI over the range of a year. The code below generates a time series plot from the table above.

# Figure size: 20 representing the width, 10 representing the height
df = df.sort_values(by="datetime")
fig = plt.figure(figsize=(10,5))

# Change 'which_stat' below if you would rather look at a different statistic, like minimum or maximum.
which_stat = "mean"

plt.plot(
    [d[0:7] for d in df["datetime"]], # X-axis: sorted datetime
    df[which_stat], # Y-axis: maximum CH4 flux
    color="red", # Line color
    linestyle="-", # Line style
    linewidth=2, # Line width
    label=f"{items[0].assets[asset_name].title}", # Legend label
)

# Display legend
plt.legend()

# Insert label for the X-axis
plt.xlabel("Years")

# Insert label for the Y-axis
plt.ylabel("g CH₄/m²/year")
plt.xticks(rotation = 90)

# Show once-annual tick marks
#interval = 12
#ticks = range(0, len(df), interval)  # Tick positions
#plt.xticks(ticks, [d[0:7] for d in df["datetime"]][list(ticks)])  # Use the corresponding datetime values
plt.xticks([d[0:7] for d in df['datetime'][:]][::2])

# Insert title for the plot
plt.title(f"{which_stat.capitalize()} Monthly Flux from {items[0].assets[asset_name].title} sources within {aoi_name} AOI (2023)")

# Add data citation
plt.text(
    [d[0:7] for d in df['datetime']][0],           # X-coordinate of the text
    df[which_stat].min(),                  # Y-coordinate of the text

    # Text to be displayed
    f"Source: {collection.title}",                  
    fontsize=8,                             # Font size
    horizontalalignment="left",              # Horizontal alignment
    verticalalignment="top",                 # Vertical alignment
    color="blue",                            # Text color
)


# Plot the time series
plt.show()

This is a visual representation of the seasonal cycle of microbial methane emissions in Eastern Canada over the span of one year.

Summary

In this notebook we have successfully completed the following steps for the CarbonTracker-CH₄ Isotopic CH₄ Flux Estimates dataset:

  1. Install and import the necessary libraries
  2. Fetch the collection from STAC using the appropriate endpoints
  3. Count the number of existing fles (granules) within the collection
  4. Map and compare the CH₄ inverse fluxes over an area of interest for two distinctive dates/times
  5. Generate zonal statistics for an area of interest (AOI)
  6. Generate a time-series graph of the CH₄ inverse fluxes for a specified region

If you have any questions regarding this user notebook, please contact us using the feedback form.

Back to top