# Import the following libraries
import requests
import folium
import folium.plugins
from folium import Map, TileLayer
from pystac_client import Client
import branca
import pandas as pd
import matplotlib.pyplot as plt
EMIT Methane Point Source Plume Complexes
Approach
- Identify available dates and temporal frequency of observations for the given collection using the GHGC API
/stac
endpoint. The collection processed in this notebook is the Earth Surface Mineral Dust Source Investigation (EMIT) methane emission plumes data product. - Pass the STAC item into the raster API
/stac/tilejson.json
endpoint. - Using
folium.Map
, visualize the plumes. - After the visualization, perform zonal statistics for a given polygon.
About the Data
The Earth Surface Mineral Dust Source Investigation (EMIT) instrument builds upon NASA’s long history of developing advanced imaging spectrometers for new science and applications. EMIT launched to the International Space Station (ISS) on July 14, 2022. The data shows high-confidence research grade methane plumes from point source emitters - updated as they are identified - in keeping with Jet Propulsion Laboratory (JPL) Open Science and Open Data policy. For more information regarding this dataset, please visit the EMIT Methane Point Source Plume Complexes data overview page.
Install the Required Libraries
Required libraries are pre-installed on the GHG Center Hub. If you need to run this notebook elsewhere, please install them with this line in a code cell:
%pip install requests folium rasterstats pystac_client pandas matplotlib –quiet
Querying the STAC API
First, we are going to import the required libraries. Once imported, they allow better executing a query in the GHG Center Spatio Temporal Asset Catalog (STAC) Application Programming Interface (API) where the granules for this collection are stored.
# Provide the STAC and RASTER API endpoints
# The endpoint is referring to a location within the API that executes a request on a data collection nesting on the server.
# The STAC API is a catalog of all the existing data collections that are stored in the GHG Center.
= "http://ghg.center/api/stac"
STAC_API_URL
# The RASTER API is used to fetch collections for visualization
= "https://ghg.center/api/raster"
RASTER_API_URL
# The collection name is used to fetch the dataset from the STAC API. First, we define the collection name as a variable
# Name of the collection for methane emission plumes
= "emit-ch4plume-v1" collection_name
# Fetch the collection from the STAC API using the appropriate endpoint
# The 'requests' library allows a HTTP request possible
= requests.get(f"{STAC_API_URL}/collections/{collection_name}").json()
collection
# Print the properties of the collection to the console
collection
Examining the contents of our collection
under the temporal
variable, we note that data is available from August 2022 to May 2023. By looking at the dashboard: time density
, we can see that observations are conducted daily and non-periodically (i.e., there are plumes emissions for multiple places on the same dates).
def get_item_count(collection_id):
= 0
count = f"{STAC_API_URL}/collections/{collection_id}/items"
items_url
while True:
= requests.get(items_url)
response
if not response.ok:
print("error getting items")
exit()
= response.json()
stac += int(stac["context"].get("returned", 0))
count next = [link for link in stac["links"] if link["rel"] == "next"]
if not next:
break
= next[0]["href"]
items_url
return count
# Check total number of items available
= get_item_count(collection_name)
number_of_items = requests.get(f"{STAC_API_URL}/collections/{collection_name}/items?limit={number_of_items}").json()["features"]
items print(f"Found {len(items)} items")
# Examining the first item in the collection
0] items[
Below, we are entering the minimum and maximum values to provide our upper and lower bounds in rescale_values
.
Exploring Methane Emission Plumes (CH₄) using the Raster API
In this notebook, we will explore global methane emission plumes from point sources. We will visualize the outputs on a map using folium.
# To access the year value from each item more easily, this will let us query more explicity by year and month (e.g., 2020-02)
= {item["id"][20:]: item for item in items}
items = "ch4-plume-emissions" asset_name
# Fetching the min and max values for a specific item
= {"max":items[list(items.keys())[0]]["assets"][asset_name]["raster:bands"][0]["histogram"]["max"], "min":items[list(items.keys())[0]]["assets"][asset_name]["raster:bands"][0]["histogram"]["min"]} rescale_values
Now we will pass the item id, collection name, and rescaling_factor
to the Raster API
endpoint. We will do this for only one item so that we can visualize the event.
# Select the item ID which you want to visualize. Item ID is in the format yyyymmdd followed by the timestamp. This ID can be extracted from the COG name as well.
= "20230418T200118_000829"
item_id = "magma"
color_map = requests.get(
methane_plume_tile f"{RASTER_API_URL}/stac/tilejson.json?collection={items[item_id]['collection']}&item={items[item_id]['id']}"
f"&assets={asset_name}"
f"&color_formula=gamma+r+1.05&colormap_name={color_map}"
f"&rescale={rescale_values['min']},{rescale_values['max']}",
).json() methane_plume_tile
Visualizing CH₄ Emission Plume
# Set initial zoom and center of map for plume Layer
= folium.Map(location=(methane_plume_tile["center"][1], methane_plume_tile["center"][0]), zoom_start=13)
map_
# December 2001
= TileLayer(
map_layer =methane_plume_tile["tiles"][0],
tiles="GHG",
attr=1,
opacity
)
map_layer.add_to(map_)
# visualising the map
map_
Calculating Zonal Statistics
To perform zonal statistics, first we need to create a polygon. In this use case we will create a polygon around the plume.
# Plume AOI
= items[item_id]["geometry"]["coordinates"]
plumes_coordinates = {
methane_plume_aoi "type": "Feature",
"properties": {},
"geometry": {
"coordinates":
plumes_coordinates,"type": "Polygon",
}, }
# We'll plug in the coordinates for a location
# central to the study area and a reasonable zoom level
= "Place_Holder" # please put the name of the place you are trying to visualize
region_name = Map(
aoi_map ="OpenStreetMap",
tiles=[
location0][0][1],
plumes_coordinates[0][0][0]
plumes_coordinates[
],=12,
zoom_start
)
=region_name).add_to(aoi_map)
folium.GeoJson(methane_plume_aoi, name aoi_map
# Check total number of items available
= requests.get(
items f"{STAC_API_URL}/collections/{collection_name}/items?limit={number_of_items}"
"features"]
).json()[print(f"Found {len(items)} items")
# Explore the first item
0] items[
# The bounding box should be passed to the geojson param as a geojson Feature or FeatureCollection
def generate_stats(item, geojson):
= requests.post(
result f"{RASTER_API_URL}/cog/statistics",
={"url": item["assets"][asset_name]["href"]},
params=geojson,
json
).json()print(result)
return {
**result["properties"],
"item_id": item["id"][20:],
}
for item in items:
print(item["id"])
break
With the function above, we can generate the statistics for the area of interest.
%%time
= [generate_stats(item, methane_plume_aoi) for item in items]
stats = [ stat for stat in stats if stat["statistics"]["b1"]["mean"] != None] stats
stats
def clean_stats(stats_json) -> pd.DataFrame:
= pd.json_normalize(stats_json)
df = [col.replace("statistics.b1.", "") for col in df.columns]
df.columns # df["date"] = pd.to_datetime(df["datetime"])
return df
= clean_stats(stats)
df df
= requests.get(
plume_tile_2 f"{RASTER_API_URL}/stac/tilejson.json?collection={items[0]['collection']}&item={items[0]['id']}"
f"&assets={asset_name}"
f"&color_formula=gamma+r+1.05&colormap_name={color_map}"
f"&rescale={rescale_values['min']},{rescale_values['max']}",
).json() plume_tile_2
# Use bbox initial zoom and map
# Set up a map located w/in event bounds
= items[0]["geometry"]["coordinates"]
plume_tile_2_coordinates = Map(
aoi_map_bbox ="OpenStreetMap",
tiles=[
location0][0][1],
plume_tile_2_coordinates[0][0][0]
plume_tile_2_coordinates[
],=10,
zoom_start
)
= TileLayer(
map_layer =plume_tile_2["tiles"][0],
tiles="GHG", opacity = 1
attr
)
map_layer.add_to(aoi_map_bbox)
aoi_map_bbox
Summary
In this notebook we have successfully completed the following steps for the STAC collection for the EMIT Methane Point Source Plume Complexes dataset: 1. Install and import the necessary libraries 2. Fetch the collection from STAC collections using the appropriate endpoints 3. Count the number of existing granules within the collection 4. Map the methane emission plumes 5. Generate statistics for the area of interest (AOI)
If you have any questions regarding this user notebook, please contact us using the feedback form.