Week 4B: Working with Raster Data

Sep 24, 2020

Housekeeping

  • Homework #2 due today
  • Homework #3 due on Thursday (10/8)
    • Part 1: Working with eviction data in Philadelphia
    • Part 2: Exploring the NDVI in Philadelphia
  • Come to office hours!
    • Tuesday (Nick) 7:30am-9am and 6-7:30pm (Philadelphia time)
    • Thursday (Eugene) 10:30am - 12:30pm (Phila time)
    • Sign up for time slots on Canvas calendar

Follow along in lecture on Binder!

All lecture slides will work on Binder, data is available to be loaded, etc. Binder has an exact copy of the file structure in the GitHub repo.

https://github.com/MUSA-550-Fall-2020/week-4

Screen%20Shot%202020-09-23%20at%209.49.02%20PM.png

Last time: A set of coordinated visualization libraries in Python

hvplot

  • Quickly generate interactive plots from your data
  • Seamlessly handles pandas and geopandas data
  • Relies on Holoviews and Geoviews under the hood
  • Fancy, magic widgets for grouping charts by a third column
  • Great built-in interaction toolbar powered by bokeh: box select, lasso, pan, zoom, etc...

An interface just like the pandas plot() function, but much more useful.

In [1]:
# Our usual imports
import pandas as pd
import geopandas as gpd
import numpy as np
from matplotlib import pyplot as plt
%matplotlib inline
In [2]:
# This will add the .hvplot() function to your DataFrame!
import hvplot.pandas

# This registers "bokeh" as the desired backend for the interactive plots
import holoviews as hv
hv.extension("bokeh")

Last time: hvplot has great support for geopandas

Example: Interactive choropleth of median property assessments in Philadelphia

In [3]:
# Load the data
url = "https://raw.githubusercontent.com/MUSA-550-Fall-2020/week-3/master/data/opa_residential.csv"
data = pd.read_csv(url)

# Create the Point() objects
data['geometry'] = gpd.points_from_xy(data['lng'], data['lat'])

# Create the GeoDataFrame
data = gpd.GeoDataFrame(data, geometry='geometry', crs="EPSG:4326")
In [4]:
# load the Zillow data from GitHub
url = "https://raw.githubusercontent.com/MUSA-550-Fall-2020/week-3/master/data/zillow_neighborhoods.geojson"
zillow = gpd.read_file(url)
In [5]:
# Important: Make sure the CRS match
data = data.to_crs(zillow.crs)

# perform the spatial join
data = gpd.sjoin(data, zillow, op='within', how='left')
In [6]:
# Calculate the median market value per Zillow neighborhood
median_values = data.groupby("ZillowName", as_index=False)["market_value"].median()

# Merge median values with the Zillow geometries
median_values = zillow.merge(median_values, on="ZillowName")
print(type(median_values))
<class 'geopandas.geodataframe.GeoDataFrame'>
In [7]:
median_values.head()
Out[7]:
ZillowName geometry market_value
0 Academy Gardens POLYGON ((-74.99851 40.06435, -74.99456 40.061... 185950.0
1 Allegheny West POLYGON ((-75.16592 40.00327, -75.16596 40.003... 34750.0
2 Andorra POLYGON ((-75.22463 40.06686, -75.22588 40.065... 251900.0
3 Aston Woodbridge POLYGON ((-75.00860 40.05369, -75.00861 40.053... 183800.0
4 Bartram Village POLYGON ((-75.20733 39.93350, -75.20733 39.933... 48300.0

Take advantage of GeoViews

Let's add a tile source underneath the choropleth map

In [8]:
import geoviews as gv
import geoviews.tile_sources as gvts
In [9]:
%%opts WMTS [width=800, height=800, xaxis=None, yaxis=None]

choro = median_values.hvplot(c='market_value', 
                             width=500, 
                             height=400, 
                             alpha=0.5, 
                             geo=True,
                             cmap='viridis', 
                             hover_cols=['ZillowName'])
gvts.ESRI * choro
/Users/nhand/opt/miniconda3/envs/musa-550-fall-2020/lib/python3.7/site-packages/holoviews/plotting/util.py:685: MatplotlibDeprecationWarning: The global colormaps dictionary is no longer considered public API.
  [cmap for cmap in cm.cmap_d if not
/Users/nhand/opt/miniconda3/envs/musa-550-fall-2020/lib/python3.7/site-packages/holoviews/plotting/util.py:685: MatplotlibDeprecationWarning: The global colormaps dictionary is no longer considered public API.
  [cmap for cmap in cm.cmap_d if not
Out[9]:

hvplot also works with raster data

  • So far we've been working mainly with vector data using geopandas: lines, points, polygons
  • The basemaps we've been adding are an exampe of raster data
  • Raster data:
    • Gridded or pixelated
    • Maps easily to an array — it's the representation for digital images

Continuous raster examples

  • Multispectral satellite imagery, e.g., the Landsat satellite
  • Digital Elevation Models (DEMs)
  • Maps of canopy height derived from LiDAR data.

Categorical raster examples

  • Land cover or land-use maps.
  • Tree height maps classified as short, medium, and tall trees.
  • Snowcover masks (binary snow or no snow)

Important attributes of raster data

1. The coordinate reference system

0637aa2541b31f526ad44f7cb2db7b6c.jpg

2. Resolution

The spatial distance that a single pixel covers on the ground.

Screen%20Shot%202020-09-20%20at%208.18.36%20PM.png

3. Extent

The bounding box that covers the entire extent of the raster data.

4. Affine transformation

Pixel space --> real space

  • A mapping from pixel space (rows and columns of the image) to the x and y coordinates defined by the CRS
  • Typically a six parameter matrix defining the origin, pixel resolution, and rotation of the raster image

Multi-band raster data

Each band measures the light reflected from different parts of the electromagnetic spectrum.

Color images are multi-band rasters!

The raster format: GeoTIFF

  • A standard .tif image format with additional spatial information embedded in the file, which includes metadata for:
    • Geotransform, e.g., extent, resolution
    • Coordinate Reference System (CRS)
    • Values that represent missing data (NoDataValue)

Tools for raster data

  • Lots of different options: really just working with arrays
  • One of the first options: Geospatial Data Abstraction Library (GDAL)
    • Low level and not really user-friendly
  • A more recent, much more Pythonic package: rasterio

We'll use rasterio for the majority of our raster analysis

Getting started with rasterio

Analysis is built around the "open()" command which returns a "dataset" or "file handle"

In [10]:
import rasterio as rio
In [11]:
# Open the file and get a file "handle"
landsat = rio.open("./data/landsat8_philly.tif")
landsat
Out[11]:
<open DatasetReader name='./data/landsat8_philly.tif' mode='r'>

Let's check out the metadata

In [12]:
# The CRS
landsat.crs
Out[12]:
CRS.from_epsg(32618)
In [13]:
# The bounds
landsat.bounds
Out[13]:
BoundingBox(left=476064.3596176505, bottom=4413096.927074196, right=503754.3596176505, top=4443066.927074196)
In [14]:
# The number of bands available
landsat.count
Out[14]:
10
In [15]:
# The band numbers that are available
landsat.indexes
Out[15]:
(1, 2, 3, 4, 5, 6, 7, 8, 9, 10)
In [16]:
# Number of pixels in the x and y directions
landsat.shape
Out[16]:
(999, 923)
In [17]:
# The 6 parameters that map from pixel to real space
landsat.transform
Out[17]:
Affine(30.0, 0.0, 476064.3596176505,
       0.0, -30.0, 4443066.927074196)
In [18]:
# All of the meta data
landsat.meta
Out[18]:
{'driver': 'GTiff',
 'dtype': 'uint16',
 'nodata': None,
 'width': 923,
 'height': 999,
 'count': 10,
 'crs': CRS.from_epsg(32618),
 'transform': Affine(30.0, 0.0, 476064.3596176505,
        0.0, -30.0, 4443066.927074196)}

Reading data from a file

Tell the file which band to read, starting with 1

In [19]:
# This is just a numpy array
data = landsat.read(1)
data
Out[19]:
array([[10901, 10618, 10751, ..., 12145, 11540, 14954],
       [11602, 10718, 10546, ..., 11872, 11982, 12888],
       [10975, 10384, 10357, ..., 11544, 12318, 12456],
       ...,
       [12281, 12117, 12072, ..., 11412, 11724, 11088],
       [12747, 11866, 11587, ..., 11558, 12028, 10605],
       [11791, 11677, 10656, ..., 10615, 11557, 11137]], dtype=uint16)

We can plot it with matplotlib's imshow

In [20]:
fig, ax = plt.subplots(figsize=(10,10))

img = ax.imshow(data)

plt.colorbar(img)
Out[20]:
<matplotlib.colorbar.Colorbar at 0x7ff7c0371390>

Two improvements

  • A log-scale colorbar
  • Add the axes extent

Adding the axes extent

Note that imshow needs a bounding box of the form: (left, right, bottom, top).

See the documentation for imshow if you forget the right ordering...(I often do!)

Using a log-scale color bar

Matplotlib has a built in log normalization...

In [21]:
import matplotlib.colors as mcolors
In [22]:
# Initialize
fig, ax = plt.subplots(figsize=(10, 10))

# Plot the image
img = ax.imshow(
    data,
    norm=mcolors.LogNorm(),  # Use a log colorbar scale
    extent=[  # Set the extent of the images
        landsat.bounds.left,
        landsat.bounds.right,
        landsat.bounds.bottom,
        landsat.bounds.top,
    ],
)

# Add a colorbar
plt.colorbar(img)
Out[22]:
<matplotlib.colorbar.Colorbar at 0x7ff7adfd4890>

Overlaying vector geometries on raster plots

Two requirements:

  1. The CRS of the two datasets must match
  2. The extent of the imshow plot must be set properly

Let's add the city limits

In [23]:
city_limits = gpd.read_file("./data/City_Limits.geojson")
In [24]:
# Convert to the correct CRS!
print(landsat.crs.data)
{'init': 'epsg:32618'}
In [25]:
city_limits = city_limits.to_crs(landsat.crs.data['init'])
In [26]:
# Initialize
fig, ax = plt.subplots(figsize=(10, 10))

# The extent of the data
landsat_extent = [
    landsat.bounds.left,
    landsat.bounds.right,
    landsat.bounds.bottom,
    landsat.bounds.top,
]

# Plot!
img = ax.imshow(data, 
                norm=mcolors.LogNorm(), 
                extent=landsat_extent)

# Add the city limits
city_limits.plot(ax=ax, facecolor="none", edgecolor="white")

# Add a colorbar and turn off axis lines
plt.colorbar(img)
ax.set_axis_off()

What band did we plot??

Landsat 8 has 11 bands spanning the electromagnetic spectrum from visible to infrared.

How about a typical RGB color image?

First let's read the red, blue, and green bands (bands 4, 3, 2):

Note

The .indexes attributes tells us the bands available to be read.

Important: they are not zero-indexed!

In [27]:
landsat.indexes
Out[27]:
(1, 2, 3, 4, 5, 6, 7, 8, 9, 10)
In [28]:
rgb_data = landsat.read([4, 3, 2])
In [29]:
rgb_data.shape 
Out[29]:
(3, 999, 923)

Note the data has the shape: (bands, width, height)

In [30]:
fig, axs = plt.subplots(nrows=1, ncols=3, figsize=(12, 6))

cmaps = ["Reds", "Greens", "Blues"]
for i in [0, 1, 2]:

    # This subplot axes
    ax = axs[i]

    # Plot
    ax.imshow(rgb_data[i], norm=mcolors.LogNorm(), extent=landsat_extent, cmap=cmaps[i])
    city_limits.plot(ax=ax, facecolor="none", edgecolor="black")

    # Format
    ax.set_axis_off()
    ax.set_title(cmaps[i][:-1])

Side note: using subplots in matplotlib

You can specify the subplot layout via the nrows and ncols keywords to plt.subplots. If you have more than one row or column, the function returns:

  • The figure
  • The list of axes.
In [31]:
fig, axs = plt.subplots(nrows=1, ncols=3)

Note

When nrows or ncols > 1, I usually name the returned axes variable "axs" vs. the usual case of just "ax". It's useful for remembering we got more than one Axes object back!

In [32]:
# This has length 3 for each of the 3 columns!
axs
Out[32]:
array([<AxesSubplot:>, <AxesSubplot:>, <AxesSubplot:>], dtype=object)
In [33]:
# Left axes
axs[0]  
Out[33]:
<AxesSubplot:>
In [34]:
# Middle axes
axs[1]
Out[34]:
<AxesSubplot:>
In [35]:
# Right axes
axs[2]
Out[35]:
<AxesSubplot:>

Use earthpy to plot the combined color image

A useful utility package to perform the proper re-scaling of pixel values

In [36]:
import earthpy.plot as ep
In [37]:
# Initialize
fig, ax = plt.subplots(figsize=(10,10))

# Plot the RGB bands
ep.plot_rgb(rgb_data, rgb=(0, 1, 2), ax=ax);

We can "stretch" the data to improve the brightness

In [38]:
# Initialize
fig, ax = plt.subplots(figsize=(10,10))

# Plot the RGB bands
ep.plot_rgb(rgb_data, rgb=(0, 1, 2), ax=ax, stretch=True);

Conclusion: Working with multi-band data is a bit messy

Can we do better?

Yes! HoloViz to the rescue...

xarray

  • A fancier version of numpy arrays
  • Designed for gridded data with multiple dimensions
  • For raster data, those dimensions are: bands, latitude, longitude, pixel values

Let's try it out...

In [39]:
import xarray as xr
In [45]:
xr.open_rasterio?
In [40]:
ds = xr.open_rasterio("./data/landsat8_philly.tif")
ds
Out[40]:
<xarray.DataArray (band: 10, y: 999, x: 923)>
[9220770 values with dtype=uint16]
Coordinates:
  * band     (band) int64 1 2 3 4 5 6 7 8 9 10
  * y        (y) float64 4.443e+06 4.443e+06 4.443e+06 ... 4.413e+06 4.413e+06
  * x        (x) float64 4.761e+05 4.761e+05 4.761e+05 ... 5.037e+05 5.037e+05
Attributes:
    transform:      (30.0, 0.0, 476064.3596176505, 0.0, -30.0, 4443066.927074...
    crs:            +init=epsg:32618
    res:            (30.0, 30.0)
    is_tiled:       0
    nodatavals:     (nan, nan, nan, nan, nan, nan, nan, nan, nan, nan)
    scales:         (1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0)
    offsets:        (0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0)
    AREA_OR_POINT:  Area
In [41]:
type(ds)
Out[41]:
xarray.core.dataarray.DataArray

Now the magic begins: more hvplot

Still experimental, but very promising — interactive plots with widgets for each band!

In [42]:
import hvplot.xarray
In [43]:
img = ds.hvplot.image(width=700, height=500, cmap='viridis')
img
/Users/nhand/opt/miniconda3/envs/musa-550-fall-2020/lib/python3.7/site-packages/holoviews/plotting/util.py:685: MatplotlibDeprecationWarning: The global colormaps dictionary is no longer considered public API.
  [cmap for cmap in cm.cmap_d if not
/Users/nhand/opt/miniconda3/envs/musa-550-fall-2020/lib/python3.7/site-packages/holoviews/plotting/util.py:685: MatplotlibDeprecationWarning: The global colormaps dictionary is no longer considered public API.
  [cmap for cmap in cm.cmap_d if not
/Users/nhand/opt/miniconda3/envs/musa-550-fall-2020/lib/python3.7/site-packages/holoviews/plotting/util.py:685: MatplotlibDeprecationWarning: The global colormaps dictionary is no longer considered public API.
  [cmap for cmap in cm.cmap_d if not
Out[43]:

More magic widgets!

Two key raster concepts

  • Using vector + raster data
    • Cropping or masking raster data based on vector polygons
  • Raster math and zonal statistics
    • Combining multiple raster data sets
    • Calculating statistics within certain vector polygons

Cropping: let's trim to just the city limits

Use rasterio's builtin mask function:

In [44]:
from rasterio.mask import mask
In [45]:
masked, mask_transform = mask(
    dataset=landsat,
    shapes=city_limits.geometry,
    crop=True,  # remove pixels not within boundary
    all_touched=True,  # get all pixels that touch the boudnary
    filled=False,  # do not fill cropped pixels with a default value
)
In [46]:
masked
Out[46]:
masked_array(
  data=[[[--, --, --, ..., --, --, --],
         [--, --, --, ..., --, --, --],
         [--, --, --, ..., --, --, --],
         ...,
         [--, --, --, ..., --, --, --],
         [--, --, --, ..., --, --, --],
         [--, --, --, ..., --, --, --]],

        [[--, --, --, ..., --, --, --],
         [--, --, --, ..., --, --, --],
         [--, --, --, ..., --, --, --],
         ...,
         [--, --, --, ..., --, --, --],
         [--, --, --, ..., --, --, --],
         [--, --, --, ..., --, --, --]],

        [[--, --, --, ..., --, --, --],
         [--, --, --, ..., --, --, --],
         [--, --, --, ..., --, --, --],
         ...,
         [--, --, --, ..., --, --, --],
         [--, --, --, ..., --, --, --],
         [--, --, --, ..., --, --, --]],

        ...,

        [[--, --, --, ..., --, --, --],
         [--, --, --, ..., --, --, --],
         [--, --, --, ..., --, --, --],
         ...,
         [--, --, --, ..., --, --, --],
         [--, --, --, ..., --, --, --],
         [--, --, --, ..., --, --, --]],

        [[--, --, --, ..., --, --, --],
         [--, --, --, ..., --, --, --],
         [--, --, --, ..., --, --, --],
         ...,
         [--, --, --, ..., --, --, --],
         [--, --, --, ..., --, --, --],
         [--, --, --, ..., --, --, --]],

        [[--, --, --, ..., --, --, --],
         [--, --, --, ..., --, --, --],
         [--, --, --, ..., --, --, --],
         ...,
         [--, --, --, ..., --, --, --],
         [--, --, --, ..., --, --, --],
         [--, --, --, ..., --, --, --]]],
  mask=[[[ True,  True,  True, ...,  True,  True,  True],
         [ True,  True,  True, ...,  True,  True,  True],
         [ True,  True,  True, ...,  True,  True,  True],
         ...,
         [ True,  True,  True, ...,  True,  True,  True],
         [ True,  True,  True, ...,  True,  True,  True],
         [ True,  True,  True, ...,  True,  True,  True]],

        [[ True,  True,  True, ...,  True,  True,  True],
         [ True,  True,  True, ...,  True,  True,  True],
         [ True,  True,  True, ...,  True,  True,  True],
         ...,
         [ True,  True,  True, ...,  True,  True,  True],
         [ True,  True,  True, ...,  True,  True,  True],
         [ True,  True,  True, ...,  True,  True,  True]],

        [[ True,  True,  True, ...,  True,  True,  True],
         [ True,  True,  True, ...,  True,  True,  True],
         [ True,  True,  True, ...,  True,  True,  True],
         ...,
         [ True,  True,  True, ...,  True,  True,  True],
         [ True,  True,  True, ...,  True,  True,  True],
         [ True,  True,  True, ...,  True,  True,  True]],

        ...,

        [[ True,  True,  True, ...,  True,  True,  True],
         [ True,  True,  True, ...,  True,  True,  True],
         [ True,  True,  True, ...,  True,  True,  True],
         ...,
         [ True,  True,  True, ...,  True,  True,  True],
         [ True,  True,  True, ...,  True,  True,  True],
         [ True,  True,  True, ...,  True,  True,  True]],

        [[ True,  True,  True, ...,  True,  True,  True],
         [ True,  True,  True, ...,  True,  True,  True],
         [ True,  True,  True, ...,  True,  True,  True],
         ...,
         [ True,  True,  True, ...,  True,  True,  True],
         [ True,  True,  True, ...,  True,  True,  True],
         [ True,  True,  True, ...,  True,  True,  True]],

        [[ True,  True,  True, ...,  True,  True,  True],
         [ True,  True,  True, ...,  True,  True,  True],
         [ True,  True,  True, ...,  True,  True,  True],
         ...,
         [ True,  True,  True, ...,  True,  True,  True],
         [ True,  True,  True, ...,  True,  True,  True],
         [ True,  True,  True, ...,  True,  True,  True]]],
  fill_value=999999,
  dtype=uint16)
In [47]:
masked.shape
Out[47]:
(10, 999, 923)
In [48]:
# Initialize
fig, ax = plt.subplots(figsize=(10, 10))

# Plot the first band
ax.imshow(masked[0], cmap="viridis", extent=landsat_extent)

# Format and add the city limits
city_limits.boundary.plot(ax=ax, color="gray", linewidth=4)
ax.set_axis_off()

Writing GeoTIFF files

Let's save our cropped raster data. To save raster data, we need both the array of the data and the proper meta-data.

In [49]:
out_meta = landsat.meta
out_meta.update(
    {"height": masked.shape[1], "width": masked.shape[2], "transform": mask_transform}
)
print(out_meta)

# write small image to local Geotiff file
with rio.open("data/cropped_landsat.tif", "w", **out_meta) as dst:
    dst.write(masked)
{'driver': 'GTiff', 'dtype': 'uint16', 'nodata': None, 'width': 923, 'height': 999, 'count': 10, 'crs': CRS.from_epsg(32618), 'transform': Affine(30.0, 0.0, 476064.3596176505,
       0.0, -30.0, 4443066.927074196)}
In [50]:
landsat
Out[50]:
<open DatasetReader name='./data/landsat8_philly.tif' mode='r'>

Note: we used a context manager above (the "with" statement) to handle the opening and subsequent closing of our new file.

For more info, see this tutorial.

Now let's do some raster math

Often called "map algebra"...

The Normalized Difference Vegetation Index

  • A normalized index of greenness ranging from -1 to 1, calculated from the red and NIR bands.
  • Provides a measure of the amount of live green vegetation in an area

Formula:

NDVI = (NIR - Red) / (NIR + Red)

Healthy vegetation reflects NIR and absorbs visible, so the NDVI for green vegatation

Get the data for the red and NIR bands

NIR is band 5 and red is band 4

In [51]:
masked.shape
Out[51]:
(10, 999, 923)
In [52]:
# Note that the indexing here is zero-based, e.g., band 1 is index 0
red = masked[3]
nir = masked[4]
In [53]:
def calculate_NDVI(nir, red):
    """
    Calculate the NDVI from the NIR and red landsat bands
    """
    
    # Convert to floats
    nir = nir.astype(float)
    red = red.astype(float)
    
    # Get valid entries
    check = np.logical_and( red.mask == False, nir.mask == False )
    
    # Where the check is True, return the NDVI, else return NaN
    ndvi = np.where(check,  (nir - red ) / ( nir + red ), np.nan )
    return ndvi 
In [54]:
NDVI = calculate_NDVI(nir, red)
In [55]:
fig, ax = plt.subplots(figsize=(10,10))

# Plot NDVI
img = ax.imshow(NDVI, extent=landsat_extent)

# Format and plot city limits
city_limits.plot(ax=ax, edgecolor='gray', facecolor='none', linewidth=4)
plt.colorbar(img)
ax.set_axis_off()
ax.set_title("NDVI in Philadelphia", fontsize=18);

Let's overlay Philly's parks

In [56]:
# Read in the parks dataset
parks = gpd.read_file('./data/parks.geojson')
In [57]:
# Print out the CRS
parks.crs
Out[57]:
<Geographic 2D CRS: EPSG:4326>
Name: WGS 84
Axis Info [ellipsoidal]:
- Lat[north]: Geodetic latitude (degree)
- Lon[east]: Geodetic longitude (degree)
Area of Use:
- name: World
- bounds: (-180.0, -90.0, 180.0, 90.0)
Datum: World Geodetic System 1984
- Ellipsoid: WGS 84
- Prime Meridian: Greenwich
In [58]:
# Conver to landsat CRS
parks = parks.to_crs(landsat.crs.data['init'])
In [59]:
fig, ax = plt.subplots(figsize=(10,10))

# Plot NDVI
img = ax.imshow(NDVI, extent=landsat_extent)

# NEW: add the parks
parks.plot(ax=ax, edgecolor='crimson', facecolor='none', linewidth=2)

# Format and plot city limits
city_limits.plot(ax=ax, edgecolor='gray', facecolor='none', linewidth=4)
plt.colorbar(img)
ax.set_axis_off()
ax.set_title("NDVI vs. Parks in Philadelphia", fontsize=18);

It looks like it worked pretty well!

How about calculating the median NDVI within the park geometries?

This is called zonal statistics. We can use the rasterstats package to do this...

In [60]:
from rasterstats import zonal_stats
In [61]:
stats = zonal_stats(parks, NDVI, affine=landsat.transform, stats=['mean', 'median'])
/Users/nhand/opt/miniconda3/envs/musa-550-fall-2020/lib/python3.7/site-packages/rasterstats/io.py:301: UserWarning: Setting nodata to -999; specify nodata explicitly
  warnings.warn("Setting nodata to -999; specify nodata explicitly")

The zonal_stats function

Zonal statistics of raster values aggregated to vector geometries.

  • First argument: vector data
  • Second argument: raster data to aggregated
  • Also need to pass the affine transform and the names of the stats to compute
In [62]:
stats
Out[62]:
[{'mean': 0.5051087300747117, 'median': 0.521160491292894},
 {'mean': 0.441677258928841, 'median': 0.47791901454593105},
 {'mean': 0.4885958179164529, 'median': 0.5084981442485412},
 {'mean': 0.3756264171005127, 'median': 0.42356277391080177},
 {'mean': 0.45897667566035816, 'median': 0.49204860985900256},
 {'mean': 0.4220467885671325, 'median': 0.46829433706341134},
 {'mean': 0.38363107808041097, 'median': 0.416380868090128},
 {'mean': 0.4330996771737791, 'median': 0.4772304906066629},
 {'mean': 0.45350534061220404, 'median': 0.5015001304461257},
 {'mean': 0.44505112880627273, 'median': 0.4701653486700216},
 {'mean': 0.5017048095513129, 'median': 0.5108644957689836},
 {'mean': 0.23745576631277313, 'median': 0.24259729272419628},
 {'mean': 0.31577818701756977, 'median': 0.3306512446567765},
 {'mean': 0.5300954792288528, 'median': 0.5286783042394015},
 {'mean': 0.4138823685577781, 'median': 0.4604562737642586},
 {'mean': 0.48518030764354636, 'median': 0.5319950233699855},
 {'mean': 0.3154294503884359, 'median': 0.33626983970826585},
 {'mean': 0.4673660755293326, 'median': 0.49301485937514306},
 {'mean': 0.38810332525412405, 'median': 0.4133657898874572},
 {'mean': 0.4408030558751575, 'median': 0.45596817840957254},
 {'mean': 0.5025746121592748, 'median': 0.5077718684805924},
 {'mean': 0.3694178483720184, 'median': 0.3760657145169406},
 {'mean': 0.501111461720193, 'median': 0.5050854708805356},
 {'mean': 0.5262850765386248, 'median': 0.5356888168557536},
 {'mean': 0.48653727784657114, 'median': 0.5100931149323727},
 {'mean': 0.4687375187461541, 'median': 0.49930407348983946},
 {'mean': 0.4708292376095287, 'median': 0.48082651365503404},
 {'mean': 0.4773326667398458, 'median': 0.5156144807640011},
 {'mean': 0.4787564298905878, 'median': 0.49155885289073287},
 {'mean': 0.44870821225970553, 'median': 0.4532736333178444},
 {'mean': 0.28722653079042143, 'median': 0.2989290105395561},
 {'mean': 0.5168607841969154, 'median': 0.5325215878735168},
 {'mean': 0.4725611895576703, 'median': 0.49921839394549977},
 {'mean': 0.46766118115625893, 'median': 0.49758167570710404},
 {'mean': 0.4886459911745514, 'median': 0.5020622294662287},
 {'mean': 0.46017276593316403, 'median': 0.4758438120450033},
 {'mean': 0.4536300550044786, 'median': 0.49233559560028084},
 {'mean': 0.4239182074098442, 'median': 0.4602468966932257},
 {'mean': 0.31202502605597016, 'median': 0.32572928232562837},
 {'mean': 0.2912164528160494, 'median': 0.2946940937582767},
 {'mean': 0.3967233668480246, 'median': 0.3967233668480246},
 {'mean': 0.3226210938878899, 'median': 0.35272291143510365},
 {'mean': 0.4692543355097313, 'median': 0.502843843019927},
 {'mean': 0.2094971073454244, 'median': 0.2736635627619322},
 {'mean': 0.4429390753150177, 'median': 0.49684622604615747},
 {'mean': 0.532656111900622, 'median': 0.5346870838881491},
 {'mean': 0.5053390017291615, 'median': 0.5413888437144251},
 {'mean': 0.500764874094713, 'median': 0.5121577370584547},
 {'mean': 0.4213318206046151, 'median': 0.46379308008197284},
 {'mean': 0.43340892620907207, 'median': 0.4818377611583778},
 {'mean': 0.5069321976848545, 'median': 0.5160336741725328},
 {'mean': 0.4410269675104542, 'median': 0.4870090058426777},
 {'mean': 0.4923323198113241, 'median': 0.5212096391398666},
 {'mean': 0.45403610108483955, 'median': 0.4947035355619755},
 {'mean': 0.2771909854304859, 'median': 0.2843133053564756},
 {'mean': 0.48502792988760635, 'median': 0.5001936488730003},
 {'mean': 0.45418598340923805, 'median': 0.48402058939985293},
 {'mean': 0.32960811687490216, 'median': 0.3285528031290743},
 {'mean': 0.5260210746257522, 'median': 0.5444229172432882},
 {'mean': 0.4445493328049692, 'median': 0.4630808572279276},
 {'mean': 0.4495963972554633, 'median': 0.4858841111806047},
 {'mean': 0.5240412096213553, 'median': 0.5283402835696414},
 {'mean': 0.13767686126646383, 'median': -0.02153756296999343}]
In [63]:
len(parks)
Out[63]:
63
In [64]:
# Store the median value in the parks data frame
parks['median_NDVI'] = [s['median'] for s in stats] 
In [65]:
parks.head()
Out[65]:
OBJECTID ASSET_NAME SITE_NAME CHILD_OF ADDRESS TYPE USE_ DESCRIPTION SQ_FEET ACREAGE ZIPCODE ALLIAS NOTES TENANT LABEL DPP_ASSET_ID Shape__Area Shape__Length geometry median_NDVI
0 7 Wissahickon Valley Park Wissahickon Valley Park Wissahickon Valley Park None LAND REGIONAL_CONSERVATION_WATERSHED NO_PROGRAM 9.078309e+07 2084.101326 19128 MAP SITE Wissahickon Valley Park 1357 1.441162e+07 71462.556702 MULTIPOLYGON (((484101.476 4431051.989, 484099... 0.521160
1 8 West Fairmount Park West Fairmount Park West Fairmount Park LAND REGIONAL_CONSERVATION_WATERSHED NO_PROGRAM 6.078159e+07 1395.358890 19131 MAP SITE West Fairmount Park 1714 9.631203e+06 25967.819064 MULTIPOLYGON (((482736.681 4428824.579, 482728... 0.477919
2 23 Pennypack Park Pennypack Park Pennypack Park LAND REGIONAL_CONSERVATION_WATERSHED NO_PROGRAM 6.023748e+07 1382.867808 19152 Verree Rd Interpretive Center MAP SITE Pennypack Park 1651 9.566914e+06 41487.394790 MULTIPOLYGON (((497133.192 4434667.950, 497123... 0.508498
3 24 East Fairmount Park East Fairmount Park East Fairmount Park LAND REGIONAL_CONSERVATION_WATERSHED NO_PROGRAM 2.871642e+07 659.240959 19121 MAP SITE East Fairmount Park 1713 4.549582e+06 21499.126097 POLYGON ((484539.743 4424162.545, 484620.184 4... 0.423563
4 25 Tacony Creek Park Tacony Creek Park Tacony Creek Park LAND REGIONAL_CONSERVATION_WATERSHED NO_PROGRAM 1.388049e+07 318.653500 19120 MAP SITE Tacony Creek Park 1961 2.201840e+06 19978.610852 MULTIPOLYGON (((491712.882 4429633.244, 491713... 0.492049

Make a quick histogram of the median values

They are all positive, indicating an abundance of green vegetation...

In [66]:
# Initialize
fig, ax = plt.subplots(figsize=(8,6))

# Plot a quick histogram 
ax.hist(parks['median_NDVI'], bins='auto')
ax.axvline(x=0, c='k', lw=2)

# Format
ax.set_xlabel("Median NDVI", fontsize=18)
ax.set_ylabel("Number of Parks", fontsize=18);

Let's make a choropleth, too

In [67]:
# Initialize
fig, ax = plt.subplots(figsize=(10,10))

# Plot the city limits
city_limits.plot(ax=ax, edgecolor='black', facecolor='none', linewidth=4)

# Plot the median NDVI
parks.plot(column='median_NDVI', legend=True, ax=ax, cmap='viridis')

# Format
ax.set_axis_off()

Let's make it interactive!

Make sure to check the crs!

In [68]:
parks.crs
Out[68]:
<Projected CRS: EPSG:32618>
Name: WGS 84 / UTM zone 18N
Axis Info [cartesian]:
- E[east]: Easting (metre)
- N[north]: Northing (metre)
Area of Use:
- name: World - N hemisphere - 78°W to 72°W - by country
- bounds: (-78.0, 0.0, -72.0, 84.0)
Coordinate Operation:
- name: UTM zone 18N
- method: Transverse Mercator
Datum: World Geodetic System 1984
- Ellipsoid: WGS 84
- Prime Meridian: Greenwich

The CRS is "EPSG:32618" and since it's not in 4326, we'll need to specify the "crs=" keyword when plotting!

In [69]:
# trim to only the columns we want to plot
cols = ["median_NDVI", "SITE_NAME", "geometry"]

# Plot the parks colored by median NDVI
p = parks[cols].hvplot.polygons(c="median_NDVI", 
                                geo=True, 
                                crs=32618, 
                                cmap='viridis', 
                                hover_cols=["SITE_NAME"])

# Plot the city limit boundary
cl = city_limits.hvplot.polygons(
    geo=True,
    crs=32618,
    alpha=0,
    line_alpha=1,
    line_color="black",
    hover=False,
    width=700,
    height=600,
)

# combine!
cl * p
/Users/nhand/opt/miniconda3/envs/musa-550-fall-2020/lib/python3.7/site-packages/holoviews/plotting/util.py:685: MatplotlibDeprecationWarning: The global colormaps dictionary is no longer considered public API.
  [cmap for cmap in cm.cmap_d if not
/Users/nhand/opt/miniconda3/envs/musa-550-fall-2020/lib/python3.7/site-packages/holoviews/plotting/util.py:685: MatplotlibDeprecationWarning: The global colormaps dictionary is no longer considered public API.
  [cmap for cmap in cm.cmap_d if not
Out[69]:

Exercise: Measure the median NDVI for each of Philadelphia's neighborhoods

  • A GeoJSON file of Philly's neighborhoods is available in the "data/" folder
  • Once you measure the median value for each neighborhood, make the following charts:
    • A histogram of the median values for all neighborhoods
    • Bar graphs with the neighborhoods with the 20 highest and 20 lowest values
    • A choropleth showing the median values across the city

You're welcome to use matplotlib/geopandas, but encouraged to explore the new hvplot options!

In [70]:
# Load the neighborhoods
hoods = gpd.read_file("./data/zillow_neighborhoods.geojson").to_crs(landsat.crs.data['init'])
In [71]:
# Calculate the zonal statistics
stats_by_hood = zonal_stats(hoods, NDVI, affine=landsat.transform, stats=['mean', 'median'])
/Users/nhand/opt/miniconda3/envs/musa-550-fall-2020/lib/python3.7/site-packages/rasterstats/io.py:301: UserWarning: Setting nodata to -999; specify nodata explicitly
  warnings.warn("Setting nodata to -999; specify nodata explicitly")
In [72]:
# Store the median value in the neighborhood data frame
hoods['median_NDVI'] = [s['median'] for s in stats_by_hood]

Make a histogram of median values

In [73]:
# Initialize
fig, ax = plt.subplots(figsize=(8,6))

# Plot a quick histogram 
ax.hist(hoods['median_NDVI'], bins='auto')
ax.axvline(x=0, c='k', lw=2)

# Format
ax.set_xlabel("Median NDVI", fontsize=18)
ax.set_ylabel("Number of Neighborhoods", fontsize=18);

Plot a (static) choropleth map

In [74]:
# Initialize
fig, ax = plt.subplots(figsize=(10,10))

# Plot the city limits
city_limits.plot(ax=ax, edgecolor='black', facecolor='none', linewidth=4)

# Plot the median NDVI
hoods.plot(column='median_NDVI', legend=True, ax=ax)

# Format
ax.set_axis_off()

Plot an (interactive) choropleth map

In [75]:
# trim to only the columns we want to plot
cols = ["median_NDVI", "ZillowName", "geometry"]

# Plot the parks colored by median NDVI
hoods[cols].hvplot.polygons(c="median_NDVI", 
                            geo=True, 
                            crs=32618, 
                            width=700, 
                            height=600, 
                            cmap='viridis',
                            hover_cols=["ZillowName"])
/Users/nhand/opt/miniconda3/envs/musa-550-fall-2020/lib/python3.7/site-packages/holoviews/plotting/util.py:685: MatplotlibDeprecationWarning: The global colormaps dictionary is no longer considered public API.
  [cmap for cmap in cm.cmap_d if not
/Users/nhand/opt/miniconda3/envs/musa-550-fall-2020/lib/python3.7/site-packages/holoviews/plotting/util.py:685: MatplotlibDeprecationWarning: The global colormaps dictionary is no longer considered public API.
  [cmap for cmap in cm.cmap_d if not
Out[75]:

Plot a bar chart of the neighborhoods with the top 20 largest median values

In [76]:
# calculate the top 20
cols = ['ZillowName', 'median_NDVI']
top20 = hoods[cols].sort_values('median_NDVI', ascending=False).iloc[:20]

top20.hvplot.bar(x='ZillowName', y='median_NDVI', rot=90)
Out[76]:

Plot a bar chart of the neighborhoods with the bottom 20 largest median values

In [77]:
cols = ['ZillowName', 'median_NDVI']
bottom20 = hoods[cols].sort_values('median_NDVI', ascending=True).iloc[:20]


bottom20.hvplot.bar(x='ZillowName', y='median_NDVI', rot=90)
Out[77]:

That's it!

  • Next week: introduction to APIs and downloading data from the web in Python
  • Pre-recorded lecture will be posted on Sunday
  • See you next Thursday!