Introduction to Remote-sensing, LandSat, and Google Earth Engine

Photo by NASA on Unsplash

Remote sensing is the process of analyzing geospatial images often taken by satellites. These multi-band images, which may have more than the typical 3 visible bands (red, green, blue), can be rendered to create custom visuals to highlight certain topics a visual color image (RGB) can not. (Please recall, a visual color image from a digital camera is a compilation of the three layers: red-green-blue.)

https://en.wikipedia.org/wiki/File:RGBLayers.svg

Open-sourced satellite images are now publicly available through Google Earth Engine’s repository and can be analyzed through their platform. Landsat 7 and Landsat 8 are a few of the operating satellites that presently take pictures of the earth’s surface.

History of Landsat

I thought it might be necessary to briefly talk about the history of the Landsat program, created by NASA(National Aeronautics and Space Administration) and USGS(United States Global Survey) collectively.

In 1972, NASA and USGS launched the Earth Resources Technology Satelite (ERTS-1) and was later renamed Landsat 1 in 1975. The objective was to capture images of the earth’s surface. Each image covered a region of 170km x 185km (106 mi x 115 mi) with a resolution of 80 meters per pixel, and the satellite would cycle back to its original location every 18 days.

Landsat 1

The images installed were to record four separate bands:
B4 = Green (0.5–0.6 µm)
B5 = Red (0.6–0.7 µm)
B6 = Near Infrared 1 (0.7–0.8 µm)
B7 = Near Infrared 2 (0.8–1.1 µm)

NASA/USGS terminates the Landsat 1 in 1978 but launches newer expeditions before termination to supplement the time gap. Many years later, Landsat 7 begins in 1999, capturing an 8 band image with a pixel resolution of 30 meters. However, the images were difficult to obtain and were therefore expensive to purchase even one image.

B1 = Blue (0.45–0.52 µm)
B2 = Green (0.52–0.6 µm)
B3 = Red (0.63–0.69 µm)
B4 = Near Infrared (0.77–0.90 µm)
B5 = Shortwave Infrared 1 (1.55–1.75 µm)
B6 = Low/High-gain Thermal Infrared 1 (10.40–12.50 µm)
B7 = Shortwave Infrared 2 (2.08–2.35 µm)
B8 = Panchromatic (0.52–0.90 µm)

In 2008, USGS/NASA made Landsat images free of charge and therefore opened the door to private study. The most notable study was by Matthew Hansen from the University of Maryland, who analyzed forest gain/forest loss. The study also came with an app for users to view deforestation geographically. This research would have needed to process terabytes of data, and showcasing an analysis at this level intrigued many people.

Today, anyone can analyze open-sourced images through Google Earth Engine(GEE) which holds many datasets(including Landsat, Sentinel, etc.). GEE is a little different from Google Earth. GEE serves as a two-part cloud computing platform, first storing open-sourced images/collections and second rendering the images to highlight certain topics. For example, a Normalized Difference Vegetation Index (NDVI; a measure of live green vegetation) can be computed using the following bands:

NDVI = (Near Infrared-Red)/(Near Infrared+Red)

https://developers.google.com/earth-engine/tutorials/community/modis-ndvi-time-series-animation

The index values range from -1 to 1, where the greater the number would imply higher vegetation. Please see here for more information.

The GEE platform (https://code.earthengine.google.com/) does require an account(sign-up is super fast) and operates with JavaScript. There is a Python API; however, I found it easier to learn how to use JavaScript through their tutorials at https://developers.google.com/earth-engine/guides.

Hello! My name is Albert Um.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store