Data Fusion
The combination of multiple remote sensing data sources can provide invaluable information that would not be obtained with a single sensor alone. Observation-level or pixel-based fusion combines pixels from different sources to form an image containing new information (more information). Two widely used examples of pixel-based fusion are pan-sharpening and the fusion of radar and multispectral optical images. On the one hand, pan-sharpening consists of blending a high-resolution panchromatic image with a lower resolution multispectral image to obtain a high-resolution multispectral image. On the other hand, the combination of radar and optical imagery provides images with increased spectral resolution that can mitigate the drawbacks of each product (such as cloud cover for optical images), but also provide increased temporal resolution with more frequent overpasses.
Available scripts
- Detection of Lake Extent Changes with Landsat
- Mapping Soybean and Maize NDVI with Sentinel-1 and Sentinel-2
- Sentinel-2 with cloudy parts replaced by Sentinel-1
- Ship detection with Sentinel-1 and Sentinel-2
- Built-up area detection with Sentinel-1 and Sentinel-2
- Sentinel-3 OLCI true color under Sentinel-5P products
- DEM contour lines over true color Landsat 8
- Forest fire progression monitoring with Sentinel-2 and Sentinel-1
- S2L2A Enhancement using S3SLTR F2 For Wildfire Detection
- Historic NDVI changes with Landsat 4-5 TM and Landsat 8
- Sand-Oriented Land Cover Classification with Sentinel-1 and Sentinel-2
- Thermal visualization and water in wetlands with Landsat 8 L1 & L2
- Land Surface Temperature with S3 SLSTR and OLCI
- Visualising floods using Sentinel-1 overlaid on Sentinel-2 imagery