What’s New in ENVI 5.3

As the geospatial industry continues to evolve, so too does the software. Here’s a look at what’s new in ENVI 5.3, the latest release of the popular image analysis software from Exelis VIS.

ENVI

  • New data formats and sensors. ENVI 5.3 now provides support to read and display imagery from Deimos-2, DubaiSat-2, Pleiades-HR and Spot mosaic tiles, GeoPackage vectors, Google-formatted SkySat-2, and Sentinel-2.
  • Spectral indices. In addition to the numerous indices already included in ENVI (more than 60), new options include the Normalized Difference Mud Index (NDMI) and Modified Normalized Difference Water Index (MNDWI).
  • Atmospheric correction. The Quick Atmospheric Correction (QUAC) algorithm has been updated with the latest enhancements from Spectral Sciences, Inc. to help improve algorithm accuracy.
  • Digital elevation model. Users can now download the GMTED2010 DEM (7.5 arc seconds resolution) from the Exelis VIS website for use in improving the accuracy of Image Registration using RPC Orthorectification and Auto Tie Point Generation.
  • Point clouds. If you subscribe to the ENVI Photogrammetry Module (separate license from ENVI), then the Generate Point Clouds by Dense Image Matching tool is now available for generating 3D point clouds from GeoEye-1, IKONOS, Pleiades-1A, QuickBird, Spot-6, WorldView-1,-2 and -3, and the Digital Point Positioning Data Base (DPPDB).
  • LiDAR. The ENVI LiDAR module has been merged with ENVI and can now be launched directly from within the ENVI interface.
  • Geospatial PDF. Your views, including all currently displayed imagery, layers and annotations in those views, can now be exported directly to geospatial PDF files.
  • Spatial subset. When selecting files to add to the workspace, the File Selection tool now includes options to subset files by raster, vector, region of interest or map coordinates.
  • Regrid raster. Users can now regrid raster files to custom defined grids (geographic projection, pixel size, spatial extent and/or number of rows and columns).
  • Programming. The latest ENVI release also includes dozens of new tasks, too numerous to list here, that can be utilized for developing custom user applications in ENVI and ENVI Services Engine.

To learn more about the above features and improvements, as well as many more, read the latest release notes or check out the ENVI help documentation.

ENVI 5.3

Advertisements

From here to there – and everywhere – with Geospatial Cloud Computing

Reposted from Exelis VIS, Imagery Speaks, June 30, 2015, by James Goodman, CEO HySpeed Computing.

In a previous article we presented an overview of the advantages of cloud computing in remote sensing applications, and described an upcoming prototype web application for processing imagery from the HICO sensor on the International Space Station.

First, as a follow up, we’re excited to announce availability of the HICO Image Processing System – a cloud computing platform for on-demand remote sensing image analysis and data visualization.

HICO IPS - Chesapeake Bay - Chlorophyll

HICO IPS allows users to select specific images and algorithms, dynamically launch analysis routines in the cloud, and then see results displayed directly in an online map interface. System capabilities are demonstrated using imagery collected by the Hyperspectral Imager for the Coastal Ocean (HICO) on the International Space Station, and example algorithms are included for assessing coastal water quality and other nearshore environmental conditions.

This is an application-server, and not just a map-server. Thus, HICO IPS is delivering on-demand image processing of real physical parameters, such as chlorophyll concentration, inherent optical properties, and water depth.

The system was developed using a combination of commercial and open-source software, with core image processing performed using the recently released ENVI Services Engine. No specialized software is required to run HICO IPS. You just need an internet connection and a web browser to run the application (we suggest using Google Chrome).

Beyond HICO, and beyond the coastal ocean, the system can be configured for any number of different remote sensing instruments and applications, thus providing an adaptable cloud computing framework for rapidly implementing new algorithms and applications, as well as making these applications and their output readily available to the global user community.

However, this is but one application. Significantly greater work is needed throughout the remote sensing community to leverage these and other exciting new tools and processing capabilities. To participate in a discussion of how the future of geospatial image processing is evolving, and see a presentation of the HICO IPS, join us at the upcoming ENVI Analytics Symposium in Boulder, CO, August 25-26.

With this broader context in mind, and as a second follow-up, we ask the important question when envisioning this future of how we as an industry, and as a research community, are going to get from here to there?

The currently expanding diversity and volume of remote sensing data presents particular challenges for aggregating data relevant to specific research applications, developing analysis tools that can be extended to a variety of sensors, efficiently implementing data processing across a distributed storage network, and delivering value-added products to a broad range of stakeholders.

Based on lessons learned from developing the HICO IPS, here we identify three important requirements needed to meet these challenges:

  • Data and application interoperability need to continue evolving. This need speaks to the use of broadly accessible data formats, expansion of software binding libraries, and development of cross-platform applications.
  • Improved mechanisms are needed for transforming research achievements into functional software applications. Greater impact can be achieved, larger audiences reached, and application opportunities significantly enhanced, if more investment is made in remote sensing technology transfer.
  • Robust tools are required for decision support and information delivery. This requirement necessitates development of intuitive visualization and user interface tools that will assist users in understanding image analysis output products as well as contribute to more informed decision making.

These developments will not happen overnight, but the pace of the industry indicates that such transformations are already in process and that geospatial image processing will continue to evolve at a rapid rate. We encourage you to participate.

About HySpeed Computing: Our mission is to provide the most effective analysis tools for deriving and delivering information from geospatial imagery. Visit us at hyspeedcomputing.com.

To access the HICO Image Processing System: http://hyspeedgeo.com/HICO/

Sunglint Correction in Airborne Hyperspectral Images Over Inland Waters

Announcing recent publication in Revista Brasileira de Cartographia (RBC) – the Brazilian Journal of Cartography. The full text is available open-access online: Streher et al., 2014, RBC, International Issue 66/7, 1437-1449.

Title: Sunglint Correction in Airborne Hyperspectral Images Over Inland Waters

Authors: Annia Susin Streher, Cláudio Clemente Faria Barbosa, Lênio Soares Galvão, James A. Goodman, Evlyn Marcia Leão de Moraes Novo, Thiago Sanna Freire Silva

Abstract: This study assessed sunglint effects, also known as the specular reflection from the water surface, in high-spatial and high-spectral resolution, airborne images acquired by the SpecTIR sensor under different view-illumination geometries over the Brazilian Ibitinga reservoir (Case II waters). These effects were corrected using the Goodman et al. (2008) and the Kutser et al. (2009) methods, and a Kutser et al. (2009) variant based on the continuum removal technique to calculate the oxygen absorption band depth. The performance of each method for reducing sunglint effects was evaluated by a quantitative analysis of pre- and post-sunglint correction reflectance values (residual reflectance images). Furthermore, the analysis was supported by inspection of the reflectance differences along transects placed over homogeneous masses of waters and over specific portions of the scenes affected and non-affected by sunglint. Results showed that the algorithm of Goodman et al. (2008) produced better results than the other two methods, as it approached zero amplitude reflectance values between homogenous water masses affected and non-affected by sunglint. The Kutser et al. (2009) method also presented good performance, except for the most contaminated sunglint portions of the scenes. When the continuum removal technique was incorporated to the Kutser et al. (2009) method, results varied with the scene and were more sensitive to atmospheric correction artifacts and instrument signal-to noise ratio characteristics.

Keywords: coral reefs; remote sensing; field spectra; scale; ecology; biodiversity; conservation hyperspectral remote sensing, specular reflection, water optically active substances, SpecTIR sensor

Figure 5. Deglinted SpecTIR hyperspectral of Ibitinga reservoir (São Paulo, Brazil) images and resultant reflectance profiles after correction by the methods of: (a) Goodman et al. (2008); (b) Kutser et al. (2009); and (c) modified Kutser et al. (2009).

Streher et al. 2015 Fig 5 Deglint

Linking Coral Reef Remote Sensing and Field Ecology: It’s a Matter of Scale

Announcing recent publication in the Journal of Marine Science and Engineering (JMSE). The full text is available open-access online: Lucas and Goodman, JMSE, 2015, vol. 3(1): 1-20.

Authors: Matthew Q. Lucas and James Goodman

Abstract: Remote sensing shows potential for assessing biodiversity of coral reefs. Important steps in achieving this objective are better understanding the spectral variability of various reef components and correlating these spectral characteristics with field-based ecological assessments. Here we analyze >9400 coral reef field spectra from southwestern Puerto Rico to evaluate how spectral variability and, more specifically, spectral similarity between species influences estimates of biodiversity. Traditional field methods for estimating reef biodiversity using photoquadrats are also included to add ecological context to the spectral analysis. Results show that while many species can be distinguished using in situ field spectra, the addition of the overlying water column significantly reduces the ability to differentiate species, and even groups of species. This indicates that the ability to evaluate biodiversity with remote sensing decreases with increasing water depth. Due to the inherent spectral similarity amongst many species, including taxonomically dissimilar species, remote sensing underestimates biodiversity and represents the lower limit of actual species diversity. The overall implication is that coral reef ecologists using remote sensing need to consider the spatial and spectral context of the imagery, and remote sensing scientists analyzing biodiversity need to define confidence limits as a function of both water depth and the scale of information derived, e.g., species, groups of species, or community level.

Keywords: coral reefs; remote sensing; field spectra; scale; ecology; biodiversity; conservation coral reefs; remote sensing; field spectra; scale; ecology; biodiversity; conservation

Figure 8. Estimates of biodiversity

Figure 8. Estimates of biodiversity calculated using the exponential of Shannon entropy, exp(H′), illustrating influence of increasing spectral similarity amongst reef species as a function of increasing water depth: 0* is biodiversity obtained from photoquadrats, 0** is biodiversity calculated using only those species considered prevalent or sizable enough to significantly influence the remote sensing signal (i.e., species included in the spectral measurements for this study area), and 0–10 is biodiversity calculated with consideration for optical similarities amongst species (i.e., based on hierarchical clustering of reflectance spectra as influenced by the overlying water column).

A Look at What’s New in ENVI 5.2

Earlier this month Exelis Visual Information Solutions released ENVI 5.2, the latest version of their popular geospatial analysis software.

ENVI 5.2

ENVI 5.2 includes a number of new image processing tools as well as various updates and improvements to current capabilities. We’ve already downloaded our copy and started working with the new features. Here’s a look at what’s included.

A few of the most exciting new additions to ENVI include the Spatiotemporal Analysis tools, Spectral Indices tool, Full Motion Video player, and improved integration with ArcGIS:

  • Spatiotemporal Analysis. Just like the name sounds, this feature provides users the ability to analyze stacks of imagery through space and time. Most notably, tools are now available to build a raster series, where images are ordered sequentially by time, to reproject images from multiple sensors into a common projection and grid size, and to animate and export videos of these raster series.
  • Spectral Indices. Expanding on the capabilities of the previous Vegetation Index Calculator, the new Spectral Indices tool includes 64 different indices, which in addition to analyzing vegetation can also be used to investigate geology, man-made features, burned areas and water. The tool conveniently selects only those indices that can be calculated for a given input image dependent on its spectral characteristics. So when you launch the tool you’ll only see those indices that can be calculated using your imagery.
  • Full Motion Video. ENVI 5.2 now supports video, allowing users to not just play video, but also convert video files to time-enabled raster series and extract individual video frames for analysis using standard ENVI tools. Supported file formats include Skybox SkySat video, Adobe Flash Video and Shockwave Flash, Animated GIF, Apple Quicktime, Audio Video Interleaved, Google WebM Matroska, Matroska Video, Motion JPEG and JPEG2000, MPEG-1 Part 2, MPEG-2 Transport Stream, MPEG-2 Part 2, MPEG-4 Part 12 and MPEG-4 Part 14.
  • Integration with ArcGIS. Originally introduced in ENVI 5.0, additional functionality has been added for ENVI to seamlessly interact with ArcGIS, including the ability to integrate analysis tools and image output layers in a concurrent session of ArcMap. For those working in both software domains, this helps simplify your geospatial workflows and more closely integrate your raster and vector analyses.

Other noteworthy additions in this ENVI release include:

  • New data types. ENVI 5.2 now provides support to read and display imagery from AlSat-2A, Deimos-1, Gaofen-1, Proba-V S10, Proba-V S1, SkySat-1, WorldView-3, Ziyuan-1-02C and Ziyuan-3A, as well as data formats GRIB-1, GRIB-2, Multi-page TIFF and NetCDF-4.
  • NNDiffuse Pan Sharpening. A new pan sharpening tool based on nearest neighbor diffusion has been added, which is multi-threaded for high-performance image processing.
  • Scatter Plot Tool. The previous scatter plot tool has been updated and modernized, allowing users to dynamically switch bands, calculate spectral statistics, interact with ROIs, and generate density slices of the displayed spectral data.
  • Raster Color Slice. This useful tool has also been updated, particularly from a performance perspective, providing dynamic updates in the image display according to parameter changes made in the tool.

For those interested in implementing ENVI in the cloud, the ENVI 5.2 release also marks the release of ENVI Services Engine 5.2 , which is an enterprise version of ENVI that facilitates on-demand, scalable, web-based image processing applications. As an example, HySpeed Computing is currently developing a prototype implementation of ESE for processing hyperspectral imagery from the HICO sensor on the International Space Station. The HICO Image Processing System will soon be publically available for testing and evaluation by the community. A link to access the system will be provided on our website once it is released.

HICO IPS

To learn about the above features, and many more not listed here, see the video from Exelis VIS and/or read the latest release notes on ENVI 5.2.

We’re excited to put the new tools to work. How about you?

Enhancing the Landsat 8 Quality Assessment band – Detecting snow/ice using NDSI

This is the third installment in a series on developing a set of indices to automatically delineate features such as clouds, snow/ice, water and vegetation in Landsat imagery.

In this series of investigations, the challenge we have given ourselves is to utilize relatively simple indices and thresholds to refine some or all of the existing Landsat 8 quality assessment procedure, and wherever possible to also maintain backward compatibility with previous Landsat missions.

The two previous articles focused on Differentiating water using NDWI and Using NDVI to delineate vegetation.

Landsat 8 Lake Tahoe Snow/Ice

Here we explore the Normalized Difference Snow Index (NDSI) (Dozier 1989, Hall et al. 1995; Hall and Riggs 2014) to demonstrate how this index can be utilized to delineate the presence of snow/ice.

Note that NDSI is already included in the Landsat 8 quality assessment procedure; however, as currently implemented, NDSI is used primarily as a determining parameter in the decision trees for the Cloud Cover Assessment algorithms.

Additionally, as discussed in the MODIS algorithm documentation (Hall et al. 2001), NDSI has some acknowledged limits, in that snow can sometimes be confused with water, and that lower NDSI thresholds are occasionally needed to properly identify snow covered forests. This suggests that NDSI performance can be improved through integration with other assessment indices.

For now we consider NDSI on its own, but with plans to ultimately integrate this and other indices into a rule-based decision tree for generating a cohesive overall quality assessment.

NDSI is calculated using the following general equation: NDSI = (Green – SWIR)/(Green + SWIR). To calculate this index for our example Landsat 8 images in ENVI, we used Band Math (Toolbox > Band Ratio > Band Math) to implement the following equation (float(b3)-float(b6))/(float(b3)+float(b6)), where b3 is Band-3 (Green), b6 is Band-6 (SWIR), and the float() operation is used to transform integers to floating point values and avoid byte overflow.

After visually inspecting output to develop thresholds based on observed snow/ice characteristics in our test images, results of the analysis indicate the following NDSI snow/ice thresholds: low confidence (NDSI ≥ 0.4), medium confidence (NDSI ≥ 0.5), and high confidence (NDSI ≥ 0.6).

Example 1: Lake Tahoe

This example illustrates output from a Landsat 8 scene of Lake Tahoe acquired on April 12, 2014 (LC80430332014102LGN00). For this image, both the NDSI output and QA assessment successfully differentiate snow/ice from other image features. The only significant difference, as can be observed here in the medium confidence output, is that the QA assessment identifies two lakes to the east and southeast of Lake Tahoe that are not included in the NDSI output. Without knowledge of ground conditions at the time of image acquisition, however, it is not feasible to assess the relative accuracy of whether these are, or are not, ice-covered lakes. Otherwise, the snow/ice output is in agreement for this image.

Landsat 8 Lake Tahoe NDSI Snow/Ice

Example 2: Cape Canaveral

This example illustrates output from a Landsat 8 scene of Cape Canaveral acquired on October 21, 2013 (LC80160402013294LGN00). Given its location, there is not expected to be any snow/ice identified in the image, as is the case for the high confidence NDSI output. However, for the medium and low confidence NDSI output there is some confusion with clouds, and for the QA assessment there is confusion with clouds and sand. This suggests a need to either incorporate other indices to refine the snow/ice output and/or include some geographic awareness in the analysis to eliminate snow/ice in regions where it is not expected to occur.

Landsat 8 Cape Canaveral NDSI Snow/Ice

Stay tuned for future posts on other Landsat 8 assessment options, as well as a discussion on how to combine the various indices into a single integrated quality assessment algorithm.

In the meantime, we welcome your feedback on how these indices perform on your own images.

– –

Dozier, J. (1989). Spectral signature of alpine snow cover from the Landsat Thematic Mapper. Remote sensing of Environment, 28, p. 9-22.

Hall, D. K., Riggs, G. A., and Salomonson, V. V. (1995). Development of methods for mapping global snow cover using Moderate Resolution Imaging Spectroradiometer (MODIS) data. Remote sensing of Environment, 54(2), p. 127-140.

Hall, D. K., Riggs, G. A., Salomonson, V. V., Barton, J. S., Casey, K., Chien, J. Y. L., DiGirolamo, N. E., Klein, A. G., Powell, H. W., and Tait, A. B. (2001). Algorithm theoretical basis document (ATBD) for the MODIS snow and sea ice-mapping algorithms. NASA GSFC. 45 pp.

Hall, D. K., and Riggs, G. A. (2014). Normalized-Difference Snow Index (NDSI). in Encyclopedia of Snow, Ice and Glaciers, Eds. V. P. Singh, P. Singh, and U. K. Haritashya. Springer. p. 779-780.

Enhancing the Landsat 8 Quality Assessment band – Using NDVI to delineate vegetation

This is the second installment in a series on developing alternative indices to automatically delineate features such as clouds, snow/ice, water and vegetation in Landsat imagery.

The previous article focused on utilizing the Normalized Difference Water Index to differentiate water from non-water (see Differentiating water using NDWI).

In this series of investigations, the challenge we have given ourselves is to utilize relatively simple indices and thresholds to refine some or all of the existing Landsat 8 quality assessment procedure, and wherever possible to also maintain backward compatibility with previous Landsat missions.

Landsat8 Lake Tahoe Vegetation

In this article we explore one of the most commonly used vegetation indices, the Normalized Difference Vegetation Index (NDVI) (Kriegler et al. 1969, Rouse et al. 1973, Tucker 1979), to see how it can be utilized to delineate the presence of vegetation. Since the Landsat 8 quality assessment band currently does not include output for vegetation, NDVI seems like a logical foundation for performing this assessment.

NDVI is typically used to indicate the amount, or relative density, of green vegetation present in an image; however, here we adapt this index to more simply indicate confidence levels with respect to the presence of vegetation.

To calculate NDVI in ENVI, you can either directly use the included NDVI tool (Toolbox > Spectral > Vegetation > NDVI) or calculate NDVI yourself using Band Math (Toolbox > Band Ratio > Band Math). If using Band Math, then implement the following equation (float(b5)-float(b4))/(float(b5)+float(b4)), where b4 is Band-4 (Red), b5 is Band-5 (NIR), and the float() operation is used to transform integers to floating point values and avoid byte overflow.

After visually inspecting output to develop thresholds based on observed vegetation characteristics in our test images, results of the analysis indicate the following NDVI vegetation thresholds: low confidence (NDVI ≥ 0.2), medium confidence (NDWI ≥ 0.3), and high confidence (NDWI ≥ 0.4).

Example 1: Lake Tahoe

This example illustrates output from a Landsat 8 scene of Lake Tahoe acquired on April 12, 2014 (LC80430332014102LGN00). The NDVI output for this image successfully differentiates vegetation from water, cloud, snow/ice and barren/rocky land. Note particularly how the irrigated agricultural fields to the east and southeast of Lake Tahoe are appropriately identified, and how the thresholds properly indicate increased vegetation trending westward of Lake Tahoe as one transitions downslope from the Sierra Nevada into the Central Valley of California.

Landsat8 Lake Tahoe NDVI Vegetation

Example 2: Cape Canaveral

This example illustrates output from a Landsat 8 scene of Cape Canaveral acquired on October 21, 2013 (LC80160402013294LGN00). As with the Lake Tahoe example, NDVI once again performs well at differentiating vegetation from water, cloud and barren land. Given the cloud extent and high prevalence of both small and large water bodies present in this image, NDVI demonstrates a robust capacity to effectively delineate vegetation. Such results are not unexpected given the general acceptance and applicability of this index in remote sensing science.

Landsat8 Cape Canaveral NDVI Vegetation

We’ll continue to explore other enhancements in future posts, and ultimately combine the various indices into a single integrated quality assessment algorithm.

In the meantime, we’re interested in hearing your experiences working with Landsat quality assessment and welcome your suggestions and ideas.

– –

Kriegler, F.J., W.A. Malila, R.F. Nalepka, and W. Richardson (1969). Preprocessing transformations and their effects on multispectral recognition. Proceedings of the Sixth International Symposium on Remote Sensing of Environment, p. 97-131.

Rouse, J. W., R. H. Haas, J. A. Schell, and D. W. Deering (1973). Monitoring vegetation systems in the Great Plains with ERTS, Third ERTS Symposium, NASA SP-351 I, p. 309-317.

Tucker, C. J. (1979). Red and photographic infrared linear combinations for monitoring vegetation. Remote sensing of Environment, 8(2), p. 127-150.

Enhancing the Landsat 8 Quality Assessment band – Differentiating water using NDWI

(Update: 09-23-2014) Just added – see our related post on Using NDVI to delineate vegetation.

Are you working with Landsat 8 or other earlier Landsat data? Are you looking for solutions to automatically delineate features such as clouds, snow/ice, water and vegetation? Have you looked at the Landsat 8 Quality Assessment band, but find the indicators don’t meet all your needs?

If so, you’re not alone. This is a common need in most remote sensing applications.

After recently exploring the contents of the Quality Assessment (QA) band for examples from Lake Tahoe and Cape Canaveral (see Working with Landsat 8), it became readily apparent that there is room for improvement in the quality assessment indicators. So we set out to identify possible solutions to help enhance the output.

Landsat8 Lake Tahoe - Water

The challenge we gave ourselves was to utilize only relatively simple indices and thresholds to further refine some or all of the existing Landsat 8 quality assessment procedure, and wherever possible to also maintain backward compatibility with previous Landsat missions.

As a first step, let’s explore how the Normalized Difference Water Index (NDWI), as described by McFeeters (1996), can be utilized to differentiate water from non-water.

To calculate NDWI in ENVI, we used Band Math (Toolbox > Band Ratio > Band Math) to implement the following equation (float(b3)-float(b5))/(float(b3)+float(b5)), where b3 is Band-3 (Green), b5 is Band-5 (NIR), and the float() operation is used to transform integers to floating point values and avoid byte overflow.

The NDWI output was visually inspected to develop thresholds based on known image and landscape features. Additionally, as with the QA band, rather than identify a single absolute threshold, three threshold values were used to indicate low, medium and high confidence levels whether water is present.

As a caveat at this stage, note that this analysis currently only incorporates two example test images, which is far from rigorous. Many more examples would need to be incorporated to perform thorough calibration and validation of the proposed index. It is also expected that developing a robust solution will entail integrating the different indices into a rule-based decision tree (e.g., if snow/ice or cloud, then not water).

Results of the NDWI analysis for water indicate the following: low confidence (NDWI ≥ 0.0), medium confidence (NDWI ≥ 0.06), and high confidence (NDWI ≥ 0.09).

Example 1: Lake Tahoe

This example illustrates output for a subset Landsat 8 scene of Lake Tahoe acquired on April 12, 2014 (LC80430332014102LGN00). Here we see improvement over the QA band water index, which exhibits significant confusion with vegetation. The NDWI output performs very well at the high confidence level, but includes some confusion with snow/ice and cloud at the low and medium confidence levels. We expect much of this confusion can be resolved once a decision tree is incorporated into the analysis.

Landsat8 Lake Tahoe Quality Assessment - Water

Example 2: Cape Canaveral

This example illustrates output for a subset Landsat 8 scene of Cape Canaveral acquired on October 21, 2013 (LC80160402013294LGN00). As with the previous example, there is significant improvement over the existing QA band water index. There is again some confusion with cloud at the low and medium confidence levels, but strong performance at the high confidence level. As a result, this output also shows promise as the foundation for further improvements using a decision tree.

Landsat8 Cape Canaveral Quality Assessment - Water

We’ll continue to explore other enhancements in future posts. In the meantime, we’d love to hear your experiences working with Landsat quality assessment and welcome your suggestions and ideas.

– –

McFeeters, S. K. (1996). The use of the Normalized Difference Water Index (NDWI) in the delineation of open water features. International Journal of Remote Sensing, 17(7), 1425-1432.

Working with Landsat 8 – Using and interpreting the Quality Assessment (QA) band

So you’ve downloaded a Landsat 8 scene and eager to begin your investigation. As you get started, let’s explore how the Quality Assessment band that is distributed with the data can be used to help improve your analysis.

Landsat8 Lake Tahoe

What is the QA band?

As summarized on the USGS Landsat 8 product information website: “Each pixel in the QA band contains a decimal value that represents bit-packed combinations [QA bits] of surface, atmosphere, and sensor conditions that can affect the overall usefulness of a given pixel.”

“Rigorous science applications seeking to optimize the value of pixels used in a study will find QA bits useful as a first level indicator of certain conditions. Otherwise, users are advised that this file contains information that can be easily misinterpreted and it is not recommended for general use.”

What are QA bits?

Rather than utilize multiple bands for indicating conditions such as water, clouds and snow, the QA band integrates this information into 16-bit data values referred to as QA bits. As a result, a significant amount of information is packed into a single band; however, this also means that certain steps are required to extract the multi-layered information content from the integrated QA bits.

“The pixel values in the QA file must be translated to 16-bit binary form to be used effectively. The gray shaded areas in the table below show the bits that are currently being populated in the Level 1 QA Band, and the conditions each describe. None of the currently populated bits are expected to exceed 80% accuracy in their reported assessment at this time.”

Landsat8 QA Bands

“For the single bits (0, 1, 2, and 3):

  • 0 = No, this condition does not exist
  • 1 = Yes, this condition exists.”

“The double bits (4-5, 6-7, 8-9, 10-11, 12-13, and 14-15) represent levels of confidence that a condition exists:

  • 00 = Algorithm did not determine the status of this condition
  • 01 = Algorithm has low confidence that this condition exists (0-33 percent confidence)
  • 10 = Algorithm has medium confidence that this condition exists (34-66 percent confidence)
  • 11 = Algorithm has high confidence that this condition exists (67-100 percent confidence).”

How are QA bits calculated?

QA bit values are calculated at various stages during the radiometric and geometric correction process. An overview of the algorithms used for calculating QA bits is provided in the LDCM CAL/VAL Algorithm Description Document.

The single QA bits (0-3) are used to signify: missing data and pixels outside the extent of the image following geometric correction (designated fill); dropped lines (dropped frame); and pixels hidden from sensor view by the terrain (terrain occlusion).

The double QA bits (4-15) are calculated using the LDCM Cloud Cover Assessment (CCA) system, which consists of several intermediate CCA algorithms whose results are merged to create final values for each Landsat 8 scene. The algorithms utilize a series of spectral tests, and in one case a statistical decision tree model, to assess the presence of cloud, cirrus cloud, snow/ice, and water.

As the name implies, the heritage of the CCA system is based on cloud detection; hence algorithms are directed primarily at identifying clouds, with secondary attention to snow/ice and water. Keep this in mind when interpreting results, particularly with respect to water discrimination, which is reportedly poor in most cases.

How do I use QA bits?

While it is feasible to translate individual QA bits into their respective information values, or implement thresholds to extract specific values or ranges of values, this isn’t practical for accessing the full information content contained in the QA band.

Instead, try using the L-LDOPE Toolbelt, a no-cost tool available from the USGS Landsat 8 website that includes “functionality for computing histograms, creating masks, extracting statistics, reading metadata, reducing spatial resolution, band and spatial subsetting, and unpacking bit-packed values… the new tool [also] extracts bits from the OLI Quality Assessment (QA) band to allow easy identification and interpretation of pixel condition.”

Note that the L-LDOPE Toolbelt does not include a graphical user interface, but instead operates using command-line instructions. So be sure to download the user guide, which includes the specific directions for implementing the various executables.

L-LDOPE Toolbelt example

As an example, let’s walk through the steps needed to unpack the QA bits from a Landsat 8 image of Lake Tahoe using a Windows 7 x64 desktop system:

  • Unzip the L-LDOPE Toolbelt zip file and place the contents in the desired local directory.
  • Open the Windows Command Prompt (All Programs > Accessories > Command Prompt) and navigate to the respective ‘bin’ directory for your operating system (‘windows64bit_bin’ in our example).
  • For simplicity, copy the QA file (e.g., LC80430332014102LGN00_BQA.TIF) to the same ‘bin’ directory as identified in the previous step. For users familiar with command-line applications the data can be left in a separate directory with the executable command adjusted accordingly.
  • Execute the unpacking application (unpack_oli_qa.exe) using the following command (typed entirely on one line):

Landsat8 Unpack QA

  • The above example extracts all the QA bits using the default confidence levels and places them in separate output files.
  • Refer to the user guide for instructions on how to change the defaults, extract only select QA bits, and/or combine output into a single file.

Example 1: Lake Tahoe

This example illustrates QA output for a subset Landsat 8 scene of Cape Canaveral acquired on October 21, 2013 (LC80160402013294LGN00). Here the cloud discrimination is reasonable but includes confusion with beach areas along the coastline, the snow/ice output interestingly misidentifies some cloud and beach areas, and water discrimination is again poorly defined.

Landsat8 Lake Tahoe QA

Example 2: Cape Canaveral

This example illustrates QA output for a subset Landsat 8 scene of Lake Tahoe acquired on April 12, 2014 (LC80430332014102LGN00). Note that snow/ice in the surrounding mountains in identified with reasonable accuracy, cloud discrimination is also reasonable but includes significant confusion with snow/ice, and water is poorly characterized, including many extraneous features beyond just water bodies.

Landsat8 Cape Canaveral QA

With these examples in mind, it is worth repeating: “Rigorous science applications seeking to optimize the value of pixels used in a study will find QA bits useful as a first level indicator of certain conditions. Otherwise, users are advised that this file contains information that can be easily misinterpreted and it is not recommended for general use.”

Be sure to keep this in mind when exploring the information contained in the QA band.

For more info on L-LDOPE Toolbet: https://landsat.usgs.gov/L-LDOPE_Toolbelt.php

For more info on Landsat 8: https://landsat.usgs.gov/landsat8.php

 

VISualize 2014 – Call for abstracts now open

UPDATE (6-April-2015): Announcing the ENVI Analytics Symposium – taking place in Boulder, CO from August 25-26, 2015. Those looking for the VISualize symposium, which has been indefinitely postponed, should consider attending the inaugural ENVI Analytics Symposium as a great opportunity to explore the next generation of geoanalytic solutions.

Just announced!  VISualize 2014, the annual IDL & ENVI User Group Meeting hosted by Exelis Visual Information Solutions, will be taking place October 14-16 at the World Wildlife Fund in Washington, DC.

HySpeed Computing is honored to once again be co-sponsoring this year’s VISualize. We are excited to speak with you and see your latest remote sensing applications.

At this year’s meeting HySpeed Computing will be presenting results from our latest project – a prototype cloud computing system for remote sensing image processing and data visualization. We hope to see you there.

Abstract submission deadline is September 12. Register today!

VISualize2014

“Please join us at VISualize 2014, October 14th – 16th, at the World Wildlife Fund in Washington, DC. This three day event explores real-world applications of ENVI and IDL with a specific focus on Modern Approaches for Remote Sensing & Monitoring Environmental Extremes.

Suggested topics include:

  • Using new data platforms such as UAS, microsatellites, and SAR sensors for environmental assessments
  • Land subsidence monitoring and mapping techniques
  • Remote sensing solutions for precision agriculture mapping
  • Drought, flood, and extreme precipitation event monitoring and assessment
  • Wildfire and conservation area monitoring, management, mitigation, and planning
  • Monitoring leaks from natural gas pipelines

Don’t miss this excellent opportunity to connect with industry thought leaders, researchers, and scientists.”

Register today!